WorldWideScience

Sample records for methods proposed update

  1. Test Methods for Evaluating Solid Waste, Physical/Chemical Methods. First Update. (3rd edition)

    International Nuclear Information System (INIS)

    Friedman; Sellers.

    1988-01-01

    The proposed Update is for Test Methods for Evaluating Solid Waste, Physical/Chemical Methods, SW-846, Third Edition. Attached to the report is a list of methods included in the proposed update indicating whether the method is a new method, a partially revised method, or a totally revised method. Do not discard or replace any of the current pages in the SW-846 manual until the proposed update I package is promulgated. Until promulgation of the update package, the methods in the update package are not officially part of the SW-846 manual and thus do not carry the status of EPA-approved methods. In addition to the proposed Update, six finalized methods are included for immediate inclusion into the Third Edition of SW-846. Four methods, originally proposed October 1, 1984, will be finalized in a soon to be released rulemaking. They are, however, being submitted to subscribers for the first time in the update. These methods are 7211, 7381, 7461, and 7951. Two other methods were finalized in the 2nd Edition of SW-846. They were inadvertantly omitted from the 3rd Edition and are not being proposed as new. These methods are 7081 and 7761

  2. 76 FR 28194 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2011-05-16

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 Proposed FOIA Fee Schedule Update AGENCY... publishing its proposed Freedom of Information Act (FOIA) Fee Schedule Update and solicits comments from... on the proposed fee schedule should be mailed or delivered to the Office of the General Counsel...

  3. 75 FR 27228 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2010-05-14

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 Proposed FOIA Fee Schedule Update AGENCY... publishing its proposed Freedom of Information Act (FOIA) Fee Schedule Update and solicits comments from... on the proposed fee schedule should be mailed or delivered to the Office of the General Counsel...

  4. 77 FR 33980 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2012-06-08

    ... 1703 Proposed FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Notice... the Board's proposed FOIA Fee Schedule Update published in the Federal Register of June 1, 2012. The...: The FOIA requires each Federal agency covered by the Act to specify a schedule of fees applicable to...

  5. A parallel orbital-updating based plane-wave basis method for electronic structure calculations

    International Nuclear Information System (INIS)

    Pan, Yan; Dai, Xiaoying; Gironcoli, Stefano de; Gong, Xin-Gao; Rignanese, Gian-Marco; Zhou, Aihui

    2017-01-01

    Highlights: • Propose three parallel orbital-updating based plane-wave basis methods for electronic structure calculations. • These new methods can avoid the generating of large scale eigenvalue problems and then reduce the computational cost. • These new methods allow for two-level parallelization which is particularly interesting for large scale parallelization. • Numerical experiments show that these new methods are reliable and efficient for large scale calculations on modern supercomputers. - Abstract: Motivated by the recently proposed parallel orbital-updating approach in real space method , we propose a parallel orbital-updating based plane-wave basis method for electronic structure calculations, for solving the corresponding eigenvalue problems. In addition, we propose two new modified parallel orbital-updating methods. Compared to the traditional plane-wave methods, our methods allow for two-level parallelization, which is particularly interesting for large scale parallelization. Numerical experiments show that these new methods are more reliable and efficient for large scale calculations on modern supercomputers.

  6. Aircraft engine sensor fault diagnostics using an on-line OBEM update method.

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    Full Text Available This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI system, in which a Hybrid Kalman Filter (HKF was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault.

  7. 77 FR 32433 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2012-06-01

    ... 1703 Proposed FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Notice... Defense Nuclear Facilities Safety Board is publishing its proposed Freedom of Information Act (FOIA) Fee.... on or before July 2, 2012. ADDRESSES: Comments on the proposed fee schedule should be mailed or...

  8. FE Model Updating on an In-Service Self-Anchored Suspension Bridge with Extra-Width Using Hybrid Method

    Directory of Open Access Journals (Sweden)

    Zhiyuan Xia

    2017-02-01

    Full Text Available Nowadays, many more bridges with extra-width have been needed for vehicle throughput. In order to obtain a precise finite element (FE model of those complex bridge structures, the practical hybrid updating method by integration of Gaussian mutation particle swarm optimization (GMPSO, Kriging meta-model and Latin hypercube sampling (LHS was proposed. By demonstrating the efficiency and accuracy of the hybrid method through the model updating of a damaged simply supported beam, the proposed method was applied to the model updating of a self-anchored suspension bridge with extra-width which showed great necessity considering the results of ambient vibration test. The results of bridge model updating showed that both of the mode frequencies and shapes had relatively high agreement between the updated model and experimental structure. The successful model updating of this bridge fills in the blanks of model updating of a complex self-anchored suspension bridge. Moreover, the updating process enables other model updating issues for complex bridge structures

  9. Improved Quasi-Newton method via PSB update for solving systems of nonlinear equations

    Science.gov (United States)

    Mamat, Mustafa; Dauda, M. K.; Waziri, M. Y.; Ahmad, Fadhilah; Mohamad, Fatma Susilawati

    2016-10-01

    The Newton method has some shortcomings which includes computation of the Jacobian matrix which may be difficult or even impossible to compute and solving the Newton system in every iteration. Also, the common setback with some quasi-Newton methods is that they need to compute and store an n × n matrix at each iteration, this is computationally costly for large scale problems. To overcome such drawbacks, an improved Method for solving systems of nonlinear equations via PSB (Powell-Symmetric-Broyden) update is proposed. In the proposed method, the approximate Jacobian inverse Hk of PSB is updated and its efficiency has improved thereby require low memory storage, hence the main aim of this paper. The preliminary numerical results show that the proposed method is practically efficient when applied on some benchmark problems.

  10. Fuzzy cross-model cross-mode method and its application to update the finite element model of structures

    International Nuclear Information System (INIS)

    Liu Yang; Xu Dejian; Li Yan; Duan Zhongdong

    2011-01-01

    As a novel updating technique, cross-model cross-mode (CMCM) method possesses a high efficiency and capability of flexible selecting updating parameters. However, the success of this method depends on the accuracy of measured modal shapes. Usually, the measured modal shapes are inaccurate since many kinds of measured noises are inevitable. Furthermore, the complete testing modal shapes are required by CMCM method so that the calculating errors may be introduced into the measured modal shapes by conducting the modal expansion or model reduction technique. Therefore, this algorithm is faced with the challenge of updating the finite element (FE) model of practical complex structures. In this study, the fuzzy CMCM method is proposed in order to weaken the effect of errors of the measured modal shapes on the updated results. Then two simulated examples are applied to compare the performance of the fuzzy CMCM method with the CMCM method. The test results show that proposed method is more promising to update the FE model of practical structures than CMCM method.

  11. A Kriging Model Based Finite Element Model Updating Method for Damage Detection

    Directory of Open Access Journals (Sweden)

    Xiuming Yang

    2017-10-01

    Full Text Available Model updating is an effective means of damage identification and surrogate modeling has attracted considerable attention for saving computational cost in finite element (FE model updating, especially for large-scale structures. In this context, a surrogate model of frequency is normally constructed for damage identification, while the frequency response function (FRF is rarely used as it usually changes dramatically with updating parameters. This paper presents a new surrogate model based model updating method taking advantage of the measured FRFs. The Frequency Domain Assurance Criterion (FDAC is used to build the objective function, whose nonlinear response surface is constructed by the Kriging model. Then, the efficient global optimization (EGO algorithm is introduced to get the model updating results. The proposed method has good accuracy and robustness, which have been verified by a numerical simulation of a cantilever and experimental test data of a laboratory three-story structure.

  12. Technical Notes: Notes and Proposed Guidelines on Updated ...

    African Journals Online (AJOL)

    In light of recent expansion in the planning and construction of major building structures as well as other infrastructures such as railways, masshousing, dams, bridges, etc, this paper reviews the extent of seismic hazard in Ethiopia and proposes a review and update of the current out-dated and - in most cases ...

  13. A Weighted Two-Level Bregman Method with Dictionary Updating for Nonconvex MR Image Reconstruction

    Directory of Open Access Journals (Sweden)

    Qiegen Liu

    2014-01-01

    Full Text Available Nonconvex optimization has shown that it needs substantially fewer measurements than l1 minimization for exact recovery under fixed transform/overcomplete dictionary. In this work, two efficient numerical algorithms which are unified by the method named weighted two-level Bregman method with dictionary updating (WTBMDU are proposed for solving lp optimization under the dictionary learning model and subjecting the fidelity to the partial measurements. By incorporating the iteratively reweighted norm into the two-level Bregman iteration method with dictionary updating scheme (TBMDU, the modified alternating direction method (ADM solves the model of pursuing the approximated lp-norm penalty efficiently. Specifically, the algorithms converge after a relatively small number of iterations, under the formulation of iteratively reweighted l1 and l2 minimization. Experimental results on MR image simulations and real MR data, under a variety of sampling trajectories and acceleration factors, consistently demonstrate that the proposed method can efficiently reconstruct MR images from highly undersampled k-space data and presents advantages over the current state-of-the-art reconstruction approaches, in terms of higher PSNR and lower HFEN values.

  14. Effective updating process of seismic fragilities using Bayesian method and information entropy

    International Nuclear Information System (INIS)

    Kato, Masaaki; Takata, Takashi; Yamaguchi, Akira

    2008-01-01

    Seismic probabilistic safety assessment (SPSA) is an effective method for evaluating overall performance of seismic safety of a plant. Seismic fragilities are estimated to quantify the seismically induced accident sequences. It is a great concern that the SPSA results involve uncertainties, a part of which comes from the uncertainty in the seismic fragility of equipment and systems. A straightforward approach to reduce the uncertainty is to perform a seismic qualification test and to reflect the results on the seismic fragility estimate. In this paper, we propose a figure-of-merit to find the most cost-effective condition of the seismic qualification tests about the acceleration level and number of components tested. Then a mathematical method to reflect the test results on the fragility update is developed. A Bayesian method is used for the fragility update procedure. Since a lognormal distribution that is used for the fragility model does not have a Bayes conjugate function, a parameterization method is proposed so that the posterior distribution expresses the characteristics of the fragility. The information entropy is used as the figure-of-merit to express importance of obtained evidence. It is found that the information entropy is strongly associated with the uncertainty of the fragility. (author)

  15. Evaluation of two updating methods for dissipative models on a real structure

    International Nuclear Information System (INIS)

    Moine, P.; Billet, L.

    1996-01-01

    Finite Element Models are widely used to predict the dynamic behaviour from structures. Frequently, the model does not represent the structure with all be expected accuracy i.e. the measurements realised on the structure differ from the data predicted by the model. It is therefore necessary to update the model. Although many modeling errors come from inadequate representation of the damping phenomena, most of the model updating techniques are up to now restricted to conservative models only. In this paper, we present two updating methods for dissipative models using Eigen mode shapes and Eigen values as behavioural information from the structure. The first method - the modal output error method - compares directly the experimental Eigen vectors and Eigen values to the model Eigen vectors and Eigen values whereas the second method - the error in constitutive relation method - uses an energy error derived from the equilibrium relation. The error function, in both cases, is minimized by a conjugate gradient algorithm and the gradient is calculated analytically. These two methods behave differently which can be evidenced by updating a real structure constituted from a piece of pipe mounted on two viscous elastic suspensions. The updating of the model validates an updating strategy consisting in realizing a preliminary updating with the error in constitutive relation method (a fast to converge but difficult to control method) and then to pursue the updating with the modal output error method (a slow to converge but reliable and easy to control method). Moreover the problems encountered during the updating process and their corresponding solutions are given. (authors)

  16. Mining Sequential Update Summarization with Hierarchical Text Analysis

    Directory of Open Access Journals (Sweden)

    Chunyun Zhang

    2016-01-01

    Full Text Available The outbreak of unexpected news events such as large human accident or natural disaster brings about a new information access problem where traditional approaches fail. Mostly, news of these events shows characteristics that are early sparse and later redundant. Hence, it is very important to get updates and provide individuals with timely and important information of these incidents during their development, especially when being applied in wireless and mobile Internet of Things (IoT. In this paper, we define the problem of sequential update summarization extraction and present a new hierarchical update mining system which can broadcast with useful, new, and timely sentence-length updates about a developing event. The new system proposes a novel method, which incorporates techniques from topic-level and sentence-level summarization. To evaluate the performance of the proposed system, we apply it to the task of sequential update summarization of temporal summarization (TS track at Text Retrieval Conference (TREC 2013 to compute four measurements of the update mining system: the expected gain, expected latency gain, comprehensiveness, and latency comprehensiveness. Experimental results show that our proposed method has good performance.

  17. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  18. EOP Improvement Proposal for SGTR based on The OPR PSA Update

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Hee; Cho, Jae Hyun; Kim, Dong San; Yang, Joon Eon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This updating process was also focused to enhance the PSA quality and to respect the as built and as operated conditions of target plants. For this purpose, the EOP(Emergency Operating Procedure) and AOP(Abnormal Operating Procedure) of target plant were reviewed in detail and various thermal hydraulic(T/H) analysis were also performed to analyze the realistic PSA accident sequence model. In this paper, the unreasonable point of SGTR (Steam Generator Tube Rupture) EOP based on PSA perspective was identified and the initial proposal for EOP change items from PSA insight was proposed. In this paper, the unreasonable point of SGTR EOP based on PSA perspective was identified and the EOP improvement items are proposed to enhance safety and operator's convenience for the target plant.

  19. A gradual update method for simulating the steady-state solution of stiff differential equations in metabolic circuits.

    Science.gov (United States)

    Shiraishi, Emi; Maeda, Kazuhiro; Kurata, Hiroyuki

    2009-02-01

    Numerical simulation of differential equation systems plays a major role in the understanding of how metabolic network models generate particular cellular functions. On the other hand, the classical and technical problems for stiff differential equations still remain to be solved, while many elegant algorithms have been presented. To relax the stiffness problem, we propose new practical methods: the gradual update of differential-algebraic equations based on gradual application of the steady-state approximation to stiff differential equations, and the gradual update of the initial values in differential-algebraic equations. These empirical methods show a high efficiency for simulating the steady-state solutions for the stiff differential equations that existing solvers alone cannot solve. They are effective in extending the applicability of dynamic simulation to biochemical network models.

  20. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    International Nuclear Information System (INIS)

    Fu, Y; Xu, O; Yang, W; Zhou, L; Wang, J

    2017-01-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately. (paper)

  1. Robot Visual Tracking via Incremental Self-Updating of Appearance Model

    Directory of Open Access Journals (Sweden)

    Danpei Zhao

    2013-09-01

    Full Text Available This paper proposes a target tracking method called Incremental Self-Updating Visual Tracking for robot platforms. Our tracker treats the tracking problem as a binary classification: the target and the background. The greyscale, HOG and LBP features are used in this work to represent the target and are integrated into a particle filter framework. To track the target over long time sequences, the tracker has to update its model to follow the most recent target. In order to deal with the problems of calculation waste and lack of model-updating strategy with the traditional methods, an intelligent and effective online self-updating strategy is devised to choose the optimal update opportunity. The strategy of updating the appearance model can be achieved based on the change in the discriminative capability between the current frame and the previous updated frame. By adjusting the update step adaptively, severe waste of calculation time for needless updates can be avoided while keeping the stability of the model. Moreover, the appearance model can be kept away from serious drift problems when the target undergoes temporary occlusion. The experimental results show that the proposed tracker can achieve robust and efficient performance in several benchmark-challenging video sequences with various complex environment changes in posture, scale, illumination and occlusion.

  2. A visual tracking method based on deep learning without online model updating

    Science.gov (United States)

    Tang, Cong; Wang, Yicheng; Feng, Yunsong; Zheng, Chao; Jin, Wei

    2018-02-01

    The paper proposes a visual tracking method based on deep learning without online model updating. In consideration of the advantages of deep learning in feature representation, deep model SSD (Single Shot Multibox Detector) is used as the object extractor in the tracking model. Simultaneously, the color histogram feature and HOG (Histogram of Oriented Gradient) feature are combined to select the tracking object. In the process of tracking, multi-scale object searching map is built to improve the detection performance of deep detection model and the tracking efficiency. In the experiment of eight respective tracking video sequences in the baseline dataset, compared with six state-of-the-art methods, the method in the paper has better robustness in the tracking challenging factors, such as deformation, scale variation, rotation variation, illumination variation, and background clutters, moreover, its general performance is better than other six tracking methods.

  3. A hierarchical updating method for finite element model of airbag buffer system under landing impact

    Directory of Open Access Journals (Sweden)

    He Huan

    2015-12-01

    Full Text Available In this paper, we propose an impact finite element (FE model for an airbag landing buffer system. First, an impact FE model has been formulated for a typical airbag landing buffer system. We use the independence of the structure FE model from the full impact FE model to develop a hierarchical updating scheme for the recovery module FE model and the airbag system FE model. Second, we define impact responses at key points to compare the computational and experimental results to resolve the inconsistency between the experimental data sampling frequency and experimental triggering. To determine the typical characteristics of the impact dynamics response of the airbag landing buffer system, we present the impact response confidence factors (IRCFs to evaluate how consistent the computational and experiment results are. An error function is defined between the experimental and computational results at key points of the impact response (KPIR to serve as a modified objective function. A radial basis function (RBF is introduced to construct updating variables for a surrogate model for updating the objective function, thereby converting the FE model updating problem to a soluble optimization problem. Finally, the developed method has been validated using an experimental and computational study on the impact dynamics of a classic airbag landing buffer system.

  4. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  5. Two updating methods for dissipative models with non symmetric matrices

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Aubry, D.

    1997-01-01

    In this paper the feasibility of the extension of two updating methods to rotating machinery models is considered, the particularity of rotating machinery models is to use non-symmetric stiffness and damping matrices. It is shown that the two methods described here, the inverse Eigen-sensitivity method and the error in constitutive relation method can be adapted to such models given some modification.As far as inverse sensitivity method is concerned, an error function based on the difference between right hand calculated and measured Eigen mode shapes and calculated and measured Eigen values is used. Concerning the error in constitutive relation method, the equation which defines the error has to be modified due to the non definite positiveness of the stiffness matrix. The advantage of this modification is that, in some cases, it is possible to focus the updating process on some specific model parameters. Both methods were validated on a simple test model consisting in a two-bearing and disc rotor system. (author)

  6. Medicare: Comparison of Catastrophic Health Insurance Proposals--An Update. Briefing Report to the Chairman, Select Committee on Aging, House of Representatives.

    Science.gov (United States)

    General Accounting Office, Washington, DC. Div. of Human Resources.

    This document updates a recent report by the General Accounting Office (GAO) which compared Medicare catastrophic health insurance proposals. The update includes H.R. 2470, as passed by the House of Representatives and S. 1127, as reported by the Senate Committee on Finance. An introduction explains the roles of Medicare, Medicaid, the Veterans…

  7. Updating Stiffness and Hysteretic Damping Matrices Using Measured Modal Data

    OpenAIRE

    Jiashang Jiang; Yongxin Yuan

    2018-01-01

    A new direct method for the finite element (FE) matrix updating problem in a hysteretic (or material) damping model based on measured incomplete vibration modal data is presented. With this method, the optimally approximated stiffness and hysteretic damping matrices can be easily constructed. The physical connectivity of the original model is preserved and the measured modal data are embedded in the updated model. The numerical results show that the proposed method works well.

  8. On preconditioner updates for sequences of saddle-point linear systems

    Directory of Open Access Journals (Sweden)

    Simone Valentina De

    2018-02-01

    Full Text Available Updating preconditioners for the solution of sequences of large and sparse saddle- point linear systems via Krylov methods has received increasing attention in the last few years, because it allows to reduce the cost of preconditioning while keeping the efficiency of the overall solution process. This paper provides a short survey of the two approaches proposed in the literature for this problem: updating the factors of a preconditioner available in a block LDLT form, and updating a preconditioner via a limited-memory technique inspired by quasi-Newton methods.

  9. Updating Stiffness and Hysteretic Damping Matrices Using Measured Modal Data

    Directory of Open Access Journals (Sweden)

    Jiashang Jiang

    2018-01-01

    Full Text Available A new direct method for the finite element (FE matrix updating problem in a hysteretic (or material damping model based on measured incomplete vibration modal data is presented. With this method, the optimally approximated stiffness and hysteretic damping matrices can be easily constructed. The physical connectivity of the original model is preserved and the measured modal data are embedded in the updated model. The numerical results show that the proposed method works well.

  10. Novel approach to improve the attitude update rate of a star tracker.

    Science.gov (United States)

    Zhang, Shuo; Xing, Fei; Sun, Ting; You, Zheng; Wei, Minsong

    2018-03-05

    The star tracker is widely used in attitude control systems of spacecraft for attitude measurement. The attitude update rate of a star tracker is important to guarantee the attitude control performance. In this paper, we propose a novel approach to improve the attitude update rate of a star tracker. The electronic Rolling Shutter (RS) imaging mode of the complementary metal-oxide semiconductor (CMOS) image sensor in the star tracker is applied to acquire star images in which the star spots are exposed with row-to-row time offsets, thereby reflecting the rotation of star tracker at different times. The attitude estimation method with a single star spot is developed to realize the multiple attitude updates by a star image, so as to reach a high update rate. The simulation and experiment are performed to verify the proposed approaches. The test results demonstrate that the proposed approach is effective and the attitude update rate of a star tracker is increased significantly.

  11. UPDATING NATIONAL TOPOGRAPHIC DATA BASE USING CHANGE DETECTION METHODS

    Directory of Open Access Journals (Sweden)

    E. Keinan

    2016-06-01

    Full Text Available The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA, the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  12. Updating National Topographic Data Base Using Change Detection Methods

    Science.gov (United States)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  13. Updated method guidelines for cochrane musculoskeletal group systematic reviews and metaanalyses

    DEFF Research Database (Denmark)

    Ghogomu, Elizabeth A T; Maxwell, Lara J; Buchbinder, Rachelle

    2014-01-01

    The Cochrane Musculoskeletal Group (CMSG), one of 53 groups of the not-for-profit, international Cochrane Collaboration, prepares, maintains, and disseminates systematic reviews of treatments for musculoskeletal diseases. It is important that authors conducting CMSG reviews and the readers of our...... reviews be aware of and use updated, state-of-the-art systematic review methodology. One hundred sixty reviews have been published. Previous method guidelines for systematic reviews of interventions in the musculoskeletal field published in 2006 have been substantially updated to incorporate...... using network metaanalysis. Method guidelines specific to musculoskeletal disorders are provided by CMSG editors for various aspects of undertaking a systematic review. These method guidelines will help improve the quality of reporting and ensure high standards of conduct as well as consistency across...

  14. Lagrangian relaxation technique in power systems operation planning: Multipliers updating problem

    Energy Technology Data Exchange (ETDEWEB)

    Ruzic, S. [Electric Power Utility of Serbia, Belgrade (Yugoslavia)

    1995-11-01

    All Lagrangian relaxation based approaches to the power systems operation planning have an important common part: the Lagrangian multipliers correction procedure. It is the subject of this paper. Different approaches presented in the literature are discussed and an original method for the Lagrangian multipliers updating is proposed. The basic idea of this new method is to update Lagrangian multipliers trying to satisfy Khun-Tucker optimality conditions. Instead of the dual function maximization the `distance of optimality function` is defined and minimized. If Khun-Tucker optimality conditions are satisfied the value of this function is in range (-1,0); otherwise the function has a big positive value. This method called `the distance of optimality method` takes into account future changes in planning generations due to the Lagrangian multipliers updating. The influence of changes in a multiplier associated to one system constraint to the satisfaction of some other system requirements is also considered. The numerical efficiency of the proposed method is analyzed and compared with results obtained using the sub-gradient technique. 20 refs, 2 tabs

  15. A review of methods for updating forest monitoring system estimates

    Science.gov (United States)

    Hector Franco-Lopez; Alan R. Ek; Andrew P. Robinson

    2000-01-01

    Intensifying interest in forests and the development of new monitoring technologies have induced major changes in forest monitoring systems in the last few years, including major revisions in the methods used for updating. This paper describes the methods available for projecting stand- and plot-level information, emphasizing advantages and disadvantages, and the...

  16. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...... are solved on a background computational grid. Several references state, that one of the main advantages of the material-point method is the easy application of complicated material behaviour as the constitutive response is updated individually for each material point. However, as discussed here, the MPM way...

  17. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Lei Qin

    2014-05-01

    Full Text Available We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences.

  18. Data Updating Methods for Spatial Data Infrastructure that Maintain Infrastructure Quality and Enable its Sustainable Operation

    Science.gov (United States)

    Murakami, S.; Takemoto, T.; Ito, Y.

    2012-07-01

    The Japanese government, local governments and businesses are working closely together to establish spatial data infrastructures in accordance with the Basic Act on the Advancement of Utilizing Geospatial Information (NSDI Act established in August 2007). Spatial data infrastructures are urgently required not only to accelerate computerization of the public administration, but also to help restoration and reconstruction of the areas struck by the East Japan Great Earthquake and future disaster prevention and reduction. For construction of a spatial data infrastructure, various guidelines have been formulated. But after an infrastructure is constructed, there is a problem of maintaining it. In one case, an organization updates its spatial data only once every several years because of budget problems. Departments and sections update the data on their own without careful consideration. That upsets the quality control of the entire data system and the system loses integrity, which is crucial to a spatial data infrastructure. To ensure quality, ideally, it is desirable to update data of the entire area every year. But, that is virtually impossible, considering the recent budget crunch. The method we suggest is to update spatial data items of higher importance only in order to maintain quality, not updating all the items across the board. We have explored a method of partially updating the data of these two geographical features while ensuring the accuracy of locations. Using this method, data on roads and buildings that greatly change with time can be updated almost in real time or at least within a year. The method will help increase the availability of a spatial data infrastructure. We have conducted an experiment on the spatial data infrastructure of a municipality using those data. As a result, we have found that it is possible to update data of both features almost in real time.

  19. Ontology Update in the Cognitive Model of Ontology Learning

    Directory of Open Access Journals (Sweden)

    Zhang De-Hai

    2016-01-01

    Full Text Available Ontology has been used in many hot-spot fields, but most ontology construction methods are semiautomatic, and the construction process of ontology is still a tedious and painstaking task. In this paper, a kind of cognitive models is presented for ontology learning which can simulate human being’s learning from world. In this model, the cognitive strategies are applied with the constrained axioms. Ontology update is a key step when the new knowledge adds into the existing ontology and conflict with old knowledge in the process of ontology learning. This proposal designs and validates the method of ontology update based on the axiomatic cognitive model, which include the ontology update postulates, axioms and operations of the learning model. It is proved that these operators subject to the established axiom system.

  20. Optimal update with multiple out-of-sequence measurements

    Science.gov (United States)

    Zhang, Shuo; Bar-Shalom, Yaakov

    2011-06-01

    In multisensor target tracking systems receiving out-of-sequence measurements from local sensors is a common situation. In the last decade many algorithms have been proposed to update a target state with an OOSM optimally or suboptimally. However, what one faces in the real world is multiple OOSMs, which arrive at the fusion center in, generally, arbitrary orders, e.g., in succession or interleaved with in-sequence measurements. A straightforward approach to deal with this multi-OOSM problem is by sequentially applying a given OOSM algorithm; however, this simple solution does not guarantee optimal update under the multi-OOSM scenario. The present paper discusses the differences between the single-OOSM processing and the multi-OOSM processing, and presents the general solution to the multi-OOSM problem, called the complete in-sequence information (CISI) approach. Given an OOSM, in addition to updating the target state at the most recent time, the CISI approach also updates the states between the OOSM time and the most recent time, including the state at the OOSM time. Three novel CISI methods are developed in this paper: the information filter-equivalent measurement (IF-EqM) method, the CISI fixed-point smoothing (CISI-FPS) method and the CISI fixed-interval smoothing (CISI-FIS) method. Numerical examples are given to show the optimality of these CISI methods under various multi-OOSM scenarios.

  1. Finite-element-model updating using computational intelligence techniques applications to structural dynamics

    CERN Document Server

    Marwala, Tshilidzi

    2010-01-01

    Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...

  2. Real Time Updating Genetic Network Programming for Adapting to the Change of Stock Prices

    Science.gov (United States)

    Chen, Yan; Mabu, Shingo; Shimada, Kaoru; Hirasawa, Kotaro

    The key in stock trading model is to take the right actions for trading at the right time, primarily based on the accurate forecast of future stock trends. Since an effective trading with given information of stock prices needs an intelligent strategy for the decision making, we applied Genetic Network Programming (GNP) to creating a stock trading model. In this paper, we propose a new method called Real Time Updating Genetic Network Programming (RTU-GNP) for adapting to the change of stock prices. There are three important points in this paper: First, the RTU-GNP method makes a stock trading decision considering both the recommendable information of technical indices and the candlestick charts according to the real time stock prices. Second, we combine RTU-GNP with a Sarsa learning algorithm to create the programs efficiently. Also, sub-nodes are introduced in each judgment and processing node to determine appropriate actions (buying/selling) and to select appropriate stock price information depending on the situation. Third, a Real Time Updating system has been firstly introduced in our paper considering the change of the trend of stock prices. The experimental results on the Japanese stock market show that the trading model with the proposed RTU-GNP method outperforms other models without real time updating. We also compared the experimental results using the proposed method with Buy&Hold method to confirm its effectiveness, and it is clarified that the proposed trading model can obtain much higher profits than Buy&Hold method.

  3. Update and Improve Subsection NH - Alternative Simplified Creep-Fatigue Design Methods

    International Nuclear Information System (INIS)

    Asayama, Tai

    2009-01-01

    This report described the results of investigation on Task 10 of DOE/ASME Materials NGNP/Generation IV Project based on a contract between ASME Standards Technology, LLC (ASME ST-LLC) and Japan Atomic Energy Agency (JAEA). Task 10 is to Update and Improve Subsection NH -- Alternative Simplified Creep-Fatigue Design Methods. Five newly proposed promising creep-fatigue evaluation methods were investigated. Those are (1) modified ductility exhaustion method, (2) strain range separation method, (3) approach for pressure vessel application, (4) hybrid method of time fraction and ductility exhaustion, and (5) simplified model test approach. The outlines of those methods are presented first, and predictability of experimental results of these methods is demonstrated using the creep-fatigue data collected in previous Tasks 3 and 5. All the methods (except the simplified model test approach which is not ready for application) predicted experimental results fairly accurately. On the other hand, predicted creep-fatigue life in long-term regions showed considerable differences among the methodologies. These differences come from the concepts each method is based on. All the new methods investigated in this report have advantages over the currently employed time fraction rule and offer technical insights that should be thought much of in the improvement of creep-fatigue evaluation procedures. The main points of the modified ductility exhaustion method, the strain range separation method, the approach for pressure vessel application and the hybrid method can be reflected in the improvement of the current time fraction rule. The simplified mode test approach would offer a whole new advantage including robustness and simplicity which are definitely attractive but this approach is yet to be validated for implementation at this point. Therefore, this report recommends the following two steps as a course of improvement of NH based on newly proposed creep-fatigue evaluation

  4. Quantifying Update Effects in Citizen-Oriented Software

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-02-01

    Full Text Available Defining citizen-oriented software. Detailing technical issues regarding update process in this kind of software. Presenting different effects triggered by types of update. Building model for update costs estimation, including producer-side and consumer-side effects. Analyzing model applicability on INVMAT – large scale matrix inversion software. Proposing a model for update effects estimation. Specifying ways for softening effects of inaccurate updates.

  5. Modified methods for growing 3-D skin equivalents: an update.

    Science.gov (United States)

    Lamb, Rebecca; Ambler, Carrie A

    2014-01-01

    Artificial epidermis can be reconstituted in vitro by seeding primary epidermal cells (keratinocytes) onto a supportive substrate and then growing the developing skin equivalent at the air-liquid interface. In vitro skin models are widely used to study skin biology and for industrial drug and cosmetic testing. Here, we describe updated methods for growing 3-dimensional skin equivalents using de-vitalized, de-epidermalized dermis (DED) substrates including methods for DED substrate preparation, cell seeding, growth conditions, and fixation procedures.

  6. Updating Road Networks by Local Renewal from GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Tao Wu

    2016-09-01

    Full Text Available The long production cycle and huge cost of collecting road network data often leave the data lagging behind the latest real conditions. However, this situation is rapidly changing as the positioning techniques ubiquitously used in mobile devices are gradually being implemented in road network research and applications. Currently, the predominant approaches infer road networks from mobile location information (e.g., GPS trajectory data directly using various extracting algorithms, which leads to expensive consumption of computational resources in the case of large-scale areas. For this reason, we propose an alternative that renews road networks with a novel spiral strategy, including a hidden Markov model (HMM for detecting potential problems in existing road network data and a method to update the data, on the local scale, by generating new road segments from trajectory data. The proposed approach reduces computation costs on roads with completed or updated information by updating problem road segments in the minimum range of the road network. We evaluated the performance of our proposals using GPS traces collected from taxies and OpenStreetMap (OSM road networks covering urban areas of Wuhan City.

  7. Dynamic model updating based on strain mode shape and natural frequency using hybrid pattern search technique

    Science.gov (United States)

    Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping

    2018-05-01

    Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.

  8. Proposal for a Five-Step Method to Elicit Expert Judgment

    Directory of Open Access Journals (Sweden)

    Duco Veen

    2017-12-01

    Full Text Available Elicitation is a commonly used tool to extract viable information from experts. The information that is held by the expert is extracted and a probabilistic representation of this knowledge is constructed. A promising avenue in psychological research is to incorporated experts’ prior knowledge in the statistical analysis. Systematic reviews on elicitation literature however suggest that it might be inappropriate to directly obtain distributional representations from experts. The literature qualifies experts’ performance on estimating elements of a distribution as unsatisfactory, thus reliably specifying the essential elements of the parameters of interest in one elicitation step seems implausible. Providing feedback within the elicitation process can enhance the quality of the elicitation and interactive software can be used to facilitate the feedback. Therefore, we propose to decompose the elicitation procedure into smaller steps with adjustable outcomes. We represent the tacit knowledge of experts as a location parameter and their uncertainty concerning this knowledge by a scale and shape parameter. Using a feedback procedure, experts can accept the representation of their beliefs or adjust their input. We propose a Five-Step Method which consists of (1 Eliciting the location parameter using the trial roulette method. (2 Provide feedback on the location parameter and ask for confirmation or adjustment. (3 Elicit the scale and shape parameter. (4 Provide feedback on the scale and shape parameter and ask for confirmation or adjustment. (5 Use the elicited and calibrated probability distribution in a statistical analysis and update it with data or to compute a prior-data conflict within a Bayesian framework. User feasibility and internal validity for the Five-Step Method are investigated using three elicitation studies.

  9. Updating and testing of a Finnish method for mixed municipal solid waste composition studies.

    Science.gov (United States)

    Liikanen, M; Sahimaa, O; Hupponen, M; Havukainen, J; Sorvari, J; Horttanainen, M

    2016-06-01

    More efficient recycling of municipal solid waste (MSW) is an essential precondition for turning Europe into a circular economy. Thus, the recycling of MSW must increase significantly in several member states, including Finland. This has increased the interest in the composition of mixed MSW. Due to increased information needs, a method for mixed MSW composition studies was introduced in Finland in order to improve the national comparability of composition study results. The aim of this study was to further develop the method so that it corresponds to the information needed about the composition of mixed MSW and still works in practice. A survey and two mixed MSW composition studies were carried out in the study. According to the responses of the survey, the intensification of recycling, the landfill ban on organic waste and the producer responsibility for packaging waste have particularly influenced the need for information about the composition of mixed MSW. The share of biowaste in mixed MSW interested the respondents most. Additionally, biowaste proved to be the largest waste fraction in mixed MSW in the composition studies. It constituted over 40% of mixed MSW in both composition studies. For these reasons, the classification system of the method was updated by further defining the classifications of biowaste. The classifications of paper as well as paperboard and cardboard were also updated. The updated classification system provides more information on the share of avoidable food waste and waste materials suitable for recycling in mixed MSW. The updated method and the information gained from the composition studies are important in ensuring that the method will be adopted by municipal waste management companies and thus used widely in Finland. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. An update on neurotoxin products and administration methods.

    Science.gov (United States)

    Lanoue, Julien; Dong, Joanna; Do, Timothy; Goldenberg, Gary

    2016-09-01

    Since onabotulinumtoxinA for nonsurgical aesthetic enhancement of glabellar lines was initially reported, the popularity of botulinum neurotoxin (BoNT) products among both clinicians and consumers has rapidly grown, and we have seen several additional BoNT formulations enter the market. As the demand for minimally invasive cosmetic procedures continues to increase, we will see the introduction of additional formulations of BoNT products as well as new delivery devices and administration techniques. In this article, we provide a brief update on current and upcoming BoNT products and also review the literature on novel administration methods based on recently published studies.

  11. Proposal for secondary ion beams and update of data taking schedule for 2009-2013

    CERN Document Server

    Abgrall, N; Andrieu, B; Anticic, T; Antoniou, N; Argyriades, J; Asryan, A G; Baatar, B; Blondel, A; Blumer, J; Boldizsar, L; Bravar, A; Brzychczyk, J; Bunyatov, S A; Choi, K U; Christakoglou, P; Chung, P; Cleymans, J; Derkach, D A; Diakonos, F; Dominik, W; Dumarchez, J; Engel, R; Ereditato, A; Feofilov, G A; Ferrero, A; Fodor, Z; Gazdzicki, M; Golubeva, M; Grebieszkow, K; Guber, F; Hasegawa, T; Haungs, A; Hess, M; Igolkin, S; Ivanov, A S; Ivashkin, A; Kadija, K; Katrynska, N; Kielczewska, D; Kikola, D; Kim, J H; Kobayashi, T; Kolesnikov, V I; Kolev, D; Kolevatov, R S; Kondratiev, V P; Kurepin, A; Lacey, R; Laszlo, A; Lehmann, S; Lungwitz, B; Lyubushkin, V V; Maevskaya, A; Majka, Z; Malakhov, A I; Marchionni, A; Marcinek, A; Maris, I; Matveev, V; Melkumov, G L; Meregaglia, A; Messina, M; Meurer, C; Mijakowski, P; Mitrovski, M; Montaruli, T; Mrówczynski, St; Murphy, S; Nakadaira, T; Naumenko, P A; Nikolic, V; Nishikawa, K; Palczewski, T; Pálla, G; Panagiotou, A D; Peryt, W; Petridis, A; Planeta, R; Pluta, J; Popov, B A; Posiadala, M; Przewlocki, P; Rauch, W; Ravonel, M; Renfordt, R; Röhrich, D; Rondio, E; Rossi, B; Roth, M; Rubbia, A; Rybczynski, M; Sadovskii, A; Sakashita, K; Schuster, T; Sekiguchi, T; Seyboth, P; Shileev, K; Sissakian, A N; Skrzypczak, E; Slodkowski, M; Sorin, A S; Staszel, P; Stefanek, G; Stepaniak, J; Strabel, C; Ströbele, H; Susa, T; Szentpétery, I; Szuba, M; Taranenko, A; Tsenov, R; Ulrich, R; Unger, M; Vassiliou, M; Vechernin, V V; Vesztergombi, G; Wlodarczyk, Z; Wojtaszek, A; Yi, J G; Yoo, I K; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2009-01-01

    This document presents the proposal for secondary ion beams and the updated data taking schedule of the NA61 Collaboration. The modification of the original NA61 plans is necessary in order to reach compatibility between the current I-LHC and NA61 schedules. It assumes delivery of primary proton beam in 2009-2012 and of primary lead beam in 2011-2013. The primary lead beam will be fragmented into a secondary beam of lighter ions. The modified H2 beam line will serve as a fragment separator to produce the light ion species for NA61 data taking. The expected physics performance of the NA61 experiment with secondary ion beams will be sufficient to reach the primary NA61 physics goals.

  12. Notification: FY 2017 Update of Proposed Key Management Challenges and Internal Control Weaknesses Confronting the U.S. Chemical Safety and Hazard Investigation Board

    Science.gov (United States)

    Jan 5, 2017. The EPA OIG is beginning work to update for fiscal year 2017 its list of proposed key management challenges and internal control weaknesses confronting the U.S. Chemical Safety and Hazard Investigation Board (CSB).

  13. Real-time numerical shake prediction and updating for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Wei, Yongxiang; Huang, Yandan

    2017-12-01

    Ground motion prediction is important for earthquake early warning systems, because the region's peak ground motion indicates the potential disaster. In order to predict the peak ground motion quickly and precisely with limited station wave records, we propose a real-time numerical shake prediction and updating method. Our method first predicts the ground motion based on the ground motion prediction equation after P waves detection of several stations, denoted as the initial prediction. In order to correct the prediction error of the initial prediction, an updating scheme based on real-time simulation of wave propagation is designed. Data assimilation technique is incorporated to predict the distribution of seismic wave energy precisely. Radiative transfer theory and Monte Carlo simulation are used for modeling wave propagation in 2-D space, and the peak ground motion is calculated as quickly as possible. Our method has potential to predict shakemap, making the potential disaster be predicted before the real disaster happens. 2008 M S8.0 Wenchuan earthquake is studied as an example to show the validity of the proposed method.

  14. How update schemes influence crowd simulations

    International Nuclear Information System (INIS)

    Seitz, Michael J; Köster, Gerta

    2014-01-01

    Time discretization is a key modeling aspect of dynamic computer simulations. In current pedestrian motion models based on discrete events, e.g. cellular automata and the Optimal Steps Model, fixed-order sequential updates and shuffle updates are prevalent. We propose to use event-driven updates that process events in the order they occur, and thus better match natural movement. In addition, we present a parallel update with collision detection and resolution for situations where computational speed is crucial. Two simulation studies serve to demonstrate the practical impact of the choice of update scheme. Not only do density-speed relations differ, but there is a statistically significant effect on evacuation times. Fixed-order sequential and random shuffle updates with a short update period come close to event-driven updates. The parallel update scheme overestimates evacuation times. All schemes can be employed for arbitrary simulation models with discrete events, such as car traffic or animal behavior. (paper)

  15. Large-strain optical fiber sensing and real-time FEM updating of steel structures under the high temperature effect

    International Nuclear Information System (INIS)

    Huang, Ying; Fang, Xia; Xiao, Hai; Bevans, Wesley James; Chen, Genda; Zhou, Zhi

    2013-01-01

    Steel buildings are subjected to fire hazards during or immediately after a major earthquake. Under combined gravity and thermal loads, they have non-uniformly distributed stiffness and strength, and thus collapse progressively with large deformation. In this study, large-strain optical fiber sensors for high temperature applications and a temperature-dependent finite element model updating method are proposed for accurate prediction of structural behavior in real time. The optical fiber sensors can measure strains up to 10% at approximately 700 °C. Their measurements are in good agreement with those from strain gauges up to 0.5%. In comparison with the experimental results, the proposed model updating method can reduce the predicted strain errors from over 75% to below 20% at 800 °C. The minimum number of sensors in a fire zone that can properly characterize the vertical temperature distribution of heated air due to the gravity effect should be included in the proposed model updating scheme to achieve a predetermined simulation accuracy. (paper)

  16. Using Multi-Viewpoint Contracts for Negotiation of Embedded Software Updates

    Directory of Open Access Journals (Sweden)

    Sönke Holthusen

    2016-05-01

    Full Text Available In this paper we address the issue of change after deployment in safety-critical embedded system applications. Our goal is to substitute lab-based verification with in-field formal analysis to determine whether an update may be safely applied. This is challenging because it requires an automated process able to handle multiple viewpoints such as functional correctness, timing, etc. For this purpose, we propose an original methodology for contract-based negotiation of software updates. The use of contracts allows us to cleanly split the verification effort between the lab and the field. In addition, we show how to rely on existing viewpoint-specific methods for update negotiation. We illustrate our approach on a concrete example inspired by the automotive domain.

  17. Photovoltaic Shading Testbed for Module-Level Power Electronics: 2016 Performance Data Update

    Energy Technology Data Exchange (ETDEWEB)

    Deline, Chris [National Renewable Energy Lab. (NREL), Golden, CO (United States); Meydbray, Jenya [PV Evolution Labs (PVEL), Davis, CA (United States); Donovan, Matt [PV Evolution Labs (PVEL), Davis, CA (United States)

    2016-09-01

    The 2012 NREL report 'Photovoltaic Shading Testbed for Module-Level Power Electronics' provides a standard methodology for estimating the performance benefit of distributed power electronics under partial shading conditions. Since the release of the report, experiments have been conducted for a number of products and for different system configurations. Drawing from these experiences, updates to the test and analysis methods are recommended. Proposed changes in data processing have the benefit of reducing the sensitivity to measurement errors and weather variability, as well as bringing the updated performance score in line with measured and simulated values of the shade recovery benefit of distributed PV power electronics. Also, due to the emergence of new technologies including sub-module embedded power electronics, the shading method has been extended to include power electronics that operate at a finer granularity than the module level. An update to the method is proposed to account for these emerging technologies that respond to shading differently than module-level devices. The partial shading test remains a repeatable test procedure that attempts to simulate shading situations as would be experienced by typical residential or commercial rooftop photovoltaic (PV) systems. Performance data for multiple products tested using this method are discussed, based on equipment from Enphase, Solar Edge, Maxim Integrated and SMA. In general, the annual recovery of shading losses from the module-level electronics evaluated is 25-35%, with the major difference between different trials being related to the number of parallel strings in the test installation rather than differences between the equipment tested. Appendix D data has been added in this update.

  18. Updating Recursive XML Views of Relations

    DEFF Research Database (Denmark)

    Choi, Byron; Cong, Gao; Fan, Wenfei

    2009-01-01

    This paper investigates the view update problem for XML views published from relational data. We consider XML views defined in terms of mappings directed by possibly recursive DTDs compressed into DAGs and stored in relations. We provide new techniques to efficiently support XML view updates...... specified in terms of XPath expressions with recursion and complex filters. The interaction between XPath recursion and DAG compression of XML views makes the analysis of the XML view update problem rather intriguing. Furthermore, many issues are still open even for relational view updates, and need...... to be explored. In response to these, on the XML side, we revise the notion of side effects and update semantics based on the semantics of XML views, and present effecient algorithms to translate XML updates to relational view updates. On the relational side, we propose a mild condition on SPJ views, and show...

  19. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    Science.gov (United States)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  20. Online updating of context-aware landmark detectors for prostate localization in daily treatment CT images

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Xiubin [College of Geographic and Biologic Information, Nanjing University of Posts and Telecommunications, Nanjing, Jiangsu 210015, China and IDEA Lab, Department of Radiology and BRIC, University of North Carolina at Chapel Hill, 130 Mason Farm Road, Chapel Hill, North Carolina 27510 (United States); Gao, Yaozong [IDEA Lab, Department of Radiology and BRIC, University of North Carolina at Chapel Hill, 130 Mason Farm Road, Chapel Hill, North Carolina 27510 (United States); Shen, Dinggang, E-mail: dgshen@med.unc.edu [IDEA Lab, Department of Radiology and BRIC, University of North Carolina at Chapel Hill, 130 Mason Farm Road, Chapel Hill, North Carolina 27510 and Department of Brain and Cognitive Engineering, Korea University, Seoul (Korea, Republic of)

    2015-05-15

    Purpose: In image guided radiation therapy, it is crucial to fast and accurately localize the prostate in the daily treatment images. To this end, the authors propose an online update scheme for landmark-guided prostate segmentation, which can fully exploit valuable patient-specific information contained in the previous treatment images and can achieve improved performance in landmark detection and prostate segmentation. Methods: To localize the prostate in the daily treatment images, the authors first automatically detect six anatomical landmarks on the prostate boundary by adopting a context-aware landmark detection method. Specifically, in this method, a two-layer regression forest is trained as a detector for each target landmark. Once all the newly detected landmarks from new treatment images are reviewed or adjusted (if necessary) by clinicians, they are further included into the training pool as new patient-specific information to update all the two-layer regression forests for the next treatment day. As more and more treatment images of the current patient are acquired, the two-layer regression forests can be continually updated by incorporating the patient-specific information into the training procedure. After all target landmarks are detected, a multiatlas random sample consensus (multiatlas RANSAC) method is used to segment the entire prostate by fusing multiple previously segmented prostates of the current patient after they are aligned to the current treatment image. Subsequently, the segmented prostate of the current treatment image is again reviewed (or even adjusted if needed) by clinicians before including it as a new shape example into the prostate shape dataset for helping localize the entire prostate in the next treatment image. Results: The experimental results on 330 images of 24 patients show the effectiveness of the authors’ proposed online update scheme in improving the accuracies of both landmark detection and prostate segmentation

  1. Online updating of context-aware landmark detectors for prostate localization in daily treatment CT images

    International Nuclear Information System (INIS)

    Dai, Xiubin; Gao, Yaozong; Shen, Dinggang

    2015-01-01

    Purpose: In image guided radiation therapy, it is crucial to fast and accurately localize the prostate in the daily treatment images. To this end, the authors propose an online update scheme for landmark-guided prostate segmentation, which can fully exploit valuable patient-specific information contained in the previous treatment images and can achieve improved performance in landmark detection and prostate segmentation. Methods: To localize the prostate in the daily treatment images, the authors first automatically detect six anatomical landmarks on the prostate boundary by adopting a context-aware landmark detection method. Specifically, in this method, a two-layer regression forest is trained as a detector for each target landmark. Once all the newly detected landmarks from new treatment images are reviewed or adjusted (if necessary) by clinicians, they are further included into the training pool as new patient-specific information to update all the two-layer regression forests for the next treatment day. As more and more treatment images of the current patient are acquired, the two-layer regression forests can be continually updated by incorporating the patient-specific information into the training procedure. After all target landmarks are detected, a multiatlas random sample consensus (multiatlas RANSAC) method is used to segment the entire prostate by fusing multiple previously segmented prostates of the current patient after they are aligned to the current treatment image. Subsequently, the segmented prostate of the current treatment image is again reviewed (or even adjusted if needed) by clinicians before including it as a new shape example into the prostate shape dataset for helping localize the entire prostate in the next treatment image. Results: The experimental results on 330 images of 24 patients show the effectiveness of the authors’ proposed online update scheme in improving the accuracies of both landmark detection and prostate segmentation

  2. A Proposal for Updated Standards of Photographic Documentation in Aesthetic Medicine.

    Science.gov (United States)

    Prantl, Lukas; Brandl, Dirk; Ceballos, Patricia

    2017-08-01

    In 1998, DiBernardo et al. published a very helpful standardization of comparative (before and after) photographic documentation. These standards prevail to this day. Although most of them are useful for objective documentation of aesthetic results, there are at least 3 reasons why an update is necessary at this time: First, DiBernardo et al. focused on the prevalent standards of medical photography at that time. From a modern perspective, these standards are antiquated and not always correct. Second, silver-based analog photography has mutated into digital photography. Digitalization offers virtually unlimited potential for image manipulation using a vast array of digital Apps and tools including, but not limited to, image editing software like Photoshop. Digitalization has given rise to new questions, particularly regarding appropriate use of editing techniques to maximize or increase objectivity. Third, we suggest changes to a very small number of their medical standards in the interest of obtaining a better or more objective documentation of aesthetic results. This article is structured into 3 sections and is intended as a new proposal for photographic and medical standards for the documentation of aesthetic interventions: 1. The photographic standards. 2. The medical standards. 3. Description of editing tools which should be used to increase objectivity.

  3. Empirical testing of forecast update procedure forseasonal products

    DEFF Research Database (Denmark)

    Wong, Chee Yew; Johansen, John

    2008-01-01

    Updating of forecasts is essential for successful collaborative forecasting, especially for seasonal products. This paper discusses the results of a theoretical simulation and an empirical test of a proposed time-series forecast updating procedure. It involves a two-stage longitudinal case study...... of a toy supply chain. The theoretical simulation involves historical weekly consumer demand data for 122 toy products. The empirical test is then carried out in real-time with 291 toy products. The results show that the proposed forecast updating procedure: 1) reduced forecast errors of the annual...... provided less forecast accuracy improvement and it needed a longer time to achieve relatively acceptable forecast uncertainty....

  4. SWCD: a sliding window and self-regulated learning-based background updating method for change detection in videos

    Science.gov (United States)

    Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih

    2018-03-01

    Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.

  5. Bayesian updating of reliability of civil infrastructure facilities based on condition-state data and fault-tree model

    International Nuclear Information System (INIS)

    Ching Jianye; Leu, S.-S.

    2009-01-01

    This paper considers a difficult but practical circumstance of civil infrastructure management-deterioration/failure data of the infrastructure system are absent while only condition-state data of its components are available. The goal is to develop a framework for estimating time-varying reliabilities of civil infrastructure facilities under such a circumstance. A novel method of analyzing time-varying condition-state data that only reports operational/non-operational status of the components is proposed to update the reliabilities of civil infrastructure facilities. The proposed method assumes that the degradation arrivals can be modeled as a Poisson process with unknown time-varying arrival rate and damage impact and that the target system can be represented as a fault-tree model. To accommodate large uncertainties, a Bayesian algorithm is proposed, and the reliability of the infrastructure system can be quickly updated based on the condition-state data. Use of the new method is demonstrated with a real-world example of hydraulic spillway gate system.

  6. Estimation of beam material random field properties via sensitivity-based model updating using experimental frequency response functions

    Science.gov (United States)

    Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.

    2018-03-01

    Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.

  7. Simultaneous determination of some antiprotozoal drugs in different combined dosage forms by mean centering of ratio spectra and multivariate calibration with model updating methods

    Directory of Open Access Journals (Sweden)

    Abdelaleem Eglal A

    2012-04-01

    Full Text Available Abstract Background Metronidazole (MET and Diloxanide Furoate (DF, act as antiprotozoal drugs, in their ternary mixtures with Mebeverine HCl (MEH, an effective antispasmodic drug. This work concerns with the development and validation of two simple, specific and cost effective methods mainly for simultaneous determination of the proposed ternary mixture. In addition, the developed multivariate calibration model has been updated to determine Metronidazole benzoate (METB in its binary mixture with DF in Dimetrol® suspension. Results Method (I is the mean centering of ratio spectra spectrophotometric method (MCR that depends on using the mean centered ratio spectra in two successive steps that eliminates the derivative steps and therefore the signal to noise ratio is enhanced. The developed MCR method has been successfully applied for determination of MET, DF and MEH in different laboratory prepared mixtures and in tablets. Method (II is the partial least square (PLS multivariate calibration method that has been optimized for determination of MET, DF and MEH in Dimetrol ® tablets and by updating the developed model, it has been successfully used for prediction of binary mixtures of DF and Metronidazole Benzoate ester (METB in Dimetrol ® suspension with good accuracy and precision without reconstruction of the calibration set. Conclusion The developed methods have been validated; accuracy, precision and specificity were found to be within the acceptable limits. Moreover results obtained by the suggested methods showed no significant difference when compared with those obtained by reported methods. Graphical Abstract

  8. Single-Shell Tank (SST) Retrieval Sequence Fiscal Year 2000 Update

    International Nuclear Information System (INIS)

    GARFIELD, J.S.

    2000-01-01

    This document describes the baseline single-shell tank (SST) waste retrieval sequence for the River Protection Project (RPP) updated for Fiscal Year 2000. The SST retrieval sequence identifies the proposed retrieval order (sequence), the tank selection and prioritization rationale, and planned retrieval dates for Hanford SSTs. In addition, the tank selection criteria and reference retrieval method for this sequence are discussed

  9. A Progressive Buffering Method for Road Map Update Using OpenStreetMap Data

    Directory of Open Access Journals (Sweden)

    Changyong Liu

    2015-07-01

    Full Text Available Web 2.0 enables a two-way interaction between servers and clients. GPS receivers become available to more citizens and are commonly found in vehicles and smart phones, enabling individuals to record and share their trajectory data on the Internet and edit them online. OpenStreetMap (OSM makes it possible for citizens to contribute to the acquisition of geographic information. This paper studies the use of OSM data to find newly mapped or built roads that do not exist in a reference road map and create its updated version. For this purpose, we propose a progressive buffering method for determining an optimal buffer radius to detect the new roads in the OSM data. In the next step, the detected new roads are merged into the reference road maps geometrically, topologically, and semantically. Experiments with OSM data and reference road maps over an area of 8494 km2 in the city of Wuhan, China and five of its 5 km × 5 km areas are conducted to demonstrate the feasibility and effectiveness of the method. It is shown that the OSM data can add 11.96% or a total of 2008.6 km of new roads to the reference road maps with an average precision of 96.49% and an average recall of 97.63%.

  10. Low-rank Quasi-Newton updates for Robust Jacobian lagging in Newton methods

    International Nuclear Information System (INIS)

    Brown, J.; Brune, P.

    2013-01-01

    Newton-Krylov methods are standard tools for solving nonlinear problems. A common approach is to 'lag' the Jacobian when assembly or preconditioner setup is computationally expensive, in exchange for some degradation in the convergence rate and robustness. We show that this degradation may be partially mitigated by using the lagged Jacobian as an initial operator in a quasi-Newton method, which applies unassembled low-rank updates to the Jacobian until the next full reassembly. We demonstrate the effectiveness of this technique on problems in glaciology and elasticity. (authors)

  11. Updating of a dynamic finite element model from the Hualien scale model reactor building

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Lebailly, P.

    1996-08-01

    The forces occurring at the soil-structure interface of a building have generally a large influence on the way the building reacts to an earthquake. One can be tempted to characterise these forces more accurately bu updating a model from the structure. However, this procedure requires an updating method suitable for dissipative models, since significant damping can be observed at the soil-structure interface of buildings. Such a method is presented here. It is based on the minimization of a mechanical energy built from the difference between Eigen data calculated bu the model and Eigen data issued from experimental tests on the real structure. An experimental validation of this method is then proposed on a model from the HUALIEN scale-model reactor building. This scale-model, built on the HUALIEN site of TAIWAN, is devoted to the study of soil-structure interaction. The updating concerned the soil impedances, modelled by a layer of springs and viscous dampers attached to the building foundation. A good agreement was found between the Eigen modes and dynamic responses calculated bu the updated model and the corresponding experimental data. (authors). 12 refs., 3 figs., 4 tabs

  12. A State Space Model for Spatial Updating of Remembered Visual Targets during Eye Movements.

    Science.gov (United States)

    Mohsenzadeh, Yalda; Dash, Suryadeep; Crawford, J Douglas

    2016-01-01

    In the oculomotor system, spatial updating is the ability to aim a saccade toward a remembered visual target position despite intervening eye movements. Although this has been the subject of extensive experimental investigation, there is still no unifying theoretical framework to explain the neural mechanism for this phenomenon, and how it influences visual signals in the brain. Here, we propose a unified state-space model (SSM) to account for the dynamics of spatial updating during two types of eye movement; saccades and smooth pursuit. Our proposed model is a non-linear SSM and implemented through a recurrent radial-basis-function neural network in a dual Extended Kalman filter (EKF) structure. The model parameters and internal states (remembered target position) are estimated sequentially using the EKF method. The proposed model replicates two fundamental experimental observations: continuous gaze-centered updating of visual memory-related activity during smooth pursuit, and predictive remapping of visual memory activity before and during saccades. Moreover, our model makes the new prediction that, when uncertainty of input signals is incorporated in the model, neural population activity and receptive fields expand just before and during saccades. These results suggest that visual remapping and motor updating are part of a common visuomotor mechanism, and that subjective perceptual constancy arises in part from training the visual system on motor tasks.

  13. Updating the 2001 National Land Cover Database land cover classification to 2006 by using Landsat imagery change detection methods

    Science.gov (United States)

    Xian, George; Homer, Collin G.; Fry, Joyce

    2009-01-01

    The recent release of the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001, which represents the nation's land cover status based on a nominal date of 2001, is widely used as a baseline for national land cover conditions. To enable the updating of this land cover information in a consistent and continuous manner, a prototype method was developed to update land cover by an individual Landsat path and row. This method updates NLCD 2001 to a nominal date of 2006 by using both Landsat imagery and data from NLCD 2001 as the baseline. Pairs of Landsat scenes in the same season in 2001 and 2006 were acquired according to satellite paths and rows and normalized to allow calculation of change vectors between the two dates. Conservative thresholds based on Anderson Level I land cover classes were used to segregate the change vectors and determine areas of change and no-change. Once change areas had been identified, land cover classifications at the full NLCD resolution for 2006 areas of change were completed by sampling from NLCD 2001 in unchanged areas. Methods were developed and tested across five Landsat path/row study sites that contain several metropolitan areas including Seattle, Washington; San Diego, California; Sioux Falls, South Dakota; Jackson, Mississippi; and Manchester, New Hampshire. Results from the five study areas show that the vast majority of land cover change was captured and updated with overall land cover classification accuracies of 78.32%, 87.5%, 88.57%, 78.36%, and 83.33% for these areas. The method optimizes mapping efficiency and has the potential to provide users a flexible method to generate updated land cover at national and regional scales by using NLCD 2001 as the baseline.

  14. Intraoperative magnetic resonance imaging to update interactive navigation in neurosurgery: method and preliminary experience.

    Science.gov (United States)

    Wirtz, C R; Bonsanto, M M; Knauth, M; Tronnier, V M; Albert, F K; Staubert, A; Kunze, S

    1997-01-01

    We report on the first successful intraoperative update of interactive image guidance based on an intraoperatively acquired magnetic resonance imaging (MRI) date set. To date, intraoperative imaging methods such as ultrasound, computerized tomography (CT), or MRI have not been successfully used to update interactive navigation. We developed a method of imaging patients intraoperatively with the surgical field exposed in an MRI scanner (Magnetom Open; Siemens Corp., Erlangen, Germany). In 12 patients, intraoperatively acquired 3D data sets were used for successful recalibration of neuronavigation, accounting for any anatomical changes caused by surgical manipulations. The MKM Microscope (Zeiss Corp., Oberkochen, Germany) was used as navigational system. With implantable fiducial markers, an accuracy of 0.84 +/- 0.4 mm for intraoperative reregistration was achieved. Residual tumor detected on MRI was consequently resected using navigation with the intraoperative data. No adverse effects were observed from intraoperative imaging or the use of navigation with intraoperative images, demonstrating the feasibility of recalibrating navigation with intraoperative MRI.

  15. Generalizations of the limited-memory BFGS method based on the quasi-product form of update

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    2013-01-01

    Roč. 241, 15 March (2013), s. 116-129 ISSN 0377-0427 R&D Projects: GA ČR GA201/09/1957 Institutional research plan: CEZ:AV0Z10300504 Keywords : unconstrained minimization * variable metric methods * limited-memory methods * Broyden class updates * global convergence * numerical results Subject RIV: BA - General Mathematics Impact factor: 1.077, year: 2013

  16. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  17. Updating optical pseudoinverse associative memories.

    Science.gov (United States)

    Telfer, B; Casasent, D

    1989-07-01

    Selected algorithms for adding to and deleting from optical pseudoinverse associative memories are presented and compared. New realizations of pseudoinverse updating methods using vector inner product matrix bordering and reduced-dimensionality Karhunen-Loeve approximations (which have been used for updating optical filters) are described in the context of associative memories. Greville's theorem is reviewed and compared with the Widrow-Hoff algorithm. Kohonen's gradient projection method is expressed in a different form suitable for optical implementation. The data matrix memory is also discussed for comparison purposes. Memory size, speed and ease of updating, and key vector requirements are the comparison criteria used.

  18. An Experimental Study of Structural Identification of Bridges Using the Kinetic Energy Optimization Technique and the Direct Matrix Updating Method

    Directory of Open Access Journals (Sweden)

    Gwanghee Heo

    2016-01-01

    Full Text Available This paper aims to develop an SI (structural identification technique using the KEOT and the DMUM to decide on optimal location of sensors and to update FE model, respectively, which ultimately contributes to a composition of more effective SHM. Owing to the characteristic structural flexing behavior of cable bridges (e.g., cable-stayed bridges and suspension bridges, which makes them vulnerable to any vibration, systematic and continuous structural health monitoring (SHM is pivotal for them. Since it is necessary to select optimal measurement locations with the fewest possible measurements and also to accurately assess the structural state of a bridge for the development of an effective SHM, an SI technique is as much important to accurately determine the modal parameters of the current structure based on the data optimally obtained. In this study, the kinetic energy optimization technique (KEOT was utilized to determine the optimal measurement locations, while the direct matrix updating method (DMUM was utilized for FE model updating. As a result of experiment, the required number of measurement locations derived from KEOT based on the target mode was reduced by approximately 80% compared to the initial number of measurement locations. Moreover, compared to the eigenvalue of the modal experiment, an improved FE model with a margin of error of less than 1% was derived from DMUM. Thus, the SI technique for cable-stayed bridges proposed in this study, which utilizes both KEOT and DMUM, is proven effective in minimizing the number of sensors while accurately determining the structural dynamic characteristics.

  19. A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction.

    Science.gov (United States)

    Chen, C P; Wan, J Z

    1999-01-01

    A fast learning algorithm is proposed to find an optimal weights of the flat neural networks (especially, the functional-link network). Although the flat networks are used for nonlinear function approximation, they can be formulated as linear systems. Thus, the weights of the networks can be solved easily using a linear least-square method. This formulation makes it easier to update the weights instantly for both a new added pattern and a new added enhancement node. A dynamic stepwise updating algorithm is proposed to update the weights of the system on-the-fly. The model is tested on several time-series data including an infrared laser data set, a chaotic time-series, a monthly flour price data set, and a nonlinear system identification problem. The simulation results are compared to existing models in which more complex architectures and more costly training are needed. The results indicate that the proposed model is very attractive to real-time processes.

  20. DUAL STATE-PARAMETER UPDATING SCHEME ON A CONCEPTUAL HYDROLOGIC MODEL USING SEQUENTIAL MONTE CARLO FILTERS

    Science.gov (United States)

    Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin

    Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.

  1. Updating the 2001 National Land Cover Database Impervious Surface Products to 2006 using Landsat imagery change detection methods

    Science.gov (United States)

    Xian, George; Homer, Collin G.

    2010-01-01

    A prototype method was developed to update the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 to a nominal date of 2006. NLCD 2001 is widely used as a baseline for national land cover and impervious cover conditions. To enable the updating of this database in an optimal manner, methods are designed to be accomplished by individual Landsat scene. Using conservative change thresholds based on land cover classes, areas of change and no-change were segregated from change vectors calculated from normalized Landsat scenes from 2001 and 2006. By sampling from NLCD 2001 impervious surface in unchanged areas, impervious surface predictions were estimated for changed areas within an urban extent defined by a companion land cover classification. Methods were developed and tested for national application across six study sites containing a variety of urban impervious surface. Results show the vast majority of impervious surface change associated with urban development was captured, with overall RMSE from 6.86 to 13.12% for these areas. Changes of urban development density were also evaluated by characterizing the categories of change by percentile for impervious surface. This prototype method provides a relatively low cost, flexible approach to generate updated impervious surface using NLCD 2001 as the baseline.

  2. Update of European bioethics

    DEFF Research Database (Denmark)

    Rendtorff, Jacob Dahl

    2015-01-01

    This paper presents an update of the research on European bioethics undertaken by the author together with Professor Peter Kemp since the 1990s, on Basic ethical principles in European bioethics and biolaw. In this European approach to basic ethical principles in bioethics and biolaw......, the principles of autonomy, dignity, integrity and vulnerability are proposed as the most important ethical principles for respect for the human person in biomedical and biotechnological development. This approach to bioethics and biolaw is presented here in a short updated version that integrates the earlier...... research in a presentation of the present understanding of the basic ethical principles in bioethics and biolaw....

  3. Information dissemination model for social media with constant updates

    Science.gov (United States)

    Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui

    2018-07-01

    With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.

  4. Update to Proposal for an Experiment to Measure Mixing, CP Violation and Rare Decays in Charm and Beauty Particle Decays at the Fermilab Collider - BTeV

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Joel [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Stone, Sheldon [Syracuse Univ., NY (United States)

    2002-03-01

    We have been requested to submit an update of the BTeV plan to the Fermilab Physics Advisory Committee, where to save money the detector has only one arm and there is no new interaction region magnet construction planned. These are to come from a currently running collider experiment at the appropriate time. The "Physics Case" section is complete and updated with the section on the "New Physics" capabilites of BTeV greatly expanded. We show that precise measurements of rare flavor-changing neutral current processes and CP violation are and will be complementary to the Tevatron and LHC in unraveling the electroweak breaking puzzle. We include a revised summary of the physics sensitivities for the one-arm detector, which are not simply taking our proposal numbers and dividing by two, because of additional improvements. One important change resulted from an improved understanding of just how important the RJCH detector is to muon and electron identification, that we can indeed separate electrons from pions and muons from pions, especially at relatively large angles beyond the physical aperture of the EM calorimeter or the Muon Detector. This is documented in the "Physics Sensitivities" section. The section on the detector includes the motivation for doing b and c physics at a hadron collider, and shows the changes in the detector since the proposal based on our ongoing R&D program. We do not here include a detailed description of the entire detector. That is available in the May, 2000 proposal. We include a summary of our R&D activities for the entire experiment. Finally, we also include a fully updated cost estimate for the one-arm system.

  5. Sequential updating of a new dynamic pharmacokinetic model for caffeine in premature neonates.

    Science.gov (United States)

    Micallef, Sandrine; Amzal, Billy; Bach, Véronique; Chardon, Karen; Tourneux, Pierre; Bois, Frédéric Y

    2007-01-01

    Caffeine treatment is widely used in nursing care to reduce the risk of apnoea in premature neonates. To check the therapeutic efficacy of the treatment against apnoea, caffeine concentration in blood is an important indicator. The present study was aimed at building a pharmacokinetic model as a basis for a medical decision support tool. In the proposed model, time dependence of physiological parameters is introduced to describe rapid growth of neonates. To take into account the large variability in the population, the pharmacokinetic model is embedded in a population structure. The whole model is inferred within a Bayesian framework. To update caffeine concentration predictions as data of an incoming patient are collected, we propose a fast method that can be used in a medical context. This involves the sequential updating of model parameters (at individual and population levels) via a stochastic particle algorithm. Our model provides better predictions than the ones obtained with models previously published. We show, through an example, that sequential updating improves predictions of caffeine concentration in blood (reduce bias and length of credibility intervals). The update of the pharmacokinetic model using body mass and caffeine concentration data is studied. It shows how informative caffeine concentration data are in contrast to body mass data. This study provides the methodological basis to predict caffeine concentration in blood, after a given treatment if data are collected on the treated neonate.

  6. Update on NOx measuring methods and emission levels

    International Nuclear Information System (INIS)

    Yamada, N.; Desprets, M.

    1997-01-01

    The survey was carried out in 1995 to update the NO x report prepared for presentation at the 19th World Gas Conference held in Milan in 1994 and drawn out on the basis of the information obtained through the survey carried out in 1992. Over the past three years the work on standard developments and/or improvements on NO x emissions has progressed in several IGU member countries. For example, in Europe a report on 'Determination of emissions from appliances burning gaseous fuels during type-testing' was drafted by European Committee for Standardization (CEN) in March 1994 as 'CR 1404 : 1994'. This report is based on the work so far carried out within MARCOGAZ, and it is expected that the NO x measuring methods specified in this report would be finally introduced into relevant gas appliance standards issued by CEN for type-testing of gas appliances covered by this report. (au)

  7. Effect of asynchronous updating on the stability of cellular automata

    International Nuclear Information System (INIS)

    Baetens, J.M.; Van der Weeën, P.; De Baets, B.

    2012-01-01

    Highlights: ► An upper bound on the Lyapunov exponent of asynchronously updated CA is established. ► The employed update method has repercussions on the stability of CAs. ► A decision on the employed update method should be taken with care. ► Substantial discrepancies arise between synchronously and asynchronously updated CA. ► Discrepancies between different asynchronous update schemes are less pronounced. - Abstract: Although cellular automata (CAs) were conceptualized as utter discrete mathematical models in which the states of all their spatial entities are updated simultaneously at every consecutive time step, i.e. synchronously, various CA-based models that rely on so-called asynchronous update methods have been constructed in order to overcome the limitations that are tied up with the classical way of evolving CAs. So far, only a few researchers have addressed the consequences of this way of updating on the evolved spatio-temporal patterns, and the reachable stationary states. In this paper, we exploit Lyapunov exponents to determine to what extent the stability of the rules within a family of totalistic CAs is affected by the underlying update method. For that purpose, we derive an upper bound on the maximum Lyapunov exponent of asynchronously iterated CAs, and show its validity, after which we present a comparative study between the Lyapunov exponents obtained for five different update methods, namely one synchronous method and four well-established asynchronous methods. It is found that the stability of CAs is seriously affected if one of the latter methods is employed, whereas the discrepancies arising between the different asynchronous methods are far less pronounced and, finally, we discuss the repercussions of our findings on the development of CA-based models.

  8. A Proposed Multimedia Cone of Abstraction: Updating a Classic Instructional Design Theory

    Science.gov (United States)

    Baukal, Charles E.; Ausburn, Floyd B.; Ausburn, Lynna J.

    2013-01-01

    Advanced multimedia techniques offer significant learning potential for students. Dale (1946, 1954, 1969) developed a Cone of Experience (CoE) which is a hierarchy of learning experiences ranging from direct participation to abstract symbolic expression. This paper updates the CoE for today's technology and learning context, specifically focused…

  9. Sparsity-promoting orthogonal dictionary updating for image reconstruction from highly undersampled magnetic resonance data

    International Nuclear Information System (INIS)

    Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu

    2015-01-01

    Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy. (paper)

  10. Sparsity-promoting orthogonal dictionary updating for image reconstruction from highly undersampled magnetic resonance data.

    Science.gov (United States)

    Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu

    2015-07-21

    Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy.

  11. A study on reducing update frequency of the forecast samples in the ensemble-based 4DVar data assimilation method

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Aimei; Xu, Daosheng [Lanzhou Univ. (China). Key Lab. of Arid Climatic Changing and Reducing Disaster of Gansu Province; Chinese Academy of Meteorological Sciences, Beijing (China). State Key Lab. of Severe Weather; Qiu, Xiaobin [Lanzhou Univ. (China). Key Lab. of Arid Climatic Changing and Reducing Disaster of Gansu Province; Tianjin Institute of Meteorological Science (China); Qiu, Chongjian [Lanzhou Univ. (China). Key Lab. of Arid Climatic Changing and Reducing Disaster of Gansu Province

    2013-02-15

    In the ensemble-based four dimensional variational assimilation method (SVD-En4DVar), a singular value decomposition (SVD) technique is used to select the leading eigenvectors and the analysis variables are expressed as the orthogonal bases expansion of the eigenvectors. The experiments with a two-dimensional shallow-water equation model and simulated observations show that the truncation error and rejection of observed signals due to the reduced-dimensional reconstruction of the analysis variable are the major factors that damage the analysis when the ensemble size is not large enough. However, a larger-sized ensemble is daunting computational burden. Experiments with a shallow-water equation model also show that the forecast error covariances remain relatively constant over time. For that reason, we propose an approach that increases the members of the forecast ensemble while reducing the update frequency of the forecast error covariance in order to increase analysis accuracy and to reduce the computational cost. A series of experiments were conducted with the shallow-water equation model to test the efficiency of this approach. The experimental results indicate that this approach is promising. Further experiments with the WRF model show that this approach is also suitable for the real atmospheric data assimilation problem, but the update frequency of the forecast error covariances should not be too low. (orig.)

  12. Optimal updating magnitude in adaptive flat-distribution sampling.

    Science.gov (United States)

    Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery

    2017-11-07

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  13. Automated finite element updating using strain data for the lifetime reliability assessment of bridges

    International Nuclear Information System (INIS)

    Okasha, Nader M.; Frangopol, Dan M.; Orcesi, André D.

    2012-01-01

    The importance of improving the understanding of the performance of structures over their lifetime under uncertainty with information obtained from structural health monitoring (SHM) has been widely recognized. However, frameworks that efficiently integrate monitoring data into the life-cycle management of structures are yet to be developed. The objective of this paper is to propose and illustrate an approach for updating the lifetime reliability of aging bridges using monitored strain data obtained from crawl tests. It is proposed to use automated finite element model updating techniques as a tool for updating the resistance parameters of the structure. In this paper, the results from crawl tests are used to update the finite element model and, in turn, update the lifetime reliability. The original and updated lifetime reliabilities are computed using advanced computational tools. The approach is illustrated on an existing bridge.

  14. Effective Solar Indices for Ionospheric Modeling: A Review and a Proposal for a Real-Time Regional IRI

    Science.gov (United States)

    Pignalberi, A.; Pezzopane, M.; Rizzi, R.; Galkin, I.

    2018-01-01

    The first part of this paper reviews methods using effective solar indices to update a background ionospheric model focusing on those employing the Kriging method to perform the spatial interpolation. Then, it proposes a method to update the International Reference Ionosphere (IRI) model through the assimilation of data collected by a European ionosonde network. The method, called International Reference Ionosphere UPdate (IRI UP), that can potentially operate in real time, is mathematically described and validated for the period 9-25 March 2015 (a time window including the well-known St. Patrick storm occurred on 17 March), using IRI and IRI Real Time Assimilative Model (IRTAM) models as the reference. It relies on foF2 and M(3000)F2 ionospheric characteristics, recorded routinely by a network of 12 European ionosonde stations, which are used to calculate for each station effective values of IRI indices IG_{12} and R_{12} (identified as IG_{{12{eff}}} and R_{{12{eff}}}); then, starting from this discrete dataset of values, two-dimensional (2D) maps of IG_{{12{eff}}} and R_{{12{eff}}} are generated through the universal Kriging method. Five variogram models are proposed and tested statistically to select the best performer for each effective index. Then, computed maps of IG_{{12{eff}}} and R_{{12{eff}}} are used in the IRI model to synthesize updated values of foF2 and hmF2. To evaluate the ability of the proposed method to reproduce rapid local changes that are common under disturbed conditions, quality metrics are calculated for two test stations whose measurements were not assimilated in IRI UP, Fairford (51.7°N, 1.5°W) and San Vito (40.6°N, 17.8°E), for IRI, IRI UP, and IRTAM models. The proposed method turns out to be very effective under highly disturbed conditions, with significant improvements of the foF2 representation and noticeable improvements of the hmF2 one. Important improvements have been verified also for quiet and moderately disturbed

  15. Model parameter updating using Bayesian networks

    International Nuclear Information System (INIS)

    Treml, C.A.; Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  16. Smoking and plastic surgery, part I. Pathophysiological aspects: update and proposed recommendations.

    Science.gov (United States)

    Pluvy, I; Garrido, I; Pauchot, J; Saboye, J; Chavoin, J P; Tropet, Y; Grolleau, J L; Chaput, B

    2015-02-01

    Smoking patients undergoing a plastic surgery intervention are exposed to increased risk of perioperative and postoperative complications. It seemed useful to us to establish an update about the negative impact of smoking, especially on wound healing, and also about the indisputable benefits of quitting. We wish to propose a minimum time lapse of withdrawal in the preoperative and postoperative period in order to reduce the risks and maximize the results of the intervention. A literature review of documents from 1972 to 2014 was carried out by searching five different databases (Medline, PubMed Central, Cochrane library, Pascal and Web of Science). Cigarette smoke has a diffuse and multifactorial impact in the body. Hypoxia, tissue ischemia and immune disorders induced by tobacco consumption cause alterations of the healing process. Some of these effects are reversible by quitting. Data from the literature recommend a preoperative smoking cessation period lasting between 3 and 8 weeks and up until 4 weeks postoperatively. Use of nicotine replacement therapies doubles the abstinence rate in the short term. When a patient is heavily dependent, the surgeon should be helped by a tobacco specialist. Total smoking cessation of 4 weeks preoperatively and lasting until primary healing of the operative site (2 weeks) appears to optimize surgical conditions without heightening anesthetic risk. Tobacco withdrawal assistance, both human and drug-based, is highly recommended. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  17. Updated clinical guidelines experience major reporting limitations

    Directory of Open Access Journals (Sweden)

    Robin W.M. Vernooij

    2017-10-01

    Full Text Available Abstract Background The Checklist for the Reporting of Updated Guidelines (CheckUp was recently developed. However, so far, no systematic assessment of the reporting of updated clinical guidelines (CGs exists. We aimed to examine (1 the completeness of reporting the updating process in CGs and (2 the inter-observer reliability of CheckUp. Methods We conducted a systematic assessment of the reporting of the updating process in a sample of updated CGs using CheckUp. We performed a systematic search to identify updated CGs published in 2015, developed by a professional society, reporting a systematic review of the evidence, and containing at least one recommendation. Three reviewers independently assessed the CGs with CheckUp (16 items. We calculated the median score per item, per domain, and overall, converting scores to a 10-point scale. Multiple linear regression analyses were used to identify differences according to country, type of organisation, scope, and health topic of updated CGs. We calculated the intraclass coefficient (ICC and 95% confidence interval (95% CI for domains and overall score. Results We included in total 60 updated CGs. The median domain score on a 10-point scale for presentation was 5.8 (range 1.7 to 10, for editorial independence 8.3 (range 3.3 to 10, and for methodology 5.7 (range 0 to 10. The median overall score on a 10-point scale was 6.3 (range 3.1 to 10. Presentation and justification items at recommendation level (respectively reported by 27 and 38% of the CGs and the methods used for the external review and implementing changes in practice were particularly poorly reported (both reported by 38% of the CGs. CGs developed by a European or international institution obtained a statistically significant higher overall score compared to North American or Asian institutions (p = 0.014. Finally, the agreement among the reviewers on the overall score was excellent (ICC 0.88, 95% CI 0.75 to 0.95. Conclusions The

  18. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  19. Mammogram segmentation using maximal cell strength updation in cellular automata.

    Science.gov (United States)

    Anitha, J; Peter, J Dinesh

    2015-08-01

    Breast cancer is the most frequently diagnosed type of cancer among women. Mammogram is one of the most effective tools for early detection of the breast cancer. Various computer-aided systems have been introduced to detect the breast cancer from mammogram images. In a computer-aided diagnosis system, detection and segmentation of breast masses from the background tissues is an important issue. In this paper, an automatic segmentation method is proposed to identify and segment the suspicious mass regions of mammogram using a modified transition rule named maximal cell strength updation in cellular automata (CA). In coarse-level segmentation, the proposed method performs an adaptive global thresholding based on the histogram peak analysis to obtain the rough region of interest. An automatic seed point selection is proposed using gray-level co-occurrence matrix-based sum average feature in the coarse segmented image. Finally, the method utilizes CA with the identified initial seed point and the modified transition rule to segment the mass region. The proposed approach is evaluated over the dataset of 70 mammograms with mass from mini-MIAS database. Experimental results show that the proposed approach yields promising results to segment the mass region in the mammograms with the sensitivity of 92.25% and accuracy of 93.48%.

  20. Design and development for updating national 1:50,000 topographic databases in China

    Directory of Open Access Journals (Sweden)

    CHEN Jun

    2010-02-01

    data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country. 1.3 Results A group of updating models and technical methods were proposed after a systematic investigation of the key problems arising from the continuous updating of national 1:50,000 map databases. A set of specific software tools and packages was further developed to support large area updating. With these innovative methodologies and tools, a total of 19,150 map sheets at 1:50,000 scales had been updated and such a massive task was completed in an acceptable time frame, i.e., from 2060-2010. The data currency of national 1:50,000 map databases has been raised from 20-30 years to 5 years! 1.4 Conclusion A modern state requires accurate up to date maps and keeping them up to date on a regular basis is a massive task for a country the size of China. National Geomatics Center of China (NGCC has solved this problem by using the latest data sources and developing new techniques. The methodologies developed in this paper are suited to regular updating in other rapidly developing nations and establish a model which can be followed in similar circumstances throughout the world.

  1. Method for updating pipelined, single port Z-buffer by segments on a scan line

    International Nuclear Information System (INIS)

    Hannah, M.R.

    1990-01-01

    This patent describes, in a raster scan, computer controlled video display system for presenting an image to an observer. Having Z-buffer for storing Z values and a frame buffer for storing pixel values, a method for updating the Z-buffer with new Z values to replace old Z values. It comprises: calculating a new pixel value and a new Z value for each pixel location in pixel locations, performing a Z comparison for each new Z value by comparing the old Z value with the new Z value for each pixel location, the Z comparison being performed sequentially in one direction through the plurality of pixel locations, and updating the Z-buffer only after the Z comparison produces a combination of a fail condition for a current pixel location subsequent to producing a pass condition for a pixel location immediately preceding the current pixel location

  2. Kidney function measured by clearance. Methods and indications. An update; Zur Messung der Nierenfunktion durch Clearancebestimmungen. Methoden und Indikationen. Ein Update

    Energy Technology Data Exchange (ETDEWEB)

    Durand, E. [Universitaetskrankenhaus Bicetre, Paris (France); Mueller-Suur, R. [Karolinska Inst., Danderyds Krankenhaus und Aleris Fysiologlab, Stockholm (Sweden)

    2010-09-15

    Renal function impairment can be monitored by many tests. Measurement of plasma-creatinine level is the most used method, 24 h plasma-creatininclearance with urine collection is used by others and also other alternative but indirect methods exist. However, the use of radionuclide-clearances is by far the easiest performed method with the highest accuracy and precision and gives very low irradiation dose. In the following we will discuss the different radiopharmaceuticals in use, their plus and minus, the different clearance methods in use, their limitations and give some clinically important indications to perform clearance investigations according to consensus reports. Summarizing the use of plasma clearance of 51-Cr-EDTA after a single injection with one blood sample can be generally recommended, however with some modifications in special clinical situations, which are pointed out. Please note, that this paper is, for a significant part, an update of a previous paper published 2003 in ''Der Nuklearmediziner'' (orig.)

  3. Frequency response function (FRF) based updating of a laser spot welded structure

    Science.gov (United States)

    Zin, M. S. Mohd; Rani, M. N. Abdul; Yunus, M. A.; Sani, M. S. M.; Wan Iskandar Mirza, W. I. I.; Mat Isa, A. A.

    2018-04-01

    The objective of this paper is to present frequency response function (FRF) based updating as a method for matching the finite element (FE) model of a laser spot welded structure with a physical test structure. The FE model of the welded structure was developed using CQUAD4 and CWELD element connectors, and NASTRAN was used to calculate the natural frequencies, mode shapes and FRF. Minimization of the discrepancies between the finite element and experimental FRFs was carried out using the exceptional numerical capability of NASTRAN Sol 200. The experimental work was performed under free-free boundary conditions using LMS SCADAS. Avast improvement in the finite element FRF was achieved using the frequency response function (FRF) based updating with two different objective functions proposed.

  4. Model Updating Nonlinear System Identification Toolbox, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology (ZONA) proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology that utilizes flight data with...

  5. Agent Communication for Dynamic Belief Update

    Science.gov (United States)

    Kobayashi, Mikito; Tojo, Satoshi

    Thus far, various formalizations of rational / logical agent model have been proposed. In this paper, we include the notion of communication channel and belief modality into update logic, and introduce Belief Update Logic (BUL). First, we discuss that how we can reformalize the inform action of FIPA-ACL into communication channel, which represents a connection between agents. Thus, our agents can send a message only when they believe, and also there actually is, a channel between him / her and a receiver. Then, we present a static belief logic (BL) and show its soundness and completeness. Next, we develop the logic to BUL, which can update Kripke model by the inform action; in which we show that in the updated model the belief operator also satisfies K45. Thereafter, we show that every sentence in BUL can be translated into BL; thus, we can contend that BUL is also sound and complete. Furthermore, we discuss the features of CUL, including the case of inconsistent information, as well as channel transmission. Finally, we summarize our contribution and discuss some future issues.

  6. Assessment of proposed electromagnetic quantum vacuum energy extraction methods

    OpenAIRE

    Moddel, Garret

    2009-01-01

    In research articles and patents several methods have been proposed for the extraction of zero-point energy from the vacuum. None has been reliably demonstrated, but the proposals remain largely unchallenged. In this paper the feasibility of these methods is assessed in terms of underlying thermodynamics principles of equilibrium, detailed balance, and conservation laws. The methods are separated into three classes: nonlinear processing of the zero-point field, mechanical extraction using Cas...

  7. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    Science.gov (United States)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  8. Neural network based online simultaneous policy update algorithm for solving the HJI equation in nonlinear H∞ control.

    Science.gov (United States)

    Wu, Huai-Ning; Luo, Biao

    2012-12-01

    It is well known that the nonlinear H∞ state feedback control problem relies on the solution of the Hamilton-Jacobi-Isaacs (HJI) equation, which is a nonlinear partial differential equation that has proven to be impossible to solve analytically. In this paper, a neural network (NN)-based online simultaneous policy update algorithm (SPUA) is developed to solve the HJI equation, in which knowledge of internal system dynamics is not required. First, we propose an online SPUA which can be viewed as a reinforcement learning technique for two players to learn their optimal actions in an unknown environment. The proposed online SPUA updates control and disturbance policies simultaneously; thus, only one iterative loop is needed. Second, the convergence of the online SPUA is established by proving that it is mathematically equivalent to Newton's method for finding a fixed point in a Banach space. Third, we develop an actor-critic structure for the implementation of the online SPUA, in which only one critic NN is needed for approximating the cost function, and a least-square method is given for estimating the NN weight parameters. Finally, simulation studies are provided to demonstrate the effectiveness of the proposed algorithm.

  9. Proposal of Evolutionary Simplex Method for Global Optimization Problem

    Science.gov (United States)

    Shimizu, Yoshiaki

    To make an agile decision in a rational manner, role of optimization engineering has been notified increasingly under diversified customer demand. With this point of view, in this paper, we have proposed a new evolutionary method serving as an optimization technique in the paradigm of optimization engineering. The developed method has prospects to solve globally various complicated problem appearing in real world applications. It is evolved from the conventional method known as Nelder and Mead’s Simplex method by virtue of idea borrowed from recent meta-heuristic method such as PSO. Mentioning an algorithm to handle linear inequality constraints effectively, we have validated effectiveness of the proposed method through comparison with other methods using several benchmark problems.

  10. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    Science.gov (United States)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  11. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.

    Science.gov (United States)

    Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon

    2017-04-24

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.

  12. Model Updating Nonlinear System Identification Toolbox, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology by adopting the flight data with state-of-the-art...

  13. A review on model updating of joint structure for dynamic analysis purpose

    Directory of Open Access Journals (Sweden)

    Zahari S.N.

    2016-01-01

    Full Text Available Structural joints provide connection between structural element (beam, plate etc. in order to construct a whole assembled structure. There are many types of structural joints such as bolted joint, riveted joints and welded joints. The joints structures significantly contribute to structural stiffness and dynamic behaviour of structures hence the main objectives of this paper are to review on method of model updating on joints structure and to discuss the guidelines to perform model updating for dynamic analysis purpose. This review paper firstly will outline some of the existing finite element modelling works of joints structure. Experimental modal analysis is the next step to obtain modal parameters (natural frequency & mode shape to validate and improve the discrepancy between results obtained from experimental and the simulation counterparts. Hence model updating will be carried out to minimize the differences between the two results. There are two methods of model updating; direct method and iterative method. Sensitivity analysis employed using SOL200 in NASTRAN by selecting the suitable updating parameters to avoid ill-conditioning problem. It is best to consider both geometrical and material properties in the updating procedure rather than choosing only a number of geometrical properties alone. Iterative method was chosen as the best model updating procedure because the physical meaning of updated parameters are guaranteed although this method required computational effort compare to direct method.

  14. Working memory updating occurs independently of the need to maintain task-context: accounting for triggering updating in the AX-CPT paradigm.

    Science.gov (United States)

    Kessler, Yoav; Baruchin, Liad J; Bouhsira-Sabag, Anat

    2017-01-01

    Theoretical models suggest that maintenance and updating are two functional states of working memory (WM), which are controlled by a gate between perceptual information and WM representations. Opening the gate enables updating WM with input, while closing it enables keeping the maintained information shielded from interference. However, it is still unclear when gate opening takes place, and what is the external signal that triggers it. A version of the AX-CPT paradigm was used to examine a recent proposal in the literature, suggesting that updating is triggered whenever the maintenance of the context is necessary for task performance (context-dependent tasks). In four experiments using this paradigm, we show that (1) a task-switching cost takes place in both context-dependent and context-independent trials; (2) task-switching is additive to the dependency effect, and (3) unlike switching cost, the dependency effect is not affected by preparation and, therefore, does not reflect context-updating. We suggest that WM updating is likely to be triggered by a simple mechanism that occurs in each trial of the task regardless of whether maintaining the context is needed or not. The implications for WM updating and its relationship to task-switching are discussed.

  15. Update-in-Place Analysis for True Multidimensional Arrays

    Directory of Open Access Journals (Sweden)

    Steven M. Fitzgerald

    1996-01-01

    Full Text Available Applicative languages have been proposed for defining algorithms for parallel architectures because they are implicitly parallel and lack side effects. However, straightforward implementations of applicative-language compilers may induce large amounts of copying to preserve program semantics. The unnecessary copying of data can increase both the execution time and the memory requirements of an application. To eliminate the unnecessary copying of data, the Sisal compiler uses both build-in-place and update-in-place analyses. These optimizations remove unnecessary array copy operations through compile-time analysis. Both build-in-place and update-in-place are based on hierarchical ragged arrays, i.e., the vector-of-vectors array model. Although this array model is convenient for certain applications, many optimizations are precluded, e.g., vectorization. To compensate for this deficiency, new languages, such as Sisal 2.0, have extended array models that allow for both high-level array operations to be performed and efficient implementations to be devised. In this article, we introduce a new method to perform update-in-place analysis that is applicable to arrays stored either in hierarchical or in contiguous storage. Consequently, the array model that is appropriate for an application can be selected without the loss of performance. Moreover, our analysis is more amenable for distributed memory and large software systems.

  16. The choice of leasing companies for automobile fleet updating on the basis of hierarchies analysis method

    OpenAIRE

    Dorohov, А.

    2007-01-01

    The basic criteria of leasing companies choice by the transport enterprises for automobile fleet updating such as terms of financing, size of advance, assortment time of existence at the market, have been determined. The determination of the best leasing company according to these parameters on the basis of hierarchies analysis method has been offered.

  17. Numerical model updating technique for structures using firefly algorithm

    Science.gov (United States)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  18. A Proposed Arabic Handwritten Text Normalization Method

    Directory of Open Access Journals (Sweden)

    Tarik Abu-Ain

    2014-11-01

    Full Text Available Text normalization is an important technique in document image analysis and recognition. It consists of many preprocessing stages, which include slope correction, text padding, skew correction, and straight the writing line. In this side, text normalization has an important role in many procedures such as text segmentation, feature extraction and characters recognition. In the present article, a new method for text baseline detection, straightening, and slant correction for Arabic handwritten texts is proposed. The method comprises a set of sequential steps: first components segmentation is done followed by components text thinning; then, the direction features of the skeletons are extracted, and the candidate baseline regions are determined. After that, selection of the correct baseline region is done, and finally, the baselines of all components are aligned with the writing line.  The experiments are conducted on IFN/ENIT benchmark Arabic dataset. The results show that the proposed method has a promising and encouraging performance.

  19. Thread-Level Parallel Indexing of Update Intensive Moving-Object Workloads

    DEFF Research Database (Denmark)

    Sidlauskas, Darius; Ross, Kenneth A.; Jensen, Christian S.

    2011-01-01

    as well as contain queries. The non-trivial challenge addressed is that of avoiding contention between long-running queries and frequent updates. Specifically, the paper proposes a grid-based indexing technique. A static grid indexes a near up-to-date snapshot of the data to support queries, while a live......Modern processors consist of multiple cores that each support parallel processing by multiple physical threads, and they offer ample main-memory storage. This paper studies the use of such processors for the processing of update-intensive moving-object workloads that contain very frequent updates...

  20. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    Science.gov (United States)

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  1. A PSO Driven Intelligent Model Updating and Parameter Identification Scheme for Cable-Damper System

    Directory of Open Access Journals (Sweden)

    Danhui Dan

    2015-01-01

    Full Text Available The precise measurement of the cable force is very important for monitoring and evaluating the operation status of cable structures such as cable-stayed bridges. The cable system should be installed with lateral dampers to reduce the vibration, which affects the precise measurement of the cable force and other cable parameters. This paper suggests a cable model updating calculation scheme driven by the particle swarm optimization (PSO algorithm. By establishing a finite element model considering the static geometric nonlinearity and stress-stiffening effect firstly, an automatically finite element method model updating powered by PSO algorithm is proposed, with the aims to identify the cable force and relevant parameters of cable-damper system precisely. Both numerical case studies and full-scale cable tests indicated that, after two rounds of updating process, the algorithm can accurately identify the cable force, moment of inertia, and damping coefficient of the cable-damper system.

  2. Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    Online: 02 April (2018) ISSN 1017-1398 R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : Unconstrained minimization * Block variable metric methods * Limited-memory methods * BFGS update * Global convergence * Numerical results Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 1.241, year: 2016

  3. Indoor Spatial Updating with Reduced Visual Information

    OpenAIRE

    Legge, Gordon E.; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M.

    2016-01-01

    Purpose Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Methods Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (S...

  4. Antiretroviral treatment cohort analysis using time-updated CD4 counts: assessment of bias with different analytic methods.

    Directory of Open Access Journals (Sweden)

    Katharina Kranzer

    Full Text Available Survival analysis using time-updated CD4+ counts during antiretroviral therapy is frequently employed to determine risk of clinical events. The time-point when the CD4+ count is assumed to change potentially biases effect estimates but methods used to estimate this are infrequently reported.This study examined the effect of three different estimation methods: assuming i a constant CD4+ count from date of measurement until the date of next measurement, ii a constant CD4+ count from the midpoint of the preceding interval until the midpoint of the subsequent interval and iii a linear interpolation between consecutive CD4+ measurements to provide additional midpoint measurements. Person-time, tuberculosis rates and hazard ratios by CD4+ stratum were compared using all available CD4+ counts (measurement frequency 1-3 months and 6 monthly measurements from a clinical cohort. Simulated data were used to compare the extent of bias introduced by these methods.The midpoint method gave the closest fit to person-time spent with low CD4+ counts and for hazard ratios for outcomes both in the clinical dataset and the simulated data.The midpoint method presents a simple option to reduce bias in time-updated CD4+ analysis, particularly at low CD4 cell counts and rapidly increasing counts after ART initiation.

  5. Sensitivity study of a method for updating a finite element model of a nuclear power station cooling tower

    International Nuclear Information System (INIS)

    Billet, L.

    1994-01-01

    The Research and Development Division of Electricite de France is developing a surveillance method of cooling towers involving on-site wind-induced measurements. The method is supposed to detect structural damage in the tower. The damage is identified by tuning a finite element model of the tower on experimental mode shapes and eigenfrequencies. The sensitivity of the method was evaluated through numerical tests. First, the dynamic response of a damaged tower was simulated by varying the stiffness of some area of the model shell (from 1 % to 24 % of the total shell area). Second, the structural parameters of the undamaged cooling tower model were updated in order to make the output of the undamaged model as close as possible to the synthetic experimental data. The updating method, based on the minimization of the differences between experimental modal energies and modal energies calculated by the model, did not detect a stiffness change over less than 3 % of the shell area. Such a sensitivity is thought to be insufficient to detect tower cracks which behave like highly localized defaults. (author). 8 refs., 9 figs., 6 tabs

  6. Evaluation of a proposed optimization method for discrete-event simulation models

    Directory of Open Access Journals (Sweden)

    Alexandre Ferreira de Pinho

    2012-12-01

    Full Text Available Optimization methods combined with computer-based simulation have been utilized in a wide range of manufacturing applications. However, in terms of current technology, these methods exhibit low performance levels which are only able to manipulate a single decision variable at a time. Thus, the objective of this article is to evaluate a proposed optimization method for discrete-event simulation models based on genetic algorithms which exhibits more efficiency in relation to computational time when compared to software packages on the market. It should be emphasized that the variable's response quality will not be altered; that is, the proposed method will maintain the solutions' effectiveness. Thus, the study draws a comparison between the proposed method and that of a simulation instrument already available on the market and has been examined in academic literature. Conclusions are presented, confirming the proposed optimization method's efficiency.

  7. Second-generation speed limit map updating applications

    DEFF Research Database (Denmark)

    Tradisauskas, Nerius; Agerholm, Niels; Juhl, Jens

    2011-01-01

    Intelligent Speed Adaptation is an Intelligent Transport System developed to significantly improve road safety in helping car drivers maintain appropriate driving behaviour. The system works in connection with the speed limits on the road network. It is thus essential to keep the speed limit map...... used in the Intelligent Speed Adaptation scheme updated. The traditional method of updating speed limit maps on the basis of long time interval observations needed to be replaced by a more efficient speed limit updating tool. In a Danish Intelligent Speed Adaptation trial a web-based tool was therefore...... for map updating should preferably be made on the basis of a commercial map provider, 2 such as Google Maps and that the real challenge is to oblige road authorities to carry out updates....

  8. The Updating of Geospatial Base Data

    Science.gov (United States)

    Alrajhi, Muhamad N.; Konecny, Gottfried

    2018-04-01

    Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.

  9. Method of dynamic fuzzy symptom vector in intelligent diagnosis

    International Nuclear Information System (INIS)

    Sun Hongyan; Jiang Xuefeng

    2010-01-01

    Aiming at the requirement of diagnostic symptom real-time updating brought from diagnostic knowledge accumulation and great gap in unit and value of diagnostic symptom in multi parameters intelligent diagnosis, the method of dynamic fuzzy symptom vector is proposed. The concept of dynamic fuzzy symptom vector is defined. Ontology is used to specify the vector elements, and the vector transmission method based on ontology is built. The changing law of symptom value is analyzed and fuzzy normalization method based on fuzzy membership functions is built. An instance proved method of dynamic fussy symptom vector is efficient to solve the problems of symptom updating and unify of symptom value and unit. (authors)

  10. A Continuously Updated, Global Land Classification Map, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate a fully automatic capability for generating a global, high resolution (30 m) land classification map, with continuous updates from...

  11. A Time Domain Update Method for Reservoir History Matching of Electromagnetic Data

    KAUST Repository

    Katterbauer, Klemens

    2014-03-25

    The oil & gas industry has been the backbone of the world\\'s economy in the last century and will continue to be in the decades to come. With increasing demand and conventional reservoirs depleting, new oil industry projects have become more complex and expensive, operating in areas that were previously considered impossible and uneconomical. Therefore, good reservoir management is key for the economical success of complex projects requiring the incorporation of reliable uncertainty estimates for reliable production forecasts and optimizing reservoir exploitation. Reservoir history matching has played here a key role incorporating production, seismic, electromagnetic and logging data for forecasting the development of reservoirs and its depletion. With the advances in the last decade, electromagnetic techniques, such as crosswell electromagnetic tomography, have enabled engineers to more precisely map the reservoirs and understand their evolution. Incorporating the large amount of data efficiently and reducing uncertainty in the forecasts has been one of the key challenges for reservoir management. Computing the conductivity distribution for the field for adjusting parameters in the forecasting process via solving the inverse problem has been a challenge, due to the strong ill-posedness of the inversion problem and the extensive manual calibration required, making it impossible to be included into an efficient reservoir history matching forecasting algorithm. In the presented research, we have developed a novel Finite Difference Time Domain (FDTD) based method for incorporating electromagnetic data directly into the reservoir simulator. Based on an extended Archie relationship, EM simulations are performed for both forecasted and Porosity-Saturation retrieved conductivity parameters being incorporated directly into an update step for the reservoir parameters. This novel direct update method has significant advantages such as that it overcomes the expensive and ill

  12. An iterative stochastic ensemble method for parameter estimation of subsurface flow models

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2013-01-01

    Parameter estimation for subsurface flow models is an essential step for maximizing the value of numerical simulations for future prediction and the development of effective control strategies. We propose the iterative stochastic ensemble method (ISEM) as a general method for parameter estimation based on stochastic estimation of gradients using an ensemble of directional derivatives. ISEM eliminates the need for adjoint coding and deals with the numerical simulator as a blackbox. The proposed method employs directional derivatives within a Gauss–Newton iteration. The update equation in ISEM resembles the update step in ensemble Kalman filter, however the inverse of the output covariance matrix in ISEM is regularized using standard truncated singular value decomposition or Tikhonov regularization. We also investigate the performance of a set of shrinkage based covariance estimators within ISEM. The proposed method is successfully applied on several nonlinear parameter estimation problems for subsurface flow models. The efficiency of the proposed algorithm is demonstrated by the small size of utilized ensembles and in terms of error convergence rates

  13. An iterative stochastic ensemble method for parameter estimation of subsurface flow models

    KAUST Repository

    Elsheikh, Ahmed H.

    2013-06-01

    Parameter estimation for subsurface flow models is an essential step for maximizing the value of numerical simulations for future prediction and the development of effective control strategies. We propose the iterative stochastic ensemble method (ISEM) as a general method for parameter estimation based on stochastic estimation of gradients using an ensemble of directional derivatives. ISEM eliminates the need for adjoint coding and deals with the numerical simulator as a blackbox. The proposed method employs directional derivatives within a Gauss-Newton iteration. The update equation in ISEM resembles the update step in ensemble Kalman filter, however the inverse of the output covariance matrix in ISEM is regularized using standard truncated singular value decomposition or Tikhonov regularization. We also investigate the performance of a set of shrinkage based covariance estimators within ISEM. The proposed method is successfully applied on several nonlinear parameter estimation problems for subsurface flow models. The efficiency of the proposed algorithm is demonstrated by the small size of utilized ensembles and in terms of error convergence rates. © 2013 Elsevier Inc.

  14. An adaptive reentry guidance method considering the influence of blackout zone

    Science.gov (United States)

    Wu, Yu; Yao, Jianyao; Qu, Xiangju

    2018-01-01

    Reentry guidance has been researched as a popular topic because it is critical for a successful flight. In view that the existing guidance methods do not take into account the accumulated navigation error of Inertial Navigation System (INS) in the blackout zone, in this paper, an adaptive reentry guidance method is proposed to obtain the optimal reentry trajectory quickly with the target of minimum aerodynamic heating rate. The terminal error in position and attitude can be also reduced with the proposed method. In this method, the whole reentry guidance task is divided into two phases, i.e., the trajectory updating phase and the trajectory planning phase. In the first phase, the idea of model predictive control (MPC) is used, and the receding optimization procedure ensures the optimal trajectory in the next few seconds. In the trajectory planning phase, after the vehicle has flown out of the blackout zone, the optimal reentry trajectory is obtained by online planning to adapt to the navigation information. An effective swarm intelligence algorithm, i.e. pigeon inspired optimization (PIO) algorithm, is applied to obtain the optimal reentry trajectory in both of the two phases. Compared to the trajectory updating method, the proposed method can reduce the terminal error by about 30% considering both the position and attitude, especially, the terminal error of height has almost been eliminated. Besides, the PIO algorithm performs better than the particle swarm optimization (PSO) algorithm both in the trajectory updating phase and the trajectory planning phases.

  15. Why, when and how to update a meta-ethnography qualitative synthesis.

    Science.gov (United States)

    France, Emma F; Wells, Mary; Lang, Heidi; Williams, Brian

    2016-03-15

    Meta-ethnography is a unique, systematic, qualitative synthesis approach widely used to provide robust evidence on patient and clinician beliefs and experiences and understandings of complex social phenomena. It can make important theoretical and conceptual contributions to health care policy and practice. Since beliefs, experiences, health care contexts and social phenomena change over time, the continued relevance of the findings from meta-ethnographies cannot be assumed. However, there is little guidance on whether, when and how meta-ethnographies should be updated; Cochrane guidance on updating reviews of intervention effectiveness is unlikely to be fully appropriate. This is the first in-depth discussion on updating a meta-ethnography; it explores why, when and how to update a meta-ethnography. Three main methods of updating the analysis and synthesis are examined. Advantages and disadvantages of each method are outlined, relating to the context, purpose, process and output of the update and the nature of the new data available. Recommendations are made for the appropriate use of each method, and a worked example of updating a meta-ethnography is provided. This article makes a unique contribution to this evolving area of meta-ethnography methodology.

  16. WIMS-D library update

    International Nuclear Information System (INIS)

    2007-05-01

    WIMS-D (Winfrith Improved Multigroup Scheme-D) is the name of a family of software packages for reactor lattice calculations and is one of the few reactor lattice codes in the public domain and available on noncommercial terms. WIMSD-5B has recently been released from the OECD Nuclear Energy Agency Data Bank, and features major improvements in machine portability, as well as incorporating a few minor corrections. This version supersedes WIMS-D/4, which was released by the Winfrith Technology Centre in the United Kingdom for IBM machines and has been adapted for various other computer platforms in different laboratories. The main weakness of the WIMS-D package is the multigroup constants library, which is based on very old data. The relatively good performance of WIMS-D is attributed to a series of empirical adjustments to the multigroup data. However, the adjustments are not always justified on the basis of more accurate and recent experimental measurements. Following the release of new and revised evaluated nuclear data files, it was felt that the performance of WIMS-D could be improved by updating the associated library. The WIMS-D Library Update Project (WLUP) was initiated in the early 1990s with the support of the IAEA. This project consisted of voluntary contributions from a large number of participants. Several benchmarks for testing the library were identified and analysed, the WIMSR module of the NJOY code system was upgraded and the author of NJOY accepted the proposed updates for the official code system distribution. A detailed parametric study was performed to investigate the effects of various data processing input options on the integral results. In addition, the data processing methods for the main reactor materials were optimized. Several partially updated libraries were produced for testing purposes. The final stage of the WLUP was organized as a coordinated research project (CRP) in order to speed up completion of the fully updated library

  17. A Proposed Method for Solving Fuzzy System of Linear Equations

    Directory of Open Access Journals (Sweden)

    Reza Kargar

    2014-01-01

    Full Text Available This paper proposes a new method for solving fuzzy system of linear equations with crisp coefficients matrix and fuzzy or interval right hand side. Some conditions for the existence of a fuzzy or interval solution of m×n linear system are derived and also a practical algorithm is introduced in detail. The method is based on linear programming problem. Finally the applicability of the proposed method is illustrated by some numerical examples.

  18. The PMIPv6-Based Group Binding Update for IoT Devices

    Directory of Open Access Journals (Sweden)

    Jianfeng Guan

    2016-01-01

    Full Text Available Internet of Things (IoT has been booming with rapid increase of the various wearable devices, vehicle embedded devices, and so on, and providing the effective mobility management for these IoT devices becomes a challenge due to the different application scenarios as well as the limited energy and bandwidth. Recently, lots of researchers have focused on this topic and proposed several solutions based on the combination of IoT features and traditional mobility management protocols, in which most of these schemes take the IoT devices as mobile networks and adopt the NEtwork MObility (NEMO and its variants to provide the mobility support. However, these solutions are in face of the heavy signaling cost problem. Since IoT devices are generally combined to realize the complex functions, these devices may have similar movement behaviors. Clearly analyzing these characters and using them in the mobility management will reduce the signaling cost and improve the scalability. Motivated by this, we propose a PMIPv6-based group binding update method. In particular, we describe its group creation procedure, analyze its impact on the mobility management, and derive its reduction ratio in terms of signaling cost. The final results show that the introduction of group binding update can remarkably reduce the signaling cost.

  19. Review of BEPCo's exploration drilling environmental assessment update

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-07-01

    This report was presented in response to a request from the Canada-Nova Scotia Offshore Petroleum Board for advice on the accuracy of an update to an environmental assessment report related to BEPCo's proposed exploratory drilling project offshore of Nova Scotia, which had received approval in June 2005 but was then delayed. BEPCo was seeking approval to proceed with the same project from 2009 to 2015, although the proposed project lease area was half the size of the original proposal. Advice was sought with respect to potential drilling and seismic impacts to the marine environment, any new information since 2005 regarding marine benthic habitat, non-commercial fish species, marine animals, or spawning areas of critical habitat in the area, and whether DFO Science was planning to undertake research in the area during the period in question. Summaries were presented for the new environmental impact information and the new ecosystem information, including updated species at risk information. BEPCo's updated environmental assessment did not include all of the newly available information. Although it was not clear that is new information would change the conclusions of the initial environmental assessment, it would be useful in designing and implementing an Environmental Effects Monitoring program. 21 refs.

  20. A Class of Manifold Regularized Multiplicative Update Algorithms for Image Clustering.

    Science.gov (United States)

    Yang, Shangming; Yi, Zhang; He, Xiaofei; Li, Xuelong

    2015-12-01

    Multiplicative update algorithms are important tools for information retrieval, image processing, and pattern recognition. However, when the graph regularization is added to the cost function, different classes of sample data may be mapped to the same subspace, which leads to the increase of data clustering error rate. In this paper, an improved nonnegative matrix factorization (NMF) cost function is introduced. Based on the cost function, a class of novel graph regularized NMF algorithms is developed, which results in a class of extended multiplicative update algorithms with manifold structure regularization. Analysis shows that in the learning, the proposed algorithms can efficiently minimize the rank of the data representation matrix. Theoretical results presented in this paper are confirmed by simulations. For different initializations and data sets, variation curves of cost functions and decomposition data are presented to show the convergence features of the proposed update rules. Basis images, reconstructed images, and clustering results are utilized to present the efficiency of the new algorithms. Last, the clustering accuracies of different algorithms are also investigated, which shows that the proposed algorithms can achieve state-of-the-art performance in applications of image clustering.

  1. LBTool: A stochastic toolkit for leave-based key updates

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless...... sensor networks. The analysis aims to find out the probability of a network key being compromised at a specific time point, which result in fluctuations over time for a specific key update method called Leave-based key update. For such a problem, the use of current tools is limited in many ways...

  2. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  3. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  4. A Fatigue Crack Size Evaluation Method Based on Lamb Wave Simulation and Limited Experimental Data

    Directory of Open Access Journals (Sweden)

    Jingjing He

    2017-09-01

    Full Text Available This paper presents a systematic and general method for Lamb wave-based crack size quantification using finite element simulations and Bayesian updating. The method consists of construction of a baseline quantification model using finite element simulation data and Bayesian updating with limited Lamb wave data from target structure. The baseline model correlates two proposed damage sensitive features, namely the normalized amplitude and phase change, with the crack length through a response surface model. The two damage sensitive features are extracted from the first received S0 mode wave package. The model parameters of the baseline model are estimated using finite element simulation data. To account for uncertainties from numerical modeling, geometry, material and manufacturing between the baseline model and the target model, Bayesian method is employed to update the baseline model with a few measurements acquired from the actual target structure. A rigorous validation is made using in-situ fatigue testing and Lamb wave data from coupon specimens and realistic lap-joint components. The effectiveness and accuracy of the proposed method is demonstrated under different loading and damage conditions.

  5. Knowledge structure representation and automated updates in intelligent information management systems

    Science.gov (United States)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  6. Modified Block Newton method for the lambda modes problem

    Energy Technology Data Exchange (ETDEWEB)

    González-Pintor, S., E-mail: segonpin@isirym.upv.es [Departamento de Ingeniería Química y Nuclear, Universidad Politécnica de Valencia, Camino de Vera 14, 46022 Valencia (Spain); Ginestar, D., E-mail: dginestar@mat.upv.es [Instituto de Matemática Multidisciplinar, Universidad Politécnica de Valencia, Camino de Vera 14, 46022 Valencia (Spain); Verdú, G., E-mail: gverdu@iqn.upv.es [Departamento de Ingeniería Química y Nuclear, Universidad Politécnica de Valencia, Camino de Vera 14, 46022 Valencia (Spain)

    2013-06-15

    Highlights: ► The Modal Method is based on expanding the solution in a set of dominant modes. ► Updating the set of dominant modes improve its performance. ► A Modified Block Newton Method, which use previous calculated modes, is proposed. ► The method exhibits a very good local convergence with few iterations. ► Good performance results are also obtained for heavy perturbations. -- Abstract: To study the behaviour of nuclear power reactors it is necessary to solve the time dependent neutron diffusion equation using either a rectangular mesh for PWR and BWR reactors or a hexagonal mesh for VVER reactors. This problem can be solved by means of a modal method, which uses a set of dominant modes to expand the neutron flux. For the transient calculations using the modal method with a moderate number of modes, these modes must be updated each time step to maintain the accuracy of the solution. The updating modes process is also interesting to study perturbed configurations of a reactor. A Modified Block Newton method is studied to update the modes. The performance of the Newton method has been tested for a steady state perturbation analysis of two 2D hexagonal reactors, a perturbed configuration of the IAEA PWR 3D reactor and two configurations associated with a boron dilution transient in a BWR reactor.

  7. Construction Method of Display Proposal for Commodities in Sales Promotion by Genetic Algorithm

    Science.gov (United States)

    Yumoto, Masaki

    In a sales promotion task, wholesaler prepares and presents the display proposal for commodities in order to negotiate with retailer's buyers what commodities they should sell. For automating the sales promotion tasks, the proposal has to be constructed according to the target retailer's buyer. However, it is difficult to construct the proposal suitable for the target retail store because of too much combination of commodities. This paper proposes a construction method by Genetic algorithm (GA). The proposed method represents initial display proposals for commodities with genes, improve ones with the evaluation value by GA, and rearrange one with the highest evaluation value according to the classification of commodity. Through practical experiment, we can confirm that display proposal by the proposed method is similar with the one constructed by a wholesaler.

  8. The history of female genital tract malformation classifications and proposal of an updated system.

    Science.gov (United States)

    Acién, Pedro; Acién, Maribel I

    2011-01-01

    A correct classification of malformations of the female genital tract is essential to prevent unnecessary and inadequate surgical operations and to compare reproductive results. An ideal classification system should be based on aetiopathogenesis and should suggest the appropriate therapeutic strategy. We conducted a systematic review of relevant articles found in PubMed, Scopus, Scirus and ISI webknowledge, and analysis of historical collections of 'female genital malformations' and 'classifications'. Of 124 full-text articles assessed for eligibility, 64 were included because they contained original general, partial or modified classifications. All the existing classifications were analysed and grouped. The unification of terms and concepts was also analysed. Traditionally, malformations of the female genital tract have been catalogued and classified as Müllerian malformations due to agenesis, lack of fusion, the absence of resorption and lack of posterior development of the Müllerian ducts. The American Fertility Society classification of the late 1980s included seven basic groups of malformations also considering the Müllerian development and the relationship of the malformations to fertility. Other classifications are based on different aspects: functional, defects in vertical fusion, embryological or anatomical (Vagina, Cervix, Uterus, Adnex and Associated Malformation: VCUAM classification). However, an embryological-clinical classification system seems to be the most appropriate. Accepting the need for a new classification system of genitourinary malformations that considers the experience gained from the application of the current classification systems, the aetiopathogenesis and that also suggests the appropriate treatment, we proposed an update of our embryological-clinical classification as a new system with six groups of female genitourinary anomalies.

  9. Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system

    Science.gov (United States)

    Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.

    2017-05-01

    We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.

  10. WIMS nuclear data library and its updating

    Energy Technology Data Exchange (ETDEWEB)

    Bakhtyar, S; Salahuddin, A; Arshad, M

    1995-10-01

    This report gives a brief overview of the status of reactor physics computer code WIMS-D/4 and its library. It presents the details of WIMS-D/4 Library Update Project (WLUP), initiated by International Atomic Energy Agency (IAEA) with the goal of providing updated nuclear data library to the user of WIMS-D/4. The WLUP was planned to be executed in several stages. In this report the calculations performed for the first stage are presented. A number of benchmarks for light water and heavy water lattices proposed by IAEA have been analysed and the results have been compared with the average of experimental values, the IAEA reference values and the average of calculated results from different international laboratories. (author) 8 figs.

  11. WIMS nuclear data library and its updating

    International Nuclear Information System (INIS)

    Bakhtyar, S.; Salahuddin, A.; Arshad, M.

    1995-10-01

    This report gives a brief overview of the status of reactor physics computer code WIMS-D/4 and its library. It presents the details of WIMS-D/4 Library Update Project (WLUP), initiated by International Atomic Energy Agency (IAEA) with the goal of providing updated nuclear data library to the user of WIMS-D/4. The WLUP was planned to be executed in several stages. In this report the calculations performed for the first stage are presented. A number of benchmarks for light water and heavy water lattices proposed by IAEA have been analysed and the results have been compared with the average of experimental values, the IAEA reference values and the average of calculated results from different international laboratories. (author) 8 figs

  12. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  13. Main-Memory Operation Buffering for Efficient R-Tree Update

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Saltenis, Simonas; Biveinis, Laurynas

    2007-01-01

    the buffering of update operations in main memory as well as the grouping of operations to reduce disk I/O. In particular, operations are performed in bulk so that multiple operations are able to share I/O. The paper presents an analytical cost model that is shown to be accurate by empirical studies...... the main memory that is indeed available, or do not support some of the standard index operations. Assuming a setting where the index updates need not be written to disk immediately, we propose an R-tree-based indexing technique that does not exhibit any of these drawbacks. This technique exploits...

  14. Parallel main-memory indexing for moving-object query and update workloads

    DEFF Research Database (Denmark)

    Sidlauskas, Darius; Saltenis, Simonas; Jensen, Christian Søndergaard

    2012-01-01

    of supporting the location-related query and update workloads generated by very large populations of such moving objects. This paper presents a main-memory indexing technique that aims to support such workloads. The technique, called PGrid, uses a grid structure that is capable of exploiting the parallelism...... offered by modern processors. Unlike earlier proposals that maintain separate structures for updates and queries, PGrid allows both long-running queries and rapid updates to operate on a single data structure and thus offers up-to-date query results. Because PGrid does not rely on creating snapshots...... on the same current data-store state, PGrid outperforms snapshot-based techniques in terms of both query freshness and CPU cycle-wise efficiency....

  15. A local level set method based on a finite element method for unstructured meshes

    International Nuclear Information System (INIS)

    Ngo, Long Cu; Choi, Hyoung Gwon

    2016-01-01

    A local level set method for unstructured meshes has been implemented by using a finite element method. A least-square weighted residual method was employed for implicit discretization to solve the level set advection equation. By contrast, a direct re-initialization method, which is directly applicable to the local level set method for unstructured meshes, was adopted to re-correct the level set function to become a signed distance function after advection. The proposed algorithm was constructed such that the advection and direct reinitialization steps were conducted only for nodes inside the narrow band around the interface. Therefore, in the advection step, the Gauss–Seidel method was used to update the level set function using a node-by-node solution method. Some benchmark problems were solved by using the present local level set method. Numerical results have shown that the proposed algorithm is accurate and efficient in terms of computational time

  16. A local level set method based on a finite element method for unstructured meshes

    Energy Technology Data Exchange (ETDEWEB)

    Ngo, Long Cu; Choi, Hyoung Gwon [School of Mechanical Engineering, Seoul National University of Science and Technology, Seoul (Korea, Republic of)

    2016-12-15

    A local level set method for unstructured meshes has been implemented by using a finite element method. A least-square weighted residual method was employed for implicit discretization to solve the level set advection equation. By contrast, a direct re-initialization method, which is directly applicable to the local level set method for unstructured meshes, was adopted to re-correct the level set function to become a signed distance function after advection. The proposed algorithm was constructed such that the advection and direct reinitialization steps were conducted only for nodes inside the narrow band around the interface. Therefore, in the advection step, the Gauss–Seidel method was used to update the level set function using a node-by-node solution method. Some benchmark problems were solved by using the present local level set method. Numerical results have shown that the proposed algorithm is accurate and efficient in terms of computational time.

  17. Computer program CDCID: an automated quality control program using CDC update

    International Nuclear Information System (INIS)

    Singer, G.L.; Aguilar, F.

    1984-04-01

    A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. The computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program

  18. A comparison of updating algorithms for large N reduced models

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Margarita García [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); González-Arroyo, Antonio [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); Departamento de Física Teórica, C-XI Universidad Autónoma de Madrid,E-28049 Madrid (Spain); Keegan, Liam [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland); Okawa, Masanori [Graduate School of Science, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Core of Research for the Energetic Universe, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Ramos, Alberto [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland)

    2015-06-29

    We investigate Monte Carlo updating algorithms for simulating SU(N) Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole SU(N) matrix at once, or iterating through SU(2) subgroups of the SU(N) matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  19. A comparison of updating algorithms for large $N$ reduced models

    CERN Document Server

    Pérez, Margarita García; Keegan, Liam; Okawa, Masanori; Ramos, Alberto

    2015-01-01

    We investigate Monte Carlo updating algorithms for simulating $SU(N)$ Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole $SU(N)$ matrix at once, or iterating through $SU(2)$ subgroups of the $SU(N)$ matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  20. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  1. Asynchronous decentralized method for interconnected electricity markets

    International Nuclear Information System (INIS)

    Huang, Anni; Joo, Sung-Kwan; Song, Kyung-Bin; Kim, Jin-Ho; Lee, Kisung

    2008-01-01

    This paper presents an asynchronous decentralized method to solve the optimization problem of interconnected electricity markets. The proposed method decomposes the optimization problem of combined electricity markets into individual optimization problems. The impact of neighboring markets' information is included in the objective function of the individual market optimization problem by the standard Lagrangian relaxation method. Most decentralized optimization methods use synchronous models of communication to exchange updated market information among markets during the iterative process. In this paper, however, the solutions of the individual optimization problems are coordinated through an asynchronous communication model until they converge to the global optimal solution of combined markets. Numerical examples are presented to demonstrate the advantages of the proposed asynchronous method over the existing synchronous methods. (author)

  2. Cybercrimes: A Proposed Taxonomy and Challenges

    Directory of Open Access Journals (Sweden)

    Harmandeep Singh Brar

    2018-01-01

    Full Text Available Cybersecurity is one of the most important concepts of cyberworld which provides protection to the cyberspace from various types of cybercrimes. This paper provides an updated survey of cybersecurity. We conduct the survey of security of recent prominent researches and categorize the recent incidents in context to various fundamental principles of cybersecurity. We have proposed a new taxonomy of cybercrime which can cover all types of cyberattacks. We have analyzed various cyberattacks as per the updated cybercrime taxonomy to identify the challenges in the field of cybersecurity and highlight various research directions as future work in this field.

  3. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  4. 78 FR 69805 - Periodic Reporting (Proposals Six Through Nine)

    Science.gov (United States)

    2013-11-21

    ... Postal Service proposes to update its methodology for calculating the costs for Philatelic Sales and the... MODS Operation Groups for Productivity Calculations The Postal Service states that Proposal Eight would... MODS productivity data (TPF or TPH per workhour) for a variety of operation groups related to letter...

  5. Updating QR factorization procedure for solution of linear least squares problem with equality constraints.

    Science.gov (United States)

    Zeb, Salman; Yousaf, Muhammad

    2017-01-01

    In this article, we present a QR updating procedure as a solution approach for linear least squares problem with equality constraints. We reduce the constrained problem to unconstrained linear least squares and partition it into a small subproblem. The QR factorization of the subproblem is calculated and then we apply updating techniques to its upper triangular factor R to obtain its solution. We carry out the error analysis of the proposed algorithm to show that it is backward stable. We also illustrate the implementation and accuracy of the proposed algorithm by providing some numerical experiments with particular emphasis on dense problems.

  6. Distributed Cooperative Search Control Method of Multiple UAVs for Moving Target

    Directory of Open Access Journals (Sweden)

    Chang-jian Ru

    2015-01-01

    Full Text Available To reduce the impact of uncertainties caused by unknown motion parameters on searching plan of moving targets and improve the efficiency of UAV’s searching, a novel distributed Multi-UAVs cooperative search control method for moving target is proposed in this paper. Based on detection results of onboard sensors, target probability map is updated using Bayesian theory. A Gaussian distribution of target transition probability density function is introduced to calculate prediction probability of moving target existence, and then target probability map can be further updated in real-time. A performance index function combining with target cost, environment cost, and cooperative cost is constructed, and the cooperative searching problem can be transformed into a central optimization problem. To improve computational efficiency, the distributed model predictive control method is presented, and thus the control command of each UAV can be obtained. The simulation results have verified that the proposed method can avoid the blindness of UAV searching better and improve overall efficiency of the team effectively.

  7. 77 FR 68717 - Updating OSHA Standards Based on National Consensus Standards; Head Protection

    Science.gov (United States)

    2012-11-16

    ..., 1918, and 1926 [Docket No. OSH-2011-0184] RIN 1218-AC65 Updating OSHA Standards Based on National Consensus Standards; Head Protection AGENCY: Occupational Safety and Health Administration (OSHA), Labor. ACTION: Proposed rule; withdrawal. SUMMARY: With this notice, OSHA is withdrawing the proposed rule that...

  8. Key Techniques for Dynamic Updating of National Fundamental Geographic Information Database

    Directory of Open Access Journals (Sweden)

    WANG Donghua

    2015-07-01

    Full Text Available One of the most important missions of fundamental surveying and mapping work is to keep the fundamental geographic information fresh. In this respect, National Administration of Surveying, Mapping and Geoinformation has launched the project of dynamic updating of national fundamental geographic information database since 2012, which aims to update 1:50 000, 1:250 000 and 1:1 000 000 national fundamental geographic information database continuously and quickly, by updating and publishing once a year. This paper introduces the general technical thinking of dynamic updating, states main technical methods, such as dynamic updating of fundamental database, linkage updating of derived databases, and multi-tense database management and service and so on, and finally introduces main technical characteristics and engineering applications.

  9. Adapting to change: The role of the right hemisphere in mental model building and updating.

    Science.gov (United States)

    Filipowicz, Alex; Anderson, Britt; Danckert, James

    2016-09-01

    We recently proposed that the right hemisphere plays a crucial role in the processes underlying mental model building and updating. Here, we review the evidence we and others have garnered to support this novel account of right hemisphere function. We begin by presenting evidence from patient work that suggests a critical role for the right hemisphere in the ability to learn from the statistics in the environment (model building) and adapt to environmental change (model updating). We then provide a review of neuroimaging research that highlights a network of brain regions involved in mental model updating. Next, we outline specific roles for particular regions within the network such that the anterior insula is purported to maintain the current model of the environment, the medial prefrontal cortex determines when to explore new or alternative models, and the inferior parietal lobule represents salient and surprising information with respect to the current model. We conclude by proposing some future directions that address some of the outstanding questions in the field of mental model building and updating. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Applicability of the proposed evaluation method for social infrastructures to nuclear power plants

    International Nuclear Information System (INIS)

    Ichimura, Tomiyasu

    2015-01-01

    This study proposes an evaluation method for social infrastructures, and verifies the applicability of the proposed evaluation method to social infrastructures by applying it to nuclear power plants, which belong to social infrastructures. In the proposed evaluation method for social infrastructures, the authors chose four evaluation viewpoints and proposed common evaluation standards for the evaluation indexes obtained from each viewpoint. By applying this system to the evaluation of nuclear power plants, the evaluation index examples were obtained from the evaluation viewpoints. Furthermore, when the level of the common evaluation standards of the proposed evaluation method was applied to the evaluation of the activities of nuclear power plants based on the regulations, it was confirmed that these activities are at the highest level. Through this application validation, it was clarified that the proposed evaluation method for social infrastructures had certain effectiveness. The four evaluation viewpoints are 'service,' 'environment,' 'action factor,' and 'operation and management.' Part of the application examples to a nuclear power plant are as follows: (1) in the viewpoint of service: the operation rate of the power plant, and operation costs, and (2) in the viewpoint of environment: external influence related to nuclear waste and radioactivity, and external effect related to cooling water. (A.O.)

  11. A Review of the Extraction and Determination Methods of Thirteen Essential Vitamins to the Human Body: An Update from 2010.

    Science.gov (United States)

    Zhang, Yuan; Zhou, Wei-E; Yan, Jia-Qing; Liu, Min; Zhou, Yu; Shen, Xin; Ma, Ying-Lin; Feng, Xue-Song; Yang, Jun; Li, Guo-Hui

    2018-06-19

    Vitamins are a class of essential nutrients in the body; thus, they play important roles in human health. The chemicals are involved in many physiological functions and both their lack and excess can put health at risk. Therefore, the establishment of methods for monitoring vitamin concentrations in different matrices is necessary. In this review, an updated overview of the main pretreatments and determination methods that have been used since 2010 is given. Ultrasonic assisted extraction, liquid⁻liquid extraction, solid phase extraction and dispersive liquid⁻liquid microextraction are the most common pretreatment methods, while the determination methods involve chromatography methods, electrophoretic methods, microbiological assays, immunoassays, biosensors and several other methods. Different pretreatments and determination methods are discussed.

  12. A Review of the Extraction and Determination Methods of Thirteen Essential Vitamins to the Human Body: An Update from 2010

    Directory of Open Access Journals (Sweden)

    Yuan Zhang

    2018-06-01

    Full Text Available Vitamins are a class of essential nutrients in the body; thus, they play important roles in human health. The chemicals are involved in many physiological functions and both their lack and excess can put health at risk. Therefore, the establishment of methods for monitoring vitamin concentrations in different matrices is necessary. In this review, an updated overview of the main pretreatments and determination methods that have been used since 2010 is given. Ultrasonic assisted extraction, liquid–liquid extraction, solid phase extraction and dispersive liquid–liquid microextraction are the most common pretreatment methods, while the determination methods involve chromatography methods, electrophoretic methods, microbiological assays, immunoassays, biosensors and several other methods. Different pretreatments and determination methods are discussed.

  13. Endobronchial Ultrasound (EBUS) - Update 2017.

    Science.gov (United States)

    Darwiche, Kaid; Özkan, Filiz; Wolters, Celina; Eisenmann, Stephan

    2018-02-01

    Endobronchial ultrasound (EBUS) has revolutionized the diagnosis of lung cancer over the last decade. This minimally invasive diagnostic method has also become increasingly important in the case of other diseases such as sarcoidosis, thereby helping to avoid unnecessary diagnostic interventions. This review article provides an update regarding EBUS and discusses current and future developments of this method. © Georg Thieme Verlag KG Stuttgart · New York.

  14. Profile updating for information systems

    International Nuclear Information System (INIS)

    Abrantes, J.F.

    1983-02-01

    Profiles updating methods were analysed. A method suitable to the characteristics of the system used in the research (SDI/CIN/CNEN) that uses as the selection criterio the threshold and weights criterion, was determined. Relevance weighting theory was described and experiments to verify precision were carried out. The improvements obtained were good nevertheless more significant tests are required to attain more reliable results. (Author) [pt

  15. 77 FR 43018 - Updating OSHA Construction Standards Based on National Consensus Standards; Head Protection...

    Science.gov (United States)

    2012-07-23

    .... OSHA-2011-0184] RIN 1218-AC65 Updating OSHA Construction Standards Based on National Consensus... Health Administration (OSHA), Department of Labor. ACTION: Notice of proposed rulemaking; correction. SUMMARY: OSHA is correcting a notice of proposed rulemaking (NPRM) with regard to the construction...

  16. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  17. Methodological proposal for environmental impact evaluation since different specific methods

    International Nuclear Information System (INIS)

    Leon Pelaez, Juan Diego; Lopera Arango Gabriel Jaime

    1999-01-01

    Some conceptual and practical elements related to environmental impact evaluation are described and related to the preparation of technical reports (environmental impact studies and environmental management plans) to be presented to environmental authorities for obtaining the environmental permits for development projects. In the first part of the document a summary of the main aspects of normative type is made that support the studies of environmental impact in Colombia. We propose a diagram for boarding and elaboration of the evaluation of environmental impact, which begins with the description of the project and of the environmental conditions in the area of the same. Passing then to identify the impacts through a method matricial and continuing with the quantitative evaluation of the same. For which we propose the use of the method developed by Arboleda (1994). Also we propose to qualify the activities of the project and the components of the environment in their relative importance, by means of a method here denominated agglomerate evaluation. Which allows finding those activities more impacting and the mostly impacted components. Lastly it is presented some models for the elaboration and presentation of the environmental management plans. The pursuit programs and those of environmental supervision

  18. 76 FR 14034 - Proposed Collection; Comment Request; NCI Cancer Genetics Services Directory Web-Based...

    Science.gov (United States)

    2011-03-15

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Proposed Collection; Comment Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer Summary: In... Cancer Genetics Services Directory Web-based Application Form and Update Mailer. [[Page 14035

  19. Accelerated gradient methods for constrained image deblurring

    International Nuclear Information System (INIS)

    Bonettini, S; Zanella, R; Zanni, L; Bertero, M

    2008-01-01

    In this paper we propose a special gradient projection method for the image deblurring problem, in the framework of the maximum likelihood approach. We present the method in a very general form and we give convergence results under standard assumptions. Then we consider the deblurring problem and the generality of the proposed algorithm allows us to add a energy conservation constraint to the maximum likelihood problem. In order to improve the convergence rate, we devise appropriate scaling strategies and steplength updating rules, especially designed for this application. The effectiveness of the method is evaluated by means of a computational study on astronomical images corrupted by Poisson noise. Comparisons with standard methods for image restoration, such as the expectation maximization algorithm, are also reported.

  20. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  1. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  2. Update to the R33 cross section file format

    International Nuclear Information System (INIS)

    Vickridge, I.C.

    2003-01-01

    In September 1991, in response to the workshop on cross sections for Ion Beam Analysis (IBA) held in Namur (July 1991, Nuclear Instruments and Methods B66(1992)), a simple ascii format was proposed to facilitate transfer and collation of nuclear reaction cross section data for Ion Beam Analysis (IBA) and especially for Nuclear Reaction Analysis (NRA). Although intended only as a discussion document, the ascii format - referred to as the R33 (Report 33) format - has become a de facto standard. In the decade since this first proposal there have been spectacular advances in computing power and in software usability, however the cross-platform compatibility of the ascii character set has ensured that the need for an ascii format remains. Nuclear reaction cross section data for Nuclear Reaction analysis has been collected and archived on internet web sites over the last decade. This data has largely been entered in the R33 format, although there is a series of elastic cross sections that are expressed as the ratio to the corresponding Rutherford cross sections that have been entered in a format referred to as RTR (ratio to Rutherford). During this time the R33 format has been modified and added to - firstly to take into account angular distributions, which were not catered for in the first proposal, and more recently to cater for elastic cross sections expressed as the ratio-to- Rutherford, which it is useful to have for some elastic scattering programs. It is thus timely to formally update the R33 format. There also exists the large nuclear cross section data collections of the Nuclear Data Network - of which the core centres are the OECD NEA Nuclear Data Bank, the IAEA Nuclear Data Section, the Brookhaven National Laboratory National Nuclear Data Centre and CJD IPPE Obninsk, Russia. The R33 format is now proposed to become a legal computational format for the NDN. It is thus also necessary to provide an updated formal definition of the R33 format in order to provide

  3. Updated requirements for control room annunciation: an operations perspective

    International Nuclear Information System (INIS)

    Davey, E.; Lane, L.

    2001-01-01

    The purpose of this paper is to describe the results of updating and aligning requirements for annunciation functionality and performance with current expectations for operational excellence. This redefinition of annunciation requirements was undertaken as one component of a project to characterize improvement priorities, establish the operational and economic basis for improvement, and identify preferred implementation options for Ontario Power Generation plants. The updated requirements express the kinds of information support annunciation should provide to Operations staff to support the detection, recognition and response to changes in plant conditions. The updated requirements were developed using several types of information: management and industry expectations for operations excellence, previous definitions of user needs for annunciation, and operational and ergonomic principles. Operations and engineering staff at several stations have helped refine and complete the initial requirements definition. Application of these updated requirements is expected to lead to more effective and task relevant annunciation system improvements that better serve plant operation needs. The paper outlines the project rationale, reviews development objectives, discusses the approaches applied for requirements definition and organization, describes key requirements findings in relation to current operations experience, and discusses the proposed application of these requirements for guiding future annunciation system improvements. (author)

  4. An improved design method of a tuned mass damper for an in-service footbridge

    Science.gov (United States)

    Shi, Weixing; Wang, Liangkun; Lu, Zheng

    2018-03-01

    Tuned mass damper (TMD) has a wide range of applications in the vibration control of footbridges. However, the traditional engineering design method may lead to a mistuned TMD. In this paper, an improved TMD design method based on the model updating is proposed. Firstly, the original finite element model (FEM) is studied and the natural characteristics of the in-service or newly built footbridge is identified by field test, and then the original FEM is updated. TMD is designed according to the new updated FEM, and it is optimized according to the simulation on vibration control effects. Finally, the installation and field measurement of TMD are carried out. The improved design method can be applied to both in-service and newly built footbridges. This paper illustrates the improved design method with an engineering example. The frequency identification results of field test and original FEM show that there is a relatively large difference between them. The TMD designed according to the updated FEM has better vibration control effect than the TMD designed according to the original FEM. The site test results show that TMD has good effect on controlling human-induced vibrations.

  5. Proposed frustrated-total-reflection acoustic sensing method

    International Nuclear Information System (INIS)

    Hull, J.R.

    1981-01-01

    Modulation of electromagnetic energy transmission through a frustrated-total-reflection device by pressure-induced changes in the index of refraction is proposed for use as an acoustic detector. Maximum sensitivity occurs for angles of incidence near the critical angle. The minimum detectable pressure in air is limited by Brownian noise. Acoustic propagation losses and diffraction of the optical beam by the acoustic signal limit the minimum acoustic wavelength to lengths of the order of the spatial extent of the optical beam. The response time of the method is fast enough to follow individual acoustic waves

  6. Working Memory Updating Latency Reflects the Cost of Switching between Maintenance and Updating Modes of Operation

    Science.gov (United States)

    Kessler, Yoav; Oberauer, Klaus

    2014-01-01

    Updating and maintenance of information are 2 conflicting demands on working memory (WM). We examined the time required to update WM (updating latency) as a function of the sequence of updated and not-updated items within a list. Participants held a list of items in WM and updated a variable subset of them in each trial. Four experiments that vary…

  7. 78 FR 58985 - Proposed Amendments to the Water Quality Regulations, Water Code and Comprehensive Plan To Update...

    Science.gov (United States)

    2013-09-25

    ..., Water Code and Comprehensive Plan to update stream quality objectives (also called ``water quality..., to Commission Secretary at 609-883-9522; if by U.S. Mail, to Commission Secretary, DRBC, P.O. Box...-7203. SUPPLEMENTARY INFORMATION: Background. The Commission in 1967 assigned stream quality objectives...

  8. Efficient model learning methods for actor-critic control.

    Science.gov (United States)

    Grondman, Ivo; Vaandrager, Maarten; Buşoniu, Lucian; Babuska, Robert; Schuitema, Erik

    2012-06-01

    We propose two new actor-critic algorithms for reinforcement learning. Both algorithms use local linear regression (LLR) to learn approximations of the functions involved. A crucial feature of the algorithms is that they also learn a process model, and this, in combination with LLR, provides an efficient policy update for faster learning. The first algorithm uses a novel model-based update rule for the actor parameters. The second algorithm does not use an explicit actor but learns a reference model which represents a desired behavior, from which desired control actions can be calculated using the inverse of the learned process model. The two novel methods and a standard actor-critic algorithm are applied to the pendulum swing-up problem, in which the novel methods achieve faster learning than the standard algorithm.

  9. FRMAC Updates

    International Nuclear Information System (INIS)

    Mueller, P.

    1995-01-01

    This talks describes updates in the following updates in FRMAC publications concerning radiation emergencies: Monitoring and Analysis Manual; Evaluation and Assessment Manual; Handshake Series (Biannual) including exercises participated in; environmental Data and Instrument Transmission System (EDITS); Plume in a Box with all radiological data stored onto a hand-held computer; and courses given

  10. Pseudodynamic systems approach based on a quadratic approximation of update equations for diffuse optical tomography.

    Science.gov (United States)

    Biswas, Samir Kumar; Kanhirodan, Rajan; Vasu, Ram Mohan; Roy, Debasish

    2011-08-01

    We explore a pseudodynamic form of the quadratic parameter update equation for diffuse optical tomographic reconstruction from noisy data. A few explicit and implicit strategies for obtaining the parameter updates via a semianalytical integration of the pseudodynamic equations are proposed. Despite the ill-posedness of the inverse problem associated with diffuse optical tomography, adoption of the quadratic update scheme combined with the pseudotime integration appears not only to yield higher convergence, but also a muted sensitivity to the regularization parameters, which include the pseudotime step size for integration. These observations are validated through reconstructions with both numerically generated and experimentally acquired data.

  11. Updating the Psoriatic Arthritis (PsA) Core Domain Set

    DEFF Research Database (Denmark)

    Orbai, Ana-Maria; de Wit, Maarten; Mease, Philip J

    2017-01-01

    OBJECTIVE: To include the patient perspective in accordance with the Outcome Measures in Rheumatology (OMERACT) Filter 2.0 in the updated Psoriatic Arthritis (PsA) Core Domain Set for randomized controlled trials (RCT) and longitudinal observational studies (LOS). METHODS: At OMERACT 2016, research...... conducted to update the PsA Core Domain Set was presented and discussed in breakout groups. The updated PsA Core Domain Set was voted on and endorsed by OMERACT participants. RESULTS: We conducted a systematic literature review of domains measured in PsA RCT and LOS, and identified 24 domains. We conducted...... and breakout groups at OMERACT 2016 in which findings were presented and discussed. The updated PsA Core Domain Set endorsed with 90% agreement by OMERACT 2016 participants included musculoskeletal disease activity, skin disease activity, fatigue, pain, patient's global assessment, physical function, health...

  12. Grey Forecast Rainfall with Flow Updating Algorithm for Real-Time Flood Forecasting

    Directory of Open Access Journals (Sweden)

    Jui-Yi Ho

    2015-04-01

    Full Text Available The dynamic relationship between watershed characteristics and rainfall-runoff has been widely studied in recent decades. Since watershed rainfall-runoff is a non-stationary process, most deterministic flood forecasting approaches are ineffective without the assistance of adaptive algorithms. The purpose of this paper is to propose an effective flow forecasting system that integrates a rainfall forecasting model, watershed runoff model, and real-time updating algorithm. This study adopted a grey rainfall forecasting technique, based on existing hourly rainfall data. A geomorphology-based runoff model can be used for simulating impacts of the changing geo-climatic conditions on the hydrologic response of unsteady and non-linear watershed system, and flow updating algorithm were combined to estimate watershed runoff according to measured flow data. The proposed flood forecasting system was applied to three watersheds; one in the United States and two in Northern Taiwan. Four sets of rainfall-runoff simulations were performed to test the accuracy of the proposed flow forecasting technique. The results indicated that the forecast and observed hydrographs are in good agreement for all three watersheds. The proposed flow forecasting system could assist authorities in minimizing loss of life and property during flood events.

  13. Disruption of the Right Temporoparietal Junction Impairs Probabilistic Belief Updating.

    Science.gov (United States)

    Mengotti, Paola; Dombert, Pascasie L; Fink, Gereon R; Vossel, Simone

    2017-05-31

    Generating and updating probabilistic models of the environment is a fundamental modus operandi of the human brain. Although crucial for various cognitive functions, the neural mechanisms of these inference processes remain to be elucidated. Here, we show the causal involvement of the right temporoparietal junction (rTPJ) in updating probabilistic beliefs and we provide new insights into the chronometry of the process by combining online transcranial magnetic stimulation (TMS) with computational modeling of behavioral responses. Female and male participants performed a modified location-cueing paradigm, where false information about the percentage of cue validity (%CV) was provided in half of the experimental blocks to prompt updating of prior expectations. Online double-pulse TMS over rTPJ 300 ms (but not 50 ms) after target appearance selectively decreased participants' updating of false prior beliefs concerning %CV, reflected in a decreased learning rate of a Rescorla-Wagner model. Online TMS over rTPJ also impacted on participants' explicit beliefs, causing them to overestimate %CV. These results confirm the involvement of rTPJ in updating of probabilistic beliefs, thereby advancing our understanding of this area's function during cognitive processing. SIGNIFICANCE STATEMENT Contemporary views propose that the brain maintains probabilistic models of the world to minimize surprise about sensory inputs. Here, we provide evidence that the right temporoparietal junction (rTPJ) is causally involved in this process. Because neuroimaging has suggested that rTPJ is implicated in divergent cognitive domains, the demonstration of an involvement in updating internal models provides a novel unifying explanation for these findings. We used computational modeling to characterize how participants change their beliefs after new observations. By interfering with rTPJ activity through online transcranial magnetic stimulation, we showed that participants were less able to update

  14. Construction and Updating of Event Models in Auditory Event Processing

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  15. UPDATING UNDER RISK CONDITION

    Directory of Open Access Journals (Sweden)

    VĂDUVA CECILIA ELENA

    2018-02-01

    Full Text Available The foundation for future firm development is investment. Agents have a risk aversion requiring higher returns as the risks associated with the project will be greater. The investment decision determines the market firm's affirmation, increasing the market share, dominating the market. Making an investment at a certain point will determine certain cash flows throughout the life of the project, and a residual value can be obtained when it is taken out of service. The flows and payments for the investment project can be more easily tracked if we are proposing a constant update rate. We will be able to analyze, based on various factors, three techniques for determining the discount rate for investment projects: the opportunity cost, the risk-free rate, and a series of risk premiums, the weighted average cost of capital. People without financial training make value judgments for investment projects based on other market opportunities, comparing the returns that any investment offers to other pay options. An investor has a sum of money he wants to make - if he does not invest in a project, he will invest in another, that will bring him a certain amount of money, choosing the most advantageous project by comparison. All projects are characterized by identical risks, and the agents are considered indifferent to the risks. The answer given by financial theory and practice to the disadvantage of rates in the opportunity cost category is the discount rate calculated as a sum of the risk-free rate and a risk premium, defining the risk as a factor whose action may cause a possible decrease in cash of the available flows. Higher objectivity is presented by the opportunity cost update rates of update because it refers to known variables but cannot be perfectly matched to the performance of the investment process.

  16. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    International Nuclear Information System (INIS)

    Ehlert, Kurt; Loewe, Laurence

    2014-01-01

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise

  17. 2016 updated MASCC/ESMO consensus recommendations

    DEFF Research Database (Denmark)

    Roila, Fausto; Warr, David; Hesketh, Paul J

    2017-01-01

    PURPOSE: An update of the recommendations for the prophylaxis of acute and delayed emesis induced by moderately emetogenic chemotherapy published after the last MASCC/ESMO antiemetic consensus conference in 2009 has been carried out. METHODS: A systematic literature search using PubMed from Janua...

  18. Automatic Rapid Updating of ATR Target Knowledge Bases

    National Research Council Canada - National Science Library

    Wells, Barton

    1999-01-01

    .... Methods of comparing infrared images with CAD model renderings, including object detection, feature extraction, object alignment, match quality evaluation, and CAD model updating are researched and analyzed...

  19. 75 FR 28814 - FHA Lender Approval, Annual Renewal, Periodic Updates and Required Reports From FHA Approved Lenders

    Science.gov (United States)

    2010-05-24

    ... proposal. This information is required for: (1) FHA lender approval, (2) Annual renewal of each FHA lender... following information: Title of Proposal: FHA Lender Approval, Annual Renewal, Periodic Updates and Required... and HUD-92001-C. Description of the Need for the Information and Its Proposed Use: This information is...

  20. An Algorithm of Auto-Update Threshold for Singularity Analysis of Pipeline Pressure

    Directory of Open Access Journals (Sweden)

    Jinhai Liu

    2013-01-01

    Full Text Available A precise auto-update threshold algorithm (AUTA which imitates the short-term memory of human brain is proposed to search singularities in pipeline pressure signal. According to the characteristics of the pressure signal, the pressure can be divided into two states known as nonsteady state and steady state. The AUTA can distinguish these two states and then choose corresponding method to calculate the dynamic thresholds of pressure variation in real time. Then, the parameters of AUTA are analyzed to determine their values or ranges. Finally, in the simulations to the actual pressure signal from oil pipelines, we verified the effectiveness of AUTA in estimating the dynamic threshold value of pressure.

  1. Memory updating and mental arithmetic

    Directory of Open Access Journals (Sweden)

    Cheng-Ching eHan

    2016-02-01

    Full Text Available Is domain-general memory updating ability predictive of calculation skills or are such skills better predicted by the capacity for updating specifically numerical information? Here, we used multidigit mental multiplication (MMM as a measure for calculating skill as this operation requires the accurate maintenance and updating of information in addition to skills needed for arithmetic more generally. In Experiment 1, we found that only individual differences with regard to a task updating numerical information following addition (MUcalc could predict the performance of MMM, perhaps owing to common elements between the task and MMM. In Experiment 2, new updating tasks were designed to clarify this: a spatial updating task with no numbers, a numerical task with no calculation, and a word task. The results showed that both MUcalc and the spatial task were able to predict the performance of MMM but only with the more difficult problems, while other updating tasks did not predict performance. It is concluded that relevant processes involved in updating the contents of working memory support mental arithmetic in adults.

  2. Proposal of a method for evaluating tsunami risk using response-surface methodology

    Science.gov (United States)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response

  3. Query and Update Efficient B+-Tree Based Indexing of Moving Objects

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lin, Dan; Ooi, Beng Chin

    2004-01-01

    . This motivates the design of a solution that enables the B+-tree to manage moving objects. We represent moving-object locations as vectors that are timestamped based on their update time. By applying a novel linearization technique to these values, it is possible to index the resulting values using a single B...... are streamed to a database. Indexes for moving objects must support queries efficiently, but must also support frequent updates. Indexes based on minimum bounding regions (MBRs) such as the R-tree exhibit high concurrency overheads during node splitting, and each individual update is known to be quite costly......+-tree that partitions values according to their timestamp and otherwise preserves spatial proximity. We develop algorithms for range and k nearest neighbor queries, as well as continuous queries. The proposal can be grafted into existing database systems cost effectively. An extensive experimental study explores...

  4. Semi Automated Land Cover Layer Updating Process Utilizing Spectral Analysis and GIS Data Fusion

    Science.gov (United States)

    Cohen, L.; Keinan, E.; Yaniv, M.; Tal, Y.; Felus, A.; Regev, R.

    2018-04-01

    Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components.

  5. The updating of clinical practice guidelines: insights from an international survey

    Directory of Open Access Journals (Sweden)

    Solà Ivan

    2011-09-01

    Full Text Available Abstract Background Clinical practice guidelines (CPGs have become increasingly popular, and the methodology to develop guidelines has evolved enormously. However, little attention has been given to the updating process, in contrast to the appraisal of the available literature. We conducted an international survey to identify current practices in CPG updating and explored the need to standardize and improve the methods. Methods We developed a questionnaire (28 items based on a review of the existing literature about guideline updating and expert comments. We carried out the survey between March and July 2009, and it was sent by email to 106 institutions: 69 members of the Guidelines International Network who declared that they developed CPGs; 30 institutions included in the U.S. National Guideline Clearinghouse database that published more than 20 CPGs; and 7 institutions selected by an expert committee. Results Forty-four institutions answered the questionnaire (42% response rate. In the final analysis, 39 completed questionnaires were included. Thirty-six institutions (92% reported that they update their guidelines. Thirty-one institutions (86% have a formal procedure for updating their guidelines, and 19 (53% have a formal procedure for deciding when a guideline becomes out of date. Institutions describe the process as moderately rigorous (36% or acknowledge that it could certainly be more rigorous (36%. Twenty-two institutions (61% alert guideline users on their website when a guideline is older than three to five years or when there is a risk of being outdated. Twenty-five institutions (64% support the concept of "living guidelines," which are continuously monitored and updated. Eighteen institutions (46% have plans to design a protocol to improve their guideline-updating process, and 21 (54% are willing to share resources with other organizations. Conclusions Our study is the first to describe the process of updating CPGs among prominent

  6. Egocentric-updating during navigation facilitates episodic memory retrieval.

    Science.gov (United States)

    Gomez, Alice; Rousset, Stéphane; Baciu, Monica

    2009-11-01

    Influential models suggest that spatial processing is essential for episodic memory [O'Keefe, J., & Nadel, L. (1978). The hippocampus as a cognitive map. London: Oxford University Press]. However, although several types of spatial relations exist, such as allocentric (i.e. object-to-object relations), egocentric (i.e. static object-to-self relations) or egocentric updated on navigation information (i.e. self-to-environment relations in a dynamic way), usually only allocentric representations are described as potentially subserving episodic memory [Nadel, L., & Moscovitch, M. (1998). Hippocampal contributions to cortical plasticity. Neuropharmacology, 37(4-5), 431-439]. This study proposes to confront the allocentric representation hypothesis with an egocentric updated with self-motion representation hypothesis. In the present study, we explored retrieval performance in relation to these two types of spatial processing levels during learning. Episodic remembering has been assessed through Remember responses in a recall and in a recognition task, combined with a "Remember-Know-Guess" paradigm [Gardiner, J. M. (2001). Episodic memory and autonoetic consciousness: A first-person approach. Philosophical Transactions of the Royal Society B: Biological Sciences, 356(1413), 1351-1361] to assess the autonoetic level of responses. Our results show that retrieval performance was significantly higher when encoding was performed in the egocentric-updated condition. Although egocentric updated with self-motion and allocentric representations are not mutually exclusive, these results suggest that egocentric updating processing facilitates remember responses more than allocentric processing. The results are discussed according to Burgess and colleagues' model of episodic memory [Burgess, N., Becker, S., King, J. A., & O'Keefe, J. (2001). Memory for events and their spatial context: models and experiments. Philosophical Transactions of the Royal Society of London. Series B

  7. Central Venous Disease in Hemodialysis Patients: An Update

    Energy Technology Data Exchange (ETDEWEB)

    Modabber, Milad, E-mail: mmodabber@gmail.com [McMaster University, Michael G. DeGroote School of Medicine (Canada); Kundu, Sanjoy [Scarborough Hospital and Scarborough Vascular Ultrasound, The Vein Institute of Toronto (Canada)

    2013-08-01

    Central venous occlusive disease (CVD) is a common concern among the hemodialysis patient population, with the potential to cause significant morbidity. Endovascular management of CVD, comprising percutaneous balloon angioplasty and bare-metal stenting, has been established as a safe alternative to open surgical treatment. However, these available treatments have poor long-term patency, requiring close surveillance and multiple repeat interventions. Recently, covered stents have been proposed and their efficacy assessed for the treatment of recalcitrant central venous stenosis and obstruction. Moreover, newly proposed algorithms for the surgical management of CVD warrant consideration. Here, we seek to provide an updated review of the current literature on the various treatment modalities for CVD.

  8. Vehicle Speed Estimation and Forecasting Methods Based on Cellular Floating Vehicle Data

    Directory of Open Access Journals (Sweden)

    Wei-Kuang Lai

    2016-02-01

    Full Text Available Traffic information estimation and forecasting methods based on cellular floating vehicle data (CFVD are proposed to analyze the signals (e.g., handovers (HOs, call arrivals (CAs, normal location updates (NLUs and periodic location updates (PLUs from cellular networks. For traffic information estimation, analytic models are proposed to estimate the traffic flow in accordance with the amounts of HOs and NLUs and to estimate the traffic density in accordance with the amounts of CAs and PLUs. Then, the vehicle speeds can be estimated in accordance with the estimated traffic flows and estimated traffic densities. For vehicle speed forecasting, a back-propagation neural network algorithm is considered to predict the future vehicle speed in accordance with the current traffic information (i.e., the estimated vehicle speeds from CFVD. In the experimental environment, this study adopted the practical traffic information (i.e., traffic flow and vehicle speed from Taiwan Area National Freeway Bureau as the input characteristics of the traffic simulation program and referred to the mobile station (MS communication behaviors from Chunghwa Telecom to simulate the traffic information and communication records. The experimental results illustrated that the average accuracy of the vehicle speed forecasting method is 95.72%. Therefore, the proposed methods based on CFVD are suitable for an intelligent transportation system.

  9. A Long-Term Performance Enhancement Method for FOG-Based Measurement While Drilling.

    Science.gov (United States)

    Zhang, Chunxi; Lin, Tie

    2016-07-28

    In the oil industry, the measurement-while-drilling (MWD) systems are usually used to provide the real-time position and orientation of the bottom hole assembly (BHA) during drilling. However, the present MWD systems based on magnetic surveying technology can barely ensure good performance because of magnetic interference phenomena. In this paper, a MWD surveying system based on a fiber optic gyroscope (FOG) was developed to replace the magnetic surveying system. To accommodate the size of the downhole drilling conditions, a new design method is adopted. In order to realize long-term and high position precision and orientation surveying, an integrated surveying algorithm is proposed based on inertial navigation system (INS) and drilling features. In addition, the FOG-based MWD error model is built and the drilling features are analyzed. The state-space system model and the observation updates model of the Kalman filter are built. To validate the availability and utility of the algorithm, the semi-physical simulation is conducted under laboratory conditions. The results comparison with the traditional algorithms show that the errors were suppressed and the measurement precision of the proposed algorithm is better than the traditional ones. In addition, the proposed method uses a lot less time than the zero velocity update (ZUPT) method.

  10. Tank waste remediation system retrieval and disposal mission initial updated baseline summary

    International Nuclear Information System (INIS)

    Swita, W.R.

    1998-01-01

    This document provides a summary of the proposed Tank Waste Remediation System Retrieval and Disposal Mission Initial Updated Baseline (scope, schedule, and cost) developed to demonstrate the Tank Waste Remediation System contractor's Readiness-to-Proceed in support of the Phase 1B mission

  11. Estimation of body fluids with bioimpedance spectroscopy: state of the art methods and proposal of novel methods

    International Nuclear Information System (INIS)

    Buendia, R; Seoane, F; Lindecrantz, K; Bosaeus, I; Gil-Pita, R; Johannsson, G; Ellegård, L; Ward, L C

    2015-01-01

    Determination of body fluids is a useful common practice in determination of disease mechanisms and treatments. Bioimpedance spectroscopy (BIS) methods are non-invasive, inexpensive and rapid alternatives to reference methods such as tracer dilution. However, they are indirect and their robustness and validity are unclear. In this article, state of the art methods are reviewed, their drawbacks identified and new methods are proposed. All methods were tested on a clinical database of patients receiving growth hormone replacement therapy. Results indicated that most BIS methods are similarly accurate (e.g.  <  0.5   ±   3.0% mean percentage difference for total body water) for estimation of body fluids. A new model for calculation is proposed that performs equally well for all fluid compartments (total body water, extra- and intracellular water). It is suggested that the main source of error in extracellular water estimation is due to anisotropy, in total body water estimation to the uncertainty associated with intracellular resistivity and in determination of intracellular water a combination of both. (paper)

  12. The implementation of a simplified spherical harmonics semi-analytic nodal method in PANTHER

    International Nuclear Information System (INIS)

    Hall, S.K.; Eaton, M.D.; Knight, M.P.

    2013-01-01

    Highlights: ► An SP N nodal method is proposed. ► Consistent CMFD derived and tested. ► Mark vacuum boundary conditions applied. ► Benchmarked against other diffusions and transport codes. - Abstract: In this paper an SP N nodal method is proposed which can utilise existing multi-group neutron diffusion solvers to obtain the solution. The semi-analytic nodal method is used in conjunction with a coarse mesh finite difference (CMFD) scheme to solve the resulting set of equations. This is compared against various nuclear benchmarks to show that the method is capable of computing an accurate solution for practical cases. A few different CMFD formulations are implemented and their performance compared. It is found that the effective diffusion coefficent (EDC) can provide additional stability and require less power iterations on a coarse mesh. A re-arrangement of the EDC is proposed that allows the iteration matrix to be computed at the beginning of a calculation. Successive nodal updates only modify the source term unlike existing CMFD methods which update the iteration matrix. A set of Mark vacuum boundary conditions are also derived which can be applied to the SP N nodal method extending its validity. This is possible due to a similarity transformation of the angular coupling matrix, which is used when applying the nodal method. It is found that the Marshak vacuum condition can also be derived, but would require the significant modification of existing neutron diffusion codes to implement it

  13. Reliability residual-life prediction method for thermal aging based on performance degradation

    International Nuclear Information System (INIS)

    Ren Shuhong; Xue Fei; Yu Weiwei; Ti Wenxin; Liu Xiaotian

    2013-01-01

    The paper makes the study of the nuclear power plant main pipeline. The residual-life of the main pipeline that failed due to thermal aging has been studied by the use of performance degradation theory and Bayesian updating methods. Firstly, the thermal aging impact property degradation process of the main pipeline austenitic stainless steel has been analyzed by the accelerated thermal aging test data. Then, the thermal aging residual-life prediction model based on the impact property degradation data is built by Bayesian updating methods. Finally, these models are applied in practical situations. It is shown that the proposed methods are feasible and the prediction accuracy meets the needs of the project. Also, it provides a foundation for the scientific management of aging management of the main pipeline. (authors)

  14. Preconditioner Updates Applied to CFD Model Problems

    Czech Academy of Sciences Publication Activity Database

    Birken, P.; Duintjer Tebbens, Jurjen; Meister, A.; Tůma, Miroslav

    2008-01-01

    Roč. 58, č. 11 (2008), s. 1628-1641 ISSN 0168-9274 R&D Projects: GA AV ČR 1ET400300415; GA AV ČR KJB100300703 Institutional research plan: CEZ:AV0Z10300504 Keywords : finite volume methods * update preconditioning * Krylov subspace methods * Euler equations * conservation laws Subject RIV: BA - General Mathematics Impact factor: 0.952, year: 2008

  15. 78 FR 65932 - Updating OSHA Standards Based on National Consensus Standards; Signage

    Science.gov (United States)

    2013-11-04

    ...; Signage AGENCY: Occupational Safety and Health Administration (OSHA), Department of Labor. ACTION... accompanied its direct final rule revising its signage standards for general industry and construction. DATES... proposed rule (NPRM) along with the direct final rule (DFR) (see 78 FR 35585) updating its signage...

  16. 75 FR 12251 - Notice of Proposed Information Collection for Public Comment; FHA Lender Approval, Annual Renewal...

    Science.gov (United States)

    2010-03-15

    ... Lenders. OMB Control Number, if applicable: 2502-0005. Description of the need for the information and proposed use: The information is used by FHA to verify that lenders meet all approval, renewal, update and... Information Collection for Public Comment; FHA Lender Approval, Annual Renewal, Periodic Updates and...

  17. 78 FR 17937 - Notice of Proposed Information Collection for Public Comment; FHA Lender Approval, Annual Renewal...

    Science.gov (United States)

    2013-03-25

    ... Lenders. OMB Control Number, if applicable: 2502-0005. Description of the need for the information and proposed use: The information is used by FHA to verify that lenders meet all approval, renewal, update and... Information Collection for Public Comment; FHA Lender Approval, Annual Renewal, Periodic Updates and...

  18. 77 FR 63323 - Notice of Proposed Information Collection for Public Comment; FHA Lender Approval, Annual Renewal...

    Science.gov (United States)

    2012-10-16

    ... Lenders OMB Control Number, if applicable: 2502-0005. Description of the need for the information and proposed use: The information is used by FHA to verify that lenders meet all approval, renewal, update and... Information Collection for Public Comment; FHA Lender Approval, Annual Renewal, Periodic Updates and...

  19. Finite element model updating of concrete structures based on imprecise probability

    Science.gov (United States)

    Biswal, S.; Ramaswamy, A.

    2017-09-01

    Imprecise probability based methods are developed in this study for the parameter estimation, in finite element model updating for concrete structures, when the measurements are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Procedures are also developed for integrating the imprecision in the parameters of the finite element model, in the finite element software Abaqus. The proposed methods are then verified against reinforced concrete beams and prestressed concrete beams tested in our laboratory as part of this study.

  20. Updating OSHA Standards Based on National Consensus Standards; Eye and Face Protection. Final rule.

    Science.gov (United States)

    2016-03-25

    On March 13, 2015, OSHA published in the Federal Register a notice of proposed rulemaking (NPRM) to revise its eye and face protection standards for general industry, shipyard employment, marine terminals, longshoring, and construction by updating the references to national consensus standards approved by the American National Standards Institute (ANSI). OSHA received no significant objections from commenters and therefore is adopting the amendments as proposed. This final rule updates the references in OSHA's eye and face standards to reflect the most recent edition of the ANSI/International Safety Equipment Association (ISEA) eye and face protection standard. It removes the oldest-referenced edition of the same ANSI standard. It also amends other provisions of the construction eye and face protection standard to bring them into alignment with OSHA's general industry and maritime standards.

  1. Imitate or innovate: Competition of strategy updating attitudes in spatial social dilemma games

    Science.gov (United States)

    Danku, Zsuzsa; Wang, Zhen; Szolnoki, Attila

    2018-01-01

    Evolution is based on the assumption that competing players update their strategies to increase their individual payoffs. However, while the applied updating method can be different, most of previous works proposed uniform models where players use identical way to revise their strategies. In this work we explore how imitation-based or learning attitude and innovation-based or myopic best-response attitude compete for space in a complex model where both attitudes are available. In the absence of additional cost the best response trait practically dominates the whole snow-drift game parameter space which is in agreement with the average payoff difference of basic models. When additional cost is involved then the imitation attitude can gradually invade the whole parameter space but this transition happens in a highly nontrivial way. However, the role of competing attitudes is reversed in the stag-hunt parameter space where imitation is more successful in general. Interestingly, a four-state solution can be observed for the latter game which is a consequence of an emerging cyclic dominance between possible states. These phenomena can be understood by analyzing the microscopic invasion processes, which reveals the unequal propagation velocities of strategies and attitudes.

  2. A MU-MIMO CQI estimation method for MU-MIMO UEs in LTE systems

    DEFF Research Database (Denmark)

    Nguyen, Hung Tuan; Kovacs, Istvan

    2012-01-01

    Abstract—This paper addresses a method to estimate the multi user channel quality indicator (CQI) from the reported rank 1 single user CQI in LTE systems. We investigate the relationship between the multi user CQI and the channel condition. Based on that, we propose an updating mechanism where th...

  3. Proposed Sandia frequency shift for anti-islanding detection method based on artificial immune system

    Directory of Open Access Journals (Sweden)

    A.Y. Hatata

    2018-03-01

    Full Text Available Sandia frequency shift (SFS is one of the active anti-islanding detection methods that depend on frequency drift to detect an islanding condition for inverter-based distributed generation. The non-detection zone (NDZ of the SFS method depends to a great extent on its parameters. Improper adjusting of these parameters may result in failure of the method. This paper presents a proposed artificial immune system (AIS-based technique to obtain optimal parameters of SFS anti-islanding detection method. The immune system is highly distributed, highly adaptive, and self-organizing in nature, maintains a memory of past encounters, and has the ability to continually learn about new encounters. The proposed method generates less total harmonic distortion (THD than the conventional SFS, which results in faster island detection and better non-detection zone. The performance of the proposed method is derived analytically and simulated using Matlab/Simulink. Two case studies are used to verify the proposed method. The first case includes a photovoltaic (PV connected to grid and the second includes a wind turbine connected to grid. The deduced optimized parameter setting helps to achieve the “non-islanding inverter” as well as least potential adverse impact on power quality. Keywords: Anti-islanding detection, Sandia frequency shift (SFS, Non-detection zone (NDZ, Total harmonic distortion (THD, Artificial immune system (AIS, Clonal selection algorithm

  4. Updated Heat Atlas calculation method. Layout of flooded evaporators; Aktualisierte Waermeatlas-Rechenmethode. Auslegung ueberfluteter Verdampfer

    Energy Technology Data Exchange (ETDEWEB)

    Gorenflo, Dieter; Baumhoegger, Elmar; Herres, Gerhard [Paderborn Univ. (Germany). Thermodynamik und Energietechnik; Kotthoff, Stephan [Siemens AG, Goerlitz (Germany)

    2012-07-01

    For years, the most precise forecast of the heat transfer performance of evaporators is a current topic with regard to an efficient energy utilization. An established calculation method for the new edition of the Heat Atlas was updated with regard to flooded evaporators which especially were implemented in air-conditioning and cooling systems. The contribution under consideration outlines this method and enlarges upon the innovations in detail. The impact of the heat flow density and boiling pressure on the heat transfer during pool boiling is modified by means of measurement in the case of a single, horizontal vaporizer tube. Above all, the impact of the fluid can be described easier and more exact. The authors compare the forecasting results with the experimental results regarding the ribbing of the heating surface and impact of the bundle. Furthermore, examples of close boiling and near azeotropic mixtures were admitted to the Heat Atlas. The authors also consider the positive effect of the rising bubble swarm when boiling the mixture in horizontal tube bundles.

  5. [Sampling, storage and transport of biological materials collected from living and deceased subjects for determination of concentration levels of ethyl alcohol and similarly acting substances. A proposal of updating the blood and urine sampling protocol].

    Science.gov (United States)

    Wiergowski, Marek; Reguła, Krystyna; Pieśniak, Dorota; Galer-Tatarowicz, Katarzyna; Szpiech, Beata; Jankowski, Zbigniew

    2007-01-01

    The present paper emphasizes the most common mistakes committed at the beginning of an analytical procedure. To shorten the time and decrease the cost of determinations of substances with similar to alcohol activity, it is postulated to introduce mass-scale screening analysis of saliva collected from a living subject at the site of the event, with all positive results confirmed in blood or urine samples. If no saliva sample is collected for toxicology, a urine sample, allowing for a stat fast screening analysis, and a blood sample, to confirm the result, should be ensured. Inappropriate storage of a blood sample in the tube without a preservative can cause sample spilling and its irretrievable loss. The authors propose updating the "Blood/urine sampling protocol", with the updated version to be introduced into practice following consultations and revisions.

  6. [Contribution of the cervical vertebral maturation (CVM) method to dentofacial orthopedics: update].

    Science.gov (United States)

    Elhaddaoui, R; Benyahia, H; Azaroual, F; Zaoui, F

    2014-11-01

    The successful orthopedic treatment of skeletal Class II malocclusions is closely related to the reasoned determination of the optimal time to initiate the treatment. This is why various methods have been proposed to assess skeletal maturation, such as a hand-wrist radiograph or the cervical vertebral maturation (CVM) method. The hand-wrist radiograph was up to now the most frequently used method to assess skeletal maturation. However, the clinical and biological limitations of this technique, as well as the need to perform an additional radiograph, were reasons to develop another method to explore the maturation stages of visible cervical vertebrae on a simple lateral cephalometric radiograph. The authors compare the 2 methods and prove the greater contribution of the CVM method compared to the hand-wrist radiograph. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  7. A Proposal of Operational Risk Management Method Using FMEA for Drug Manufacturing Computerized System

    Science.gov (United States)

    Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori

    This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.

  8. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    International Nuclear Information System (INIS)

    Kim, Je Hyun; Shim, Chang Ho; Kim, Sung Hyun; Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo; Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho

    2016-01-01

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers

  9. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Je Hyun; Shim, Chang Ho [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of); Kim, Sung Hyun [Nuclear Fuel Cycle Waste Treatment Research Division, Research Reactor Institute, Kyoto University, Osaka (Japan); Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo [Ionizing Radiation Center, Nuclear Fuel Cycle Waste Treatment Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho [Ionizing Radiation Center, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2016-12-15

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers.

  10. Invited Review Article: Tip modification methods for tip-enhanced Raman spectroscopy (TERS) and colloidal probe technique: A 10 year update (2006-2016) review

    Science.gov (United States)

    Yuan, C. C.; Zhang, D.; Gan, Y.

    2017-03-01

    Engineering atomic force microscopy tips for reliable tip enhanced Raman spectroscopy (TERS) and colloidal probe technique are becoming routine practices in many labs. In this 10 year update review, various new tip modification methods developed over the past decade are briefly reviewed to help researchers select the appropriate method. The perspective is put in a large context to discuss the opportunities and challenges in this area, including novel combinations of seemingly different methods, potential applications of some methods which were not originally intended for TERS tip fabrication, and the problems of high cost and poor reproducibility of tip fabrication.

  11. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas April 11 Update of switch in LHC 4 LHC 4 Point April 14 Update of switch in LHC 5 LHC 5 Point April 15 Update of switches in LHC 3 and LHC 2 Points LHC 3 and LHC 2 April 22 Update of switch N4 Meyrin Ouest April 23 Update of switch  N6 Prévessin Site Ap...

  12. Genome Update: alignment of bacterial chromosomes

    DEFF Research Database (Denmark)

    Ussery, David; Jensen, Mette; Poulsen, Tine Rugh

    2004-01-01

    There are four new microbial genomes listed in this month's Genome Update, three belonging to Gram-positive bacteria and one belonging to an archaeon that lives at pH 0; all of these genomes are listed in Table 1⇓. The method of genome comparison this month is that of genome alignment and, as an ...

  13. A Performance Evaluation of Online Warehouse Update Algorithms

    Science.gov (United States)

    1998-01-01

    able to present a fully consistent ver- sion of the warehouse to the queries while the warehouse is being updated. Multiversioning has been used...LST97]). Special- ized multiversion access structures have also been proposed ([LS89, LS90, dBS96, BC97, VV97, MOPW98]) In the context of OLTP systems...collection processes. 2.1 Multiversioning MVNL supports multiple versions by using Time Travel ([Sto87]). Each row has two extra at- tributes, Tmin

  14. Construction and updating of event models in auditory event processing.

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  16. Studying the potential impact of automated document classification on scheduling a systematic review update

    Science.gov (United States)

    2012-01-01

    Background Systematic Reviews (SRs) are an essential part of evidence-based medicine, providing support for clinical practice and policy on a wide range of medical topics. However, producing SRs is resource-intensive, and progress in the research they review leads to SRs becoming outdated, requiring updates. Although the question of how and when to update SRs has been studied, the best method for determining when to update is still unclear, necessitating further research. Methods In this work we study the potential impact of a machine learning-based automated system for providing alerts when new publications become available within an SR topic. Some of these new publications are especially important, as they report findings that are more likely to initiate a review update. To this end, we have designed a classification algorithm to identify articles that are likely to be included in an SR update, along with an annotation scheme designed to identify the most important publications in a topic area. Using an SR database containing over 70,000 articles, we annotated articles from 9 topics that had received an update during the study period. The algorithm was then evaluated in terms of the overall correct and incorrect alert rate for publications meeting the topic inclusion criteria, as well as in terms of its ability to identify important, update-motivating publications in a topic area. Results Our initial approach, based on our previous work in topic-specific SR publication classification, identifies over 70% of the most important new publications, while maintaining a low overall alert rate. Conclusions We performed an initial analysis of the opportunities and challenges in aiding the SR update planning process with an informatics-based machine learning approach. Alerts could be a useful tool in the planning, scheduling, and allocation of resources for SR updates, providing an improvement in timeliness and coverage for the large number of medical topics needing SRs

  17. A last updating evolution model for online social networks

    Science.gov (United States)

    Bu, Zhan; Xia, Zhengyou; Wang, Jiandong; Zhang, Chengcui

    2013-05-01

    As information technology has advanced, people are turning to electronic media more frequently for communication, and social relationships are increasingly found on online channels. However, there is very limited knowledge about the actual evolution of the online social networks. In this paper, we propose and study a novel evolution network model with the new concept of “last updating time”, which exists in many real-life online social networks. The last updating evolution network model can maintain the robustness of scale-free networks and can improve the network reliance against intentional attacks. What is more, we also found that it has the “small-world effect”, which is the inherent property of most social networks. Simulation experiment based on this model show that the results and the real-life data are consistent, which means that our model is valid.

  18. Hypersensitivity to local anaesthetics--update and proposal of evaluation algorithm

    DEFF Research Database (Denmark)

    Thyssen, Jacob Pontoppidan; Menné, Torkil; Elberling, Jesper

    2008-01-01

    of patients suspected with immediate- and delayed-type immune reactions. Literature was examined using PubMed-Medline, EMBASE, Biosis and Science Citation Index. Based on the literature, the proposed algorithm may safely and rapidly distinguish between immediate-type and delayed-type allergic immune reactions....

  19. A proposal on alternative sampling-based modeling method of spherical particles in stochastic media for Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Lee, Jae Yong; KIm, Do Hyun; Kim, Jong Kyung [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-08-15

    Chord length sampling method in Monte Carlo simulations is a method used to model spherical particles with random sampling technique in a stochastic media. It has received attention due to the high calculation efficiency as well as user convenience; however, a technical issue regarding boundary effect has been noted. In this study, after analyzing the distribution characteristics of spherical particles using an explicit method, an alternative chord length sampling method is proposed. In addition, for modeling in finite media, a correction method of the boundary effect is proposed. Using the proposed method, sample probability distributions and relative errors were estimated and compared with those calculated by the explicit method. The results show that the reconstruction ability and modeling accuracy of the particle probability distribution with the proposed method were considerably high. Also, from the local packing fraction results, the proposed method can successfully solve the boundary effect problem. It is expected that the proposed method can contribute to the increasing of the modeling accuracy in stochastic media.

  20. A proposal on alternative sampling-based modeling method of spherical particles in stochastic media for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Lee, Jae Yong; KIm, Do Hyun; Kim, Jong Kyung; Noh, Jae Man

    2015-01-01

    Chord length sampling method in Monte Carlo simulations is a method used to model spherical particles with random sampling technique in a stochastic media. It has received attention due to the high calculation efficiency as well as user convenience; however, a technical issue regarding boundary effect has been noted. In this study, after analyzing the distribution characteristics of spherical particles using an explicit method, an alternative chord length sampling method is proposed. In addition, for modeling in finite media, a correction method of the boundary effect is proposed. Using the proposed method, sample probability distributions and relative errors were estimated and compared with those calculated by the explicit method. The results show that the reconstruction ability and modeling accuracy of the particle probability distribution with the proposed method were considerably high. Also, from the local packing fraction results, the proposed method can successfully solve the boundary effect problem. It is expected that the proposed method can contribute to the increasing of the modeling accuracy in stochastic media

  1. Updating known distribution models for forecasting climate change impact on endangered species.

    Science.gov (United States)

    Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo

    2013-01-01

    To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only.

  2. Efficiencies of joint non-local update moves in Monte Carlo simulations of coarse-grained polymers

    Science.gov (United States)

    Austin, Kieran S.; Marenz, Martin; Janke, Wolfhard

    2018-03-01

    In this study four update methods are compared in their performance in a Monte Carlo simulation of polymers in continuum space. The efficiencies of the update methods and combinations thereof are compared with the aid of the autocorrelation time with a fixed (optimal) acceptance ratio. Results are obtained for polymer lengths N = 14, 28 and 42 and temperatures below, at and above the collapse transition. In terms of autocorrelation, the optimal acceptance ratio is approximately 0.4. Furthermore, an overview of the step sizes of the update methods that correspond to this optimal acceptance ratio is given. This shall serve as a guide for future studies that rely on efficient computer simulations.

  3. Curriculum structure, content, learning and assessment in European undergraduate dental education - update 2010.

    LENUS (Irish Health Repository)

    Manogue, M

    2011-08-01

    This paper presents an updated statement on behalf of the Association for Dental Education in Europe (ADEE) in relation to proposals for undergraduate Curriculum Structure, Content, Learning, Assessment and Student \\/ Staff Exchange for dental education in Europe. A task force was constituted to consider these issues and the two previous, related publications produced by the Association (Plasschaert et al 2006 and 2007) were revised. The broad European dental community was circulated and contributed to the revisions. The paper was approved at the General Assembly of ADEE, held in Amsterdam in August 2010 and will be updated again in 2015.

  4. 78 FR 35585 - Updating OSHA Standards Based on National Consensus Standards; Signage

    Science.gov (United States)

    2013-06-13

    ...; Signage AGENCY: Occupational Safety and Health Administration (OSHA), Department of Labor. ACTION: Notice... Administration (``OSHA'' or ``the Agency'') proposes to update its general industry and construction signage... standards, ANSI Z53.1-1967, Z35.1-1968, and Z35.2-1968, in its signage standards, thereby providing...

  5. A Literature Review Fuzzy Pay-Off-Method – A Modern Approach in Valuation

    Directory of Open Access Journals (Sweden)

    Daniel Manaţe

    2015-01-01

    Full Text Available This article proposes to present a modern approach in the analysis of updated cash flows. The approach is based on the Fuzzy Pay-Off-Method (FPOM for Real Option Valuation (ROV. This article describes a few types of models for the valuation of real options currently in use. In support for the chosen FPOM method, we included the mathematical model that stands at the basis of this method and a case study.

  6. Astrophysics Update 2

    CERN Document Server

    Mason, John W

    2006-01-01

    "Astrophysics Updates" is intended to serve the information needs of professional astronomers and postgraduate students about areas of astronomy, astrophysics and cosmology that are rich and active research spheres. Observational methods and the latest results of astronomical research are presented as well as their theoretical foundations and interrelations. The contributed commissioned articles are written by leading exponents in a format that will appeal to professional astronomers and astrophysicists who are interested in topics outside their own specific areas of research. This collection of timely reviews may also attract the interest of advanced amateur astronomers seeking scientifically rigorous coverage.

  7. Validation of a method for assessing resident physicians' quality improvement proposals.

    Science.gov (United States)

    Leenstra, James L; Beckman, Thomas J; Reed, Darcy A; Mundell, William C; Thomas, Kris G; Krajicek, Bryan J; Cha, Stephen S; Kolars, Joseph C; McDonald, Furman S

    2007-09-01

    Residency programs involve trainees in quality improvement (QI) projects to evaluate competency in systems-based practice and practice-based learning and improvement. Valid approaches to assess QI proposals are lacking. We developed an instrument for assessing resident QI proposals--the Quality Improvement Proposal Assessment Tool (QIPAT-7)-and determined its validity and reliability. QIPAT-7 content was initially obtained from a national panel of QI experts. Through an iterative process, the instrument was refined, pilot-tested, and revised. Seven raters used the instrument to assess 45 resident QI proposals. Principal factor analysis was used to explore the dimensionality of instrument scores. Cronbach's alpha and intraclass correlations were calculated to determine internal consistency and interrater reliability, respectively. QIPAT-7 items comprised a single factor (eigenvalue = 3.4) suggesting a single assessment dimension. Interrater reliability for each item (range 0.79 to 0.93) and internal consistency reliability among the items (Cronbach's alpha = 0.87) were high. This method for assessing resident physician QI proposals is supported by content and internal structure validity evidence. QIPAT-7 is a useful tool for assessing resident QI proposals. Future research should determine the reliability of QIPAT-7 scores in other residency and fellowship training programs. Correlations should also be made between assessment scores and criteria for QI proposal success such as implementation of QI proposals, resident scholarly productivity, and improved patient outcomes.

  8. Proposal of Constraints Analysis Method Based on Network Model for Task Planning

    Science.gov (United States)

    Tomiyama, Tomoe; Sato, Tatsuhiro; Morita, Toyohisa; Sasaki, Toshiro

    Deregulation has been accelerating several activities toward reengineering business processes, such as railway through service and modal shift in logistics. Making those activities successful, business entities have to regulate new business rules or know-how (we call them ‘constraints’). According to the new constraints, they need to manage business resources such as instruments, materials, workers and so on. In this paper, we propose a constraint analysis method to define constraints for task planning of the new business processes. To visualize each constraint's influence on planning, we propose a network model which represents allocation relations between tasks and resources. The network can also represent task ordering relations and resource grouping relations. The proposed method formalizes the way of defining constraints manually as repeatedly checking the network structure and finding conflicts between constraints. Being applied to crew scheduling problems shows that the method can adequately represent and define constraints of some task planning problems with the following fundamental features, (1) specifying work pattern to some resources, (2) restricting the number of resources for some works, (3) requiring multiple resources for some works, (4) prior allocation of some resources to some works and (5) considering the workload balance between resources.

  9. Dissociating Working Memory Updating and Automatic Updating: The Reference-Back Paradigm

    Science.gov (United States)

    Rac-Lubashevsky, Rachel; Kessler, Yoav

    2016-01-01

    Working memory (WM) updating is a controlled process through which relevant information in the environment is selected to enter the gate to WM and substitute its contents. We suggest that there is also an automatic form of updating, which influences performance in many tasks and is primarily manifested in reaction time sequential effects. The goal…

  10. 76 FR 73564 - Federal Acquisition Regulation; Updates to Contract Reporting and Central Contractor Registration

    Science.gov (United States)

    2011-11-29

    ... Federal Acquisition Regulation; Updates to Contract Reporting and Central Contractor Registration AGENCIES... Procurement Data System (FPDS). Additionally, changes are proposed for the clauses requiring contractor registration in the Central Contractor Registration (CCR) database and DUNS number reporting. DATES: Interested...

  11. Update Strength in EDAs and ACO: How to Avoid Genetic Drift

    DEFF Research Database (Denmark)

    Sudholt, Dirk; Witt, Carsten

    2016-01-01

    , showing that the update strength should be limited to 1/K, ρ = O(1/(√n log n)). In fact, choosing 1/K, ρ ∼ 1/(√n log n) both algorithms efficiently optimize OneMax in expected time O (n log n). Our analyses provide new insights into the stochastic behavior of probabilistic model-building GAs and propose...

  12. TCAM-based High Speed Longest Prefix Matching with Fast Incremental Table Updates

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Kragelund, A.; Berger, Michael Stübert

    2013-01-01

    and consequently a higher throughput of the network search engine, since the TCAM down time caused by incremental updates is eliminated. The LPM scheme is described in HDL for FPGA implementation and compared to an existing scheme for customized CAM circuits. The paper shows that the proposed scheme can process...

  13. Characteristics of Key Update Strategies for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2011-01-01

    Wireless sensor networks offer the advantages of simple and low-resource communication. Challenged by this simplicity and low-resources, security is of particular importance in many cases such as transmission of sensitive data or strict requirements of tamper-resistance. Updating the security keys...... is one of the essential points in security, which restrict the amount of data that may be exposed when a key is compromised. In this paper, we investigate key update methods that may be used in wireless sensor networks, and benefiting from stochastic model checking we derive characteristics...

  14. A Commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century"

    Science.gov (United States)

    Brandt, Steffen

    2010-01-01

    This article presents the author's commentary on "Updating the Duplex Design for Test-Based Accountability in the Twenty-First Century," in which Isaac I. Bejar and E. Aurora Graf propose the application of a test design--the duplex design (which was proposed in 1988 by Bock and Mislevy) for application in current accountability assessments.…

  15. Software Implementation of Secure Firmware Update in IoT Concept

    Directory of Open Access Journals (Sweden)

    Lukas Kvarda

    2017-01-01

    Full Text Available This paper focuses on a survey of secure firmware update in the Internet of Things, design and description of safe and secure bootloader implementation on RFID UHF reader, encryption with AES-CCM and versioning with use of external backup flash memory device. In the case of problems with HW compatibility or other unexpected errors with new FW version, it is possible to downgrade to previous FW image, including the factory image. Authentication is provided by the UHF RFID service tag used to extract unique initialization vector of the encryption algorithm for each update session. The results show slower update speed with this new upgrade method of approximately 27% compared to older one, using the only AES-CBC algorithm.

  16. Proposal for an Evaluation Method for the Performance of Work Procedures.

    Science.gov (United States)

    Mohammed, Mouda; Mébarek, Djebabra; Wafa, Boulagouas; Makhlouf, Chati

    2016-12-01

    Noncompliance of operators with work procedures is a recurrent problem. This human behavior has been said to be situational and studied by many different approaches (ergonomic and others), which consider the noncompliance with work procedures to be obvious and seek to analyze its causes as well as consequences. The object of the proposed method is to solve this problem by focusing on the performance of work procedures and ensuring improved performance on a continuous basis. This study has multiple results: (1) assessment of the work procedures' performance by a multicriteria approach; (2) the use of a continuous improvement approach as a framework for the sustainability of the assessment method of work procedures' performance; and (3) adaptation of the Stop-Card as a facilitator support for continuous improvement of work procedures. The proposed method emphasizes to put in value the inputs of continuous improvement of the work procedures in relation with the conventional approaches which adopt the obvious evidence of the noncompliance to the working procedures and seek to analyze the cause-effect relationships related to this unacceptable phenomenon, especially in strategic industry.

  17. Pentadiagonal alternating-direction-implicit finite-difference time-domain method for two-dimensional Schrödinger equation

    Science.gov (United States)

    Tay, Wei Choon; Tan, Eng Leong

    2014-07-01

    In this paper, we have proposed a pentadiagonal alternating-direction-implicit (Penta-ADI) finite-difference time-domain (FDTD) method for the two-dimensional Schrödinger equation. Through the separation of complex wave function into real and imaginary parts, a pentadiagonal system of equations for the ADI method is obtained, which results in our Penta-ADI method. The Penta-ADI method is further simplified into pentadiagonal fundamental ADI (Penta-FADI) method, which has matrix-operator-free right-hand-sides (RHS), leading to the simplest and most concise update equations. As the Penta-FADI method involves five stencils in the left-hand-sides (LHS) of the pentadiagonal update equations, special treatments that are required for the implementation of the Dirichlet's boundary conditions will be discussed. Using the Penta-FADI method, a significantly higher efficiency gain can be achieved over the conventional Tri-ADI method, which involves a tridiagonal system of equations.

  18. Self-shielding models of MICROX-2 code: Review and updates

    International Nuclear Information System (INIS)

    Hou, J.; Choi, H.; Ivanov, K.N.

    2014-01-01

    Highlights: • The MICROX-2 code has been improved to expand its application to advanced reactors. • New fine-group cross section libraries based on ENDF/B-VII have been generated. • Resonance self-shielding and spatial self-shielding models have been improved. • The improvements were assessed by a series of benchmark calculations against MCNPX. - Abstract: The MICROX-2 is a transport theory code that solves for the neutron slowing-down and thermalization equations of a two-region lattice cell. The MICROX-2 code has been updated to expand its application to advanced reactor concepts and fuel cycle simulations, including generation of new fine-group cross section libraries based on ENDF/B-VII. In continuation of previous work, the MICROX-2 methods are reviewed and updated in this study, focusing on its resonance self-shielding and spatial self-shielding models for neutron spectrum calculations. The improvement of self-shielding method was assessed by a series of benchmark calculations against the Monte Carlo code, using homogeneous and heterogeneous pin cell models. The results have shown that the implementation of the updated self-shielding models is correct and the accuracy of physics calculation is improved. Compared to the existing models, the updates reduced the prediction error of the infinite multiplication factor by ∼0.1% and ∼0.2% for the homogeneous and heterogeneous pin cell models, respectively, considered in this study

  19. Research of Cadastral Data Modelling and Database Updating Based on Spatio-temporal Process

    Directory of Open Access Journals (Sweden)

    ZHANG Feng

    2016-02-01

    Full Text Available The core of modern cadastre management is to renew the cadastre database and keep its currentness,topology consistency and integrity.This paper analyzed the changes and their linkage of various cadastral objects in the update process.Combined object-oriented modeling technique with spatio-temporal objects' evolution express,the paper proposed a cadastral data updating model based on the spatio-temporal process according to people's thought.Change rules based on the spatio-temporal topological relations of evolution cadastral spatio-temporal objects are drafted and further more cascade updating and history back trace of cadastral features,land use and buildings are realized.This model implemented in cadastral management system-ReGIS.Achieved cascade changes are triggered by the direct driving force or perceived external events.The system records spatio-temporal objects' evolution process to facilitate the reconstruction of history,change tracking,analysis and forecasting future changes.

  20. Updating systematic reviews: an international survey.

    Directory of Open Access Journals (Sweden)

    Chantelle Garritty

    Full Text Available BACKGROUND: Systematic reviews (SRs should be up to date to maintain their importance in informing healthcare policy and practice. However, little guidance is available about when and how to update SRs. Moreover, the updating policies and practices of organizations that commission or produce SRs are unclear. METHODOLOGY/PRINCIPAL FINDINGS: The objective was to describe the updating practices and policies of agencies that sponsor or conduct SRs. An Internet-based survey was administered to a purposive non-random sample of 195 healthcare organizations within the international SR community. Survey results were analyzed using descriptive statistics. The completed response rate was 58% (n = 114 from across 26 countries with 70% (75/107 of participants identified as producers of SRs. Among responders, 79% (84/107 characterized the importance of updating as high or very-high and 57% (60/106 of organizations reported to have a formal policy for updating. However, only 29% (35/106 of organizations made reference to a written policy document. Several groups (62/105; 59% reported updating practices as irregular, and over half (53/103 of organizational respondents estimated that more than 50% of their respective SRs were likely out of date. Authors of the original SR (42/106; 40% were most often deemed responsible for ensuring SRs were current. Barriers to updating included resource constraints, reviewer motivation, lack of academic credit, and limited publishing formats. Most respondents (70/100; 70% indicated that they supported centralization of updating efforts across institutions or agencies. Furthermore, 84% (83/99 of respondents indicated they favoured the development of a central registry of SRs, analogous to efforts within the clinical trials community. CONCLUSIONS/SIGNIFICANCE: Most organizations that sponsor and/or carry out SRs consider updating important. Despite this recognition, updating practices are not regular, and many organizations lack

  1. Proposed method to calculate FRMAC intervention levels for the assessment of radiologically contaminated food and comparison of the proposed method to the U.S. FDA's method to calculate derived intervention levels

    Energy Technology Data Exchange (ETDEWEB)

    Kraus, Terrence D.; Hunt, Brian D.

    2014-02-01

    This report reviews the method recommended by the U.S. Food and Drug Administration for calculating Derived Intervention Levels (DILs) and identifies potential improvements to the DIL calculation method to support more accurate ingestion pathway analyses and protective action decisions. Further, this report proposes an alternate method for use by the Federal Emergency Radiological Assessment Center (FRMAC) to calculate FRMAC Intervention Levels (FILs). The default approach of the FRMAC during an emergency response is to use the FDA recommended methods. However, FRMAC recommends implementing the FIL method because we believe it to be more technically accurate. FRMAC will only implement the FIL method when approved by the FDA representative on the Federal Advisory Team for Environment, Food, and Health.

  2. Field Application of Cable Tension Estimation Technique Using the h-SI Method

    Directory of Open Access Journals (Sweden)

    Myung-Hyun Noh

    2015-01-01

    Full Text Available This paper investigates field applicability of a new system identification technique of estimating tensile force for a cable of long span bridges. The newly proposed h-SI method using the combination of the sensitivity updating algorithm and the advanced hybrid microgenetic algorithm can allow not only avoiding the trap of local minimum at initial searching stage but also finding the optimal solution in terms of better numerical efficiency than existing methods. First, this paper overviews the procedure of tension estimation through a theoretical formulation. Secondly, the validity of the proposed technique is numerically examined using a set of dynamic data obtained from benchmark numerical samples considering the effect of sag extensibility and bending stiffness of a sag-cable system. Finally, the feasibility of the proposed method is investigated through actual field data extracted from a cable-stayed Seohae Bridge. The test results show that the existing methods require precise initial data in advance but the proposed method is not affected by such initial information. In particular, the proposed method can improve accuracy and convergence rate toward final values. Consequently, the proposed method can be more effective than existing methods in terms of characterizing the tensile force variation for cable structures.

  3. Are Forecast Updates Progressive?

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)

    2010-01-01

    textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,

  4. Citation Discovery Tools for Conducting Adaptive Meta-analyses to Update Systematic Reviews.

    Science.gov (United States)

    Bae, Jong-Myon; Kim, Eun Hee

    2016-03-01

    The systematic review (SR) is a research methodology that aims to synthesize related evidence. Updating previously conducted SRs is necessary when new evidence has been produced, but no consensus has yet emerged on the appropriate update methodology. The authors have developed a new SR update method called 'adaptive meta-analysis' (AMA) using the 'cited by', 'similar articles', and 'related articles' citation discovery tools in the PubMed and Scopus databases. This study evaluates the usefulness of these citation discovery tools for updating SRs. Lists were constructed by applying the citation discovery tools in the two databases to the articles analyzed by a published SR. The degree of overlap between the lists and distribution of excluded results were evaluated. The articles ultimately selected for the SR update meta-analysis were found in the lists obtained from the 'cited by' and 'similar' tools in PubMed. Most of the selected articles appeared in both the 'cited by' lists in Scopus and PubMed. The Scopus 'related' tool did not identify the appropriate articles. The AMA, which involves using both citation discovery tools in PubMed, and optionally, the 'related' tool in Scopus, was found to be useful for updating an SR.

  5. Postretrieval new learning does not reliably induce human memory updating via reconsolidation.

    Science.gov (United States)

    Hardwicke, Tom E; Taqi, Mahdi; Shanks, David R

    2016-05-10

    Reconsolidation theory proposes that retrieval can destabilize an existing memory trace, opening a time-dependent window during which that trace is amenable to modification. Support for the theory is largely drawn from nonhuman animal studies that use invasive pharmacological or electroconvulsive interventions to disrupt a putative postretrieval restabilization ("reconsolidation") process. In human reconsolidation studies, however, it is often claimed that postretrieval new learning can be used as a means of "updating" or "rewriting" existing memory traces. This proposal warrants close scrutiny because the ability to modify information stored in the memory system has profound theoretical, clinical, and ethical implications. The present study aimed to replicate and extend a prominent 3-day motor-sequence learning study [Walker MP, Brakefield T, Hobson JA, Stickgold R (2003) Nature 425(6958):616-620] that is widely cited as a convincing demonstration of human reconsolidation. However, in four direct replication attempts (n = 64), we did not observe the critical impairment effect that has previously been taken to indicate disruption of an existing motor memory trace. In three additional conceptual replications (n = 48), we explored the broader validity of reconsolidation-updating theory by using a declarative recall task and sequences similar to phone numbers or computer passwords. Rather than inducing vulnerability to interference, memory retrieval appeared to aid the preservation of existing sequence knowledge relative to a no-retrieval control group. These findings suggest that memory retrieval followed by new learning does not reliably induce human memory updating via reconsolidation.

  6. Updating risk prediction tools: a case study in prostate cancer.

    Science.gov (United States)

    Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M

    2012-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. A Performance Study on Synchronous and Asynchronous Update Rules for A Plug-In Direct Particle Swarm Repetitive Controller

    Directory of Open Access Journals (Sweden)

    Ufnalski Bartlomiej

    2014-12-01

    Full Text Available In this paper two different update schemes for the recently developed plug-in direct particle swarm repetitive controller (PDPSRC are investigated and compared. The proposed approach employs the particle swarm optimizer (PSO to solve in on-line mode a dynamic optimization problem (DOP related to the control task in the constant-amplitude constant-frequency voltage-source inverter (CACF VSI with an LC output filter. The effectiveness of synchronous and asynchronous update rules, both commonly used in static optimization problems (SOPs, is assessed and compared in the case of PDPSRC. The performance of the controller, when synthesized using each of the update schemes, is studied numerically.

  8. Altitude training for elite endurance performance: a 2012 update.

    Science.gov (United States)

    Fudge, Barry W; Pringle, Jamie S M; Maxwell, Neil S; Turner, Gareth; Ingham, Stephen A; Jones, Andrew M

    2012-01-01

    Altitude training is commonly used by endurance athletes and coaches in pursuit of enhancement of performance on return to sea level. The purpose of the current review article was to update and evaluate recent literature relevant to the practical application of altitude training for endurance athletes. Consequently, the literature can be considered in either of two categories: performance-led investigations or mechanistic advancements/insights. Each section discusses the relevant literature and proposes future directions where appropriate.

  9. Proposals of counting method for bubble detectors and their intercomparisons

    International Nuclear Information System (INIS)

    Ramalho, Eduardo; Silva, Ademir X.; Bellido, Luis F.; Facure, Alessandro; Pereira, Mario

    2009-01-01

    The study of neutron's spectrometry and dosimetry has become significantly easier due to relatively new devices called bubble detectors. Insensitive to gamma rays and composed by superheated emulsions, they still are subjects of many researches in Radiation Physics and Nuclear Engineering. In bubble detectors, either exposed to more intense neutron fields or for a long time, when more bubbles are produced, the statistical uncertainty during the dosimetric and spectrometric processes is reduced. A proposal of this nature is set up in this work, which presents ways to perform counting processes for bubble detectors and an updated proceeding to get the irradiated detectors' images in order to make the manual counting easier. Twelve BDS detectors were irradiated by RDS111 cyclotron from IEN's (Instituto de Engenharia Nuclear) and photographed using an assembly specially designed for this experiment. Counting was proceeded manually in a first moment; simultaneously, ImagePro was used in order to perform counting automatically. The bubble counting values, either manual or automatic, were compared and the time to get them and their difficult levels as well. After the bubble counting, the detectors' standardizes responses were calculated in both cases, according to BDS's manual and they were also compared. Among the results, the counting on these devices really becomes very hard at a large number of bubbles, besides higher variations in counting of many bubbles. Because of the good agreement between manual counting and the custom program, the last one revealed a good alternative in practical and economical levels. Despite the good results, the custom program needs of more adjustments in order to achieve more accuracy on higher counting on bubble detectors for neutron measurement applications. (author)

  10. Gating based on internal/external signals with dynamic correlation updates

    International Nuclear Information System (INIS)

    Wu Huanmei; Zhao Qingya; Berbeco, Ross I; Nishioka, Seiko; Shirato, Hiroki; Jiang, Steve B

    2008-01-01

    Precise localization of mobile tumor positions in real time is critical to the success of gated radiotherapy. Tumor positions are usually derived from either internal or external surrogates. Fluoroscopic gating based on internal surrogates, such as implanted fiducial markers, is accurate however requiring a large amount of imaging dose. Gating based on external surrogates, such as patient abdominal surface motion, is non-invasive however less accurate due to the uncertainty in the correlation between tumor location and external surrogates. To address these complications, we propose to investigate an approach based on hybrid gating with dynamic internal/external correlation updates. In this approach, the external signal is acquired at high frequency (such as 30 Hz) while the internal signal is sparsely acquired (such as 0.5 Hz or less). The internal signal is used to validate and update the internal/external correlation during treatment. Tumor positions are derived from the external signal based on the newly updated correlation. Two dynamic correlation updating algorithms are introduced. One is based on the motion amplitude and the other is based on the motion phase. Nine patients with synchronized internal/external motion signals are simulated retrospectively to evaluate the effectiveness of hybrid gating. The influences of different clinical conditions on hybrid gating, such as the size of gating windows, the optimal timing for internal signal acquisition and the acquisition frequency are investigated. The results demonstrate that dynamically updating the internal/external correlation in or around the gating window will reduce false positive with relatively diminished treatment efficiency. This improvement will benefit patients with mobile tumors, especially greater for early stage lung cancers, for which the tumors are less attached or freely floating in the lung.

  11. Gating based on internal/external signals with dynamic correlation updates

    Energy Technology Data Exchange (ETDEWEB)

    Wu Huanmei [Purdue School of Engineering and Technology, Indiana University School of Informatics, IUPUI, Indianapolis, IN (United States); Zhao Qingya [School of Health Sciences, Purdue University, West Lafayette, IN (United States); Berbeco, Ross I [Department of Radiation Oncology, Dana-Farber/Brigham and Womens Cancer Center and Harvard Medical School, Boston, MA (United States); Nishioka, Seiko [NTT East-Japan Sapporo Hospital, Sapporo (Japan); Shirato, Hiroki [Hokkaido University Graduate School of Medicine, Sapporo (Japan); Jiang, Steve B [Department of Radiation Oncology, School of Medicine, University of California, San Diego, CA (United States)], E-mail: hw9@iupui.edu, E-mail: sbjiang@ucsd.edu

    2008-12-21

    Precise localization of mobile tumor positions in real time is critical to the success of gated radiotherapy. Tumor positions are usually derived from either internal or external surrogates. Fluoroscopic gating based on internal surrogates, such as implanted fiducial markers, is accurate however requiring a large amount of imaging dose. Gating based on external surrogates, such as patient abdominal surface motion, is non-invasive however less accurate due to the uncertainty in the correlation between tumor location and external surrogates. To address these complications, we propose to investigate an approach based on hybrid gating with dynamic internal/external correlation updates. In this approach, the external signal is acquired at high frequency (such as 30 Hz) while the internal signal is sparsely acquired (such as 0.5 Hz or less). The internal signal is used to validate and update the internal/external correlation during treatment. Tumor positions are derived from the external signal based on the newly updated correlation. Two dynamic correlation updating algorithms are introduced. One is based on the motion amplitude and the other is based on the motion phase. Nine patients with synchronized internal/external motion signals are simulated retrospectively to evaluate the effectiveness of hybrid gating. The influences of different clinical conditions on hybrid gating, such as the size of gating windows, the optimal timing for internal signal acquisition and the acquisition frequency are investigated. The results demonstrate that dynamically updating the internal/external correlation in or around the gating window will reduce false positive with relatively diminished treatment efficiency. This improvement will benefit patients with mobile tumors, especially greater for early stage lung cancers, for which the tumors are less attached or freely floating in the lung.

  12. Environmental Regulatory Update Table, May/June 1992

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.; Salk, M.S.

    1992-07-01

    This report contains a bi-monthly update of environmental regulatory activity that is of interest to the Department of Energy. It is provided to DOE operations and contractor staff to assist and support environmental management programs by tracking regulatory developments. Any proposed regulation that raises significant issues for any DOE operation should be reported to the Office of Environmental Guidance (EH-23) as soon as possible so that the Department can make its concerns known to the appropriate regulatory agency. Items of particular interest to EH-23 are indicated by a shading of the RU{number sign}.

  13. Environmental Regulatory Update Table, May/June 1992

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.; Salk, M.S.

    1992-07-01

    This report contains a bi-monthly update of environmental regulatory activity that is of interest to the Department of Energy. It is provided to DOE operations and contractor staff to assist and support environmental management programs by tracking regulatory developments. Any proposed regulation that raises significant issues for any DOE operation should be reported to the Office of Environmental Guidance (EH-23) as soon as possible so that the Department can make its concerns known to the appropriate regulatory agency. Items of particular interest to EH-23 are indicated by a shading of the RU{number_sign}.

  14. Updating of working memory: lingering bindings.

    Science.gov (United States)

    Oberauer, Klaus; Vockenberg, Kerstin

    2009-05-01

    Three experiments investigated proactive interference and proactive facilitation in a memory-updating paradigm. Participants remembered several letters or spatial patterns, distinguished by their spatial positions, and updated them by new stimuli up to 20 times per trial. Self-paced updating times were shorter when an item previously remembered and then replaced reappeared in the same location than when it reappeared in a different location. This effect demonstrates residual memory for no-longer-relevant bindings of items to locations. The effect increased with the number of items to be remembered. With one exception, updating times did not increase, and recall of final values did not decrease, over successive updating steps, thus providing little evidence for proactive interference building up cumulatively.

  15. The Pedestrian Detection Method Using an Extension Background Subtraction about the Driving Safety Support Systems

    Science.gov (United States)

    Muranaka, Noriaki; Date, Kei; Tokumaru, Masataka; Imanishi, Shigeru

    In recent years, the traffic accident occurs frequently with explosion of traffic density. Therefore, we think that the safe and comfortable transportation system to defend the pedestrian who is the traffic weak is necessary. First, we detect and recognize the pedestrian (the crossing person) by the image processing. Next, we inform all the drivers of the right or left turn that the pedestrian exists by the sound and the image and so on. By prompting a driver to do safe driving in this way, the accident to the pedestrian can decrease. In this paper, we are using a background subtraction method for the movement detection of the movement object. In the background subtraction method, the update method in the background was important, and as for the conventional way, the threshold values of the subtraction processing and background update were identical. That is, the mixing rate of the input image and the background image of the background update was a fixation value, and the fine tuning which corresponded to the environment change of the weather was difficult. Therefore, we propose the update method of the background image that the estimated mistake is difficult to be amplified. We experiment and examines in the comparison about five cases of sunshine, cloudy, evening, rain, sunlight change, except night. This technique can set separately the threshold values of the subtraction processing and background update processing which suited the environmental condition of the weather and so on. Therefore, the fine tuning becomes possible freely in the mixing rate of the input image and the background image of the background update. Because the setting of the parameter which suited an environmental condition becomes important to minimize mistaking percentage, we examine about the setting of a parameter.

  16. How do we update faces? Effects of gaze direction and facial expressions on working memory updating

    Directory of Open Access Journals (Sweden)

    Caterina eArtuso

    2012-09-01

    Full Text Available The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM. We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g. joy, while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g. fear. Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g. joy-direct gaze were compared to low binding conditions (e.g. joy-averted gaze. Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  17. How do we update faces? Effects of gaze direction and facial expressions on working memory updating.

    Science.gov (United States)

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  18. Millimetre Level Accuracy GNSS Positioning with the Blind Adaptive Beamforming Method in Interference Environments

    Directory of Open Access Journals (Sweden)

    Saeed Daneshmand

    2016-10-01

    Full Text Available The use of antenna arrays in Global Navigation Satellite System (GNSS applications is gaining significant attention due to its superior capability to suppress both narrowband and wideband interference. However, the phase distortions resulting from array processing may limit the applicability of these methods for high precision applications using carrier phase based positioning techniques. This paper studies the phase distortions occurring with the adaptive blind beamforming method in which satellite angle of arrival (AoA information is not employed in the optimization problem. To cater to non-stationary interference scenarios, the array weights of the adaptive beamformer are continuously updated. The effects of these continuous updates on the tracking parameters of a GNSS receiver are analyzed. The second part of this paper focuses on reducing the phase distortions during the blind beamforming process in order to allow the receiver to perform carrier phase based positioning by applying a constraint on the structure of the array configuration and by compensating the array uncertainties. Limitations of the previous methods are studied and a new method is proposed that keeps the simplicity of the blind beamformer structure and, at the same time, reduces tracking degradations while achieving millimetre level positioning accuracy in interference environments. To verify the applicability of the proposed method and analyze the degradations, array signals corresponding to the GPS L1 band are generated using a combination of hardware and software simulators. Furthermore, the amount of degradation and performance of the proposed method under different conditions are evaluated based on Monte Carlo simulations.

  19. 49 CFR 1002.3 - Updating user fees.

    Science.gov (United States)

    2010-10-01

    ... updating fees. Each fee shall be updated by updating the cost components comprising the fee. Cost... direct labor costs are direct labor costs determined by the cost study set forth in Revision of Fees For... by total office costs for the Offices directly associated with user fee activity. Actual updating of...

  20. Novel Exponentially Fitted Two-Derivative Runge-Kutta Methods with Equation-Dependent Coefficients for First-Order Differential Equations

    Directory of Open Access Journals (Sweden)

    Yanping Yang

    2016-01-01

    Full Text Available The construction of exponentially fitted two-derivative Runge-Kutta (EFTDRK methods for the numerical solution of first-order differential equations is investigated. The revised EFTDRK methods proposed, with equation-dependent coefficients, take into consideration the errors produced in the internal stages to the update. The local truncation errors and stability of the new methods are analyzed. The numerical results are reported to show the accuracy of the new methods.

  1. Ontology-Based Method for Fault Diagnosis of Loaders.

    Science.gov (United States)

    Xu, Feixiang; Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei

    2018-02-28

    This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study.

  2. Technical Update: Preimplantation Genetic Diagnosis and Screening.

    Science.gov (United States)

    Dahdouh, Elias M; Balayla, Jacques; Audibert, François; Wilson, R Douglas; Audibert, François; Brock, Jo-Ann; Campagnolo, Carla; Carroll, June; Chong, Karen; Gagnon, Alain; Johnson, Jo-Ann; MacDonald, William; Okun, Nanette; Pastuck, Melanie; Vallée-Pouliot, Karine

    2015-05-01

    To update and review the techniques and indications of preimplantation genetic diagnosis (PGD) and preimplantation genetic screening (PGS). Discussion about the genetic and technical aspects of preimplantation reproductive techniques, particularly those using new cytogenetic technologies and embryo-stage biopsy. Clinical outcomes of reproductive techniques following the use of PGD and PGS are included. This update does not discuss in detail the adverse outcomes that have been recorded in association with assisted reproductive technologies. Published literature was retrieved through searches of The Cochrane Library and Medline in April 2014 using appropriate controlled vocabulary (aneuploidy, blastocyst/physiology, genetic diseases, preimplantation diagnosis/methods, fertilization in vitro) and key words (e.g., preimplantation genetic diagnosis, preimplantation genetic screening, comprehensive chromosome screening, aCGH, SNP microarray, qPCR, and embryo selection). Results were restricted to systematic reviews, randomized controlled trials/controlled clinical trials, and observational studies published from 1990 to April 2014. There were no language restrictions. Searches were updated on a regular basis and incorporated in the update to January 2015. Additional publications were identified from the bibliographies of retrieved articles. Grey (unpublished) literature was identified through searching the websites of health technology assessment and health technology-related agencies, clinical practice guideline collections, clinical trial registries, and national and international medical specialty societies. The quality of evidence in this document was rated using the criteria described in the Report of the Canadian Task Force on Preventive Health Care. (Table 1) BENEFITS, HARMS, AND COSTS: This update will educate readers about new preimplantation genetic concepts, directions, and technologies. The major harms and costs identified are those of assisted reproductive

  3. Updating the asymmetric osmium-catalyzed dihydroxylation (AD) mnemonic. Q2MM modeling and new kinetic measurements

    DEFF Research Database (Denmark)

    Fristrup, Peter; Tanner, David Ackland; Norrby, Per-Ola

    2003-01-01

    The mnemonic device for predicting stereoselectivities in the Sharpless asymmetric dihydroxylation (AD) reaction has been updated based on extensive computational studies. Kinetic measurements from competition reactions validate the new proposal. The interactions responsible for the high stereose...

  4. Use of an advanced document system in post-refuelling updating of nuclear power plant documentation

    International Nuclear Information System (INIS)

    Puech Suanzes, P.; Cortes Soler, M.

    1993-01-01

    This paper discusses the results of the extensive use of an advanced document system to update documentation prepared by traditional methods and affected by changes in the period between two plant refuellings. The implementation of a system for the capture, retrieval and storage of drawings using optical discs is part of a plan to modernize production and management tools and to thus achieve better control of document configuration. These processes are consequently optimized in that: 1. The deterioration of drawings is detained with the help of an identical, updated, legible, reliable support for all users. 2. The time required to update documentation is reduced. Given the large number of drawings, the implementation method should effectively combine costs and time. The document management tools ensure optical disc storage control so that from the moment a drawing resides in the system, any modification to it is made through the system utilities, thus ensuring quality and reducing schedules. The system described was used to update the electrical drawings of Almaraz Nuclear Power Plant. Changes made during the eighth refuelling of Unit I were incorporated and the time needed to issue the updated drawings was reduced by one month. (author)

  5. Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo

    Science.gov (United States)

    McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.

    2017-11-01

    Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.

  6. Concepts of incremental updating and versioning

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2004-07-01

    Full Text Available of the work undertaken recently by the Working Group (WG). The WG was voted for a Commission by the General Assembly held at the 21st ICC in Durban, South Africa. The basic problem being addressed by the Commission is that a user compiles their data base... or election). Historically, updates have been provided in bulk, with the new data set replacing the old one. User could: ignore update (if it is not significant enough), manually (and selectively) update their data base, or accept the whole update...

  7. Due date assignment procedures with dynamically updated coefficients for multi-level assembly job shops

    NARCIS (Netherlands)

    Adam, N.R.; Bertrand, J.W.M.; Morehead, D.C.; Surkis, J.

    1993-01-01

    This paper presents a study of due date assignment procedures in job shop environments where multi-level assembly jobs are processed and due dates are internally assigned. Most of the reported studies in the literature have focused on string type jobs. We propose a dynamic update approach (which

  8. Frequently updated noise threat maps created with use of supercomputing grid

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2014-09-01

    Full Text Available An innovative supercomputing grid services devoted to noise threat evaluation were presented. The services described in this paper concern two issues, first is related to the noise mapping, while the second one focuses on assessment of the noise dose and its influence on the human hearing system. The discussed serviceswere developed within the PL-Grid Plus Infrastructure which accumulates Polish academic supercomputer centers. Selected experimental results achieved by the usage of the services proposed were presented. The assessment of the environmental noise threats includes creation of the noise maps using either ofline or online data, acquired through a grid of the monitoring stations. A concept of estimation of the source model parameters based on the measured sound level for the purpose of creating frequently updated noise maps was presented. Connecting the noise mapping grid service with a distributed sensor network enables to automatically update noise maps for a specified time period. Moreover, a unique attribute of the developed software is the estimation of the auditory effects evoked by the exposure to noise. The estimation method uses a modified psychoacoustic model of hearing and is based on the calculated noise level values and on the given exposure period. Potential use scenarios of the grid services for research or educational purpose were introduced. Presentation of the results of predicted hearing threshold shift caused by exposure to excessive noise can raise the public awareness of the noise threats.

  9. Real Time Updating in Distributed Urban Rainfall Runoff Modelling

    DEFF Research Database (Denmark)

    Borup, Morten; Madsen, Henrik

    that are being updated from system measurements was studied. The results showed that the fact alone that it takes time for rainfall data to travel the distance between gauges and catchments has such a big negative effect on the forecast skill of updated models, that it can justify the choice of even very...... as in a real data case study. The results confirmed that the method is indeed suitable for DUDMs and that it can be used to utilise upstream as well as downstream water level and flow observations to improve model estimates and forecasts. Due to upper and lower sensor limits many sensors in urban drainage...

  10. Effective Filtering of Query Results on Updated User Behavioral Profiles in Web Mining

    Directory of Open Access Journals (Sweden)

    S. Sadesh

    2015-01-01

    Full Text Available Web with tremendous volume of information retrieves result for user related queries. With the rapid growth of web page recommendation, results retrieved based on data mining techniques did not offer higher performance filtering rate because relationships between user profile and queries were not analyzed in an extensive manner. At the same time, existing user profile based prediction in web data mining is not exhaustive in producing personalized result rate. To improve the query result rate on dynamics of user behavior over time, Hamilton Filtered Regime Switching User Query Probability (HFRS-UQP framework is proposed. HFRS-UQP framework is split into two processes, where filtering and switching are carried out. The data mining based filtering in our research work uses the Hamilton Filtering framework to filter user result based on personalized information on automatic updated profiles through search engine. Maximized result is fetched, that is, filtered out with respect to user behavior profiles. The switching performs accurate filtering updated profiles using regime switching. The updating in profile change (i.e., switches regime in HFRS-UQP framework identifies the second- and higher-order association of query result on the updated profiles. Experiment is conducted on factors such as personalized information search retrieval rate, filtering efficiency, and precision ratio.

  11. Optimal plot size in the evaluation of papaya scions: proposal and comparison of methods

    Directory of Open Access Journals (Sweden)

    Humberto Felipe Celanti

    Full Text Available ABSTRACT Evaluating the quality of scions is extremely important and it can be done by characteristics of shoots and roots. This experiment evaluated height of the aerial part, stem diameter, number of leaves, petiole length and length of roots of papaya seedlings. Analyses were performed from a blank trial with 240 seedlings of "Golden Pecíolo Curto". The determination of the optimum plot size was done by applying the methods of maximum curvature, maximum curvature of coefficient of variation and a new proposed method, which incorporates the bootstrap resampling simulation to the maximum curvature method. According to the results obtained, five is the optimal number of seedlings of papaya "Golden Pecíolo Curto" per plot. The proposed method of bootstrap simulation with replacement provides optimal plot sizes equal or higher than the maximum curvature method and provides same plot size than maximum curvature method of the coefficient of variation.

  12. 77 FR 32137 - Agency Information Collection Activities: Proposed Collection; Comments Requested; Strategic...

    Science.gov (United States)

    2012-05-31

    ...] Agency Information Collection Activities: Proposed Collection; Comments Requested; Strategic Planning... Form/Collection: Strategic Planning Environmental Assessment Outreach. (3) Agency form number, if any... satisfaction. This act requires that agencies update and revise their strategic plans every three years. The...

  13. FDA Developments: Food Code 2013 and Proposed Trans Fat Determination

    NARCIS (Netherlands)

    Grossman, M.R.

    2014-01-01

    268 Reports EFFL 4|2014 USA FDA Developments: Food Code 2013 and Proposed Trans Fat Determination Margaret Rosso Grossman* I. Food Code 2013 and Food Code Reference System Since 1993, the US Food and Drug Administration has published a Food Code, now updated every four years. In November 2013, the

  14. Proposed Project Selection Method for Human Support Research and Technology Development (HSR&TD)

    Science.gov (United States)

    Jones, Harry

    2005-01-01

    The purpose of HSR&TD is to deliver human support technologies to the Exploration Systems Mission Directorate (ESMD) that will be selected for future missions. This requires identifying promising candidate technologies and advancing them in technology readiness until they are acceptable. HSR&TD must select an may of technology development projects, guide them, and either terminate or continue them, so as to maximize the resulting number of usable advanced human support technologies. This paper proposes an effective project scoring methodology to support managing the HSR&TD project portfolio. Researchers strongly disagree as to what are the best technology project selection methods, or even if there are any proven ones. Technology development is risky and outstanding achievements are rare and unpredictable. There is no simple formula for success. Organizations that are satisfied with their project selection approach typically use a mix of financial, strategic, and scoring methods in an open, established, explicit, formal process. This approach helps to build consensus and develop management insight. It encourages better project proposals by clarifying the desired project attributes. We propose a project scoring technique based on a method previously used in a federal laboratory and supported by recent research. Projects are ranked by their perceived relevance, risk, and return - a new 3 R's. Relevance is the degree to which the project objective supports the HSR&TD goal of developing usable advanced human support technologies. Risk is the estimated probability that the project will achieve its specific objective. Return is the reduction in mission life cycle cost obtained if the project is successful. If the project objective technology performs a new function with no current cost, its return is the estimated cash value of performing the new function. The proposed project selection scoring method includes definitions of the criteria, a project evaluation

  15. Characterization of heterogeneous reservoirs: sentinels method and quantification of uncertainties; Caracterisation des reservoirs heterogenes: methode des sentinelles et quantification des incertitudes

    Energy Technology Data Exchange (ETDEWEB)

    Mezghani, M.

    1999-02-11

    The aim of this thesis is to propose a new inversion method to allow both an improved reservoir characterization and a management of uncertainties. In this approach, the identification of the permeability distribution is conducted using the sentinel method in order to match the pressure data. This approach, based on optimal control theory, can be seen as an alternative of least-squares method. Here, we prove the existence of exact sentinels under regularity hypothesis. From a numerical point of view, we consider regularized sentinels. We suggest a novel approach to update the penalization coefficient in order to improve numerical robustness. Moreover, the flexibility of the sentinel method enables to develop a way to treat noisy pressure data. To deal with geostatistical modelling of permeability distribution, we propose to link the pilot point method with sentinels to reach the identification of permeability. We particularly focus on the optimal location of pilot points. Finally, we present an original method, based on adjoint state computations, to quantify the dynamic data contribution to the characterisation of a calibrated geostatistical model. (author) 67 refs.

  16. 78 FR 19190 - Proposed Information Collection; Comment Request; 2013 Company Organization Survey

    Science.gov (United States)

    2013-03-29

    ... DEPARTMENT OF COMMERCE U.S. Census Bureau Proposed Information Collection; Comment Request; 2013 Company Organization Survey AGENCY: U.S. Census Bureau, Commerce. ACTION: Notice. SUMMARY: The Department... Bureau conducts the annual Company Organization Survey (COS) to update and maintain a central...

  17. 76 FR 71511 - Proposed Information Collection; Comment Request; 2012 Company Organization Survey

    Science.gov (United States)

    2011-11-18

    ... DEPARTMENT OF COMMERCE U.S. Census Bureau Proposed Information Collection; Comment Request; 2012 Company Organization Survey AGENCY: U.S. Census Bureau, Commerce. ACTION: Notice. SUMMARY: The Department... Bureau conducts the annual Company Organization Survey (COS) to update and maintain a central...

  18. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  19. Taxonomy of the family Arenaviridae and the order Bunyavirales: update 2018.

    Science.gov (United States)

    Maes, Piet; Alkhovsky, Sergey V; Bào, Yīmíng; Beer, Martin; Birkhead, Monica; Briese, Thomas; Buchmeier, Michael J; Calisher, Charles H; Charrel, Rémi N; Choi, Il Ryong; Clegg, Christopher S; de la Torre, Juan Carlos; Delwart, Eric; DeRisi, Joseph L; Di Bello, Patrick L; Di Serio, Francesco; Digiaro, Michele; Dolja, Valerian V; Drosten, Christian; Druciarek, Tobiasz Z; Du, Jiang; Ebihara, Hideki; Elbeaino, Toufic; Gergerich, Rose C; Gillis, Amethyst N; Gonzalez, Jean-Paul J; Haenni, Anne-Lise; Hepojoki, Jussi; Hetzel, Udo; Hồ, Thiện; Hóng, Ní; Jain, Rakesh K; Jansen van Vuren, Petrus; Jin, Qi; Jonson, Miranda Gilda; Junglen, Sandra; Keller, Karen E; Kemp, Alan; Kipar, Anja; Kondov, Nikola O; Koonin, Eugene V; Kormelink, Richard; Korzyukov, Yegor; Krupovic, Mart; Lambert, Amy J; Laney, Alma G; LeBreton, Matthew; Lukashevich, Igor S; Marklewitz, Marco; Markotter, Wanda; Martelli, Giovanni P; Martin, Robert R; Mielke-Ehret, Nicole; Mühlbach, Hans-Peter; Navarro, Beatriz; Ng, Terry Fei Fan; Nunes, Márcio Roberto Teixeira; Palacios, Gustavo; Pawęska, Janusz T; Peters, Clarence J; Plyusnin, Alexander; Radoshitzky, Sheli R; Romanowski, Víctor; Salmenperä, Pertteli; Salvato, Maria S; Sanfaçon, Hélène; Sasaya, Takahide; Schmaljohn, Connie; Schneider, Bradley S; Shirako, Yukio; Siddell, Stuart; Sironen, Tarja A; Stenglein, Mark D; Storm, Nadia; Sudini, Harikishan; Tesh, Robert B; Tzanetakis, Ioannis E; Uppala, Mangala; Vapalahti, Olli; Vasilakis, Nikos; Walker, Peter J; Wáng, Guópíng; Wáng, Lìpíng; Wáng, Yànxiăng; Wèi, Tàiyún; Wiley, Michael R; Wolf, Yuri I; Wolfe, Nathan D; Wú, Zhìqiáng; Xú, Wénxìng; Yang, Li; Yāng, Zuòkūn; Yeh, Shyi-Dong; Zhāng, Yǒng-Zhèn; Zhèng, Yàzhōu; Zhou, Xueping; Zhū, Chénxī; Zirkel, Florian; Kuhn, Jens H

    2018-04-21

    In 2018, the family Arenaviridae was expanded by inclusion of 1 new genus and 5 novel species. At the same time, the recently established order Bunyavirales was expanded by 3 species. This article presents the updated taxonomy of the family Arenaviridae and the order Bunyavirales as now accepted by the International Committee on Taxonomy of Viruses (ICTV) and summarizes additional taxonomic proposals that may affect the order in the near future.

  20. Xanthones of Lichen Source: A 2016 Update.

    Science.gov (United States)

    Le Pogam, Pierre; Boustie, Joël

    2016-03-02

    An update of xanthones encountered in lichens is proposed as more than 20 new xanthones have been described since the publication of the compendium of lichen metabolites by Huneck and Yoshimura in 1996. The last decades witnessed major advances regarding the elucidation of biosynthetic schemes leading to these fascinating compounds, accounting for the unique substitution patterns of a very vast majority of lichen xanthones. Besides a comprehensive analysis of the structures of xanthones described in lichens, their bioactivities and the emerging analytical strategies used to pinpoint them within lichens are presented here together with physico-chemical properties (including NMR data) as reported since 1996.

  1. Receiver Operating Characteristic Curve-Based Prediction Model for Periodontal Disease Updated With the Calibrated Community Periodontal Index.

    Science.gov (United States)

    Su, Chiu-Wen; Yen, Amy Ming-Fang; Lai, Hongmin; Chen, Hsiu-Hsi; Chen, Sam Li-Sheng

    2017-12-01

    The accuracy of a prediction model for periodontal disease using the community periodontal index (CPI) has been undertaken by using an area under a receiver operating characteristics (AUROC) curve. How the uncalibrated CPI, as measured by general dentists trained by periodontists in a large epidemiologic study, and affects the performance in a prediction model, has not been researched yet. A two-stage design was conducted by first proposing a validation study to calibrate CPI between a senior periodontal specialist and trained general dentists who measured CPIs in the main study of a nationwide survey. A Bayesian hierarchical logistic regression model was applied to estimate the non-updated and updated clinical weights used for building up risk scores. How the calibrated CPI affected performance of the updated prediction model was quantified by comparing AUROC curves between the original and updated models. Estimates regarding calibration of CPI obtained from the validation study were 66% and 85% for sensitivity and specificity, respectively. After updating, clinical weights of each predictor were inflated, and the risk score for the highest risk category was elevated from 434 to 630. Such an update improved the AUROC performance of the two corresponding prediction models from 62.6% (95% confidence interval [CI]: 61.7% to 63.6%) for the non-updated model to 68.9% (95% CI: 68.0% to 69.6%) for the updated one, reaching a statistically significant difference (P prediction model was demonstrated for periodontal disease as measured by the calibrated CPI derived from a large epidemiologic survey.

  2. Reacting to different types of concept drift: the Accuracy Updated Ensemble algorithm.

    Science.gov (United States)

    Brzezinski, Dariusz; Stefanowski, Jerzy

    2014-01-01

    Data stream mining has been receiving increased attention due to its presence in a wide range of applications, such as sensor networks, banking, and telecommunication. One of the most important challenges in learning from data streams is reacting to concept drift, i.e., unforeseen changes of the stream's underlying data distribution. Several classification algorithms that cope with concept drift have been put forward, however, most of them specialize in one type of change. In this paper, we propose a new data stream classifier, called the Accuracy Updated Ensemble (AUE2), which aims at reacting equally well to different types of drift. AUE2 combines accuracy-based weighting mechanisms known from block-based ensembles with the incremental nature of Hoeffding Trees. The proposed algorithm is experimentally compared with 11 state-of-the-art stream methods, including single classifiers, block-based and online ensembles, and hybrid approaches in different drift scenarios. Out of all the compared algorithms, AUE2 provided best average classification accuracy while proving to be less memory consuming than other ensemble approaches. Experimental results show that AUE2 can be considered suitable for scenarios, involving many types of drift as well as static environments.

  3. Modelling precipitation extremes in the Czech Republic: update of intensity–duration–frequency curves

    Directory of Open Access Journals (Sweden)

    Michal Fusek

    2016-11-01

    Full Text Available Precipitation records from six stations of the Czech Hydrometeorological Institute were subject to statistical analysis with the objectives of updating the intensity–duration–frequency (IDF curves, by applying extreme value distributions, and comparing the updated curves against those produced by an empirical procedure in 1958. Another objective was to investigate differences between both sets of curves, which could be explained by such factors as different measuring instruments, measuring stations altitudes and data analysis methods. It has been shown that the differences between the two sets of IDF curves are significantly influenced by the chosen method of data analysis.

  4. The string-junction picture of multiquark states: an update

    CERN Document Server

    Rossi, Giancarlo

    2016-06-07

    We recall and update, both theoretically and phenomenologically, our (nearly) forty-years-old proposal of a string-junction as a necessary complement to the conventional classification of hadrons based just on their quark-antiquark constituents. In that proposal single (though in general metastable) hadronic states are associated with "irreducible" gauge-invariant operators consisting of Wilson lines (visualized as strings of color flux tubes) that may either end on a quark or an antiquark, or annihilate in triplets at a junction $J$ or an anti-junction $\\bar{J}$. For the junction-free sector (ordinary $q\\, \\bar{q}$ mesons and glueballs) the picture is supported by large-$N$ (number of colors) considerations as well as by a lattice strong-coupling expansion. Both imply the famous OZI rule suppressing quark-antiquark annihilation diagrams. For hadrons with $J$ and/or $\\bar{J}$ constituents the same expansions support our proposal, including its generalization of the OZI rule to the suppression of $J-\\bar{J}$ a...

  5. Comments and Remarks over Classic Linear Loop-Gain Method for Oscillator Design and Analysis. New Proposed Method Based on NDF/RRT

    Directory of Open Access Journals (Sweden)

    J. L. Jimenez-Martin

    2012-04-01

    Full Text Available Present paper describes a new method for designing oscillators based on the Normalized Determinant Function (NDF and Return Relations (RRT . First a review of the loop-gain method will be performed, showing pros, cons and including some examples for exploring wrong so- lutions provided by this method. Wrong solutions, because some conditions have to be previously fulfilled in order to obtain right ones, which will be described and finally, demonstrate that NDF analysis is necessary, including Return Relations (RRT usefulness, which in fact are related with the True Loop-Gain. Finally concluding this paper, steps for oscillator design and analysis, using the proposed NDF/RRT method will be presented, compared to wrong previous solutions pointing out new accuracy achieved on oscillation frequency and QL prediction. Also, more new examples, of plane reference oscillators (Z/Y/rho, will be added for which loop gain method application is clearly difficult or even impossible, solving them with the new proposed NDF/RRT method.

  6. "Etxadi-Gangoiti" Scale: A Proposal to Evaluate the Family Contexts of Two-Year-Old Children

    Science.gov (United States)

    Arranz Freijo, Enrique B.; Olabarrieta Artetxe, Fernando; Manzano Fernández, Ainhoa; Martín Ayala, Juan luís; Galende Pérez, Nuria

    2014-01-01

    This paper makes a proposal for the comprehensive assessment of the family context of children aged two years. It offers an updated resource based on recent research into the assessment of family contexts and their influence on children's psychological development. The proposal explores the following areas: Presence of learning materials;…

  7. 78 FR 33138 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing of Proposed...

    Science.gov (United States)

    2013-06-03

    ... subject to the same risk of undue pressure from investors. These situations are also less likely to fit... provisions, to conform cross-references contained in other By-Laws to changes being proposed herein and to... changes being proposed are conforming in nature in that they update cross-references to By-Laws and Rules...

  8. The LANDFIRE Refresh strategy: updating the national dataset

    Science.gov (United States)

    Nelson, Kurtis J.; Connot, Joel A.; Peterson, Birgit E.; Martin, Charley

    2013-01-01

    The LANDFIRE Program provides comprehensive vegetation and fuel datasets for the entire United States. As with many large-scale ecological datasets, vegetation and landscape conditions must be updated periodically to account for disturbances, growth, and natural succession. The LANDFIRE Refresh effort was the first attempt to consistently update these products nationwide. It incorporated a combination of specific systematic improvements to the original LANDFIRE National data, remote sensing based disturbance detection methods, field collected disturbance information, vegetation growth and succession modeling, and vegetation transition processes. This resulted in the creation of two complete datasets for all 50 states: LANDFIRE Refresh 2001, which includes the systematic improvements, and LANDFIRE Refresh 2008, which includes the disturbance and succession updates to the vegetation and fuel data. The new datasets are comparable for studying landscape changes in vegetation type and structure over a decadal period, and provide the most recent characterization of fuel conditions across the country. The applicability of the new layers is discussed and the effects of using the new fuel datasets are demonstrated through a fire behavior modeling exercise using the 2011 Wallow Fire in eastern Arizona as an example.

  9. Incrementally Detecting Change Types of Spatial Area Object: A Hierarchical Matching Method Considering Change Process

    Directory of Open Access Journals (Sweden)

    Yanhui Wang

    2018-01-01

    Full Text Available Detecting and extracting the change types of spatial area objects can track area objects’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets. To respond to the problems of high complexity of detection methods, high redundancy rate of detection factors, and the low automation degree during incrementally update process, we take into account the change process of area objects in an integrated way and propose a hierarchical matching method to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. We illustrate in details the identification, extraction, and database entry of change types, and how we achieve a close connection and organic coupling of incremental information extraction and object type-of-change detection so as to characterize the whole change process. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing weighted matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new-version and old-version objects, and facilitate the linked update processing and quality control of spatial data.

  10. Recommendations of diagnosis and treatment of pleural effusion. Update.

    Science.gov (United States)

    Villena Garrido, Victoria; Cases Viedma, Enrique; Fernández Villar, Alberto; de Pablo Gafas, Alicia; Pérez Rodríguez, Esteban; Porcel Pérez, José Manuel; Rodríguez Panadero, Francisco; Ruiz Martínez, Carlos; Salvatierra Velázquez, Angel; Valdés Cuadrado, Luis

    2014-06-01

    Although during the last few years there have been several important changes in the diagnostic or therapeutic methods, pleural effusion is still one of the diseases that the respiratory specialist have to evaluate frequently. The aim of this paper is to update the knowledge about pleural effusions, rather than to review the causes of pleural diseases exhaustively. These recommendations have a longer extension for the subjects with a direct clinical usefulness, but a slight update of other pleural diseases has been also included. Among the main scientific advantages are included the thoracic ultrasonography, the intrapleural fibrinolytics, the pleurodesis agents, or the new pleural drainages techniques. Copyright © 2013 SEPAR. Published by Elsevier Espana. All rights reserved.

  11. Circular Updates

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Circular Updates are periodic sequentially numbered instructions to debriefing staff and observers informing them of changes or additions to scientific and specimen...

  12. Important update of CERN Mail Services

    CERN Multimedia

    IT Department

    2009-01-01

    The CERN Mail Services are evolving. In the course of June and July 2009, all CERN mailboxes will be updated with a new infrastructure for hosting mailboxes, running Exchange 2007. This update is taking place in order to provide the capacity upgrade for the constantly growing volume of CERN mailboxes. It is also the opportunity to provide a number of improvements to CERN mailboxes: new and improved Outlook Web Access (the web interface used to access your mailbox from a web browser, also known as "webmail"), new features in the Out-of-Office auto-reply assistant, easier spam management... The update will preserve the mailbox configuration and no specific action is required by users. During the next weeks, each mailbox will be individually notified of the upcoming update the day before it takes place. We invite all users to carefully read this notification as it will contain the latest information for this update. The mailbox will be unavailable for a short time during the ni...

  13. Updated embrittlement trend curve for reactor pressure vessel steels

    International Nuclear Information System (INIS)

    Kirk, M.; Santos, C.; Eason, E.; Wright, J.; Odette, G.R.

    2003-01-01

    The reactor pressure vessels of commercial nuclear power plants are subject to embrittlement due to exposure to high energy neutrons from the core. Irradiation embrittlement of RPV belt-line materials is currently evaluated using US Regulatory Guide 1.99 Revision 2 (RG 1.99 Rev 2), which presents methods for estimating the Charpy transition temperature shift (ΔT30) at 30 ft-lb (41 J) and the drop in Charpy upper shelf energy (ΔUSE). A more recent embrittlement model, based on a broader database and more recent research results, is presented in NUREG/CR-6551. The objective of this paper is to describe the most recent update to the embrittlement model in NUREG/CR-6551, based upon additional data and increased understanding of embrittlement mechanisms. The updated ΔT30 and USE models include fluence, copper, nickel, phosphorous content, and product form; the ΔT30 model also includes coolant temperature, irradiation time (or flux), and a long-time term. The models were developed using multi-variable surface fitting techniques, understanding of the ΔT30 mechanisms, and engineering judgment. The updated ΔT30 model reduces scatter significantly relative to RG 1.99 Rev 2 on the currently available database for plates, forgings, and welds. This updated embrittlement trend curve will form the basis of revision 3 to Regulatory Guide 1.99. (author)

  14. 76 FR 24911 - Notice of Submission of Proposed Information Collection to OMB; Public Housing Physical Needs...

    Science.gov (United States)

    2011-05-03

    ...The proposed information collection requirement described below has been submitted to the Office of Management and Budget (OMB) for review, as required by the Paperwork Reduction Act. The Department is soliciting public comments on the subject proposal. PHAs will complete a PNA once every 5 years, will update the PNA annually, and will submit information electronically to HUD. The information is used by PHAs as a strategic and capital planning tool. The information uploaded to HUD will be used for aggregation of an estimate of the capital needs across the Public Housing portfolio and evaluation of the impact of the Capital Fund in meeting the physical needs based upon review of the annual updates.

  15. Fast Updating National Geo-Spatial Databases with High Resolution Imagery: China's Methodology and Experience

    Science.gov (United States)

    Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.

    2014-04-01

    Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.

  16. DATMAN: A reliability data analysis program using Bayesian updating

    International Nuclear Information System (INIS)

    Becker, M.; Feltus, M.A.

    1996-01-01

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, which can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately

  17. Where's the beef? An update on meat irradiation in the USA

    International Nuclear Information System (INIS)

    Adams, Patterson

    2000-01-01

    Since the US Food and Drug Administration (FDA) approved irradiation of red meats in December 1997, the irradiation industry has been focused on this potential new utilization of our technology. In February 1999, the United States Department of Agricultural (USDA) finally issued a proposed rule, which will allow processors to begin irradiating red meats for human consumption. This presentation provides a brief update of the rules, regulations and prospects for this promising application. (author)

  18. The Complete and Updated "Rotifer Polyculture Method" for Rearing First Feeding Zebrafish

    Science.gov (United States)

    Lawrence, Christian; Best, Jason; Cockington, Jason; Henry, Eric C.; Hurley, Shane; James, Althea; Lapointe, Christopher; Maloney, Kara; Sanders, Erik

    2016-01-01

    The zebrafish (Danio rerio) is a model organism of increasing importance in many fields of science. One of the most demanding technical aspects of culture of this species in the laboratory is rearing first-feeding larvae to the juvenile stage with high rates of growth and survival. The central management challenge of this developmental period revolves around delivering highly nutritious feed items to the fish on a nearly continuous basis without compromising water quality. Because larval zebrafish are well-adapted to feed on small zooplankton in the water column, live prey items such as brachionid rotifers, Artemia, and Paramecium are widely recognized as the feeds of choice, at least until the fish reach the juvenile stage and are able to efficiently feed on processed diets. This protocol describes a method whereby newly hatched zebrafish larvae are cultured together with live saltwater rotifers (Brachionus plicatilis) in the same system. This polyculture approach provides fish with an "on-demand", nutrient-rich live food source without producing chemical waste at levels that would otherwise limit performance. Importantly, because the system harnesses both the natural high productivity of the rotifers and the behavioral preferences of the fish, the labor involved with maintenance is low. The following protocol details an updated, step-by-step procedure that incorporates rotifer production (scalable to any desired level) for use in a polyculture of zebrafish larvae and rotifers that promotes maximal performance during the first 5 days of exogenous feeding. PMID:26863035

  19. Uncertainty for Part Density Determination: An Update

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Mario Orlando [Los Alamos National Laboratory

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  20. 77 FR 17502 - Agency Information Collection Activities: Proposed Collection; Comments Requested: Strategic...

    Science.gov (United States)

    2012-03-26

    ...] Agency Information Collection Activities: Proposed Collection; Comments Requested: Strategic Planning... agencies update and revise their strategic plans every three years. The Strategic Planning Office at ATF... accepted for ``sixty days'' until May 25, 2012. This process is conducted in accordance with 5 CFR 1320.10...

  1. A Lightweight Surface Reconstruction Method for Online 3D Scanning Point Cloud Data Oriented toward 3D Printing

    Directory of Open Access Journals (Sweden)

    Buyun Sheng

    2018-01-01

    Full Text Available The existing surface reconstruction algorithms currently reconstruct large amounts of mesh data. Consequently, many of these algorithms cannot meet the efficiency requirements of real-time data transmission in a web environment. This paper proposes a lightweight surface reconstruction method for online 3D scanned point cloud data oriented toward 3D printing. The proposed online lightweight surface reconstruction algorithm is composed of a point cloud update algorithm (PCU, a rapid iterative closest point algorithm (RICP, and an improved Poisson surface reconstruction algorithm (IPSR. The generated lightweight point cloud data are pretreated using an updating and rapid registration method. The Poisson surface reconstruction is also accomplished by a pretreatment to recompute the point cloud normal vectors; this approach is based on a least squares method, and the postprocessing of the PDE patch generation was based on biharmonic-like fourth-order PDEs, which effectively reduces the amount of reconstructed mesh data and improves the efficiency of the algorithm. This method was verified using an online personalized customization system that was developed with WebGL and oriented toward 3D printing. The experimental results indicate that this method can generate a lightweight 3D scanning mesh rapidly and efficiently in a web environment.

  2. Amended proposal for R ampersand D on a cluster klystron

    International Nuclear Information System (INIS)

    Fernow, R.C.; Fischer, J.; Gallardo, J.C.; Kirk, H.G.; Ko, S.K.; Palmer, R.B.; Ulc, S.; Wang, H.

    1993-01-01

    This Proposal is an updated version of FWP submitted in March 1992. Significant work has been done since the original proposal, and much of this is reported on in this update. In addition there have been several changes made, some in response to suggestions made by the three reviews sent to us in December, 1992. The new information and changes include: Technical information on the proposed design of the magnetron gun, the magnet, acceleration gap, and electrical system (including a comment on efficiency loss due to high-voltage leakage current). Modification of the phase I and II tests to allow operation of the gun and klystron off the axis of the magnet, thus simulating the magnet situation when multiple beams are used. Modification of phases III and IV to test a cluster of three beams: first a three beam gun, and then three beams with a klystron on one of them. We have added a phase V which would be the testing of a full three-beam demonstration klystron. The mod-anode pulser would now be located on the high voltage deck instead of externally. Power for the pulser and other high voltage components would now be provided by an isolation transformer instead of from a lead battery. We believe these changes have improved the proposed program and thank the reviewers for their constructive suggestions. The design is still evolving. Relatively little work has been done on the detailed klystron design, and none on the beam dump

  3. A Proposal of New Spherical Particle Modeling Method Based on Stochastic Sampling of Particle Locations in Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Do Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jea Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    To the high computational efficiency and user convenience, the implicit method had received attention; however, it is noted that the implicit method in the previous studies has low accuracy at high packing fraction. In this study, a new implicit method, which can be used at any packing fraction with high accuracy, is proposed. In this study, the implicit modeling method in the spherical particle distributed medium for using the MC simulation is proposed. A new concept in the spherical particle sampling was developed to solve the problems in the previous implicit methods. The sampling method was verified by simulating the sampling method in the infinite and finite medium. The results show that the particle implicit modeling with the proposed method was accurately performed in all packing fraction boundaries. It is expected that the proposed method can be efficiently utilized for the spherical particle distributed mediums, which are the fusion reactor blanket, VHTR reactors, and shielding analysis.

  4. Thalamocortical dysrhythmia: a theoretical update in tinnitus

    Directory of Open Access Journals (Sweden)

    Dirk eDe Ridder

    2015-06-01

    Full Text Available Tinnitus is the perception of a sound in the absence of an external sound source. Pathophysiologically it has been attributed to bottom up deafferentation and/or top down noise-cancelling deficit. Both mechanisms are proposed to alter auditory thalamocortical signal transmission resulting in thalamocortical dysrhythmia (TCD. In deafferentation, TCD is characterized by a slowing down of resting state alpha to theta activity associated with an increase in surrounding gamma activity, resulting in persisting cross-frequency coupling between theta and gamma activity. Theta burst-firing increases network synchrony and recruitment, a mechanism which might enable long range synchrony, which in turn could represent a means for finding the missing thalamocortical information and for gaining access to consciousness. Theta oscillations could function as a carrier wave to integrate the tinnitus related focal auditory gamma activity in a consciousness enabling network, as envisioned by the global workspace model. This model suggests that focal activity in the brain does not reach consciousness, except if the focal activity becomes functionally coupled to a consciousness enabling network, aka the global workspace. In limited deafferentation the missing information can be retrieved from the auditory cortical neighborhood, decreasing surround inhibition, resulting in TCD. When the deafferentation is too wide in bandwidth it is hypothesized that the missing information is retrieved from theta mediated parahippocampal auditory memory. This suggests that based on the amount of deafferentation TCD might change to parahippocampo-cortical persisting and thus pathological theta-gamma rhythm. From a Bayesian point of view, in which the brain is conceived as a prediction machine that updates its memory-based predictions through sensory updating, tinnitus is the result of a prediction error between the predicted and sensed auditory input. The decrease in sensory updating

  5. 75 FR 41140 - Agency Information Collection Activities: Proposed Collection; Comment Request-Child Nutrition...

    Science.gov (United States)

    2010-07-15

    ... nutrient data from the food service industry to update and expand the Child Nutrition Database in support... DEPARTMENT OF AGRICULTURE Food and Nutrition Service Agency Information Collection Activities: Proposed Collection; Comment Request--Child Nutrition Database AGENCY: Food and Nutrition Service, USDA...

  6. 78 FR 25059 - Proposed Information Collection; Comment Request; Business and Professional Classification Report...

    Science.gov (United States)

    2013-04-29

    ... business activity, company structure, size, and business operations. This information is used to update the.... Form Number: SQ-CLASS. Type of Review: Regular. Affected Public: Businesses and other organizations in... DEPARTMENT OF COMMERCE Census Bureau Proposed Information Collection; Comment Request; Business...

  7. 7. Mentor update and support: what do mentors need from an update?

    Science.gov (United States)

    Phillips, Mari; Marshall, Joyce

    2015-04-01

    Mentorship is the 14th series of 'Midwifery basics' targeted at practising midwives. The aim of these articles is to provide information to raise awareness of the impact of the work of midwives on women's experience, and encourage midwives to seek further information through a series of activities relating to the topic. In this seventh article Mari Phillips and Joyce Marshall consider some of the key issues related to mentor update and support and consider what mentors need from their annual update.

  8. A proposed safety assurance method and its application to the fusion experimental reactor

    International Nuclear Information System (INIS)

    Okazaki, T.; Seki, Y.; Inabe, T.; Aoki, I.

    1995-01-01

    Importance categorization and hazard identification methods have been proposed for a fusion experimental reactor. A parameter, the system index, is introduced in the categorization method. The relative importance of systems with safety functions can be classified by the largeness of the system index and whether or not the system acts as a boundary for radioactive materials. This categorization can be used as the basic principle in determining structure design assessment, seismic design criteria etc. For the hazard identification the system time energy matrix is proposed, where the time and spatial distributions of hazard energies are used. This approach is formulated more systematically than an ad-hoc identification of hazard events and it is useful to select design basis events which are employed in the assessment of safety designs. (orig.)

  9. Seismic detection method for small-scale discontinuities based on dictionary learning and sparse representation

    Science.gov (United States)

    Yu, Caixia; Zhao, Jingtao; Wang, Yanfei

    2017-02-01

    Studying small-scale geologic discontinuities, such as faults, cavities and fractures, plays a vital role in analyzing the inner conditions of reservoirs, as these geologic structures and elements can provide storage spaces and migration pathways for petroleum. However, these geologic discontinuities have weak energy and are easily contaminated with noises, and therefore effectively extracting them from seismic data becomes a challenging problem. In this paper, a method for detecting small-scale discontinuities using dictionary learning and sparse representation is proposed that can dig up high-resolution information by sparse coding. A K-SVD (K-means clustering via Singular Value Decomposition) sparse representation model that contains two stage of iteration procedure: sparse coding and dictionary updating, is suggested for mathematically expressing these seismic small-scale discontinuities. Generally, the orthogonal matching pursuit (OMP) algorithm is employed for sparse coding. However, the method can only update one dictionary atom at one time. In order to improve calculation efficiency, a regularized version of OMP algorithm is presented for simultaneously updating a number of atoms at one time. Two numerical experiments demonstrate the validity of the developed method for clarifying and enhancing small-scale discontinuities. The field example of carbonate reservoirs further demonstrates its effectiveness in revealing masked tiny faults and small-scale cavities.

  10. Email Updates

    Science.gov (United States)

    ... of this page: https://medlineplus.gov/listserv.html Email Updates To use the sharing features on this ... view your email history or unsubscribe. Prevent MedlinePlus emails from being marked as "spam" or "junk" To ...

  11. Valence-Dependent Belief Updating: Computational Validation

    Directory of Open Access Journals (Sweden)

    Bojana Kuzmanovic

    2017-06-01

    Full Text Available People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates with trials with bad news (worse-than-expected base rates. After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on

  12. Level Set Projection Method for Incompressible Navier-Stokes on Arbitrary Boundaries

    KAUST Repository

    Williams-Rioux, Bertrand

    2012-01-12

    Second order level set projection method for incompressible Navier-Stokes equations is proposed to solve flow around arbitrary geometries. We used rectilinear grid with collocated cell centered velocity and pressure. An explicit Godunov procedure is used to address the nonlinear advection terms, and an implicit Crank-Nicholson method to update viscous effects. An approximate pressure projection is implemented at the end of the time stepping using multigrid as a conventional fast iterative method. The level set method developed by Osher and Sethian [17] is implemented to address real momentum and pressure boundary conditions by the advection of a distance function, as proposed by Aslam [3]. Numerical results for the Strouhal number and drag coefficients validated the model with good accuracy for flow over a cylinder in the parallel shedding regime (47 < Re < 180). Simulations for an array of cylinders and an oscillating cylinder were performed, with the latter demonstrating our methods ability to handle dynamic boundary conditions.

  13. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  14. 75 FR 43235 - Medicare Program; Home Health Prospective Payment System Rate Update for Calendar Year 2011...

    Science.gov (United States)

    2010-07-23

    ... Hypertension''), fail to include the NHLBI Blood Pressure (BP) guidelines and classification terminology. The.... Updates to the HH PPS II. Provisions of the Proposed Regulation A. Case-Mix Measurement B. Hypertension... points for secondary diagnoses, whereas the system prior to the refinements did not. Longstanding OASIS...

  15. Eastern US seismic hazard characterization update

    International Nuclear Information System (INIS)

    Savy, J.B.; Boissonnade, A.C.; Mensing, R.W.; Short, C.M.

    1993-06-01

    In January 1989, LLNL published the results of a multi-year project, funded by NRC, on estimating seismic hazard at nuclear plant sites east of the Rockies. The goal of this study was twofold: to develop a good central estimate (median) of the seismic hazard and to characterize the uncertainty in the estimates of this hazard. In 1989, LLNL was asked by DOE to develop site specific estimates of the seismic hazard at the Savannah River Site (SRS) in South Carolina as part of the New Production Reactor (NPR) project. For the purpose of the NPR, a complete review of the methodology and of the data acquisition process was performed. Work done under the NPR project has shown that first order improvement in the estimates of the uncertainty (i.e., lower mean hazard values) could be easily achieved by updating the modeling of the seismicity and ground motion attenuation uncertainty. To this effect, NRC sponsored LLNL to perform a reelicitation to update the seismicity and ground motion experts' inputs and to revise methods to combine seismicity and ground motion inputs in the seismic hazard analysis for nuclear power plant sites east of the Rocky Mountains. The objective of the recent study was to include the first order improvements that reflect the latest knowledge in seismicity and ground motion modeling and produce an update of all the hazard results produced in the 1989 study. In particular, it had been demonstrated that eliciting seismicity information in terms of rates of earthquakes rather than a- and b-values, and changing the elicitation format to a one-on-one interview, improved our ability to express the uncertainty of earthquake rates of occurrence at large magnitudes. Thus, NRC sponsored this update study to refine the model of uncertainty, and to re-elicitate of the experts' interpretations of the zonation and seismicity, as well as to reelicitate the ground motion models, based on current state of knowledge

  16. An LPV Adaptive Observer for Updating a Map Applied to an MAF Sensor in a Diesel Engine.

    Science.gov (United States)

    Liu, Zhiyuan; Wang, Changhui

    2015-10-23

    In this paper, a new method for mass air flow (MAF) sensor error compensation and an online updating error map (or lookup table) due to installation and aging in a diesel engine is developed. Since the MAF sensor error is dependent on the engine operating point, the error model is represented as a two-dimensional (2D) map with two inputs, fuel mass injection quantity and engine speed. Meanwhile, the 2D map representing the MAF sensor error is described as a piecewise bilinear interpolation model, which can be written as a dot product between the regression vector and parameter vector using a membership function. With the combination of the 2D map regression model and the diesel engine air path system, an LPV adaptive observer with low computational load is designed to estimate states and parameters jointly. The convergence of the proposed algorithm is proven under the conditions of persistent excitation and given inequalities. The observer is validated against the simulation data from engine software enDYNA provided by Tesis. The results demonstrate that the operating point-dependent error of the MAF sensor can be approximated acceptably by the 2D map from the proposed method.

  17. 49 CFR 360.5 - Updating user fees.

    Science.gov (United States)

    2010-10-01

    ... updating the cost components comprising the fee. Cost components shall be updated as follows: (1) Direct... determined by the cost study in Regulations Governing Fees For Service, 1 I.C.C. 2d 60 (1984), or subsequent... by total office costs for the office directly associated with user fee activity. Actual updating of...

  18. Updating of states in operational hydrological models

    Science.gov (United States)

    Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.

    2012-04-01

    Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.

  19. A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data

    KAUST Repository

    Liang, Faming

    2013-03-01

    The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  20. Statistical Bayesian method for reliability evaluation based on ADT data

    Science.gov (United States)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  1. Updating design information questionnaire (DIQ) experiences

    International Nuclear Information System (INIS)

    Palafox-Garcia, P.

    2001-01-01

    , we had to update our two DIQ's and we have to go through the previous mentioned steps, that means that in order to get the updated Facility Attachment for our MBA's, we have to wait at least two years, and meanwhile the IAEA Safeguards Inspectors have to deal without the proper Facility Attachment during their inspections in the facility, but the main problem is that meanwhile we are dealing with the updated DIQ's, unexpected changes have been occurred in one of the MBA's. The situation we are facing in the MBA is, now that the Fuel Fabrication Pilot Plant (FFPP) was stopped, they want to take advantage of certain part of this area and some equipment in order of work at the facility as a research level, but meanwhile the nuclear material is still there, they have to comply with all the requirements of security, physical, radiological and safeguards, so they are planning to leave at the MBA, a small quantity of nuclear material in order that the security regulations to comply with, there were no so difficult, laborious and expensive. The most of the nuclear material is planned to pack it in 200 It. metallic drums and send it to the other MBA. 4. Conclusion - The updated DIQ we are dealing with, was sent after they stopped the FFPP and when we will get the updated Facility Attachment will send a third more updated DIQ and this it will be as soon as they could solve all the permissions and requirements needed for the maneuvers previously described and we do not know how long it will takes, but they are working on it. As a proposal, in order to avoid that length of time to get the new Facility Attachment, the IAEA has to establish to the facilities a delivering time for each review. (author)

  2. Kepler Stellar Properties Catalog Update for Q1-Q17 DR25 Transit Search

    Science.gov (United States)

    Mathur, Savita; Huber, Daniel

    2016-01-01

    Huber et al. (2014) presented revised stellar properties for 196,468 Kepler targets, which were used for the Q1-Q16 TPSDV planet search (Tenenbaum et al. 2014). The catalog was based on atmospheric properties (i.e., temperature (Teff), surface gravity (log(g)), and metallicity ([FeH])) published in the literature using a variety of methods (e.g., asteroseismology, spectroscopy, exoplanet transits, photometry), which were then homogeneously fitted to a grid of Dartmouth (DSEP) isochrones (Dotter et al. 2008). The catalog was updated in early 2015 for the Q1-Q17 Data Release (DR) 24 transit search (Seader et al. 2015) based on the latest classifications of Kepler targets in the literature at that time. The methodology followed Huber et al. (2014). Here we provide updated stellar properties of 197,096 Kepler targets. Like the previous catalog, this update is based on atmospheric properties that were either published in the literature or provided by the Kepler community follow-up program (CFOP). The input values again come from different methods: asteroseismology, spectroscopy, flicker, and photometry. This catalog update was developed to support the SOC 9.3 TPSDV planet search (Twicken et al. 2016), which is expected to be the final search and data release by the Kepler project.In this document, we describe the method and the inputs that were used to build the catalog. The methodology follows Huber et al. (2014) with a few improvements as described in Section 2.

  3. Genetic Algorithm (GA Method for Optimization of Multi-Reservoir Systems Operation

    Directory of Open Access Journals (Sweden)

    Shervin Momtahen

    2006-01-01

    Full Text Available A Genetic Algorithm (GA method for optimization of multi-reservoir systems operation is proposed in this paper. In this method, the parameters of operating policies are optimized using system simulation results. Hence, any operating problem with any sort of objective function, constraints and structure of operating policy can be optimized by GA. The method is applied to a 3-reservoir system and is compared with two traditional methods of Stochastic Dynamic Programming and Dynamic Programming and Regression. The results show that GA is superior both in objective function value and in computational speed. The proposed method is further improved using a mutation power updating rule and a varying period simulation method. The later is a novel procedure proposed in this paper that is believed to help in solving computational time problem in large systems. These revisions are evaluated and proved to be very useful in converging to better solutions in much less time. The final GA method is eventually evaluated as a very efficient procedure that is able to solve problems of large multi-reservoir system which is usually impossible by traditional methods. In fact, the real performance of the GA method starts where others fail to function.

  4. Update of the ADEME 2035-2050 energy-climate scenario

    International Nuclear Information System (INIS)

    Combet, Emmanuel; Marchal, David; Vincent, Isabelle; Mairet, Nicolas; Briand, Vincent; Cals, Guilain; Sidat, Patricia; Bellini, Robert; Guenard, Vincent; Berthomieu, Nadine; Canal, David; Laplaige, Philippe; Delanoe, Julien; Morlot, Rodolphe; Biscaglia, Stephane; Cardona Maestro, Astrid; Thouin, Simon; Bardinal, Marc; Eglin, Thomas; Gagnepain, Bruno; Martin, Sarah; Proharam, Florence; Streiff, Frederic; El Khamlichi, Aicha; Bodineau, Luc; Bastide, Guillaume; Moch, Yves; Marry, Solene; Lefranc, Anne; Mefflet-Piperel, Mathieu; Chassignet, Mathieu; Boulard, Severine; Carballes, Sandrine; Ducreux, Bertrand- Olivier; Barbusse, Stephane; Plassat, Gabriel; Pasquier, Maxime; Tremeac, Yann; Nauleau, Marie-Laure; Meunier, Laurent; Callonnec, Gael

    2017-08-01

    After a discussion of the evolution of the economic and demographic context, this report, illustrated by many data tables and graphs, proposes an analysis of demand evolutions in the building and urban sectors, in the transport and mobility sectors, in the food, agriculture and soil use sectors, and assesses the corresponding evolution of final energy consumptions. The second part analyses, discusses and assesses the various and important potentials of production of renewable energy: proposition of two variations for the de-carbonated energy mix, biomass mobilisation, directly used renewable energy sources, production mix of heat networks, the gas system. It then proposes a synthetic overview and discussion of the evolution of energy offer. The evolution of greenhouse gas emissions is then assessed and discussed. Lessons learned from this update are highlighted, and perspectives are discussed. A synthetic version of this report is also provided

  5. A study of internet of things real-time data updating based on WebSocket

    Science.gov (United States)

    Wei, Shoulin; Yu, Konglin; Dai, Wei; Liang, Bo; Zhang, Xiaoli

    2015-12-01

    The Internet of Things (IoT) is gradually entering the industrial stage. Web applications in IoT such as monitoring, instant messaging, real-time quote system changes need to be transmitted in real-time mode to client without client constantly refreshing and sending the request. These applications often need to be as fast as possible and provide nearly real-time components. Real-time data updating is becoming the core part of application layer visualization technology in IoT. With support of data push in server-side, running state of "Things" in IoT could be displayed in real-time mode. This paper discusses several current real-time data updating method and explores the advantages and disadvantages of each method. We explore the use of WebSocket in a new approach for real-time data updating in IoT, since WebSocket provides low delay, low network throughput solutions for full-duplex communication.

  6. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas March 26Update of the voice messaging systemAll CERN sites April 4Updat...

  7. 75 FR 41106 - Amendments to the Water Quality Regulations, Water Code and Comprehensive Plan to Update Water...

    Science.gov (United States)

    2010-07-15

    ... DELAWARE RIVER BASIN COMMISSION 18 CFR Part 410 Amendments to the Water Quality Regulations, Water Code and Comprehensive Plan to Update Water Quality Criteria for Toxic Pollutants in the Delaware... hold a public hearing to receive comments on proposed amendments to the Commission's Water Quality...

  8. Maximum Likelihood-Based Methods for Target Velocity Estimation with Distributed MIMO Radar

    Directory of Open Access Journals (Sweden)

    Zhenxin Cao

    2018-02-01

    Full Text Available The estimation problem for target velocity is addressed in this in the scenario with a distributed multi-input multi-out (MIMO radar system. A maximum likelihood (ML-based estimation method is derived with the knowledge of target position. Then, in the scenario without the knowledge of target position, an iterative method is proposed to estimate the target velocity by updating the position information iteratively. Moreover, the Carmér-Rao Lower Bounds (CRLBs for both scenarios are derived, and the performance degradation of velocity estimation without the position information is also expressed. Simulation results show that the proposed estimation methods can approach the CRLBs, and the velocity estimation performance can be further improved by increasing either the number of radar antennas or the information accuracy of the target position. Furthermore, compared with the existing methods, a better estimation performance can be achieved.

  9. Breast Cancer and Estrogen-Alone Update

    Science.gov (United States)

    ... Current Issue Past Issues Research News From NIH Breast Cancer and Estrogen-Alone Update Past Issues / Summer 2006 ... hormone therapy does not increase the risk of breast cancer in postmenopausal women, according to an updated analysis ...

  10. Working Memory Updating as a Predictor of Academic Attainment

    Science.gov (United States)

    Lechuga, M. Teresa; Pelegrina, Santiago; Pelaez, Jose L.; Martin-Puga, M. Eva; Justicia, M. Jose

    2016-01-01

    There is growing evidence supporting the importance of executive functions, and specifically working memory updating (WMU), for children's academic achievement. This study aimed to assess the specific contribution of updating to the prediction of academic performance. Two updating tasks, which included different updating components, were…

  11. Study of Updating Initiating Event Frequency using Prognostics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of)

    2014-10-15

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA.

  12. Study of Updating Initiating Event Frequency using Prognostics

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung

    2014-01-01

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA

  13. A New Key-lock Method for User Authentication and Access Control

    Institute of Scientific and Technical Information of China (English)

    JI Dongyao; ZHANG Futai; WANG Yumin

    2001-01-01

    We propose a new key-lock methodfor user authentication and access control based onChinese remainder theorem, the concepts of the ac-cess control matrix, key-lock-pair, time stamp, and the NS public key protocol. Our method is dynamicand needs a minimum amount of computation in thesense that it only updates at most one key/lock foreach access request. We also demonstrate how an au-thentication protocol can be integrated into the ac-cess control method. By applying a time stamp, themethod can not only withstand replay attack, butalso strengthen the authenticating mechanism, whichcould not be achieved simultaneously in previous key-lock methods.

  14. Machine learning in updating predictive models of planning and scheduling transportation projects

    Science.gov (United States)

    1997-01-01

    A method combining machine learning and regression analysis to automatically and intelligently update predictive models used in the Kansas Department of Transportations (KDOTs) internal management system is presented. The predictive models used...

  15. Proposal and Implementation of a Robust Sensing Method for DVB-T Signal

    Science.gov (United States)

    Song, Chunyi; Rahman, Mohammad Azizur; Harada, Hiroshi

    This paper proposes a sensing method for TV signals of DVB-T standard to realize effective TV White Space (TVWS) Communication. In the TVWS technology trial organized by the Infocomm Development Authority (iDA) of Singapore, with regard to the sensing level and sensing time, detecting DVB-T signal at the level of -120dBm over an 8MHz channel with a sensing time below 1 second is required. To fulfill such a strict sensing requirement, we propose a smart sensing method which combines feature detection and energy detection (CFED), and is also characterized by using dynamic threshold selection (DTS) based on a threshold table to improve sensing robustness to noise uncertainty. The DTS based CFED (DTS-CFED) is evaluated by computer simulations and is also implemented into a hardware sensing prototype. The results show that the DTS-CFED achieves a detection probability above 0.9 for a target false alarm probability of 0.1 for DVB-T signals at the level of -120dBm over an 8MHz channel with the sensing time equals to 0.1 second.

  16. THE USE OF MULTIPLE DATA SOURCES IN THE PROCESS OF TOPOGRAPHIC MAPS UPDATING

    Directory of Open Access Journals (Sweden)

    A. Cantemir

    2016-06-01

    Full Text Available The methods used in the process of updating maps have evolved and become more complex, especially upon the development of the digital technology. At the same time, the development of technology has led to an abundance of available data that can be used in the updating process. The data sources came in a great variety of forms and formats from different acquisition sensors. Satellite images provided by certain satellite missions are now available on space agencies portals. Images stored in archives of satellite missions such us Sentinel, Landsat and other can be downloaded free of charge.The main advantages are represented by the large coverage area and rather good spatial resolution that enables the use of these images for the map updating at an appropriate scale. In our study we focused our research of these images on 1: 50.000 scale map. DEM that are globally available could represent an appropriate input for watershed delineation and stream network generation, that can be used as support for hydrography thematic layer update. If, in addition to remote sensing aerial photogrametry and LiDAR data are ussed, the accuracy of data sources is enhanced. Ortophotoimages and Digital Terrain Models are the main products that can be used for feature extraction and update. On the other side, the use of georeferenced analogical basemaps represent a significant addition to the process. Concerning the thematic maps, the classic representation of the terrain by contour lines derived from DTM, remains the best method of surfacing the earth on a map, nevertheless the correlation with other layers such as Hidrography are mandatory. In the context of the current national coverage of the Digital Terrain Model, one of the main concerns of the National Center of Cartography, through the Cartography and Photogrammetry Department, is represented by the exploitation of the available data in order to update the layers of the Topographic Reference Map 1:5000, known as

  17. Update of identification and estimation of socioeconomic impacts resulting from perceived risks and changing images: An annotated bibliography

    International Nuclear Information System (INIS)

    Nieves, L.A.; Clark, D.E.; Wernette, D.

    1991-08-01

    This annotated bibliography reviews selected literature published through August 1991 on the identification of perceived risks and methods for estimating the economic impacts of risk perception. It updates the literature review found in Argonne National Laboratory report ANL/EAIS/TM-24 (February 1990). Included in this update are (1) a literature review of the risk perception process, of the relationship between risk perception and economic impacts, of economic methods and empirical applications, and interregional market interactions and adjustments; (2) a working bibliography (that includes the documents abstracted in the 1990 report); (3) a topical index to the abstracts found in both reports; and (4) abstracts of selected articles found in this update

  18. Update of identification and estimation of socioeconomic impacts resulting from perceived risks and changing images: An annotated bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Nieves, L.A.; Clark, D.E.; Wernette, D.

    1991-08-01

    This annotated bibliography reviews selected literature published through August 1991 on the identification of perceived risks and methods for estimating the economic impacts of risk perception. It updates the literature review found in Argonne National Laboratory report ANL/EAIS/TM-24 (February 1990). Included in this update are (1) a literature review of the risk perception process, of the relationship between risk perception and economic impacts, of economic methods and empirical applications, and interregional market interactions and adjustments; (2) a working bibliography (that includes the documents abstracted in the 1990 report); (3) a topical index to the abstracts found in both reports; and (4) abstracts of selected articles found in this update.

  19. Combining A Priori Knowledge and Sensor Information for Updating the Global Position of an Autonomous Vehicle

    NARCIS (Netherlands)

    Zivkovic, Z.; Schoute, Albert L.; van der Heijden, Ferdinand; van Amerongen, J.; Jonker, B.; Regtien, P.P.L; Stramigioli, S.

    The problem of updating the global position of an autonomous vehicle is considered. An iterative procedure is proposed to fit a map to a set of noisy measurements. The procedure is inspired by a non-parametric procedure for probability density function mode searching. We show how this could be used

  20. Expanded and updated data and a query pipeline for iBeetle-Base.

    Science.gov (United States)

    Dönitz, Jürgen; Gerischer, Lizzy; Hahnke, Stefan; Pfeiffer, Stefan; Bucher, Gregor

    2018-01-04

    The iBeetle-Base provides access to sequence and phenotype information for genes of the beetle Tribolium castaneum. It has been updated including more and updated data and new functions. RNAi phenotypes are now available for >50% of the genes, which represents an expansion of 60% compared to the previous version. Gene sequence information has been updated based on the new official gene set OGS3 and covers all genes. Interoperability with FlyBase has been enhanced: First, gene information pages of homologous genes are interlinked between both databases. Second, some steps of a new query pipeline allow transforming gene lists from either species into lists with related gene IDs, names or GO terms. This facilitates the comparative analysis of gene functions between fly and beetle. The backend of the pipeline is implemented as endpoints of a RESTful interface, such that it can be reused by other projects or tools. A novel online interface allows the community to propose GO terms for their gene of interest expanding the range of animals where GO terms are defined. iBeetle-Base is available at http://ibeetle-base.uni-goettingen.de/. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  1. Channel stability in the Mackenzie Delta, NWT: 1992/93 update

    Energy Technology Data Exchange (ETDEWEB)

    Carson, M A

    1992-12-01

    An update is presented of a previous report on channel stability in the Mackenzie Delta, prepared as part of a program dealing with sediment-related aspects of northern hydrocarbon development. The report presents an overview of proprietary literature from industry in the 1970s dealing with channel stability in the outer delta, a review of work of the Geological Survey of Canada in 1990-91 dealing with channel stability at proposed pipeline crossings in the Niglintgak, Taglu and Swimming Point areas, and a search of Russian literature dealing with hydrothermal erosion. 45 refs., 46 figs.

  2. Extensions and Enhancements to “the Secure Remote Update Protocol”

    Directory of Open Access Journals (Sweden)

    Andrew John Poulter

    2017-09-01

    Full Text Available This paper builds on previous work introducing the Secure Remote Update Protocol (SRUP, a secure communications protocol for Command and Control applications in the Internet of Things, built on top of MQTT. This paper builds on the original protocol and introduces a number of additional message types: adding additional capabilities to the protocol. We also discuss the difficulty of proving that a physical device has an identity corresponding to a logical device on the network and propose a mechanism to overcome this within the protocol.

  3. 77 FR 67427 - Self-Regulatory Organizations; ICE Clear Europe Limited; Notice of Filing of Proposed Rule Change...

    Science.gov (United States)

    2012-11-09

    ... Rules provide updates related to anti-money laundering legislation applicable to customers, clarify... proposed rule change will have any impact or impose any burden on competition. C. Self-Regulatory...

  4. National Security in the Nuclear Age: Public Library Proposal and Booklist. May 1987 Update.

    Science.gov (United States)

    Dane, Ernest B.

    To increase public understanding of national security issues, this document proposes that a balanced and up-to-date collection of books and other materials on national security in the nuclear age be included in all U.S. public libraries. The proposal suggests that the books be grouped together on an identified shelf. Selection criteria for the…

  5. Proposal and Evaluation of Management Method for College Mechatronics Education Applying the Project Management

    Science.gov (United States)

    Ando, Yoshinobu; Eguchi, Yuya; Mizukawa, Makoto

    In this research, we proposed and evaluated a management method of college mechatronics education. We applied the project management to college mechatronics education. We practiced our management method to the seminar “Microcomputer Seminar” for 3rd grade students who belong to Department of Electrical Engineering, Shibaura Institute of Technology. We succeeded in management of Microcomputer Seminar in 2006. We obtained the good evaluation for our management method by means of questionnaire.

  6. Quantitative critical thinking: Student activities using Bayesian updating

    Science.gov (United States)

    Warren, Aaron R.

    2018-05-01

    One of the central roles of physics education is the development of students' ability to evaluate proposed hypotheses and models. This ability is important not just for students' understanding of physics but also to prepare students for future learning beyond physics. In particular, it is often hoped that students will better understand the manner in which physicists leverage the availability of prior knowledge to guide and constrain the construction of new knowledge. Here, we discuss how the use of Bayes' Theorem to update the estimated likelihood of hypotheses and models can help achieve these educational goals through its integration with evaluative activities that use hypothetico-deductive reasoning. Several types of classroom and laboratory activities are presented that engage students in the practice of Bayesian likelihood updating on the basis of either consistency with experimental data or consistency with pre-established principles and models. This approach is sufficiently simple for introductory physics students while offering a robust mechanism to guide relatively sophisticated student reflection concerning models, hypotheses, and problem-solutions. A quasi-experimental study utilizing algebra-based introductory courses is presented to assess the impact of these activities on student epistemological development. The results indicate gains on the Epistemological Beliefs Assessment for Physical Science (EBAPS) at a minimal cost of class-time.

  7. Update on the Health Services Research Doctoral Core Competencies.

    Science.gov (United States)

    Burgess, James F; Menachemi, Nir; Maciejewski, Matthew L

    2018-03-13

    To present revised core competencies for doctoral programs in health services research (HSR), modalities to deliver these competencies, and suggested methods for assessing mastery of these competencies. Core competencies were originally developed in 2005, updated (but unpublished) in 2008, modestly updated for a 2016 HSR workforce conference, and revised based on feedback from attendees. Additional feedback was obtained from doctoral program directors, employer/workforce experts and attendees of presentation on these competencies at the AcademyHealth's June 2017 Annual Research Meeting. The current version (V2.1) competencies include the ethical conduct of research, conceptual models, development of research questions, study designs, data measurement and collection methods, statistical methods for analyzing data, professional collaboration, and knowledge dissemination. These competencies represent a core that defines what HSR researchers should master in order to address the complexities of microsystem to macro-system research that HSR entails. There are opportunities to conduct formal evaluation of newer delivery modalities (e.g., flipped classrooms) and to integrate new Learning Health System Researcher Core Competencies, developed by AHRQ, into the HSR core competencies. Core competencies in HSR are a continually evolving work in progress because new research questions arise, new methods are developed, and the trans-disciplinary nature of the field leads to new multidisciplinary and team building needs. © Health Research and Educational Trust.

  8. Macroscopic Quantum Resonators (MAQRO): 2015 update

    International Nuclear Information System (INIS)

    Kaltenbaek, Rainer; Aspelmeyer, Markus; Kiesel, Nikolai; Barker, Peter F.; Bose, Sougato; Bassi, Angelo; Bateman, James; Bongs, Kai; Cruise, Adrian Michael; Braxmaier, Claus; Brukner, Caslav; Christophe, Bruno; Rodrigues, Manuel; Chwalla, Michael; Johann, Ulrich; Cohadon, Pierre-Francois; Heidmann, Antoine; Lambrecht, Astrid; Reynaud, Serge; Curceanu, Catalina; Dholakia, Kishan; Mazilu, Michael; Diosi, Lajos; Doeringshoff, Klaus; Peters, Achim; Ertmer, Wolfgang; Rasel, Ernst M.; Gieseler, Jan; Novotny, Lukas; Rondin, Loic; Guerlebeck, Norman; Herrmann, Sven; Laemmerzahl, Claus; Hechenblaikner, Gerald; Hossenfelder, Sabine; Kim, Myungshik; Milburn, Gerard J.; Mueller, Holger; Paternostro, Mauro; Pikovski, Igor; Pilan Zanoni, Andre; Riedel, Charles Jess; Roura, Albert; Schleich, Wolfgang P.; Schmiedmayer, Joerg; Schuldt, Thilo; Schwab, Keith C.; Tajmar, Martin; Tino, Guglielmo M.; Ulbricht, Hendrik; Ursin, Rupert; Vedral, Vlatko

    2016-01-01

    Do the laws of quantum physics still hold for macroscopic objects - this is at the heart of Schroedinger's cat paradox - or do gravitation or yet unknown effects set a limit for massive particles? What is the fundamental relation between quantum physics and gravity? Ground-based experiments addressing these questions may soon face limitations due to limited free-fall times and the quality of vacuum and microgravity. The proposed mission Macroscopic Quantum Resonators (MAQRO) may overcome these limitations and allow addressing such fundamental questions. MAQRO harnesses recent developments in quantum optomechanics, high-mass matter-wave interferometry as well as state-of-the-art space technology to push macroscopic quantum experiments towards their ultimate performance limits and to open new horizons for applying quantum technology in space. The main scientific goal is to probe the vastly unexplored 'quantum-classical' transition for increasingly massive objects, testing the predictions of quantum theory for objects in a size and mass regime unachievable in ground-based experiments. The hardware will largely be based on available space technology. Here, we present the MAQRO proposal submitted in response to the 4th Cosmic Vision call for a medium-sized mission (M4) in 2014 of the European Space Agency (ESA) with a possible launch in 2025, and we review the progress with respect to the original MAQRO proposal for the 3rd Cosmic Vision call for a medium-sized mission (M3) in 2010. In particular, the updated proposal overcomes several critical issues of the original proposal by relying on established experimental techniques from high-mass matter-wave interferometry and by introducing novel ideas for particle loading and manipulation. Moreover, the mission design was improved to better fulfill the stringent environmental requirements for macroscopic quantum experiments. (orig.)

  9. Macroscopic Quantum Resonators (MAQRO): 2015 update

    Energy Technology Data Exchange (ETDEWEB)

    Kaltenbaek, Rainer [University of Vienna, Vienna Center for Quantum Science and Technology, Vienna (Austria); Aspelmeyer, Markus; Kiesel, Nikolai [University of Vienna, Vienna Center for Quantum Science and Technology, Vienna (Austria); Barker, Peter F.; Bose, Sougato [University College London, Department of Physics and Astronomy, London (United Kingdom); Bassi, Angelo [University of Trieste, Department of Physics, Trieste (Italy); INFN - Trieste Section, Trieste (Italy); Bateman, James [University of Swansea, Department of Physics, College of Science, Swansea (United Kingdom); Bongs, Kai; Cruise, Adrian Michael [University of Birmingham, School of Physics and Astronomy, Birmingham (United Kingdom); Braxmaier, Claus [University of Bremen, Center of Applied Space Technology and Micro Gravity (ZARM), Bremen (Germany); Institute of Space Systems, German Aerospace Center (DLR), Bremen (Germany); Brukner, Caslav [University of Vienna, Vienna Center for Quantum Science and Technology, Vienna (Austria); Austrian Academy of Sciences, Institute of Quantum Optics and Quantum Information (IQOQI), Vienna (Austria); Christophe, Bruno; Rodrigues, Manuel [The French Aerospace Lab, ONERA, Chatillon (France); Chwalla, Michael; Johann, Ulrich [Airbus Defence and Space GmbH, Immenstaad (Germany); Cohadon, Pierre-Francois; Heidmann, Antoine; Lambrecht, Astrid; Reynaud, Serge [ENS-PSL Research University, Laboratoire Kastler Brossel, UPMC-Sorbonne Universites, CNRS, College de France, Paris (France); Curceanu, Catalina [Laboratori Nazionali di Frascati dell' INFN, Frascati (Italy); Dholakia, Kishan; Mazilu, Michael [University of St. Andrews, School of Physics and Astronomy, St. Andrews (United Kingdom); Diosi, Lajos [Wigner Research Center for Physics, P.O. Box 49, Budapest (Hungary); Doeringshoff, Klaus; Peters, Achim [Humboldt-Universitaet zu Berlin, Institut fuer Physik, Berlin (Germany); Ertmer, Wolfgang; Rasel, Ernst M. [Leibniz Universitaet Hannover, Institut fuer Quantenoptik, Hannover (Germany); Gieseler, Jan; Novotny, Lukas; Rondin, Loic [ETH Zuerich, Photonics Laboratory, Zuerich (Switzerland); Guerlebeck, Norman; Herrmann, Sven; Laemmerzahl, Claus [University of Bremen, Center of Applied Space Technology and Micro Gravity (ZARM), Bremen (Germany); Hechenblaikner, Gerald [Airbus Defence and Space GmbH, Immenstaad (Germany); European Southern Observatory (ESO), Garching bei Muenchen (Germany); Hossenfelder, Sabine [KTH Royal Institute of Technology and Stockholm University, Nordita, Stockholm (Sweden); Kim, Myungshik [Imperial College London, QOLS, Blackett Laboratory, London (United Kingdom); Milburn, Gerard J. [University of Queensland, ARC Centre for Engineered Quantum Systems, Brisbane (Australia); Mueller, Holger [University of California, Department of Physics, Berkeley, CA (United States); Paternostro, Mauro [Queen' s University, Centre for Theoretical Atomic, Molecular and Optical Physics, School of Mathematics and Physics, Belfast (United Kingdom); Pikovski, Igor [Harvard-Smithsonian Center for Astrophysics, ITAMP, Cambridge, MA (United States); Pilan Zanoni, Andre [Airbus Defence and Space GmbH, Immenstaad (Germany); CERN - European Organization for Nuclear Research, EN-STI-TCD, Geneva (Switzerland); Riedel, Charles Jess [Perimeter Institute for Theoretical Physics, Waterloo, ON (Canada); Roura, Albert [Universitaet Ulm, Institut fuer Quantenphysik, Ulm (Germany); Schleich, Wolfgang P. [Universitaet Ulm, Institut fuer Quantenphysik, Ulm (Germany); Texas A and M University Institute for Advanced Study (TIAS), Institute for Quantum Science and Engineering (IQSE), and Department of Physics and Astronomy, College Station, TX (United States); Schmiedmayer, Joerg [Vienna University of Technology, Vienna Center for Quantum Science and Technology, Institute of Atomic and Subatomic Physics, Vienna (Austria); Schuldt, Thilo [Institute of Space Systems, German Aerospace Center (DLR), Bremen (Germany); Schwab, Keith C. [California Institute of Technology, Applied Physics, Pasadena, CA (United States); Tajmar, Martin [Technische Universitaet Dresden, Institut fuer Luft- und Raumfahrttechnik, Dresden (Germany); Tino, Guglielmo M. [Universita di Firenze, Dipartimento di Fisica e Astronomia and LENS, INFN, Sesto Fiorentino, Firenze (Italy); Ulbricht, Hendrik [University of Southampton, Physics and Astronomy, Southampton (United Kingdom); Ursin, Rupert [Austrian Academy of Sciences, Institute of Quantum Optics and Quantum Information (IQOQI), Vienna (Austria); Vedral, Vlatko [University of Oxford, Atomic and Laser Physics, Clarendon Laboratory, Oxford (United Kingdom); National University of Singapore, Center for Quantum Technologies, Singapore (SG)

    2016-12-15

    Do the laws of quantum physics still hold for macroscopic objects - this is at the heart of Schroedinger's cat paradox - or do gravitation or yet unknown effects set a limit for massive particles? What is the fundamental relation between quantum physics and gravity? Ground-based experiments addressing these questions may soon face limitations due to limited free-fall times and the quality of vacuum and microgravity. The proposed mission Macroscopic Quantum Resonators (MAQRO) may overcome these limitations and allow addressing such fundamental questions. MAQRO harnesses recent developments in quantum optomechanics, high-mass matter-wave interferometry as well as state-of-the-art space technology to push macroscopic quantum experiments towards their ultimate performance limits and to open new horizons for applying quantum technology in space. The main scientific goal is to probe the vastly unexplored 'quantum-classical' transition for increasingly massive objects, testing the predictions of quantum theory for objects in a size and mass regime unachievable in ground-based experiments. The hardware will largely be based on available space technology. Here, we present the MAQRO proposal submitted in response to the 4th Cosmic Vision call for a medium-sized mission (M4) in 2014 of the European Space Agency (ESA) with a possible launch in 2025, and we review the progress with respect to the original MAQRO proposal for the 3rd Cosmic Vision call for a medium-sized mission (M3) in 2010. In particular, the updated proposal overcomes several critical issues of the original proposal by relying on established experimental techniques from high-mass matter-wave interferometry and by introducing novel ideas for particle loading and manipulation. Moreover, the mission design was improved to better fulfill the stringent environmental requirements for macroscopic quantum experiments. (orig.)

  10. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. Date Change type Affected areas April 8 Update of switch in LHC 7 LHC 7 Point April 9 Update of...

  11. A new preconditioner update strategy for the solution of sequences of linear systems in structural mechanics: application to saddle point problems in elasticity

    Science.gov (United States)

    Mercier, Sylvain; Gratton, Serge; Tardieu, Nicolas; Vasseur, Xavier

    2017-12-01

    Many applications in structural mechanics require the numerical solution of sequences of linear systems typically issued from a finite element discretization of the governing equations on fine meshes. The method of Lagrange multipliers is often used to take into account mechanical constraints. The resulting matrices then exhibit a saddle point structure and the iterative solution of such preconditioned linear systems is considered as challenging. A popular strategy is then to combine preconditioning and deflation to yield an efficient method. We propose an alternative that is applicable to the general case and not only to matrices with a saddle point structure. In this approach, we consider to update an existing algebraic or application-based preconditioner, using specific available information exploiting the knowledge of an approximate invariant subspace or of matrix-vector products. The resulting preconditioner has the form of a limited memory quasi-Newton matrix and requires a small number of linearly independent vectors. Numerical experiments performed on three large-scale applications in elasticity highlight the relevance of the new approach. We show that the proposed method outperforms the deflation method when considering sequences of linear systems with varying matrices.

  12. N Reactor updated safety analysis report, NUSAR

    International Nuclear Information System (INIS)

    1978-01-01

    An update of the N Reactor safety analysis is presented to reconfirm that the continued operation does not pose undue risk to DOE personnel and property, the public, or the environment. A reanalysis of LOCA and reactivity transients utilizing current codes and methods is made. The principal aspects of the overall submission, a general description, and site characteristics including geography and demography, nearby industrial, transportation and military facilities, meteorology, hydraulic engineering, and geology and seismology are described

  13. Gloss uniformity measurement update for ISO/IEC 19751

    Science.gov (United States)

    Ng, Yee S.; Cui, Chengwu; Kuo, Chunghui; Maggard, Eric; Mashtare, Dale; Morris, Peter

    2005-01-01

    To address the standardization issues of perceptually based image quality for printing systems, ISO/IEC JTC1/SC28, the standardization committee for office equipment chartered the W1.1 project with the responsibility of drafting a proposal for an international standard for the evaluation of printed image quality1. An ISO draft Standard2, ISO/WD 19751-1, Office Equipment - Appearance-based image quality standards for printers - Part 1: Overview, Procedure and Common Methods, 2004 describes the overview of this multi-part appearance-based image quality standard. One of the ISO 19751 multi-part Standard"s tasks is to address the appearance-based gloss and gloss uniformity issues (in ISO 19751-2). This paper summarizes the current status and technical progress since the last two updates3, 4. In particular, we will be discussion our attempt to include 75 degree gloss (G75) objective measurement5 in differential gloss and within-page gloss uniformity. The result for a round-robin experiment involving objective measurement of differential gloss using G60 and G75 gloss measurement geometry is described. The results for two perceptual-based round-robin experiments relating to haze effect on the perception of gloss, and gloss artifacts (gloss streaks/bands, gloss graininess/mottle) are discussed.

  14. Decentralized Gauss-Newton method for nonlinear least squares on wide area network

    Science.gov (United States)

    Liu, Lanchao; Ling, Qing; Han, Zhu

    2014-10-01

    This paper presents a decentralized approach of Gauss-Newton (GN) method for nonlinear least squares (NLLS) on wide area network (WAN). In a multi-agent system, a centralized GN for NLLS requires the global GN Hessian matrix available at a central computing unit, which may incur large communication overhead. In the proposed decentralized alternative, each agent only needs local GN Hessian matrix to update iterates with the cooperation of neighbors. The detail formulation of decentralized NLLS on WAN is given, and the iteration at each agent is defined. The convergence property of the decentralized approach is analyzed, and numerical results validate the effectiveness of the proposed algorithm.

  15. A Damage Prognosis Method of Girder Structures Based on Wavelet Neural Networks

    Directory of Open Access Journals (Sweden)

    Rumian Zhong

    2014-01-01

    Full Text Available Based on the basic theory of wavelet neural networks and finite element model updating method, a basic framework of damage prognosis method is proposed in this paper. Firstly, a damaged I-steel beam model testing is used to verify the feasibility and effectiveness of the proposed damage prognosis method. The results show that the predicted results of the damage prognosis method and the measured results are very well consistent, and the maximum error is less than 5%. Furthermore, Xinyihe Bridge in the Beijing-Shanghai Highway is selected as the engineering background, and the damage prognosis is conducted based on the data from the structural health monitoring system. The results show that the traffic volume will increase and seasonal differences will decrease in the next year and a half. The displacement has a slight increase and seasonal characters in the critical section of mid span, but the strain will increase distinctly. The analysis results indicate that the proposed method can be applied to the damage prognosis of girder bridge structures and has the potential for the bridge health monitoring and safety prognosis.

  16. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  17. Spumaretroviruses: Updated taxonomy and nomenclature.

    Science.gov (United States)

    Khan, Arifa S; Bodem, Jochen; Buseyne, Florence; Gessain, Antoine; Johnson, Welkin; Kuhn, Jens H; Kuzmak, Jacek; Lindemann, Dirk; Linial, Maxine L; Löchelt, Martin; Materniak-Kornas, Magdalena; Soares, Marcelo A; Switzer, William M

    2018-03-01

    Spumaretroviruses, commonly referred to as foamy viruses, are complex retroviruses belonging to the subfamily Spumaretrovirinae, family Retroviridae, which naturally infect a variety of animals including nonhuman primates (NHPs). Additionally, cross-species transmissions of simian foamy viruses (SFVs) to humans have occurred following exposure to tissues of infected NHPs. Recent research has led to the identification of previously unknown exogenous foamy viruses, and to the discovery of endogenous spumaretrovirus sequences in a variety of host genomes. Here, we describe an updated spumaretrovirus taxonomy that has been recently accepted by the International Committee on Taxonomy of Viruses (ICTV) Executive Committee, and describe a virus nomenclature that is generally consistent with that used for other retroviruses, such as lentiviruses and deltaretroviruses. This taxonomy can be applied to distinguish different, but closely related, primate (e.g., human, ape, simian) foamy viruses as well as those from other hosts. This proposal accounts for host-virus co-speciation and cross-species transmission. Published by Elsevier Inc.

  18. Proposal for outline of training and evaluation method for non-technical skills

    International Nuclear Information System (INIS)

    Nagasaka, Akihiko; Shibue, Hisao

    2015-01-01

    The purpose of this study is to systematize measures for improvement of emergency response capability focused on non-technical skills. As the results of investigation of some emergency training in nuclear power plant and referring to CRM training, following two issues were picked up. 1) Lack of practical training method for improvement of non-technical skills. 2) Lack of evaluation method of non-technical skills. Then, based on these 7 non-technical skills 'situational awareness' 'decision making' 'communication' 'teamworking' 'leadership' 'managing stress' 'coping with fatigue' are promotion factors to improve emergency response capability, we propose practical training method for each non-technical skill. Also we give example of behavioral markers as evaluation factor, and indicate approaches to introduce the evaluation method of non-technical skills. (author)

  19. Updated US and Canadian Normalization Factors for TRACI 2.1

    Science.gov (United States)

    The objectives of this study is to update the normalization factors (NFs) of U.S.EPA’s TRACI 2.1 LCIA method (Bare, 2012) for the United States (US) and US-Canadian (US-CA) regions. This is done for the reference year 2008. This was deemed necessary to maintain the representative...

  20. Online updating procedures for a real-time hydrological forecasting system

    International Nuclear Information System (INIS)

    Kahl, B; Nachtnebel, H P

    2008-01-01

    Rainfall-runoff-models can explain major parts of the natural runoff pattern but never simulate the observed hydrograph exactly. Reasons for errors are various sources of uncertainties embedded in the model forecasting system. Errors are due to measurement errors, the selected time period for calibration and validation, the parametric uncertainty and the model imprecision. In on-line forecasting systems forecasted input data is used which additionally generates a major uncertainty for the hydrological forecasting system. Techniques for partially compensating these uncertainties are investigated in the recent study in a medium sized catchment in the Austrian part of the Danube basin. The catchment area is about 1000 km2. The forecasting system consists of a semi-distributed continuous rainfall-runoff model that uses quantitative precipitation and temperature forecasts. To provide adequate system states at the beginning of the forecasting period continuous simulation is required, especially in winter. In this study two online updating methods are used and combined for enhancing the runoff forecasts. The first method is used for updating the system states at the beginning of the forecasting period by changing the precipitation input. The second method is an autoregressive error model, which is used to eliminate systematic errors in the model output. In combination those two methods work together well as each method is more effective in different runoff situations.

  1. 78 FR 79009 - Proposed Information Collection; Radiation Sampling and Exposure Records (Pertains to Underground...

    Science.gov (United States)

    2013-12-27

    ... soliciting comments concerning the proposed information collection for updating Radiation Sampling and... exposed with no adverse effects have been established and are expressed as working levels (WL). The... mandatory samplings. Records must include the sample date, location, and results, and must be retained at...

  2. Application of Quasi-Newton methods to the analysis of axisymmetric pressure vessels

    International Nuclear Information System (INIS)

    Parisi, D.A.C.

    1987-01-01

    This work studies the application of Quasi-Newton techniques to material nonlinear analysis of axisymmetrical pressure vessels by the finite element method. In the formulation the material bahavior is described by an isotropic elastoplastic model with strain hardening. The continum is discretized through triangular finite elements of axisymmetrical solids with linear interpolation of the displacement field. The incremental governing equations are derived by the virtual work. The solution of the system of simultaneous nonlinear equations is solved iteratively by the Quasi-Newton method employing the BFGS update. The numerical performance of the proposed method is compared with the Newton-Raphson method and some of its variants through some selected examples. (author) [pt

  3. Better Plants Progress Update Fall 2013

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2013-09-23

    This Progress Update summarizes the significant energy saving achievements and cumulative cost savings made by these industry leaders from 2010-2012. The update also shares the plans and priorities over the next year for the Better Plants Program to continue to advance energy efficiency in the industrial sector.

  4. Non-Linear Approximation of Bayesian Update

    KAUST Repository

    Litvinenko, Alexander

    2016-01-01

    We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.

  5. Non-Linear Approximation of Bayesian Update

    KAUST Repository

    Litvinenko, Alexander

    2016-06-23

    We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.

  6. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Heo, Gyunyoung

    2015-01-01

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making

  7. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2015-05-15

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making.

  8. Online updating and uncertainty quantification using nonstationary output-only measurement

    Science.gov (United States)

    Yuen, Ka-Veng; Kuok, Sin-Chi

    2016-01-01

    Extended Kalman filter (EKF) is widely adopted for state estimation and parametric identification of dynamical systems. In this algorithm, it is required to specify the covariance matrices of the process noise and measurement noise based on prior knowledge. However, improper assignment of these noise covariance matrices leads to unreliable estimation and misleading uncertainty estimation on the system state and model parameters. Furthermore, it may induce diverging estimation. To resolve these problems, we propose a Bayesian probabilistic algorithm for online estimation of the noise parameters which are used to characterize the noise covariance matrices. There are three major appealing features of the proposed approach. First, it resolves the divergence problem in the conventional usage of EKF due to improper choice of the noise covariance matrices. Second, the proposed approach ensures the reliability of the uncertainty quantification. Finally, since the noise parameters are allowed to be time-varying, nonstationary process noise and/or measurement noise are explicitly taken into account. Examples using stationary/nonstationary response of linear/nonlinear time-varying dynamical systems are presented to demonstrate the efficacy of the proposed approach. Furthermore, comparison with the conventional usage of EKF will be provided to reveal the necessity of the proposed approach for reliable model updating and uncertainty quantification.

  9. Evaluating the fermionic determinant of dynamical configurations

    International Nuclear Information System (INIS)

    Hasenfratz, Anna; Alexandru, Andrei

    2002-01-01

    We propose and study an improved method to calculate the fermionic determinant of dynamical configurations. The evaluation or at least stochastic estimation of the ratios of fermionic determinants is essential for a recently proposed updating method of smeared link dynamical fermions. This update creates a sequence of configurations by changing a subset of the gauge links by a pure gauge heat bath or over-relaxation step. The acceptance of the proposed configuration depends on the ratio of the fermionic determinants on the new and original configurations. We study this ratio as a function of the number of links that are changed in the heat bath update. We find that even when every link of a given direction and parity of a 10 fm 4 configuration is updated, the average of the determinant ratio is still close to one and with the improved stochastic estimator the proposed change is accepted with about 20% probability. This improvement suggests that the new updating technique can be efficient even on large lattices and could provide an updating method for dynamical overlap actions

  10. A proposal framework for investigating website success in the context of e-banking:an analytic network process approach

    OpenAIRE

    Salehi, Mona; Keramati, Abbas

    2009-01-01

    This study proposes a framework to investigate website success factors, and their relative importance in selecting the most preferred e-banking website. For one thing, Updated Delone and Mclean IS success model is chosen to extract significant website success factors in the context of e-banking in Iran. Secondly, Updated Delone and McLean IS success model is extended through applying an analytic network process (ANP) approach in order to investigate the relative importance of each factor and ...

  11. Updating expected action outcome in the medial frontal cortex involves an evaluation of error type.

    Science.gov (United States)

    Maier, Martin E; Steinhauser, Marco

    2013-10-02

    Forming expectations about the outcome of an action is an important prerequisite for action control and reinforcement learning in the human brain. The medial frontal cortex (MFC) has been shown to play an important role in the representation of outcome expectations, particularly when an update of expected outcome becomes necessary because an error is detected. However, error detection alone is not always sufficient to compute expected outcome because errors can occur in various ways and different types of errors may be associated with different outcomes. In the present study, we therefore investigate whether updating expected outcome in the human MFC is based on an evaluation of error type. Our approach was to consider an electrophysiological correlate of MFC activity on errors, the error-related negativity (Ne/ERN), in a task in which two types of errors could occur. Because the two error types were associated with different amounts of monetary loss, updating expected outcomes on error trials required an evaluation of error type. Our data revealed a pattern of Ne/ERN amplitudes that closely mirrored the amount of monetary loss associated with each error type, suggesting that outcome expectations are updated based on an evaluation of error type. We propose that this is achieved by a proactive evaluation process that anticipates error types by continuously monitoring error sources or by dynamically representing possible response-outcome relations.

  12. Updated folate data in the Dutch Food Composition Database and implications for intake estimates

    Directory of Open Access Journals (Sweden)

    Susanne Westenbrink

    2012-04-01

    Full Text Available Background and objective: Nutrient values are influenced by the analytical method used. Food folate measured by high performance liquid chromatography (HPLC or by microbiological assay (MA yield different results, with in general higher results from MA than from HPLC. This leads to the question of how to deal with different analytical methods in compiling standardised and internationally comparable food composition databases? A recent inventory on folate in European food composition databases indicated that currently MA is more widely used than HPCL. Since older Dutch values are produced by HPLC and newer values by MA, analytical methods and procedures for compiling folate data in the Dutch Food Composition Database (NEVO were reconsidered and folate values were updated. This article describes the impact of this revision of folate values in the NEVO database as well as the expected impact on the folate intake assessment in the Dutch National Food Consumption Survey (DNFCS. Design: The folate values were revised by replacing HPLC with MA values from recent Dutch analyses. Previously MA folate values taken from foreign food composition tables had been recalculated to the HPLC level, assuming a 27% lower value from HPLC analyses. These recalculated values were replaced by the original MA values. Dutch HPLC and MA values were compared to each other. Folate intake was assessed for a subgroup within the DNFCS to estimate the impact of the update. Results: In the updated NEVO database nearly all folate values were produced by MA or derived from MA values which resulted in an average increase of 24%. The median habitual folate intake in young children was increased by 11–15% using the updated folate values. Conclusion: The current approach for folate in NEVO resulted in more transparency in data production and documentation and higher comparability among European databases. Results of food consumption surveys are expected to show higher folate intakes

  13. UPEML, Computer Independent Emulator of CDC Update Utility

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: UPEML is a machine-portable CDC UPDATE emulation program. It is capable of emulating a significant subset of the standard CDC UPDATE functions, including program library creation and subsequent modification. 2 - Method of solution: UPEML was originally written to facilitate the use of CDC-based scientific packages on alternate computers. In addition to supporting computers such as the VAX/VMS, IBM, and CRAY/COS, Version 3.0 now supports UNIX workstations and the CRAY/UNICOS operating system. Several program bugs have been corrected in Version 3.0. Version 3.0 has several new features including 1) improved error checking, 2) the ability to use *ADDFILE and READ from nested files, 3) creation of compile file on creation, 4) allows identifiers to begin with numbers, and 5) ability to control warning messages and program termination on error conditions. 3 - Restrictions on the complexity of the problem: None noted

  14. Fertility Preservation for Patients With Cancer: American Society of Clinical Oncology Clinical Practice Guideline Update

    Science.gov (United States)

    Loren, Alison W.; Mangu, Pamela B.; Beck, Lindsay Nohr; Brennan, Lawrence; Magdalinski, Anthony J.; Partridge, Ann H.; Quinn, Gwendolyn; Wallace, W. Hamish; Oktay, Kutluk

    2013-01-01

    Purpose To update guidance for health care providers about fertility preservation for adults and children with cancer. Methods A systematic review of the literature published from March 2006 through January 2013 was completed using MEDLINE and the Cochrane Collaboration Library. An Update Panel reviewed the evidence and updated the recommendation language. Results There were 222 new publications that met inclusion criteria. A majority were observational studies, cohort studies, and case series or reports, with few randomized clinical trials. After review of the new evidence, the Update Panel concluded that no major, substantive revisions to the 2006 American Society of Clinical Oncology recommendations were warranted, but clarifications were added. Recommendations As part of education and informed consent before cancer therapy, health care providers (including medical oncologists, radiation oncologists, gynecologic oncologists, urologists, hematologists, pediatric oncologists, and surgeons) should address the possibility of infertility with patients treated during their reproductive years (or with parents or guardians of children) and be prepared to discuss fertility preservation options and/or to refer all potential patients to appropriate reproductive specialists. Although patients may be focused initially on their cancer diagnosis, the Update Panel encourages providers to advise patients regarding potential threats to fertility as early as possible in the treatment process so as to allow for the widest array of options for fertility preservation. The discussion should be documented. Sperm and embryo cryopreservation as well as oocyte cryopreservation are considered standard practice and are widely available. Other fertility preservation methods should be considered investigational and should be performed by providers with the necessary expertise. PMID:23715580

  15. A proposed analytic framework for determining the impact of an antimicrobial resistance intervention.

    Science.gov (United States)

    Grohn, Yrjo T; Carson, Carolee; Lanzas, Cristina; Pullum, Laura; Stanhope, Michael; Volkova, Victoriya

    2017-06-01

    Antimicrobial use (AMU) is increasingly threatened by antimicrobial resistance (AMR). The FDA is implementing risk mitigation measures promoting prudent AMU in food animals. Their evaluation is crucial: the AMU/AMR relationship is complex; a suitable framework to analyze interventions is unavailable. Systems science analysis, depicting variables and their associations, would help integrate mathematics/epidemiology to evaluate the relationship. This would identify informative data and models to evaluate interventions. This National Institute for Mathematical and Biological Synthesis AMR Working Group's report proposes a system framework to address the methodological gap linking livestock AMU and AMR in foodborne bacteria. It could evaluate how AMU (and interventions) impact AMR. We will evaluate pharmacokinetic/dynamic modeling techniques for projecting AMR selection pressure on enteric bacteria. We study two methods to model phenotypic AMR changes in bacteria in the food supply and evolutionary genotypic analyses determining molecular changes in phenotypic AMR. Systems science analysis integrates the methods, showing how resistance in the food supply is explained by AMU and concurrent factors influencing the whole system. This process is updated with data and techniques to improve prediction and inform improvements for AMU/AMR surveillance. Our proposed framework reflects both the AMR system's complexity, and desire for simple, reliable conclusions.

  16. Action plan for energy efficiency 2003-2006. A Working Group Proposal

    International Nuclear Information System (INIS)

    2003-02-01

    The updating of the Action Plan for Energy Efficiency is closely related to the need to further intensify measures for promoting energy conservation that was highlighted in the debate in Parliament on the National Climate Strategy and building of a new nuclear power plant. The Working Group with responsibility for the preparation of the updating has made an assessment of the implementation and impact of the previous Action Plan for Energy Efficiency and sought to come up with new measures and ways of increasing the effect of the actions in the previous action plan. The main instruments presented in the updated action plan are developing new technologies, economic instruments, energy conservation agreements, laws and regulations and information and training. The action plan comprises proposals for increasing the budget for energy subsidies for companies and bodies and finding new formulas for the funding of energy saving investments. Further, the aid for the renovation of buildings is proposed to be enhanced. More effort is also needed as concerns disseminating information on energy saving. The development of new technologies requires that the funding from the National Technology Agency (Tekes) for energy efficiency is kept at least at the level of 1999. An implementation of the measures proposed would require a contribution from the state amounting to about E 80 million per year. The system of Energy Conservation Agreements is proposed to be further extended and developed. The agreements could to a larger extent than before cover research and product development processes and processes for purchasing of goods and services. The Working Group proposes further examination of the possibility of imposing binding targets and applying sanctions. Energy taxation is proposed to be developed further in order to promote energy saving and co- generation with the impact of the future Directive on emission allowance trading in mind. New research and development projects are

  17. Staged Optimization Design for Updating Urban Drainage Systems in a City of China

    Directory of Open Access Journals (Sweden)

    Kui Xu

    2018-01-01

    Full Text Available Flooding has been reported more often than in the past in most cities of China in recent years. In response, China’s State Council has urged the 36 largest cities to update the preparedness to handle the 50-year rainfall, which would be a massive project with large investments. We propose a staged optimization design for updating urban drainage that is not only a flexible option against environmental changes, but also an effective way to reduce the cost of the project. The staged cost optimization model involving the hydraulic model was developed in Fuzhou City, China. This model was established to minimize the total present costs, including intervention costs and flooding costs, with full consideration of the constraints of specific local conditions. The results show that considerable financial savings could be achieved by a staged design rather than the implement-once scheme. The model’s sensitivities to four data parameters were analyzed, including rainfall increase rate, flood unit cost, storage unit cost, and discount rate. The results confirm the applicability and robustness of the model for updating drainage systems to meet the requirements. The findings of this study may have important implications on urban flood management in the cities of developing countries with limited construction investments.

  18. Draft environmental assessment -- Test Area North pool stabilization project update

    International Nuclear Information System (INIS)

    1997-06-01

    The purpose of this Environmental Assessment (EA) is to update the ''Test Area North Pool Stabilization Project'' EA (DOE/EA-1050) and finding of no significant impact (FONSI) issued May 6, 1996. This update analyzes the environmental and health impacts of a drying process for the Three Mile Island (TMI) nuclear reactor core debris canisters now stored underwater in a facility on the Idaho National Engineering and Environmental Laboratory (INEEL). A drying process was analyzed in the predecision versions of the EA released in 1995 but that particular process was determined to be ineffective and dropped form the Ea/FONSI issued May 6, 1996. The origin and nature of the TMI core debris and the proposed drying process are described and analyzed in detail in this EA. As did the 1996 EA, this update analyzes the environmental and health impacts of removing various radioactive materials from underwater storage, dewatering these materials, constructing a new interim dry storage facility, and transporting and placing the materials into the new facility. Also, as did the 1996 EA, this EA analyzes the removal, treatment and disposal of water from the pool, and placement of the facility into a safe, standby condition. The entire action would take place within the boundaries of the INEEL. The materials are currently stored underwater in the Test Area North (TAN) building 607 pool, the new interim dry storage facility would be constructed at the Idaho Chemical Processing Plant (ICPP) which is about 25 miles south of TAN

  19. Nonsynchronous updating in the multiverse of cellular automata.

    Science.gov (United States)

    Reia, Sandro M; Kinouchi, Osame

    2015-04-01

    In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.

  20. Nonsynchronous updating in the multiverse of cellular automata

    Science.gov (United States)

    Reia, Sandro M.; Kinouchi, Osame

    2015-04-01

    In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.

  1. New uses of sulfur - update

    Energy Technology Data Exchange (ETDEWEB)

    Almond, K.P.

    1995-07-01

    An update to an extensive bibliography on alternate uses of sulfur was presented. Alberta Sulphur Research Ltd., previously compiled a bibliography in volume 24 of this quarterly bulletin. This update provides an additional 44 new publications. The information regarding current research focusses on topics regarding the use of sulfur in oil and gas applications, mining and metallurgy, concretes and other structural materials, waste management, rubber and textile products, asphalts and other paving and highway applications.

  2. Visual assessment of BIPV retrofit design proposals for selected historical buildings using the saliency map method

    Directory of Open Access Journals (Sweden)

    Ran Xu

    2015-06-01

    Full Text Available With the increasing awareness of energy efficiency, many old buildings have to undergo a massive facade energy retrofit. How to predict the visual impact which solar installations on the aesthetic cultural value of these buildings has been a heated debate in Switzerland (and throughout the world. The usual evaluation method to describe the visual impact of BIPV is based on semantic and qualitative descriptors, and strongly dependent on personal preferences. The evaluation scale is therefore relative, flexible and imprecise. This paper proposes a new method to accurately measure the visual impact which BIPV installations have on a historical building by using the saliency map method. By imitating working principles of the human eye, it is measured how much the BIPV design proposals differ from the original building facade in the aspect of attracting human visual attention. The result is directly presented in a quantitative manner, and can be used to compare the fitness of different BIPV design proposals. The measuring process is numeric, objective and more precise.  

  3. Perforated peptic ulcer - an update.

    Science.gov (United States)

    Chung, Kin Tong; Shelat, Vishalkumar G

    2017-01-27

    Peptic ulcer disease (PUD) affects 4 million people worldwide annually. The incidence of PUD has been estimated at around 1.5% to 3%. Perforated peptic ulcer (PPU) is a serious complication of PUD and patients with PPU often present with acute abdomen that carries high risk for morbidity and mortality. The lifetime prevalence of perforation in patients with PUD is about 5%. PPU carries a mortality ranging from 1.3% to 20%. Thirty-day mortality rate reaching 20% and 90-d mortality rate of up to 30% have been reported. In this review we have summarized the current evidence on PPU to update readers. This literature review includes the most updated information such as common causes, clinical features, diagnostic methods, non-operative and operative management, post-operative complications and different scoring systems of PPU. With the advancement of medical technology, PUD can now be treated with medications instead of elective surgery. The classic triad of sudden onset of abdominal pain, tachycardia and abdominal rigidity is the hallmark of PPU. Erect chest radiograph may miss 15% of cases with air under the diaphragm in patients with bowel perforation. Early diagnosis, prompt resuscitation and urgent surgical intervention are essential to improve outcomes. Exploratory laparotomy and omental patch repair remains the gold standard. Laparoscopic surgery should be considered when expertise is available. Gastrectomy is recommended in patients with large or malignant ulcer.

  4. Perforated peptic ulcer - an update

    Science.gov (United States)

    Chung, Kin Tong; Shelat, Vishalkumar G

    2017-01-01

    Peptic ulcer disease (PUD) affects 4 million people worldwide annually. The incidence of PUD has been estimated at around 1.5% to 3%. Perforated peptic ulcer (PPU) is a serious complication of PUD and patients with PPU often present with acute abdomen that carries high risk for morbidity and mortality. The lifetime prevalence of perforation in patients with PUD is about 5%. PPU carries a mortality ranging from 1.3% to 20%. Thirty-day mortality rate reaching 20% and 90-d mortality rate of up to 30% have been reported. In this review we have summarized the current evidence on PPU to update readers. This literature review includes the most updated information such as common causes, clinical features, diagnostic methods, non-operative and operative management, post-operative complications and different scoring systems of PPU. With the advancement of medical technology, PUD can now be treated with medications instead of elective surgery. The classic triad of sudden onset of abdominal pain, tachycardia and abdominal rigidity is the hallmark of PPU. Erect chest radiograph may miss 15% of cases with air under the diaphragm in patients with bowel perforation. Early diagnosis, prompt resuscitation and urgent surgical intervention are essential to improve outcomes. Exploratory laparotomy and omental patch repair remains the gold standard. Laparoscopic surgery should be considered when expertise is available. Gastrectomy is recommended in patients with large or malignant ulcer. PMID:28138363

  5. News and Features Updates from USA.gov

    Data.gov (United States)

    General Services Administration — Stay on top of important government news and information with the USA.gov Updates: News and Features RSS feed. We'll update this feed when we add news and featured...

  6. [Social determinants of health and disability: updating the model for determination].

    Science.gov (United States)

    Tamayo, Mauro; Besoaín, Álvaro; Rebolledo, Jaime

    Social determinants of health (SDH) are conditions in which people live. These conditions impact their lives, health status and social inclusion level. In line with the conceptual and comprehensive progression of disability, it is important to update SDH due to their broad implications in implementing health interventions in society. This proposal supports incorporating disability in the model as a structural determinant, as it would lead to the same social inclusion/exclusion of people described in other structural SDH. This proposal encourages giving importance to designing and implementing public policies to improve societal conditions and contribute to social equity. This will be an act of reparation, justice and fulfilment with the Convention on the Rights of Persons with Disabilities. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. Improving Semantic Updating Method on 3d City Models Using Hybrid Semantic-Geometric 3d Segmentation Technique

    Science.gov (United States)

    Sharkawi, K.-H.; Abdul-Rahman, A.

    2013-09-01

    to LoD4. The accuracy and structural complexity of the 3D objects increases with the LoD level where LoD0 is the simplest LoD (2.5D; Digital Terrain Model (DTM) + building or roof print) while LoD4 is the most complex LoD (architectural details with interior structures). Semantic information is one of the main components in CityGML and 3D City Models, and provides important information for any analyses. However, more often than not, the semantic information is not available for the 3D city model due to the unstandardized modelling process. One of the examples is where a building is normally generated as one object (without specific feature layers such as Roof, Ground floor, Level 1, Level 2, Block A, Block B, etc). This research attempts to develop a method to improve the semantic data updating process by segmenting the 3D building into simpler parts which will make it easier for the users to select and update the semantic information. The methodology is implemented for 3D buildings in LoD2 where the buildings are generated without architectural details but with distinct roof structures. This paper also introduces hybrid semantic-geometric 3D segmentation method that deals with hierarchical segmentation of a 3D building based on its semantic value and surface characteristics, fitted by one of the predefined primitives. For future work, the segmentation method will be implemented as part of the change detection module that can detect any changes on the 3D buildings, store and retrieve semantic information of the changed structure, automatically updates the 3D models and visualize the results in a userfriendly graphical user interface (GUI).

  8. A Preconditioning Technique for First-Order Primal-Dual Splitting Method in Convex Optimization

    Directory of Open Access Journals (Sweden)

    Meng Wen

    2017-01-01

    Full Text Available We introduce a preconditioning technique for the first-order primal-dual splitting method. The primal-dual splitting method offers a very general framework for solving a large class of optimization problems arising in image processing. The key idea of the preconditioning technique is that the constant iterative parameters are updated self-adaptively in the iteration process. We also give a simple and easy way to choose the diagonal preconditioners while the convergence of the iterative algorithm is maintained. The efficiency of the proposed method is demonstrated on an image denoising problem. Numerical results show that the preconditioned iterative algorithm performs better than the original one.

  9. Rebuild America partner update, November--December 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-11-01

    This issue of the update includes articles on retrofitting Duke University facilities, energy efficiency updates to buildings in Portland, Oregon, Salisbury, North Carolina, Hawaii, Roanoke-Chowan, Virginia, and energy savings centered designs for lighting systems.

  10. Quasi-Newton methods for the acceleration of multi-physics codes

    CSIR Research Space (South Africa)

    Haelterman, R

    2017-08-01

    Full Text Available .E. Dennis, J.J. More´, Quasi-Newton methods: motivation and theory. SIAM Rev. 19, pp. 46–89 (1977) [11] J.E. Dennis, R.B. Schnabel, Least Change Secant Updates for quasi- Newton methods. SIAM Rev. 21, pp. 443–459 (1979) [12] G. Dhondt, CalculiX CrunchiX USER...) [25] J.M. Martinez, M.C. Zambaldi, An Inverse Column-Updating Method for solving large-scale nonlinear systems of equations. Optim. Methods Softw. 1, pp. 129–140 (1992) [26] J.M. Martinez, On the convergence of the column-updating method. Comp. Appl...

  11. Co-operation and Phase Behavior under the Mixed Updating Rules

    International Nuclear Information System (INIS)

    Zhang Wen; Li Yao-Sheng; Xu Chen

    2015-01-01

    We present a model by considering two updating rules when the agents play prisoner's dilemma on a square lattice. Agents can update their strategies by referencing one of his neighbors of higher payoffs under the imitation updating rule or directly replaced by one of his neighbors according to the death-birth updating rule. The frequency of co-operation is related to the probability q of occurrence of the imitation updating or the death-birth updating and the game parameter b. The death-birth updating rule favors the co-operation while the imitation updating rule favors the defection on the lattice, although both rules suppress the co-operation in the well-mixed population. Therefore a totally co-operative state may emerge when the death-birth updating is involved in the evolution when b is relatively small. We also obtain a phase diagram on the q-b plane. There are three phases on the plane with two pure phases of a totally co-operative state and a totally defective state and a mixing phase of mixed strategies. Based on the pair approximation, we theoretically analyze the phase behavior and obtain a quantitative agreement with the simulation results. (paper)

  12. A method proposal for cumulative environmental impact assessment based on the landscape vulnerability evaluation

    International Nuclear Information System (INIS)

    Pavlickova, Katarina; Vyskupova, Monika

    2015-01-01

    Cumulative environmental impact assessment deals with the occasional use in practical application of environmental impact assessment process. The main reasons are the difficulty of cumulative impact identification caused by lack of data, inability to measure the intensity and spatial effect of all types of impacts and the uncertainty of their future evolution. This work presents a method proposal to predict cumulative impacts on the basis of landscape vulnerability evaluation. For this purpose, qualitative assessment of landscape ecological stability is conducted and major vulnerability indicators of environmental and socio-economic receptors are specified and valuated. Potential cumulative impacts and the overall impact significance are predicted quantitatively in modified Argonne multiple matrixes while considering the vulnerability of affected landscape receptors and the significance of impacts identified individually. The method was employed in the concrete environmental impact assessment process conducted in Slovakia. The results obtained in this case study reflect that this methodology is simple to apply, valid for all types of impacts and projects, inexpensive and not time-consuming. The objectivity of the partial methods used in this procedure is improved by quantitative landscape ecological stability evaluation, assignment of weights to vulnerability indicators based on the detailed characteristics of affected factors, and grading impact significance. - Highlights: • This paper suggests a method proposal for cumulative impact prediction. • The method includes landscape vulnerability evaluation. • The vulnerability of affected receptors is determined by their sensitivity. • This method can increase the objectivity of impact prediction in the EIA process

  13. Federal Education Update, December 2004. Commission Update 04-17.

    Science.gov (United States)

    California Postsecondary Education Commission, 2004

    2004-01-01

    This update presents some of the major issues affecting education occurring at the national level. These include: Higher Education Act Extended for One Year; New Law Increases Loan Forgiveness for Teachers; Domestic Appropriations Measures Completed; Change in Federal Student Aid Rules; Bush Advisor Nominated To Be Education Secretary In Second…

  14. A proposed assessment method for image of regional educational institutions

    Directory of Open Access Journals (Sweden)

    Kataeva Natalya

    2017-01-01

    Full Text Available Market of educational services in the current Russian economic conditions is a complex of a huge variety of educational institutions. Market of educational services is already experiencing a significant influence of the demographic situation in Russia. This means that higher education institutions are forced to fight in a tough competition for high school students. Increased competition in the educational market forces universities to find new methods of non-price competition in attraction of potential students and throughout own educational and economic activities. Commercialization of education places universities in a single plane with commercial companies who study a positive perception of the image and reputation as a competitive advantage, which is quite acceptable for use in strategic and current activities of higher education institutions to ensure the competitiveness of educational services and educational institution in whole. Nevertheless, due to lack of evidence-based proposals in this area there is a need for scientific research in terms of justification of organizational and methodological aspects of image use as a factor in the competitiveness of the higher education institution. Theoretically and practically there are different methods and ways of evaluating the company’s image. The article provides a comparative assessment of the existing valuation methods of corporate image and the author’s method of estimating the image of higher education institutions based on the key influencing factors. The method has been tested on the Vyatka State Agricultural Academy (Russia. The results also indicate the strengths and weaknesses of the institution, highlights ways of improving, and adjusts the efforts for image improvement.

  15. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2010-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update and Fundamentals is the single best source for the latest developments, trends, and issues in communication technology. Featuring the fundamental framework along with the history and background of communication technologies, Communication Technology Update and Fundamentals, 12th edition helps you stay ahead of these ever-changing and emerging technologies.As always, every chapter ha

  16. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2008-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update is the single best source for the latest developments, trends, and issues in communication technology. Now in its 11th edition, Communication Technology Update has become an indispensable information resource for business, government, and academia. As always, every chapter has been completely rewritten to reflect the latest developments and market statistics, and now covers mobile computing, dig

  17. Updated safety analysis of ITER

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Neill, E-mail: neill.taylor@iter.org [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France); Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France)

    2011-10-15

    An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.

  18. Updated safety analysis of ITER

    International Nuclear Information System (INIS)

    Taylor, Neill; Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid

    2011-01-01

    An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.

  19. Updating Dosimetry for Emergency Response Dose Projections.

    Science.gov (United States)

    DeCair, Sara

    2016-02-01

    In 2013, the U.S. Environmental Protection Agency (EPA) proposed an update to the 1992 Protective Action Guides (PAG) Manual. The PAG Manual provides guidance to state and local officials planning for radiological emergencies. EPA requested public comment on the proposed revisions, while making them available for interim use by officials faced with an emergency situation. Developed with interagency partners, EPA's proposal incorporates newer dosimetric methods, identifies tools and guidelines developed since the current document was issued, and extends the scope of the PAGs to all significant radiological incidents, including radiological dispersal devices or improvised nuclear devices. In order to best serve the emergency management community, scientific policy direction had to be set on how to use International Commission on Radiological Protection Publication 60 age groups in dose assessment when implementing emergency guidelines. Certain guidelines that lend themselves to different PAGs for different subpopulations are the PAGs for potassium iodide (KI), food, and water. These guidelines provide age-specific recommendations because of the radiosensitivity of the thyroid and young children with respect to ingestion and inhalation doses in particular. Taking protective actions like using KI, avoiding certain foods or using alternative sources of drinking water can be relatively simple to implement by the parents of young children. Clear public messages can convey which age groups should take which action, unlike how an evacuation or relocation order should apply to entire households or neighborhoods. New in the PAG Manual is planning guidance for the late phase of an incident, after the situation is stabilized and efforts turn toward recovery. Because the late phase can take years to complete, decision makers are faced with managing public exposures in areas not fully remediated. The proposal includes quick-reference operational guidelines to inform re-entry to

  20. Finite element model updating of natural fibre reinforced composite structure in structural dynamics

    Directory of Open Access Journals (Sweden)

    Sani M.S.M.

    2016-01-01

    Full Text Available Model updating is a process of making adjustment of certain parameters of finite element model in order to reduce discrepancy between analytical predictions of finite element (FE and experimental results. Finite element model updating is considered as an important field of study as practical application of finite element method often shows discrepancy to the test result. The aim of this research is to perform model updating procedure on a composite structure as well as trying improving the presumed geometrical and material properties of tested composite structure in finite element prediction. The composite structure concerned in this study is a plate of reinforced kenaf fiber with epoxy. Modal properties (natural frequency, mode shapes, and damping ratio of the kenaf fiber structure will be determined using both experimental modal analysis (EMA and finite element analysis (FEA. In EMA, modal testing will be carried out using impact hammer test while normal mode analysis using FEA will be carried out using MSC. Nastran/Patran software. Correlation of the data will be carried out before optimizing the data from FEA. Several parameters will be considered and selected for the model updating procedure.

  1. Food irradiation. An update of legal and analytical aspects

    International Nuclear Information System (INIS)

    Masotti, P.; Zonta, F.

    1999-01-01

    A new European directive concerning ionising radiation treatment of foodstuffs has been recently adopted, although National laws may continue to be applied at least until 31 December 2000. A brief updated review dealing with the legal and analytical aspects of food irradiation is presented. The legal status of the food irradiation issue presently in force in Italy, in the European Union and in the USA is discussed. Some of the most used and reliable analytical methods for detecting irradiated foodstuffs, with special reference to standardised methods of European Committee of Standardization, are listed [it

  2. High-efficiency wavefunction updates for large scale Quantum Monte Carlo

    Science.gov (United States)

    Kent, Paul; McDaniel, Tyler; Li, Ying Wai; D'Azevedo, Ed

    Within ab intio Quantum Monte Carlo (QMC) simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunctions. The evaluation of each Monte Carlo move requires finding the determinant of a dense matrix, which is traditionally iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. For calculations with thousands of electrons, this operation dominates the execution profile. We propose a novel rank- k delayed update scheme. This strategy enables probability evaluation for multiple successive Monte Carlo moves, with application of accepted moves to the matrices delayed until after a predetermined number of moves, k. Accepted events grouped in this manner are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency. This procedure does not change the underlying Monte Carlo sampling or the sampling efficiency. For large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude speedups can be obtained on both multi-core CPU and on GPUs, making this algorithm highly advantageous for current petascale and future exascale computations.

  3. Estimation of effective brain connectivity with dual Kalman filter and EEG source localization methods.

    Science.gov (United States)

    Rajabioun, Mehdi; Nasrabadi, Ali Motie; Shamsollahi, Mohammad Bagher

    2017-09-01

    Effective connectivity is one of the most important considerations in brain functional mapping via EEG. It demonstrates the effects of a particular active brain region on others. In this paper, a new method is proposed which is based on dual Kalman filter. In this method, firstly by using a brain active localization method (standardized low resolution brain electromagnetic tomography) and applying it to EEG signal, active regions are extracted, and appropriate time model (multivariate autoregressive model) is fitted to extracted brain active sources for evaluating the activity and time dependence between sources. Then, dual Kalman filter is used to estimate model parameters or effective connectivity between active regions. The advantage of this method is the estimation of different brain parts activity simultaneously with the calculation of effective connectivity between active regions. By combining dual Kalman filter with brain source localization methods, in addition to the connectivity estimation between parts, source activity is updated during the time. The proposed method performance has been evaluated firstly by applying it to simulated EEG signals with interacting connectivity simulation between active parts. Noisy simulated signals with different signal to noise ratios are used for evaluating method sensitivity to noise and comparing proposed method performance with other methods. Then the method is applied to real signals and the estimation error during a sweeping window is calculated. By comparing proposed method results in different simulation (simulated and real signals), proposed method gives acceptable results with least mean square error in noisy or real conditions.

  4. Run-time Phenomena in Dynamic Software Updating: Causes and Effects

    DEFF Research Database (Denmark)

    Gregersen, Allan Raundahl; Jørgensen, Bo Nørregaard

    2011-01-01

    The development of a dynamic software updating system for statically-typed object-oriented programming languages has turned out to be a challenging task. Despite the fact that the present state of the art in dynamic updating systems, like JRebel, Dynamic Code Evolution VM, JVolve and Javeleon, all...... written in statically-typed object-oriented programming languages. In this paper, we present our experience from developing dynamically updatable applications using a state-of-the-art dynamic updating system for Java. We believe that the findings presented in this paper provide an important step towards...... provide very transparent and flexible technical solutions to dynamic updating, case studies have shown that designing dynamically updatable applications still remains a challenging task. This challenge has its roots in a number of run-time phenomena that are inherent to dynamic updating of applications...

  5. Proposed waste form performance criteria and testing methods for low-level mixed waste

    International Nuclear Information System (INIS)

    Franz, E.M.; Fuhrmann, M.; Bowerman, B.

    1995-01-01

    Proposed waste form performance criteria and testing methods were developed as guidance in judging the suitability of solidified waste as a physico-chemical barrier to releases of radionuclides and RCRA regulated hazardous components. The criteria follow from the assumption that release of contaminants by leaching is the single most important property for judging the effectiveness of a waste form. A two-tier regimen is proposed. The first tier consists of a leach test designed to determine the net, forward leach rate of the solidified waste and a leach test required by the Environmental Protection Agency (EPA). The second tier of tests is to determine if a set of stresses (i.e., radiation, freeze-thaw, wet-dry cycling) on the waste form adversely impacts its ability to retain contaminants and remain physically intact. In the absence of site-specific performance assessments (PA), two generic modeling exercises are described which were used to calculate proposed acceptable leachates

  6. 34 CFR 668.55 - Updating information.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Updating information. 668.55 Section 668.55 Education... Information § 668.55 Updating information. (a)(1) Unless the provisions of paragraph (a)(2) or (a)(3) of this... applicant to verify the information contained in his or her application for assistance in an award year if...

  7. Heterogeneous Data Fusion Method to Estimate Travel Time Distributions in Congested Road Networks.

    Science.gov (United States)

    Shi, Chaoyang; Chen, Bi Yu; Lam, William H K; Li, Qingquan

    2017-12-06

    Travel times in congested urban road networks are highly stochastic. Provision of travel time distribution information, including both mean and variance, can be very useful for travelers to make reliable path choice decisions to ensure higher probability of on-time arrival. To this end, a heterogeneous data fusion method is proposed to estimate travel time distributions by fusing heterogeneous data from point and interval detectors. In the proposed method, link travel time distributions are first estimated from point detector observations. The travel time distributions of links without point detectors are imputed based on their spatial correlations with links that have point detectors. The estimated link travel time distributions are then fused with path travel time distributions obtained from the interval detectors using Dempster-Shafer evidence theory. Based on fused path travel time distribution, an optimization technique is further introduced to update link travel time distributions and their spatial correlations. A case study was performed using real-world data from Hong Kong and showed that the proposed method obtained accurate and robust estimations of link and path travel time distributions in congested road networks.

  8. Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel Part II: Plate bending test and proposal of a simplified evaluation method

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Masanori, E-mail: ando.masanori@jaea.go.jp; Takaya, Shigeru, E-mail: takaya.shigeru@jaea.go.jp

    2016-12-15

    Highlights: • Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel is proposed. • A simplified evaluation method is also proposed for the codification. • Both proposed evaluation method was validated by the plate bending test. • For codification, the local stress and strain behavior was analyzed. - Abstract: In the present study, to develop an evaluation procedure and design rules for Mod.9Cr-1Mo steel weld joints, a method for evaluating the creep-fatigue life of Mod.9Cr-1Mo steel weld joints was proposed based on finite element analysis (FEA) and a series of cyclic plate bending tests of longitudinal and horizontal seamed plates. The strain concentration and redistribution behaviors were evaluated and the failure cycles were estimated using FEA by considering the test conditions and metallurgical discontinuities in the weld joints. Inelastic FEA models consisting of the base metal, heat-affected zone and weld metal were employed to estimate the elastic follow-up behavior caused by the metallurgical discontinuities. The elastic follow-up factors determined by comparing the elastic and inelastic FEA results were determined to be less than 1.5. Based on the estimated elastic follow-up factors obtained via inelastic FEA, a simplified technique using elastic FEA was proposed for evaluating the creep-fatigue life in Mod.9Cr-1Mo steel weld joints. The creep-fatigue life obtained using the plate bending test was compared to those estimated from the results of inelastic FEA and by a simplified evaluation method.

  9. Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel Part II: Plate bending test and proposal of a simplified evaluation method

    International Nuclear Information System (INIS)

    Ando, Masanori; Takaya, Shigeru

    2016-01-01

    Highlights: • Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel is proposed. • A simplified evaluation method is also proposed for the codification. • Both proposed evaluation method was validated by the plate bending test. • For codification, the local stress and strain behavior was analyzed. - Abstract: In the present study, to develop an evaluation procedure and design rules for Mod.9Cr-1Mo steel weld joints, a method for evaluating the creep-fatigue life of Mod.9Cr-1Mo steel weld joints was proposed based on finite element analysis (FEA) and a series of cyclic plate bending tests of longitudinal and horizontal seamed plates. The strain concentration and redistribution behaviors were evaluated and the failure cycles were estimated using FEA by considering the test conditions and metallurgical discontinuities in the weld joints. Inelastic FEA models consisting of the base metal, heat-affected zone and weld metal were employed to estimate the elastic follow-up behavior caused by the metallurgical discontinuities. The elastic follow-up factors determined by comparing the elastic and inelastic FEA results were determined to be less than 1.5. Based on the estimated elastic follow-up factors obtained via inelastic FEA, a simplified technique using elastic FEA was proposed for evaluating the creep-fatigue life in Mod.9Cr-1Mo steel weld joints. The creep-fatigue life obtained using the plate bending test was compared to those estimated from the results of inelastic FEA and by a simplified evaluation method.

  10. AN INVESTIGATION OF AUTOMATIC CHANGE DETECTION FOR TOPOGRAPHIC MAP UPDATING

    Directory of Open Access Journals (Sweden)

    P. Duncan

    2012-08-01

    Full Text Available Changes to the landscape are constantly occurring and it is essential for geospatial and mapping organisations that these changes are regularly detected and captured, so that map databases can be updated to reflect the current status of the landscape. The Chief Directorate of National Geospatial Information (CD: NGI, South Africa's national mapping agency, currently relies on manual methods of detecting changes and capturing these changes. These manual methods are time consuming and labour intensive, and rely on the skills and interpretation of the operator. It is therefore necessary to move towards more automated methods in the production process at CD: NGI. The aim of this research is to do an investigation into a methodology for automatic or semi-automatic change detection for the purpose of updating topographic databases. The method investigated for detecting changes is through image classification as well as spatial analysis and is focussed on urban landscapes. The major data input into this study is high resolution aerial imagery and existing topographic vector data. Initial results indicate the traditional pixel-based image classification approaches are unsatisfactory for large scale land-use mapping and that object-orientated approaches hold more promise. Even in the instance of object-oriented image classification generalization of techniques on a broad-scale has provided inconsistent results. A solution may lie with a hybrid approach of pixel and object-oriented techniques.

  11. Towards Dynamic Updates in Service Composition

    Directory of Open Access Journals (Sweden)

    Mario Bravetti

    2015-12-01

    Full Text Available We survey our results about verification of adaptable processes. We present adaptable processes as a way of overcoming the limitations that process calculi have for describing patterns of dynamic process evolution. Such patterns rely on direct ways of controlling the behavior and location of running processes, and so they are at the heart of the adaptation capabilities present in many modern concurrent systems. Adaptable processes have named scopes and are sensible to actions of dynamic update at runtime; this allows to express dynamic and static topologies of adaptable processes as well as different evolvability patterns for concurrent processes. We introduce a core calculus of adaptable processes and consider verification problems for them: first based on specific properties related to error occurrence, that we call bounded and eventual adaptation, and then by considering a simple yet expressive temporal logic over adaptable processes. We provide (undecidability results of such verification problems over adaptable processes considering the spectrum of topologies/evolvability patterns introduced. We then consider distributed adaptability, where a process can update part of a protocol by performing dynamic distributed updates over a set of protocol participants. Dynamic updates in this context are presented as an extension of our work on choreographies and behavioural contracts in multiparty interactions. We show how update mechanisms considered for adaptable processes can be used to extend the theory of choreography and orchestration/contracts, allowing them to be modified at run-time by internal (self-adaptation or external intervention.

  12. Finite element model updating of a prestressed concrete box girder bridge using subproblem approximation

    Science.gov (United States)

    Chen, G. W.; Omenzetter, P.

    2016-04-01

    This paper presents the implementation of an updating procedure for the finite element model (FEM) of a prestressed concrete continuous box-girder highway off-ramp bridge. Ambient vibration testing was conducted to excite the bridge, assisted by linear chirp sweepings induced by two small electrodynamic shakes deployed to enhance the excitation levels, since the bridge was closed to traffic. The data-driven stochastic subspace identification method was executed to recover the modal properties from measurement data. An initial FEM was developed and correlation between the experimental modal results and their analytical counterparts was studied. Modelling of the pier and abutment bearings was carefully adjusted to reflect the real operational conditions of the bridge. The subproblem approximation method was subsequently utilized to automatically update the FEM. For this purpose, the influences of bearing stiffness, and mass density and Young's modulus of materials were examined as uncertain parameters using sensitivity analysis. The updating objective function was defined based on a summation of squared values of relative errors of natural frequencies between the FEM and experimentation. All the identified modes were used as the target responses with the purpose of putting more constrains for the optimization process and decreasing the number of potentially feasible combinations for parameter changes. The updated FEM of the bridge was able to produce sufficient improvements in natural frequencies in most modes of interest, and can serve for a more precise dynamic response prediction or future investigation of the bridge health.

  13. Off-Highway Gasoline Consuption Estimation Models Used in the Federal Highway Administration Attribution Process: 2008 Updates

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ho-Ling [ORNL; Davis, Stacy Cagle [ORNL

    2009-12-01

    This report is designed to document the analysis process and estimation models currently used by the Federal Highway Administration (FHWA) to estimate the off-highway gasoline consumption and public sector fuel consumption. An overview of the entire FHWA attribution process is provided along with specifics related to the latest update (2008) on the Off-Highway Gasoline Use Model and the Public Use of Gasoline Model. The Off-Highway Gasoline Use Model is made up of five individual modules, one for each of the off-highway categories: agricultural, industrial and commercial, construction, aviation, and marine. This 2008 update of the off-highway models was the second major update (the first model update was conducted during 2002-2003) after they were originally developed in mid-1990. The agricultural model methodology, specifically, underwent a significant revision because of changes in data availability since 2003. Some revision to the model was necessary due to removal of certain data elements used in the original estimation method. The revised agricultural model also made use of some newly available information, published by the data source agency in recent years. The other model methodologies were not drastically changed, though many data elements were updated to improve the accuracy of these models. Note that components in the Public Use of Gasoline Model were not updated in 2008. A major challenge in updating estimation methods applied by the public-use model is that they would have to rely on significant new data collection efforts. In addition, due to resource limitation, several components of the models (both off-highway and public-us models) that utilized regression modeling approaches were not recalibrated under the 2008 study. An investigation of the Environmental Protection Agency's NONROAD2005 model was also carried out under the 2008 model update. Results generated from the NONROAD2005 model were analyzed, examined, and compared, to the extent that

  14. Proposal of evaluation method of tsunami wave pressure using 2D depth-integrated flow simulation

    International Nuclear Information System (INIS)

    Arimitsu, Tsuyoshi; Ooe, Kazuya; Kawasaki, Koji

    2012-01-01

    To design and construct land structures resistive to tsunami force, it is most essential to evaluate tsunami pressure quantitatively. The existing hydrostatic formula, in general, tended to underestimate tsunami wave pressure under the condition of inundation flow with large Froude number. Estimation method of tsunami pressure acting on a land structure was proposed using inundation depth and horizontal velocity at the front of the structure, which were calculated employing a 2D depth-integrated flow model based on the unstructured grid system. The comparison between the numerical and experimental results revealed that the proposed method could reasonably reproduce the vertical distribution of the maximum tsunami pressure as well as the time variation of the tsunami pressure exerting on the structure. (author)

  15. Boundary element method for modelling creep behaviour

    International Nuclear Information System (INIS)

    Zarina Masood; Shah Nor Basri; Abdel Majid Hamouda; Prithvi Raj Arora

    2002-01-01

    A two dimensional initial strain direct boundary element method is proposed to numerically model the creep behaviour. The boundary of the body is discretized into quadratic element and the domain into quadratic quadrilaterals. The variables are also assumed to have a quadratic variation over the elements. The boundary integral equation is solved for each boundary node and assembled into a matrix. This matrix is solved by Gauss elimination with partial pivoting to obtain the variables on the boundary and in the interior. Due to the time-dependent nature of creep, the solution has to be derived over increments of time. Automatic time incrementation technique and backward Euler method for updating the variables are implemented to assure stability and accuracy of results. A flowchart of the solution strategy is also presented. (Author)

  16. PCIU: Hardware Implementations of an Efficient Packet Classification Algorithm with an Incremental Update Capability

    Directory of Open Access Journals (Sweden)

    O. Ahmed

    2011-01-01

    Full Text Available Packet classification plays a crucial role for a number of network services such as policy-based routing, firewalls, and traffic billing, to name a few. However, classification can be a bottleneck in the above-mentioned applications if not implemented properly and efficiently. In this paper, we propose PCIU, a novel classification algorithm, which improves upon previously published work. PCIU provides lower preprocessing time, lower memory consumption, ease of incremental rule update, and reasonable classification time compared to state-of-the-art algorithms. The proposed algorithm was evaluated and compared to RFC and HiCut using several benchmarks. Results obtained indicate that PCIU outperforms these algorithms in terms of speed, memory usage, incremental update capability, and preprocessing time. The algorithm, furthermore, was improved and made more accessible for a variety of applications through implementation in hardware. Two such implementations are detailed and discussed in this paper. The results indicate that a hardware/software codesign approach results in a slower, but easier to optimize and improve within time constraints, PCIU solution. A hardware accelerator based on an ESL approach using Handel-C, on the other hand, resulted in a 31x speed-up over a pure software implementation running on a state of the art Xeon processor.

  17. A Globally Convergent Matrix-Free Method for Constrained Equations and Its Linear Convergence Rate

    Directory of Open Access Journals (Sweden)

    Min Sun

    2014-01-01

    Full Text Available A matrix-free method for constrained equations is proposed, which is a combination of the well-known PRP (Polak-Ribière-Polyak conjugate gradient method and the famous hyperplane projection method. The new method is not only derivative-free, but also completely matrix-free, and consequently, it can be applied to solve large-scale constrained equations. We obtain global convergence of the new method without any differentiability requirement on the constrained equations. Compared with the existing gradient methods for solving such problem, the new method possesses linear convergence rate under standard conditions, and a relax factor γ is attached in the update step to accelerate convergence. Preliminary numerical results show that it is promising in practice.

  18. Diagnosis, management and response criteria of iron overload in myelodysplastic syndromes (MDS): updated recommendations of the Austrian MDS platform.

    Science.gov (United States)

    Valent, Peter; Stauder, Reinhard; Theurl, Igor; Geissler, Klaus; Sliwa, Thamer; Sperr, Wolfgang R; Bettelheim, Peter; Sill, Heinz; Pfeilstöcker, Michael

    2018-02-01

    Despite the availability of effective iron chelators, transfusion-related morbidity is still a challenge in chronically transfused patients with myelodysplastic syndromes (MDS). In these patients, transfusion-induced iron overload may lead to organ dysfunction or even organ failure. In addition, iron overload is associated with reduced overall survival in MDS. Areas covered: During the past 10 years, various guidelines for the management of MDS patients with iron overload have been proposed. In the present article, we provide our updated recommendations for the diagnosis, prevention and therapy of iron overload in MDS. In addition, we propose refined treatment response criteria. As in 2006 and 2007, recommendations were discussed and formulated by participants of our Austrian MDS platform in a series of meetings in 2016 and 2017. Expert commentary: Our updated recommendations should support early recognition of iron overload, optimal patient management and the measurement of clinical responses to chelation treatment in daily practice.

  19. An update on risk factors for cartilage loss in knee osteoarthritis assessed using MRI-based semiquantitative grading methods

    Energy Technology Data Exchange (ETDEWEB)

    Alizai, Hamza [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); University of Texas Health Science Center at San Antonio, Department of Radiology, San Antonio, TX (United States); Roemer, Frank W. [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); University of Erlangen-Nuremberg, Department of Radiology, Erlangen (Germany); Hayashi, Daichi [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); Yale University School of Medicine, Department of Radiology, Bridgeport Hospital, Bridgeport, CT (United States); Crema, Michel D. [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); Hospital do Coracao and Teleimagem, Department of Radiology, Sao Paulo (Brazil); Felson, David T. [Boston University School of Medicine, Clinical Epidemiology Research and Training Unit, Boston, MA (United States); Guermazi, Ali [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); Boston Medical Center, Boston, MA (United States)

    2014-11-07

    Arthroscopy-based semiquantitative scoring systems such as Outerbridge and Noyes' scores were the first to be developed for the purpose of grading cartilage defects. As magnetic resonance imaging (MRI) became available faor evaluation of the osteoarthritic knee joint, these systems were adapted for use with MRI. Later on, grading methods such as the Whole Organ Magnetic Resonance Score, the Boston-Leeds Osteoarthritis Knee Score and the MRI Osteoarthritis Knee Score were designed specifically for performing whole-organ assessment of the knee joint structures, including cartilage. Cartilage grades on MRI obtained with these scoring systems represent optimal outcome measures for longitudinal studies, and are designed to enhance understanding of the knee osteoarthritis disease process. The purpose of this narrative review is to describe cartilage assessment in knee osteoarthritis using currently available MRI-based semiquantitative whole-organ scoring systems, and to provide an update on the risk factors for cartilage loss in knee osteoarthritis as assessed with these scoring systems. (orig.)

  20. An update on risk factors for cartilage loss in knee osteoarthritis assessed using MRI-based semiquantitative grading methods

    International Nuclear Information System (INIS)

    Alizai, Hamza; Roemer, Frank W.; Hayashi, Daichi; Crema, Michel D.; Felson, David T.; Guermazi, Ali

    2015-01-01

    Arthroscopy-based semiquantitative scoring systems such as Outerbridge and Noyes' scores were the first to be developed for the purpose of grading cartilage defects. As magnetic resonance imaging (MRI) became available faor evaluation of the osteoarthritic knee joint, these systems were adapted for use with MRI. Later on, grading methods such as the Whole Organ Magnetic Resonance Score, the Boston-Leeds Osteoarthritis Knee Score and the MRI Osteoarthritis Knee Score were designed specifically for performing whole-organ assessment of the knee joint structures, including cartilage. Cartilage grades on MRI obtained with these scoring systems represent optimal outcome measures for longitudinal studies, and are designed to enhance understanding of the knee osteoarthritis disease process. The purpose of this narrative review is to describe cartilage assessment in knee osteoarthritis using currently available MRI-based semiquantitative whole-organ scoring systems, and to provide an update on the risk factors for cartilage loss in knee osteoarthritis as assessed with these scoring systems. (orig.)

  1. A hybrid degradation tendency measurement method for mechanical equipment based on moving window and Grey–Markov model

    International Nuclear Information System (INIS)

    Jiang, Wei; Zhou, Jianzhong; Zheng, Yang; Liu, Han

    2017-01-01

    Accurate degradation tendency measurement is vital for the secure operation of mechanical equipment. However, the existing techniques and methodologies for degradation measurement still face challenges, such as lack of appropriate degradation indicator, insufficient accuracy, and poor capability to track the data fluctuation. To solve these problems, a hybrid degradation tendency measurement method for mechanical equipment based on a moving window and Grey–Markov model is proposed in this paper. In the proposed method, a 1D normalized degradation index based on multi-feature fusion is designed to assess the extent of degradation. Subsequently, the moving window algorithm is integrated with the Grey–Markov model for the dynamic update of the model. Two key parameters, namely the step size and the number of states, contribute to the adaptive modeling and multi-step prediction. Finally, three types of combination prediction models are established to measure the degradation trend of equipment. The effectiveness of the proposed method is validated with a case study on the health monitoring of turbine engines. Experimental results show that the proposed method has better performance, in terms of both measuring accuracy and data fluctuation tracing, in comparison with other conventional methods. (paper)

  2. A hybrid degradation tendency measurement method for mechanical equipment based on moving window and Grey-Markov model

    Science.gov (United States)

    Jiang, Wei; Zhou, Jianzhong; Zheng, Yang; Liu, Han

    2017-11-01

    Accurate degradation tendency measurement is vital for the secure operation of mechanical equipment. However, the existing techniques and methodologies for degradation measurement still face challenges, such as lack of appropriate degradation indicator, insufficient accuracy, and poor capability to track the data fluctuation. To solve these problems, a hybrid degradation tendency measurement method for mechanical equipment based on a moving window and Grey-Markov model is proposed in this paper. In the proposed method, a 1D normalized degradation index based on multi-feature fusion is designed to assess the extent of degradation. Subsequently, the moving window algorithm is integrated with the Grey-Markov model for the dynamic update of the model. Two key parameters, namely the step size and the number of states, contribute to the adaptive modeling and multi-step prediction. Finally, three types of combination prediction models are established to measure the degradation trend of equipment. The effectiveness of the proposed method is validated with a case study on the health monitoring of turbine engines. Experimental results show that the proposed method has better performance, in terms of both measuring accuracy and data fluctuation tracing, in comparison with other conventional methods.

  3. Decentralized Consistent Network Updates in SDN with ez-Segway

    KAUST Repository

    Nguyen, Thanh Dang

    2017-03-06

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  4. BE-2004: International meeting on updates in best estimate methods in nuclear installation safety analysis. Proceedings

    International Nuclear Information System (INIS)

    2004-01-01

    BE-2004 is the second in a series of embedded conferences that focus on generating and sustaining the dialogue regarding the use of best estimate plus uncertainty tools to license operational and advanced nuclear systems. The first conference in the series was held during the 2000 American Nuclear Society Winter Meeting in Washington. BE-2004 is international in scope, as evidenced by the multinational sources of the papers, and is intended to serve as an opportunity for information exchange between research scientists, practicing engineers, and regulators. However, as appropriate to a follow-on conference, the primary theme of BE-2004 is to provide updates reflecting the progress in best estimate methodologies in the last four years. Examples include research activities that evolved from the current Generation-IV initiative and other new designs [Nuclear Energy Research Initiative (NERI), etc.], core design and neutronic calculations that support best estimate analysis, use of advanced methodologies to produce plant licensing procedures competitive with best estimate methods, and of course current philosophical and technical issues that need to be considered in implementing best estimate codes as an established part of the international licensing framework

  5. Methodology for updating terrain object data from remote sensing data : the application of Landsat TM data with respect to agricultural fields

    NARCIS (Netherlands)

    Janssen, L.

    1994-01-01

    This thesis describes some methods for updating the thematic and geometrical data of terrain objects that are contained in a Geographic Information System (GIS). The updating is based on the application of digital interpretation techniques on high resolution satellite data. The potential

  6. Library Services and Construction Act. Long Range Plan, 1982-1986 Updates.

    Science.gov (United States)

    Seidenberg, Edward

    This 1982-86 update to long-range planning designed to continue the improvement of library facilities and services in Texas includes a review of how the plan developed, the various environmental factors affecting library operations, the present development of libraries, information needs and approaches to satisfying those needs, and methods for…

  7. Roads Data Conflation Using Update High Resolution Satellite Images

    Science.gov (United States)

    Abdollahi, A.; Riyahi Bakhtiari, H. R.

    2017-11-01

    Urbanization, industrialization and modernization are rapidly growing in developing countries. New industrial cities, with all the problems brought on by rapid population growth, need infrastructure to support the growth. This has led to the expansion and development of the road network. A great deal of road network data has made by using traditional methods in the past years. Over time, a large amount of descriptive information has assigned to these map data, but their geometric accuracy and precision is not appropriate to today's need. In this regard, the improvement of the geometric accuracy of road network data by preserving the descriptive data attributed to them and updating of the existing geo databases is necessary. Due to the size and extent of the country, updating the road network maps using traditional methods is time consuming and costly. Conversely, using remote sensing technology and geographic information systems can reduce costs, save time and increase accuracy and speed. With increasing the availability of high resolution satellite imagery and geospatial datasets there is an urgent need to combine geographic information from overlapping sources to retain accurate data, minimize redundancy, and reconcile data conflicts. In this research, an innovative method for a vector-to-imagery conflation by integrating several image-based and vector-based algorithms presented. The SVM method for image classification and Level Set method used to extract the road the different types of road intersections extracted from imagery using morphological operators. For matching the extracted points and to find the corresponding points, matching function which uses the nearest neighborhood method was applied. Finally, after identifying the matching points rubber-sheeting method used to align two datasets. Two residual and RMSE criteria used to evaluate accuracy. The results demonstrated excellent performance. The average root-mean-square error decreased from 11.8 to 4.1 m.

  8. Identification of material parameters for plasticity models: A comparative study on the finite element model updating and the virtual fields method

    Science.gov (United States)

    Martins, J. M. P.; Thuillier, S.; Andrade-Campos, A.

    2018-05-01

    The identification of material parameters, for a given constitutive model, can be seen as the first step before any practical application. In the last years, the field of material parameters identification received an important boost with the development of full-field measurement techniques, such as Digital Image Correlation. These techniques enable the use of heterogeneous displacement/strain fields, which contain more information than the classical homogeneous tests. Consequently, different techniques have been developed to extract material parameters from full-field measurements. In this study, two of these techniques are addressed, the Finite Element Model Updating (FEMU) and the Virtual Fields Method (VFM). The main idea behind FEMU is to update the parameters of a constitutive model implemented in a finite element model until both numerical and experimental results match, whereas VFM makes use of the Principle of Virtual Work and does not require any finite element simulation. Though both techniques proved their feasibility in linear and non-linear constitutive models, it is rather difficult to rank their robustness in plasticity. The purpose of this work is to perform a comparative study in the case of elasto-plastic models. Details concerning the implementation of each strategy are presented. Moreover, a dedicated code for VFM within a large strain framework is developed. The reconstruction of the stress field is performed through a user subroutine. A heterogeneous tensile test is considered to compare FEMU and VFM strategies.

  9. Minnesota's forest statistics, 1987: an inventory update.

    Science.gov (United States)

    Jerold T. Hahn; W. Brad Smith

    1987-01-01

    The Minnesota 1987 inventory update, derived by using tree growth models, reports 13.5 million acres of timberland, a decline of less than 1% since 1977. This bulletin presents findings from the inventory update in tables detailing timer land area, volume, and biomass.

  10. Wisconsin's forest statistics, 1987: an inventory update.

    Science.gov (United States)

    W. Brad Smith; Jerold T. Hahn

    1989-01-01

    The Wisconsin 1987 inventory update, derived by using tree growth models, reports 14.7 million acres of timberland, a decline of less than 1% since 1983. This bulletin presents findings from the inventory update in tables detailing timberland area, volume, and biomass.

  11. Updated measurement of the tau lifetime at SLD

    International Nuclear Information System (INIS)

    1996-01-01

    We present an updated measurement of the tau lifetime at SLD. 4316 τ-pair events, selected from a 150k Z 0 data sample, are analyzed using three techniques: decay length, impact parameter, and impact parameter difference methods. The measurement benefits from the small and stable interaction region at the SLC and the precision CCD pixel vertex detector of the SLD. The combined result is: τ τ = 288.1 ± 6.1(stat) ± 3.3(syst) fs

  12. Research on Topographic Map Updating

    Directory of Open Access Journals (Sweden)

    Ivana Javorović

    2013-04-01

    Full Text Available The investigation of interpretability of panchromatic satellite image IRS-1C integrated with multispectral Landsat TM image with the purpose of updating the topographic map sheet at the scale of 1:25 000 has been described. The geocoding of source map was based on trigonometric points of the map sheet. Satellite images were geocoded using control points selected from the map. The contents of map have been vectorized and topographic database designed. The digital image processing improved the interpretability of images. Then, the vectorization of new contents was made. The change detection of the forest and water area was defined by using unsupervised classification of spatial and spectral merged images. Verification of the results was made using corresponding aerial photographs. Although this methodology could not insure the complete updating of topographic map at the scale of 1:25 000, the database has been updated with huge amount of data. Erdas Imagine 8.3. software was used. 

  13. AN EFFICIENT SELF-UPDATING FACE RECOGNITION SYSTEM FOR PLASTIC SURGERY FACE

    Directory of Open Access Journals (Sweden)

    A. Devi

    2016-08-01

    Full Text Available Facial recognition system is fundamental a computer application for the automatic identification of a person through a digitized image or a video source. The major cause for the overall poor performance is related to the transformations in appearance of the user based on the aspects akin to ageing, beard growth, sun-tan etc. In order to overcome the above drawback, Self-update process has been developed in which, the system learns the biometric attributes of the user every time the user interacts with the system and the information gets updated automatically. The procedures of Plastic surgery yield a skilled and endurable means of enhancing the facial appearance by means of correcting the anomalies in the feature and then treating the facial skin with the aim of getting a youthful look. When plastic surgery is performed on an individual, the features of the face undergo reconstruction either locally or globally. But, the changes which are introduced new by plastic surgery remain hard to get modeled by the available face recognition systems and they deteriorate the performances of the face recognition algorithm. Hence the Facial plastic surgery produces changes in the facial features to larger extent and thereby creates a significant challenge to the face recognition system. This work introduces a fresh Multimodal Biometric approach making use of novel approaches to boost the rate of recognition and security. The proposed method consists of various processes like Face segmentation using Active Appearance Model (AAM, Face Normalization using Kernel Density Estimate/ Point Distribution Model (KDE-PDM, Feature extraction using Local Gabor XOR Patterns (LGXP and Classification using Independent Component Analysis (ICA. Efficient techniques have been used in each phase of the FRAS in order to obtain improved results.

  14. FPGA remote update for nuclear environments

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, Ana; Pereira, Rita C.; Sousa, Jorge; Carvalho, Paulo F.; Correia, Miguel; Rodrigues, Antonio P.; Carvalho, Bernardo B.; Goncalves, Bruno [Instituto de Plasmasbe Fusao Nuclear, Instituto Superior Tecnico, Universidade de Lisboa, 1049-001 Lisboa, (Portugal); Correia, Carlos M.B.A. [Centro de Instrumentacao, Dept. de Fisica, Universidade de Coimbra, 3004-516 Coimbra, (Portugal)

    2015-07-01

    The Instituto de Plasmas e Fusao Nuclear (IPFN) has developed dedicated re-configurable modules based on field programmable gate array (FPGA) devices for several nuclear fusion machines worldwide. Moreover, new Advanced Telecommunication Computing Architecture (ATCA) based modules developed by IPFN are already included in the ITER catalogue. One of the requirements for re-configurable modules operating in future nuclear environments including ITER is the remote update capability. Accordingly, this work presents an alternative method for FPGA remote programing to be implemented in new ATCA based re-configurable modules. FPGAs are volatile devices and their programming code is usually stored in dedicated flash memories for properly configuration during module power-on. The presented method is capable to store new FPGA codes in Serial Peripheral Interface (SPI) flash memories using the PCIexpress (PCIe) network established on the ATCA back-plane, linking data acquisition endpoints and the data switch blades. The method is based on the Xilinx Quick Boot application note, adapted to PCIe protocol and ATCA based modules. (authors)

  15. Simplified Analytical Method for Optimized Initial Shape Analysis of Self-Anchored Suspension Bridges and Its Verification

    Directory of Open Access Journals (Sweden)

    Myung-Rag Jung

    2015-01-01

    Full Text Available A simplified analytical method providing accurate unstrained lengths of all structural elements is proposed to find the optimized initial state of self-anchored suspension bridges under dead loads. For this, equilibrium equations of the main girder and the main cable system are derived and solved by evaluating the self-weights of cable members using unstrained cable lengths and iteratively updating both the horizontal tension component and the vertical profile of the main cable. Furthermore, to demonstrate the validity of the simplified analytical method, the unstrained element length method (ULM is applied to suspension bridge models based on the unstressed lengths of both cable and frame members calculated from the analytical method. Through numerical examples, it is demonstrated that the proposed analytical method can indeed provide an optimized initial solution by showing that both the simplified method and the nonlinear FE procedure lead to practically identical initial configurations with only localized small bending moment distributions.

  16. 76 FR 70209 - Proposed Information Collection Request; Notice of New Requirements and Procedures for Grant...

    Science.gov (United States)

    2011-11-10

    ... invites the public and other Federal agencies to comment on a proposed information collection concerning... Administrations (OAs).\\1\\ DOT is updating systems that support grant payments and there will be changes to the way... System (GTS), the Federal Highway Administration's Rapid Approval State Payment System (RASPS), or...

  17. Recommender engine for continuous-time quantum Monte Carlo methods

    Science.gov (United States)

    Huang, Li; Yang, Yi-feng; Wang, Lei

    2017-03-01

    Recommender systems play an essential role in the modern business world. They recommend favorable items such as books, movies, and search queries to users based on their past preferences. Applying similar ideas and techniques to Monte Carlo simulations of physical systems boosts their efficiency without sacrificing accuracy. Exploiting the quantum to classical mapping inherent in the continuous-time quantum Monte Carlo methods, we construct a classical molecular gas model to reproduce the quantum distributions. We then utilize powerful molecular simulation techniques to propose efficient quantum Monte Carlo updates. The recommender engine approach provides a general way to speed up the quantum impurity solvers.

  18. Software Updating in Wireless Sensor Networks: A Survey and Lacunae

    Directory of Open Access Journals (Sweden)

    Cormac J. Sreenan

    2013-11-01

    Full Text Available Wireless Sensor Networks are moving out of the laboratory and into the field. For a number of reasons there is often a need to update sensor node software, or node configuration, after deployment. The need for over-the-air updates is driven both by the scale of deployments, and by the remoteness and inaccessibility of sensor nodes. This need has been recognized since the early days of sensor networks, and research results from the related areas of mobile networking and distributed systems have been applied to this area. In order to avoid any manual intervention, the update process needs to be autonomous. This paper presents a comprehensive survey of software updating in Wireless Sensor Networks, and analyses the features required to make these updates autonomous. A new taxonomy of software update features and a new model for fault detection and recovery are presented. The paper concludes by identifying the lacunae relating to autonomous software updates, providing direction for future research.

  19. Which Individuals To Choose To Update the Reference Population? Minimizing the Loss of Genetic Diversity in Animal Genomic Selection Programs

    Directory of Open Access Journals (Sweden)

    Sonia E. Eynard

    2018-01-01

    Full Text Available Genomic selection (GS is commonly used in livestock and increasingly in plant breeding. Relying on phenotypes and genotypes of a reference population, GS allows performance prediction for young individuals having only genotypes. This is expected to achieve fast high genetic gain but with a potential loss of genetic diversity. Existing methods to conserve genetic diversity depend mostly on the choice of the breeding individuals. In this study, we propose a modification of the reference population composition to mitigate diversity loss. Since the high cost of phenotyping is the limiting factor for GS, our findings are of major economic interest. This study aims to answer the following questions: how would decisions on the reference population affect the breeding population, and how to best select individuals to update the reference population and balance maximizing genetic gain and minimizing loss of genetic diversity? We investigated three updating strategies for the reference population: random, truncation, and optimal contribution (OC strategies. OC maximizes genetic merit for a fixed loss of genetic diversity. A French Montbéliarde dairy cattle population with 50K SNP chip genotypes and simulations over 10 generations were used to compare these different strategies using milk production as the trait of interest. Candidates were selected to update the reference population. Prediction bias and both genetic merit and diversity were measured. Changes in the reference population composition slightly affected the breeding population. Optimal contribution strategy appeared to be an acceptable compromise to maintain both genetic gain and diversity in the reference and the breeding populations.

  20. Which Individuals To Choose To Update the Reference Population? Minimizing the Loss of Genetic Diversity in Animal Genomic Selection Programs.

    Science.gov (United States)

    Eynard, Sonia E; Croiseau, Pascal; Laloë, Denis; Fritz, Sebastien; Calus, Mario P L; Restoux, Gwendal

    2018-01-04

    Genomic selection (GS) is commonly used in livestock and increasingly in plant breeding. Relying on phenotypes and genotypes of a reference population, GS allows performance prediction for young individuals having only genotypes. This is expected to achieve fast high genetic gain but with a potential loss of genetic diversity. Existing methods to conserve genetic diversity depend mostly on the choice of the breeding individuals. In this study, we propose a modification of the reference population composition to mitigate diversity loss. Since the high cost of phenotyping is the limiting factor for GS, our findings are of major economic interest. This study aims to answer the following questions: how would decisions on the reference population affect the breeding population, and how to best select individuals to update the reference population and balance maximizing genetic gain and minimizing loss of genetic diversity? We investigated three updating strategies for the reference population: random, truncation, and optimal contribution (OC) strategies. OC maximizes genetic merit for a fixed loss of genetic diversity. A French Montbéliarde dairy cattle population with 50K SNP chip genotypes and simulations over 10 generations were used to compare these different strategies using milk production as the trait of interest. Candidates were selected to update the reference population. Prediction bias and both genetic merit and diversity were measured. Changes in the reference population composition slightly affected the breeding population. Optimal contribution strategy appeared to be an acceptable compromise to maintain both genetic gain and diversity in the reference and the breeding populations. Copyright © 2018 Eynard et al.

  1. 77 FR 41258 - FOIA Fee Schedule Update

    Science.gov (United States)

    2012-07-13

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Establishment of FOIA Fee Schedule. SUMMARY: The Defense Nuclear Facilities Safety Board is publishing its Freedom of Information Act (FOIA) Fee Schedule Update pursuant to...

  2. 76 FR 43819 - FOIA Fee Schedule Update

    Science.gov (United States)

    2011-07-22

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Establishment of FOIA Fee Schedule. SUMMARY: The Defense Nuclear Facilities Safety Board is publishing its Freedom of Information Act (FOIA) Fee Schedule Update pursuant to...

  3. A methodological survey identified eight proposed frameworks for the adaptation of health related guidelines.

    Science.gov (United States)

    Darzi, Andrea; Abou-Jaoude, Elias A; Agarwal, Arnav; Lakis, Chantal; Wiercioch, Wojtek; Santesso, Nancy; Brax, Hneine; El-Jardali, Fadi; Schünemann, Holger J; Akl, Elie A

    2017-06-01

    Our objective was to identify and describe published frameworks for adaptation of clinical, public health, and health services guidelines. We included reports describing methods of adaptation of guidelines in sufficient detail to allow its reproducibility. We searched Medline and EMBASE databases. We also searched personal files, as well manuals and handbooks of organizations and professional societies that proposed methods of adaptation and adoption of guidelines. We followed standard systematic review methodology. Our search captured 12,021 citations, out of which we identified eight proposed methods of guidelines adaptation: ADAPTE, Adapted ADAPTE, Alberta Ambassador Program adaptation phase, GRADE-ADOLOPMENT, MAGIC, RAPADAPTE, Royal College of Nursing (RCN), and Systematic Guideline Review (SGR). The ADAPTE framework consists of a 24-step process to adapt guidelines to a local context taking into consideration the needs, priorities, legislation, policies, and resources. The Alexandria Center for Evidence-Based Clinical Practice Guidelines updated one of ADAPTE's tools, modified three tools, and added three new ones. In addition, they proposed optionally using three other tools. The Alberta Ambassador Program adaptation phase consists of 11 steps and focused on adapting good-quality guidelines for nonspecific low back pain into local context. GRADE-ADOLOPMENT is an eight-step process based on the GRADE Working Group's Evidence to Decision frameworks and applied in 22 guidelines in the context of national guideline development program. The MAGIC research program developed a five-step adaptation process, informed by ADAPTE and the GRADE approach in the context of adapting thrombosis guidelines. The RAPADAPTE framework consists of 12 steps based on ADAPTE and using synthesized evidence databases, retrospectively derived from the experience of producing a high-quality guideline for the treatment of breast cancer with limited resources in Costa Rica. The RCN outlines

  4. 77 FR 24684 - Proposed Information Collection; Comment Request; 2013-2015 American Community Survey Methods...

    Science.gov (United States)

    2012-04-25

    ... proposed content changes. Thus, we need to test an alternative questionnaire design to accommodate additional content on the ACS mail questionnaire. In the 2013 ACS Questionnaire Design Test, we will study... in Puerto Rico. II. Method of Collection Questionnaire Design Test--Data collection for this test...

  5. Programming the finite element method

    CERN Document Server

    Smith, I M; Margetts, L

    2013-01-01

    Many students, engineers, scientists and researchers have benefited from the practical, programming-oriented style of the previous editions of Programming the Finite Element Method, learning how to develop computer programs to solve specific engineering problems using the finite element method. This new fifth edition offers timely revisions that include programs and subroutine libraries fully updated to Fortran 2003, which are freely available online, and provides updated material on advances in parallel computing, thermal stress analysis, plasticity return algorithms, convection boundary c

  6. Environmental Regulatory Update Table, December 1989

    International Nuclear Information System (INIS)

    Houlbert, L.M.; Langston, M.E.; Nikbakht, A.; Salk, M.S.

    1990-01-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  7. Environmental regulatory update table, March 1989

    International Nuclear Information System (INIS)

    Houlberg, L.; Langston, M.E.; Nikbakht, A.; Salk, M.S.

    1989-04-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  8. Environmental Regulatory Update Table, April 1989

    International Nuclear Information System (INIS)

    Houlberg, L.; Langston, M.E.; Nikbakht, A.; Salk, M.S.

    1989-05-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  9. Environmental Regulatory Update Table, December 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1992-01-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  10. Environmental Regulatory Update Table, August 1990

    International Nuclear Information System (INIS)

    Houlberg, L.M.; Nikbakht, A.; Salk, M.S.

    1990-09-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  11. Environmental Regulatory Update Table, October 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-11-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  12. Environmental Regulatory Update Table, November 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-12-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  13. Environmental Regulatory Update Table, September 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-10-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  14. Gene set analysis: limitations in popular existing methods and proposed improvements.

    Science.gov (United States)

    Mishra, Pashupati; Törönen, Petri; Leino, Yrjö; Holm, Liisa

    2014-10-01

    Gene set analysis is the analysis of a set of genes that collectively contribute to a biological process. Most popular gene set analysis methods are based on empirical P-value that requires large number of permutations. Despite numerous gene set analysis methods developed in the past decade, the most popular methods still suffer from serious limitations. We present a gene set analysis method (mGSZ) based on Gene Set Z-scoring function (GSZ) and asymptotic P-values. Asymptotic P-value calculation requires fewer permutations, and thus speeds up the gene set analysis process. We compare the GSZ-scoring function with seven popular gene set scoring functions and show that GSZ stands out as the best scoring function. In addition, we show improved performance of the GSA method when the max-mean statistics is replaced by the GSZ scoring function. We demonstrate the importance of both gene and sample permutations by showing the consequences in the absence of one or the other. A comparison of asymptotic and empirical methods of P-value estimation demonstrates a clear advantage of asymptotic P-value over empirical P-value. We show that mGSZ outperforms the state-of-the-art methods based on two different evaluations. We compared mGSZ results with permutation and rotation tests and show that rotation does not improve our asymptotic P-values. We also propose well-known asymptotic distribution models for three of the compared methods. mGSZ is available as R package from cran.r-project.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. An updated distribution of Solidago × niederederi (Asteraceae in Poland

    Directory of Open Access Journals (Sweden)

    Pliszko Artur

    2017-12-01

    Full Text Available In this paper, an updated map of the distribution of Solidago ×niederederi, a natural hybrid between S. canadensis and S. virgaurea, in Poland is presented using the ATPOL cartogram method. A compiled list of 55 localities of the hybrid within 40 cartogram units (10-km squares is provided and its negative impact on S. virgaurea is highlighted.

  16. Online sequential condition prediction method of natural circulation systems based on EOS-ELM and phase space reconstruction

    International Nuclear Information System (INIS)

    Chen, Hanying; Gao, Puzhen; Tan, Sichao; Tang, Jiguo; Yuan, Hongsheng

    2017-01-01

    Highlights: •An online condition prediction method for natural circulation systems in NPP was proposed based on EOS-ELM. •The proposed online prediction method was validated using experimental data. •The training speed of the proposed method is significantly fast. •The proposed method can achieve good accuracy in wide parameter range. -- Abstract: Natural circulation design is widely used in the passive safety systems of advanced nuclear power reactors. The irregular and chaotic flow oscillations are often observed in boiling natural circulation systems so it is difficult for operators to monitor and predict the condition of these systems. An online condition forecasting method for natural circulation system is proposed in this study as an assisting technique for plant operators. The proposed prediction approach was developed based on Ensemble of Online Sequential Extreme Learning Machine (EOS-ELM) and phase space reconstruction. Online Sequential Extreme Learning Machine (OS-ELM) is an online sequential learning neural network algorithm and EOS-ELM is the ensemble method of it. The proposed condition prediction method can be initiated by a small chunk of monitoring data and it can be updated by newly arrived data at very fast speed during the online prediction. Simulation experiments were conducted on the data of two natural circulation loops to validate the performance of the proposed method. The simulation results show that the proposed predication model can successfully recognize different types of flow oscillations and accurately forecast the trend of monitored plant variables. The influence of the number of hidden nodes and neural network inputs on prediction performance was studied and the proposed model can achieve good accuracy in a wide parameter range. Moreover, the comparison results show that the proposed condition prediction method has much faster online learning speed and better prediction accuracy than conventional neural network model.

  17. How Do We Update Faces? Effects of Gaze Direction and Facial Expressions on Working Memory Updating

    OpenAIRE

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enh...

  18. Key Update Assistant for Resource-Constrained Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    developed a push-button solution - powered by stochastic model checking - that network designers can easily benefit from, and it paves the way for consumers to set up key update related security parameters. Key Update Assistant, as we named it, runs necessary model checking operations and determines...

  19. A Method for Proposing Valued-Adding Attributes in Customized Housing

    Directory of Open Access Journals (Sweden)

    Cynthia S. Hentschke

    2014-12-01

    Full Text Available In most emerging economies, there has been many incentives and high availability of funding for low-cost housing projects. This has encouraged product standardization and the application of mass production ideas, based on the assumption that this is the most effective strategy for reducing costs. However, the delivery of highly standardized housing units to customers with different needs, without considering their lifestyle and perception of value, often results in inadequate products. Mass customization has been pointed out as an effective strategy to improve value generation in low-cost housing projects, and to avoid waste caused by renovations done in dwellings soon after occupancy. However, one of the main challenges for the implementation of mass customization is the definition of a set of relevant options based on users’ perceived value. The aim of this paper is to propose a method for defining value adding attributes in customized housing projects, which can support decision-making in product development. The means-end chain theory was used as theoretical framework to connect product attributes and costumers’ values, through the application of the laddering technique. The method was tested in two house-building projects delivered by a company from Brazil. The main contribution of this method is to indicate the customization units that are most important for users along with the explanation of why those units are the most relevant ones.

  20. 78 FR 36560 - 30-Day Notice of Proposed Information Collection: FHA Lender Approval, Annual Renewal, Periodic...

    Science.gov (United States)

    2013-06-18

    ... 92001-C Non-compliances on Title I Lenders Description of the need for the information and proposed use... Information Collection: FHA Lender Approval, Annual Renewal, Periodic Updates and Noncompliance Reporting by FHA Approved Lenders AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice. SUMMARY...

  1. [French Society for Biological Psychiatry and Neuropsychopharmacology task force. Formal consensus for the treatment of bipolar disorder: an update (2014)].

    Science.gov (United States)

    Samalin, L; Guillaume, S; Courtet, P; Abbar, M; Lancrenon, S; Llorca, P-M

    2015-02-01

    As part of a process to improve the quality of care, the French Society for Biological Psychiatry and Neuropsychopharmacology developed in 2010 formal consensus guidelines for the treatment of bipolar disorder. The evolution of therapeutic options available in France for the treatment of bipolar disorder has justified the update of this guideline. The purpose of this work was to provide an updated and ergonomic document to promote its use by clinicians. This update focuses on two of the six thematic previously published (acute treatment and long-term treatment). Aspects of the treatment of bipolar patients sparking debate and questions of clinicians (use of antidepressant, place of the bitherapy, interest of long-acting antipsychotics…) were also covered. Finally, we proposed graded recommendations taking into account specifically the risk-benefit balance of each molecule. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  2. Nonadditive entropy maximization is inconsistent with Bayesian updating

    Science.gov (United States)

    Pressé, Steve

    2014-11-01

    The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.

  3. Updating Human Factors Engineering Guidelines for Conducting Safety Reviews of Nuclear Power Plants

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Higgins, J.; Fleger, Stephen

    2011-01-01

    The U.S. Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) programs of applicants for nuclear power plant construction permits, operating licenses, standard design certifications, and combined operating licenses. The purpose of these safety reviews is to help ensure that personnel performance and reliability are appropriately supported. Detailed design review procedures and guidance for the evaluations is provided in three key documents: the Standard Review Plan (NUREG-0800), the HFE Program Review Model (NUREG-0711), and the Human-System Interface Design Review Guidelines (NUREG-0700). These documents were last revised in 2007, 2004 and 2002, respectively. The NRC is committed to the periodic update and improvement of the guidance to ensure that it remains a state-of-the-art design evaluation tool. To this end, the NRC is updating its guidance to stay current with recent research on human performance, advances in HFE methods and tools, and new technology being employed in plant and control room design. This paper describes the role of HFE guidelines in the safety review process and the content of the key HFE guidelines used. Then we will present the methodology used to develop HFE guidance and update these documents, and describe the current status of the update program.

  4. Environmental Regulatory Update Table, August 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M., Hawkins, G.T.; Salk, M.S.

    1991-09-01

    This Environmental Regulatory Update Table (August 1991) provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  5. Environmental regulatory update table, July 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-08-01

    This Environmental Regulatory Update Table (July 1991) provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  6. The Value of Ensari’s Proposal in Evaluating the Mucosal Pathology of Childhood Celiac Disease: Old Classification versus New Version

    Directory of Open Access Journals (Sweden)

    Gülçin Güler Şimşek

    2012-09-01

    Full Text Available Objective: Small intestinal biopsy remains the gold standard in diagnosing celiac disease (CD; however, the wide spectrum of histopathological states and differential diagnosis of CD is still a diagnostic problem for pathologists. Recently, Ensari reviewed the literature and proposed an update of the histopathological diagnosis and classification for CD. Materials and Methods: In this study, the histopathological materials of 54 children in whom CD was diagnosed at our hospital were reviewed to compare the previous Marsh and Modified Marsh-Oberhuber classifications with this new proposal. Results: In this study, we show that the Ensari classification is as accurate as the Marsh and Modified Marsh classifications in describing the consecutive states of mucosal damage seen in CD.Conclusions: Ensari’s classification is simple, practical and facilitative in diagnosing and subtyping of mucosal pathology of CD.

  7. Dynamics of random Boolean networks under fully asynchronous stochastic update based on linear representation.

    Directory of Open Access Journals (Sweden)

    Chao Luo

    Full Text Available A novel algebraic approach is proposed to study dynamics of asynchronous random Boolean networks where a random number of nodes can be updated at each time step (ARBNs. In this article, the logical equations of ARBNs are converted into the discrete-time linear representation and dynamical behaviors of systems are investigated. We provide a general formula of network transition matrices of ARBNs as well as a necessary and sufficient algebraic criterion to determine whether a group of given states compose an attractor of length[Formula: see text] in ARBNs. Consequently, algorithms are achieved to find all of the attractors and basins in ARBNs. Examples are showed to demonstrate the feasibility of the proposed scheme.

  8. Sugammadex: An Update

    Directory of Open Access Journals (Sweden)

    Ezri Tiberiu

    2016-01-01

    Full Text Available The purpose of this update is to provide recent knowledge and debates regarding the use of sugammadex in the fields of anesthesia and critical care. The review is not intended to provide a comprehensive description of sugammadex and its clinical use.

  9. A 2D/1D coupling neutron transport method based on the matrix MOC and NEM methods

    International Nuclear Information System (INIS)

    Zhang, H.; Zheng, Y.; Wu, H.; Cao, L.

    2013-01-01

    A new 2D/1D coupling method based on the matrix MOC method (MMOC) and nodal expansion method (NEM) is proposed for solving the three-dimensional heterogeneous neutron transport problem. The MMOC method, used for radial two-dimensional calculation, constructs a response matrix between source and flux with only one sweep and then solves the linear system by using the restarted GMRES algorithm instead of the traditional trajectory sweeping process during within-group iteration for angular flux update. Long characteristics are generated by using the customization of commercial software AutoCAD. A one-dimensional diffusion calculation is carried out in the axial direction by employing the NEM method. The 2D and ID solutions are coupled through the transverse leakage items. The 3D CMFD method is used to ensure the global neutron balance and adjust the different convergence properties of the radial and axial solvers. A computational code is developed based on these theories. Two benchmarks are calculated to verify the coupling method and the code. It is observed that the corresponding numerical results agree well with references, which indicates that the new method is capable of solving the 3D heterogeneous neutron transport problem directly. (authors)

  10. A 2D/1D coupling neutron transport method based on the matrix MOC and NEM methods

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, H.; Zheng, Y.; Wu, H.; Cao, L. [School of Nuclear Science and Technology, Xi' an Jiaotong University, No. 28, Xianning West Road, Xi' an, Shaanxi 710049 (China)

    2013-07-01

    A new 2D/1D coupling method based on the matrix MOC method (MMOC) and nodal expansion method (NEM) is proposed for solving the three-dimensional heterogeneous neutron transport problem. The MMOC method, used for radial two-dimensional calculation, constructs a response matrix between source and flux with only one sweep and then solves the linear system by using the restarted GMRES algorithm instead of the traditional trajectory sweeping process during within-group iteration for angular flux update. Long characteristics are generated by using the customization of commercial software AutoCAD. A one-dimensional diffusion calculation is carried out in the axial direction by employing the NEM method. The 2D and ID solutions are coupled through the transverse leakage items. The 3D CMFD method is used to ensure the global neutron balance and adjust the different convergence properties of the radial and axial solvers. A computational code is developed based on these theories. Two benchmarks are calculated to verify the coupling method and the code. It is observed that the corresponding numerical results agree well with references, which indicates that the new method is capable of solving the 3D heterogeneous neutron transport problem directly. (authors)

  11. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  12. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.; Pojonk, Oliver; Rosic, Bojana V.; Zander, Elmar

    2014-01-01

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  13. Topology optimization based on the harmony search method

    International Nuclear Information System (INIS)

    Lee, Seung-Min; Han, Seog-Young

    2017-01-01

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  14. Topology optimization based on the harmony search method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung-Min; Han, Seog-Young [Hanyang University, Seoul (Korea, Republic of)

    2017-06-15

    A new topology optimization scheme based on a Harmony search (HS) as a metaheuristic method was proposed and applied to static stiffness topology optimization problems. To apply the HS to topology optimization, the variables in HS were transformed to those in topology optimization. Compliance was used as an objective function, and harmony memory was defined as the set of the optimized topology. Also, a parametric study for Harmony memory considering rate (HMCR), Pitch adjusting rate (PAR), and Bandwidth (BW) was performed to find the appropriate range for topology optimization. Various techniques were employed such as a filtering scheme, simple average scheme and harmony rate. To provide a robust optimized topology, the concept of the harmony rate update rule was also implemented. Numerical examples are provided to verify the effectiveness of the HS by comparing the optimal layouts of the HS with those of Bidirectional evolutionary structural optimization (BESO) and Artificial bee colony algorithm (ABCA). The following conclu- sions could be made: (1) The proposed topology scheme is very effective for static stiffness topology optimization problems in terms of stability, robustness and convergence rate. (2) The suggested method provides a symmetric optimized topology despite the fact that the HS is a stochastic method like the ABCA. (3) The proposed scheme is applicable and practical in manufacturing since it produces a solid-void design of the optimized topology. (4) The suggested method appears to be very effective for large scale problems like topology optimization.

  15. The Role of the Oculomotor System in Updating Visual-Spatial Working Memory across Saccades

    OpenAIRE

    Boon, Paul J.; Belopolsky, Artem V.; Theeuwes, Jan

    2016-01-01

    Visual-spatial working memory (VSWM) helps us to maintain and manipulate visual information in the absence of sensory input. It has been proposed that VSWM is an emergent property of the oculomotor system. In the present study we investigated the role of the oculomotor system in updating of spatial working memory representations across saccades. Participants had to maintain a location in memory while making a saccade to a different location. During the saccade the target was displaced, which ...

  16. Updating and prospective validation of a prognostic model for high sickness absence

    NARCIS (Netherlands)

    Roelen, C.A.M.; Heymans, M.W.; Twisk, J.W.R.; van Rhenen, W.; Pallesen, S.; Bjorvatn, B.; Moen, B.E.; Mageroy, N.

    2015-01-01

    Objectives To further develop and validate a Dutch prognostic model for high sickness absence (SA). Methods Three-wave longitudinal cohort study of 2,059 Norwegian nurses. The Dutch prognostic model was used to predict high SA among Norwegian nurses at wave 2. Subsequently, the model was updated by

  17. Proposed rule package on fracture toughness and thermal annealing requirements and guidance for light water reactor vessels

    International Nuclear Information System (INIS)

    Allen Hiser, J.R.

    1993-01-01

    In the framework of updating and clarification of the fracture toughness and thermal annealing requirements and guidance for light water reactor pressure vessels, proposed revisions concerning the pressurized thermal shock rule, fracture toughness requirements and reactor vessel material surveillance program requirements, are described. A new rule concerning thermal annealing requirements and a draft regulatory guide on 'Format and Content of Application for Approval for Thermal Annealing of RPV' are also proposed

  18. Proposed rule package on fracture toughness and thermal annealing requirements and guidance for light water reactor vessels

    Energy Technology Data Exchange (ETDEWEB)

    Allen Hiser, J R [UKAEA Harwell Lab. (United Kingdom). Engineering Div.

    1994-12-31

    In the framework of updating and clarification of the fracture toughness and thermal annealing requirements and guidance for light water reactor pressure vessels, proposed revisions concerning the pressurized thermal shock rule, fracture toughness requirements and reactor vessel material surveillance program requirements, are described. A new rule concerning thermal annealing requirements and a draft regulatory guide on `Format and Content of Application for Approval for Thermal Annealing of RPV` are also proposed.

  19. Workers Experience Guides Karaoke in Updating Status in Facebook as Interpersonal Relations and Personal Communication with Customers

    OpenAIRE

    Amelia Sari, Kiki; Suprihartini, M.Si, Dra. Taufik

    2016-01-01

    The presence of information technology is rapid and practical nature can allow for changes in behavior or lifestyle. One of them is the development of information technology with the birth of social networks, namely Facebook. Karaoke guide also actively uses Facebook to update your status and communicate with customers. By using qualitative methods, this study aims to describe the "Experience of Guides Karaoke Workers when Updating status on Facebook as Interpersonal and Personality Communica...

  20. Updates of CORESTA Recommended Methods after Further Collaborative Studies Carried Out under Both ISO and Health Canada Intense Smoking Regimes

    Directory of Open Access Journals (Sweden)

    Purkis SW

    2014-12-01

    Full Text Available During 2012, three CORESTA Recommended Methods (CRMs (1-3 were updated to include smoke yield and variability data under both ISO (4 and the Canadian Intense (CI (5 smoking regimes. At that time, repeatability and reproducibility data under the CI regime on smoke analytes other than “tar”, nicotine and carbon monoxide (6 and tobacco-specific nitrosamines (TSNAs (7 were not available in the public literature. The subsequent work involved the determination of the mainstream smoke yields of benzo[a]-pyrene, selected volatiles (benzene, toluene, 1,3-butadiene, isoprene, acrylonitrile, and selected carbonyls (acetaldehyde, formaldehyde, propionaldehyde, butyraldehyde, crotonaldehyde, acrolein, acetone and 2-butanone in ten cigarette products followed by statistical analyses according to the ISO protocol (8. This paper provides some additional perspective on the data variability under the ISO and CI smoking regimes not given in the CRMs.