WorldWideScience

Sample records for model updating procedures

  1. PSA Update Procedures, an Ultimate Need for Living PSA

    International Nuclear Information System (INIS)

    Hegedus, D.

    1998-01-01

    Nuclear facilities by their complex nature, change with time. These changes can be both physical (plant modification, etc.), operational (enhanced procedures, etc.) and organizational. In addition, there are also changes in our understanding of the plant, due to operational experience, data collection, technology enhancements, etc. Therefore, it is imperative that PSA model must be frequently up-dated or modified to reflect these changes. Over the last ten years. these has been a remarkable growth of the use of Probabilistic Safety Assessments (PSAs). The most rapidly growing area of the PSA Applications is their use to support operational decision-making. Many of these applications are characterized by the potential for not only improving the safety level but also for providing guidance on the optimal use of resources and reducing regulatory burden. To enable a wider use of the PSA model as a tool for safety activities it is essential to maintain the model in a controlled state. Moreover, to fulfill requirements for L iving PSA , the PSA model has to be constantly updated and/or monitored to reflect the current plant configuration. It should be noted that the PSA model should not only represent the plant design but should also represent the operational and emergency procedures. To keep the PSA model up-to-date several issues should be clearly defined including: - Responsibility should be divided among the PSA group, - Procedures for implementing changes should be established, and - QA requirements/program should be established to assure documentation and reporting. (author)

  2. Empirical testing of forecast update procedure forseasonal products

    DEFF Research Database (Denmark)

    Wong, Chee Yew; Johansen, John

    2008-01-01

    Updating of forecasts is essential for successful collaborative forecasting, especially for seasonal products. This paper discusses the results of a theoretical simulation and an empirical test of a proposed time-series forecast updating procedure. It involves a two-stage longitudinal case study...... of a toy supply chain. The theoretical simulation involves historical weekly consumer demand data for 122 toy products. The empirical test is then carried out in real-time with 291 toy products. The results show that the proposed forecast updating procedure: 1) reduced forecast errors of the annual...... provided less forecast accuracy improvement and it needed a longer time to achieve relatively acceptable forecast uncertainty....

  3. The update of the accounting procedures in Agricultural Cooperatives.

    Directory of Open Access Journals (Sweden)

    Rafael Enrique Viña Echevarría

    2014-06-01

    Full Text Available As part of the implementation of Internal Control in Agricultural Cooperatives from the standards established by the General Controller of the Republic, and the harmonization of accounting procedures Cuban Accounting Standards, It is need to update the accounting procedure manuals to guide and regulate the flows, times and registration basis, considering the current legislation, being these the purpose of the discussion in this investigation. The results focused on organizational dynamics of cooperatives, serving the agricultural cooperative sector and its relation to internal control and accounting management guidelines based on economic and social policy of the Party and the Revolution, as well as updating the procedure manuals. It even showed limitations in the application of internal control procedures and accounting according to the current regulations in Cuba, expressing the need to continue its development.

  4. Finite element model updating in structural dynamics using design sensitivity and optimisation

    OpenAIRE

    Calvi, Adriano

    1998-01-01

    Model updating is an important issue in engineering. In fact a well-correlated model provides for accurate evaluation of the structure loads and responses. The main objectives of the study were to exploit available optimisation programs to create an error localisation and updating procedure of nite element models that minimises the "error" between experimental and analytical modal data, addressing in particular the updating of large scale nite element models with se...

  5. A review on model updating of joint structure for dynamic analysis purpose

    Directory of Open Access Journals (Sweden)

    Zahari S.N.

    2016-01-01

    Full Text Available Structural joints provide connection between structural element (beam, plate etc. in order to construct a whole assembled structure. There are many types of structural joints such as bolted joint, riveted joints and welded joints. The joints structures significantly contribute to structural stiffness and dynamic behaviour of structures hence the main objectives of this paper are to review on method of model updating on joints structure and to discuss the guidelines to perform model updating for dynamic analysis purpose. This review paper firstly will outline some of the existing finite element modelling works of joints structure. Experimental modal analysis is the next step to obtain modal parameters (natural frequency & mode shape to validate and improve the discrepancy between results obtained from experimental and the simulation counterparts. Hence model updating will be carried out to minimize the differences between the two results. There are two methods of model updating; direct method and iterative method. Sensitivity analysis employed using SOL200 in NASTRAN by selecting the suitable updating parameters to avoid ill-conditioning problem. It is best to consider both geometrical and material properties in the updating procedure rather than choosing only a number of geometrical properties alone. Iterative method was chosen as the best model updating procedure because the physical meaning of updated parameters are guaranteed although this method required computational effort compare to direct method.

  6. Analogous Mechanisms of Selection and Updating in Declarative and Procedural Working Memory: Experiments and a Computational Model

    Science.gov (United States)

    Oberauer, Klaus; Souza, Alessandra S.; Druey, Michel D.; Gade, Miriam

    2013-01-01

    The article investigates the mechanisms of selecting and updating representations in declarative and procedural working memory (WM). Declarative WM holds the objects of thought available, whereas procedural WM holds representations of what to do with these objects. Both systems consist of three embedded components: activated long-term memory, a…

  7. Fast model updating coupling Bayesian inference and PGD model reduction

    Science.gov (United States)

    Rubio, Paul-Baptiste; Louf, François; Chamoin, Ludovic

    2018-04-01

    The paper focuses on a coupled Bayesian-Proper Generalized Decomposition (PGD) approach for the real-time identification and updating of numerical models. The purpose is to use the most general case of Bayesian inference theory in order to address inverse problems and to deal with different sources of uncertainties (measurement and model errors, stochastic parameters). In order to do so with a reasonable CPU cost, the idea is to replace the direct model called for Monte-Carlo sampling by a PGD reduced model, and in some cases directly compute the probability density functions from the obtained analytical formulation. This procedure is first applied to a welding control example with the updating of a deterministic parameter. In the second application, the identification of a stochastic parameter is studied through a glued assembly example.

  8. Updated Lagrangian finite element formulations of various biological soft tissue non-linear material models: a comprehensive procedure and review.

    Science.gov (United States)

    Townsend, Molly T; Sarigul-Klijn, Nesrin

    2016-01-01

    Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.

  9. On-line Bayesian model updating for structural health monitoring

    Science.gov (United States)

    Rocchetta, Roberto; Broggi, Matteo; Huchet, Quentin; Patelli, Edoardo

    2018-03-01

    Fatigue induced cracks is a dangerous failure mechanism which affects mechanical components subject to alternating load cycles. System health monitoring should be adopted to identify cracks which can jeopardise the structure. Real-time damage detection may fail in the identification of the cracks due to different sources of uncertainty which have been poorly assessed or even fully neglected. In this paper, a novel efficient and robust procedure is used for the detection of cracks locations and lengths in mechanical components. A Bayesian model updating framework is employed, which allows accounting for relevant sources of uncertainty. The idea underpinning the approach is to identify the most probable crack consistent with the experimental measurements. To tackle the computational cost of the Bayesian approach an emulator is adopted for replacing the computationally costly Finite Element model. To improve the overall robustness of the procedure, different numerical likelihoods, measurement noises and imprecision in the value of model parameters are analysed and their effects quantified. The accuracy of the stochastic updating and the efficiency of the numerical procedure are discussed. An experimental aluminium frame and on a numerical model of a typical car suspension arm are used to demonstrate the applicability of the approach.

  10. Finite element model updating of natural fibre reinforced composite structure in structural dynamics

    Directory of Open Access Journals (Sweden)

    Sani M.S.M.

    2016-01-01

    Full Text Available Model updating is a process of making adjustment of certain parameters of finite element model in order to reduce discrepancy between analytical predictions of finite element (FE and experimental results. Finite element model updating is considered as an important field of study as practical application of finite element method often shows discrepancy to the test result. The aim of this research is to perform model updating procedure on a composite structure as well as trying improving the presumed geometrical and material properties of tested composite structure in finite element prediction. The composite structure concerned in this study is a plate of reinforced kenaf fiber with epoxy. Modal properties (natural frequency, mode shapes, and damping ratio of the kenaf fiber structure will be determined using both experimental modal analysis (EMA and finite element analysis (FEA. In EMA, modal testing will be carried out using impact hammer test while normal mode analysis using FEA will be carried out using MSC. Nastran/Patran software. Correlation of the data will be carried out before optimizing the data from FEA. Several parameters will be considered and selected for the model updating procedure.

  11. Updating of a dynamic finite element model from the Hualien scale model reactor building

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Lebailly, P.

    1996-08-01

    The forces occurring at the soil-structure interface of a building have generally a large influence on the way the building reacts to an earthquake. One can be tempted to characterise these forces more accurately bu updating a model from the structure. However, this procedure requires an updating method suitable for dissipative models, since significant damping can be observed at the soil-structure interface of buildings. Such a method is presented here. It is based on the minimization of a mechanical energy built from the difference between Eigen data calculated bu the model and Eigen data issued from experimental tests on the real structure. An experimental validation of this method is then proposed on a model from the HUALIEN scale-model reactor building. This scale-model, built on the HUALIEN site of TAIWAN, is devoted to the study of soil-structure interaction. The updating concerned the soil impedances, modelled by a layer of springs and viscous dampers attached to the building foundation. A good agreement was found between the Eigen modes and dynamic responses calculated bu the updated model and the corresponding experimental data. (authors). 12 refs., 3 figs., 4 tabs

  12. Standard Review Plan Update and Development Program. Implementing Procedures Document

    Energy Technology Data Exchange (ETDEWEB)

    1992-05-01

    This implementing procedures document (IPD) was prepared for use in implementing tasks under the standard review plan update and development program (SRP-UDP). The IPD provides comprehensive guidance and detailed procedures for SRP-UDP tasks. The IPD is mandatory for contractors performing work for the SRP-UDP. It is guidance for the staff. At the completion of the SRP-UDP, the IPD will be revised (to remove the UDP aspects) and will replace NRR Office Letter No. 800 as long-term maintenance procedures.

  13. Dynamic model updating based on strain mode shape and natural frequency using hybrid pattern search technique

    Science.gov (United States)

    Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping

    2018-05-01

    Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.

  14. Updating Small Generator Interconnection Procedures for New Market Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Coddington, M.; Fox, K.; Stanfield, S.; Varnado, L.; Culley, T.; Sheehan, M.

    2012-12-01

    Federal and state regulators are faced with the challenge of keeping interconnection procedures updated against a backdrop of evolving technology, new codes and standards, and considerably transformed market conditions. This report is intended to educate policymakers and stakeholders on beneficial reforms that will keep interconnection processes efficient and cost-effective while maintaining a safe and reliable power system.

  15. Finite element model updating of a small steel frame using neural networks

    International Nuclear Information System (INIS)

    Zapico, J L; González, M P; Alonso, R; González-Buelga, A

    2008-01-01

    This paper presents an experimental and analytical dynamic study of a small-scale steel frame. The experimental model was physically built and dynamically tested on a shaking table in a series of different configurations obtained from the original one by changing the mass and by causing structural damage. Finite element modelling and parameterization with physical meaning is iteratively tried for the original undamaged configuration. The finite element model is updated through a neural network, the natural frequencies of the model being the net input. The updating process is made more accurate and robust by using a regressive procedure, which constitutes an original contribution of this work. A novel simplified analytical model has been developed to evaluate the reduction of bending stiffness of the elements due to damage. The experimental results of the rest of the configurations have been used to validate both the updated finite element model and the analytical one. The statistical properties of the identified modal data are evaluated. From these, the statistical properties and a confidence interval for the estimated model parameters are obtained by using the Latin Hypercube sampling technique. The results obtained are successful: the updated model accurately reproduces the low modes identified experimentally for all configurations, and the statistical study of the transmission of errors yields a narrow confidence interval for all the identified parameters

  16. Model parameter updating using Bayesian networks

    International Nuclear Information System (INIS)

    Treml, C.A.; Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  17. Estimation of beam material random field properties via sensitivity-based model updating using experimental frequency response functions

    Science.gov (United States)

    Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.

    2018-03-01

    Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.

  18. Updating Procedures Can Reorganize the Neural Circuit Supporting a Fear Memory.

    Science.gov (United States)

    Kwapis, Janine L; Jarome, Timothy J; Ferrara, Nicole C; Helmstetter, Fred J

    2017-07-01

    Established memories undergo a period of vulnerability following retrieval, a process termed 'reconsolidation.' Recent work has shown that the hypothetical process of reconsolidation is only triggered when new information is presented during retrieval, suggesting that this process may allow existing memories to be modified. Reconsolidation has received increasing attention as a possible therapeutic target for treating disorders that stem from traumatic memories, yet little is known about how this process changes the original memory. In particular, it is unknown whether reconsolidation can reorganize the neural circuit supporting an existing memory after that memory is modified with new information. Here, we show that trace fear memory undergoes a protein synthesis-dependent reconsolidation process following exposure to a single updating trial of delay conditioning. Further, this reconsolidation-dependent updating process appears to reorganize the neural circuit supporting the trace-trained memory, so that it better reflects the circuit supporting delay fear. Specifically, after a trace-to-delay update session, the amygdala is now required for extinction of the updated memory but the retrosplenial cortex is no longer required for retrieval. These results suggest that updating procedures could be used to force a complex, poorly defined memory circuit to rely on a better-defined neural circuit that may be more amenable to behavioral or pharmacological manipulation. This is the first evidence that exposure to new information can fundamentally reorganize the neural circuit supporting an existing memory.

  19. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  20. Updating of a finite element model of the Cruas 2 cooling tower

    International Nuclear Information System (INIS)

    Billet, L.

    1994-03-01

    A method based on modal analysis and inversion of a dynamic FEM model is used to detect changes in the dynamic behavior of nuclear plant cooling towers. Prior to detection, it is necessary to build a representative model of the structure. In this paper are given details about the CRUAS N. 2 cooling tower modelling and the updating procedure used to match the model to on-site measurements. First, were reviewed previous numerical and experimental studies on cooling towers vibrations. We found that the first eigenfrequencies of cooling towers are very sensitive to boundary conditions at the top and the bottom of the structure. Then, we built a beam and plate FEM model of the CRUAS N. 2 cooling tower. The first calculated modes were located in the proper frequency band (0.9 Hz - 1.30 Hz) but not distributed according to the experimental order. We decided to update the numerical model with MADMACS, an updating model software. It was necessary to: - decrease the shell stiffness by 30%; - increase the top ring stiffness by 300%; - modify the boundary conditions at the bottom by taking into account the soil impedance. In order to obtain a difference between the measured and the corresponding calculated frequencies less than 1%. The model was then judged to be realistic enough. (author). 23 figs., 13 refs., 1 annex

  1. SFC/SFBMN guidelines update for nuclear cardiology procedures: stress testing in adults and children

    International Nuclear Information System (INIS)

    Manrique, A.; Marie, P.Y.; Maunoury, Ch.; Acar, Ph.; Agostini, D.

    2002-01-01

    The guidelines update for nuclear cardiology procedures are studied in this article. We find the minimum technique conditions for the stress testing practice, the recommendations for the different ischemia activation tests, the choice of the stress test. (N.C.)

  2. Numerical model updating technique for structures using firefly algorithm

    Science.gov (United States)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  3. A Kriging Model Based Finite Element Model Updating Method for Damage Detection

    Directory of Open Access Journals (Sweden)

    Xiuming Yang

    2017-10-01

    Full Text Available Model updating is an effective means of damage identification and surrogate modeling has attracted considerable attention for saving computational cost in finite element (FE model updating, especially for large-scale structures. In this context, a surrogate model of frequency is normally constructed for damage identification, while the frequency response function (FRF is rarely used as it usually changes dramatically with updating parameters. This paper presents a new surrogate model based model updating method taking advantage of the measured FRFs. The Frequency Domain Assurance Criterion (FDAC is used to build the objective function, whose nonlinear response surface is constructed by the Kriging model. Then, the efficient global optimization (EGO algorithm is introduced to get the model updating results. The proposed method has good accuracy and robustness, which have been verified by a numerical simulation of a cantilever and experimental test data of a laboratory three-story structure.

  4. Updating QR factorization procedure for solution of linear least squares problem with equality constraints.

    Science.gov (United States)

    Zeb, Salman; Yousaf, Muhammad

    2017-01-01

    In this article, we present a QR updating procedure as a solution approach for linear least squares problem with equality constraints. We reduce the constrained problem to unconstrained linear least squares and partition it into a small subproblem. The QR factorization of the subproblem is calculated and then we apply updating techniques to its upper triangular factor R to obtain its solution. We carry out the error analysis of the proposed algorithm to show that it is backward stable. We also illustrate the implementation and accuracy of the proposed algorithm by providing some numerical experiments with particular emphasis on dense problems.

  5. Due date assignment procedures with dynamically updated coefficients for multi-level assembly job shops

    NARCIS (Netherlands)

    Adam, N.R.; Bertrand, J.W.M.; Morehead, D.C.; Surkis, J.

    1993-01-01

    This paper presents a study of due date assignment procedures in job shop environments where multi-level assembly jobs are processed and due dates are internally assigned. Most of the reported studies in the literature have focused on string type jobs. We propose a dynamic update approach (which

  6. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Lei Qin

    2014-05-01

    Full Text Available We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences.

  7. Application of Real Time Models Updating in ABO Central Field

    International Nuclear Information System (INIS)

    Heikal, S.; Adewale, D.; Doghmi, A.; Augustine, U.

    2003-01-01

    ABO central field is the first deep offshore oil production in Nigeria located in OML 125 (ex-OPL316). The field was developed in a water depth of between 500 and 800 meters. Deep-water development requires much faster data handling and model updates in order to make the best possible technical decision. This required an easy way to incorporate the latest information and dynamic update of the reservoir model enabling real time reservoir management. The paper aims at discussing the benefits of real time static and dynamic model update and illustrates with a horizontal well example how this update was beneficial prior and during the drilling operation minimizing the project CAPEX Prior to drilling, a 3D geological model was built based on seismic and offset wells' data. The geological model was updated twice, once after the pilot hole drilling and then after reaching the landing point and prior drilling the horizontal section .Forward modeling ws made was well using the along the planned trajectory. During the drilling process both geo- steering and LWD data were loaded in real time to the 3D modeling software. The data was analyzed and compared with the predicted model. The location of markers was changed as drilling progressed and the entire 3D Geological model was rapidly updated. The target zones were revaluated in the light of the new model updates. Recommendations were communicated to the field, and the well trajectory was modified to take into account the new information. The combination of speed, flexibility and update-ability of the 3D modeling software enabled continues geological model update on which the asset team based their trajectory modification decisions throughout the drilling phase. The well was geo-steered through 7 meters thickness of sand. After the drilling, the testing showed excellent results with a productivity and fluid properties data were used to update the dynamic model reviewing the well production plateau providing optimum reservoir

  8. Robot Visual Tracking via Incremental Self-Updating of Appearance Model

    Directory of Open Access Journals (Sweden)

    Danpei Zhao

    2013-09-01

    Full Text Available This paper proposes a target tracking method called Incremental Self-Updating Visual Tracking for robot platforms. Our tracker treats the tracking problem as a binary classification: the target and the background. The greyscale, HOG and LBP features are used in this work to represent the target and are integrated into a particle filter framework. To track the target over long time sequences, the tracker has to update its model to follow the most recent target. In order to deal with the problems of calculation waste and lack of model-updating strategy with the traditional methods, an intelligent and effective online self-updating strategy is devised to choose the optimal update opportunity. The strategy of updating the appearance model can be achieved based on the change in the discriminative capability between the current frame and the previous updated frame. By adjusting the update step adaptively, severe waste of calculation time for needless updates can be avoided while keeping the stability of the model. Moreover, the appearance model can be kept away from serious drift problems when the target undergoes temporary occlusion. The experimental results show that the proposed tracker can achieve robust and efficient performance in several benchmark-challenging video sequences with various complex environment changes in posture, scale, illumination and occlusion.

  9. Updated procedure for the safety evaluation of natural flavor complexes used as ingredients in food

    NARCIS (Netherlands)

    Cohen, Samuel M.; Eisenbrand, Gerhard; Fukushima, Shoji; Gooderham, Nigel J.; Guengerich, F.P.; Hecht, Stephen S.; Rietjens, Ivonne M.C.M.; Davidsen, Jeanne M.; Harman, Christie L.; Taylor, Sean V.

    2018-01-01

    An effective and thorough approach for the safety evaluation of natural flavor complexes (NFCs) was published in 2005 by the Expert Panel of the Flavor and Extract Manufacturers Association (FEMA). An updated procedure is provided here, which maintains the essential concepts of the use of the

  10. State updating of a distributed hydrological model with Ensemble Kalman Filtering: Effects of updating frequency and observation network density on forecast accuracy

    Science.gov (United States)

    Rakovec, O.; Weerts, A.; Hazenberg, P.; Torfs, P.; Uijlenhoet, R.

    2012-12-01

    This paper presents a study on the optimal setup for discharge assimilation within a spatially distributed hydrological model (Rakovec et al., 2012a). The Ensemble Kalman filter (EnKF) is employed to update the grid-based distributed states of such an hourly spatially distributed version of the HBV-96 model. By using a physically based model for the routing, the time delay and attenuation are modelled more realistically. The discharge and states at a given time step are assumed to be dependent on the previous time step only (Markov property). Synthetic and real world experiments are carried out for the Upper Ourthe (1600 km2), a relatively quickly responding catchment in the Belgian Ardennes. The uncertain precipitation model forcings were obtained using a time-dependent multivariate spatial conditional simulation method (Rakovec et al., 2012b), which is further made conditional on preceding simulations. We assess the impact on the forecasted discharge of (1) various sets of the spatially distributed discharge gauges and (2) the filtering frequency. The results show that the hydrological forecast at the catchment outlet is improved by assimilating interior gauges. This augmentation of the observation vector improves the forecast more than increasing the updating frequency. In terms of the model states, the EnKF procedure is found to mainly change the pdfs of the two routing model storages, even when the uncertainty in the discharge simulations is smaller than the defined observation uncertainty. Rakovec, O., Weerts, A. H., Hazenberg, P., Torfs, P. J. J. F., and Uijlenhoet, R.: State updating of a distributed hydrological model with Ensemble Kalman Filtering: effects of updating frequency and observation network density on forecast accuracy, Hydrol. Earth Syst. Sci. Discuss., 9, 3961-3999, doi:10.5194/hessd-9-3961-2012, 2012a. Rakovec, O., Hazenberg, P., Torfs, P. J. J. F., Weerts, A. H., and Uijlenhoet, R.: Generating spatial precipitation ensembles: impact of

  11. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    International Nuclear Information System (INIS)

    Fu, Y; Xu, O; Yang, W; Zhou, L; Wang, J

    2017-01-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately. (paper)

  12. FEM Updating of the Heritage Court Building Structure

    DEFF Research Database (Denmark)

    Ventura, C. E.; Brincker, Rune; Dascotte, E.

    2001-01-01

    . The starting model of the structure was developed from the information provided in the design documentation of the building. Different parameters of the model were then modified using an automated procedure to improve the correlation between measured and calculated modal parameters. Careful attention......This paper describes results of a model updating study conducted on a 15-storey reinforced concrete shear core building. The output-only modal identification results obtained from ambient vibration measurements of the building were used to update a finite element model of the structure...

  13. State updating of a distributed hydrological model with Ensemble Kalman Filtering: effects of updating frequency and observation network density on forecast accuracy

    Directory of Open Access Journals (Sweden)

    O. Rakovec

    2012-09-01

    Full Text Available This paper presents a study on the optimal setup for discharge assimilation within a spatially distributed hydrological model. The Ensemble Kalman filter (EnKF is employed to update the grid-based distributed states of such an hourly spatially distributed version of the HBV-96 model. By using a physically based model for the routing, the time delay and attenuation are modelled more realistically. The discharge and states at a given time step are assumed to be dependent on the previous time step only (Markov property.

    Synthetic and real world experiments are carried out for the Upper Ourthe (1600 km2, a relatively quickly responding catchment in the Belgian Ardennes. We assess the impact on the forecasted discharge of (1 various sets of the spatially distributed discharge gauges and (2 the filtering frequency. The results show that the hydrological forecast at the catchment outlet is improved by assimilating interior gauges. This augmentation of the observation vector improves the forecast more than increasing the updating frequency. In terms of the model states, the EnKF procedure is found to mainly change the pdfs of the two routing model storages, even when the uncertainty in the discharge simulations is smaller than the defined observation uncertainty.

  14. Online updating procedures for a real-time hydrological forecasting system

    International Nuclear Information System (INIS)

    Kahl, B; Nachtnebel, H P

    2008-01-01

    Rainfall-runoff-models can explain major parts of the natural runoff pattern but never simulate the observed hydrograph exactly. Reasons for errors are various sources of uncertainties embedded in the model forecasting system. Errors are due to measurement errors, the selected time period for calibration and validation, the parametric uncertainty and the model imprecision. In on-line forecasting systems forecasted input data is used which additionally generates a major uncertainty for the hydrological forecasting system. Techniques for partially compensating these uncertainties are investigated in the recent study in a medium sized catchment in the Austrian part of the Danube basin. The catchment area is about 1000 km2. The forecasting system consists of a semi-distributed continuous rainfall-runoff model that uses quantitative precipitation and temperature forecasts. To provide adequate system states at the beginning of the forecasting period continuous simulation is required, especially in winter. In this study two online updating methods are used and combined for enhancing the runoff forecasts. The first method is used for updating the system states at the beginning of the forecasting period by changing the precipitation input. The second method is an autoregressive error model, which is used to eliminate systematic errors in the model output. In combination those two methods work together well as each method is more effective in different runoff situations.

  15. Updating of states in operational hydrological models

    Science.gov (United States)

    Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.

    2012-04-01

    Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.

  16. Prediction-error variance in Bayesian model updating: a comparative study

    Science.gov (United States)

    Asadollahi, Parisa; Li, Jian; Huang, Yong

    2017-04-01

    In Bayesian model updating, the likelihood function is commonly formulated by stochastic embedding in which the maximum information entropy probability model of prediction error variances plays an important role and it is Gaussian distribution subject to the first two moments as constraints. The selection of prediction error variances can be formulated as a model class selection problem, which automatically involves a trade-off between the average data-fit of the model class and the information it extracts from the data. Therefore, it is critical for the robustness in the updating of the structural model especially in the presence of modeling errors. To date, three ways of considering prediction error variances have been seem in the literature: 1) setting constant values empirically, 2) estimating them based on the goodness-of-fit of the measured data, and 3) updating them as uncertain parameters by applying Bayes' Theorem at the model class level. In this paper, the effect of different strategies to deal with the prediction error variances on the model updating performance is investigated explicitly. A six-story shear building model with six uncertain stiffness parameters is employed as an illustrative example. Transitional Markov Chain Monte Carlo is used to draw samples of the posterior probability density function of the structure model parameters as well as the uncertain prediction variances. The different levels of modeling uncertainty and complexity are modeled through three FE models, including a true model, a model with more complexity, and a model with modeling error. Bayesian updating is performed for the three FE models considering the three aforementioned treatments of the prediction error variances. The effect of number of measurements on the model updating performance is also examined in the study. The results are compared based on model class assessment and indicate that updating the prediction error variances as uncertain parameters at the model

  17. Updates to the Demographic and Spatial Allocation Models to ...

    Science.gov (United States)

    EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change modeling by providing nationwide housing development scenarios up to 2100. ICLUS V2 includes updated population and land use data sets and addressing limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. [2017 UPDATE] Get the latest version of ICLUS and stay up-to-date by signing up to the ICLUS mailing list. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.

  18. 2017 Updates: Earth Gravitational Model 2020

    Science.gov (United States)

    Barnes, D. E.; Holmes, S. A.; Ingalls, S.; Beale, J.; Presicci, M. R.; Minter, C.

    2017-12-01

    The National Geospatial-Intelligence Agency [NGA], in conjunction with its U.S. and international partners, has begun preliminary work on its next Earth Gravitational Model, to replace EGM2008. The new `Earth Gravitational Model 2020' [EGM2020] has an expected public release date of 2020, and will retain the same harmonic basis and resolution as EGM2008. As such, EGM2020 will be essentially an ellipsoidal harmonic model up to degree (n) and order (m) 2159, but will be released as a spherical harmonic model to degree 2190 and order 2159. EGM2020 will benefit from new data sources and procedures. Updated satellite gravity information from the GOCE and GRACE mission, will better support the lower harmonics, globally. Multiple new acquisitions (terrestrial, airborne and shipborne) of gravimetric data over specific geographical areas (Antarctica, Greenland …), will provide improved global coverage and resolution over the land, as well as for coastal and some ocean areas. Ongoing accumulation of satellite altimetry data as well as improvements in the treatment of this data, will better define the marine gravity field, most notably in polar and near-coastal regions. NGA and partners are evaluating different approaches for optimally combining the new GOCE/GRACE satellite gravity models with the terrestrial data. These include the latest methods employing a full covariance adjustment. NGA is also working to assess systematically the quality of its entire gravimetry database, towards correcting biases and other egregious errors. Public release number 15-564

  19. A Provenance Tracking Model for Data Updates

    Directory of Open Access Journals (Sweden)

    Gabriel Ciobanu

    2012-08-01

    Full Text Available For data-centric systems, provenance tracking is particularly important when the system is open and decentralised, such as the Web of Linked Data. In this paper, a concise but expressive calculus which models data updates is presented. The calculus is used to provide an operational semantics for a system where data and updates interact concurrently. The operational semantics of the calculus also tracks the provenance of data with respect to updates. This provides a new formal semantics extending provenance diagrams which takes into account the execution of processes in a concurrent setting. Moreover, a sound and complete model for the calculus based on ideals of series-parallel DAGs is provided. The notion of provenance introduced can be used as a subjective indicator of the quality of data in concurrent interacting systems.

  20. Automatic procedure go keep updated the activity levels for each radionuclide existing in a radioactivity laboratory

    International Nuclear Information System (INIS)

    Los Arcos, J.M.

    1988-01-01

    An automatic procedure to keep updated the activity levels each radionuclide existing in a radioactivity laboratory, and its classification according to the Spanish Regulations on Nuclear and Radioactive Establishments is described. This procedure takes into account the mixed composition of each source and whether it is sealed or the activity and mass variation due to extraction or evaporation in non-sealed sources. For a given date and time, the procedure prints out a complete listing of the activity of each radioactive source, the accumulated activity for each radionuclide, for each kind of radionuclide and the actual classification of the laboratory according to the legal regulations above mentioned. (Author)

  1. Evaluation of two updating methods for dissipative models on a real structure

    International Nuclear Information System (INIS)

    Moine, P.; Billet, L.

    1996-01-01

    Finite Element Models are widely used to predict the dynamic behaviour from structures. Frequently, the model does not represent the structure with all be expected accuracy i.e. the measurements realised on the structure differ from the data predicted by the model. It is therefore necessary to update the model. Although many modeling errors come from inadequate representation of the damping phenomena, most of the model updating techniques are up to now restricted to conservative models only. In this paper, we present two updating methods for dissipative models using Eigen mode shapes and Eigen values as behavioural information from the structure. The first method - the modal output error method - compares directly the experimental Eigen vectors and Eigen values to the model Eigen vectors and Eigen values whereas the second method - the error in constitutive relation method - uses an energy error derived from the equilibrium relation. The error function, in both cases, is minimized by a conjugate gradient algorithm and the gradient is calculated analytically. These two methods behave differently which can be evidenced by updating a real structure constituted from a piece of pipe mounted on two viscous elastic suspensions. The updating of the model validates an updating strategy consisting in realizing a preliminary updating with the error in constitutive relation method (a fast to converge but difficult to control method) and then to pursue the updating with the modal output error method (a slow to converge but reliable and easy to control method). Moreover the problems encountered during the updating process and their corresponding solutions are given. (authors)

  2. A comparison of updating algorithms for large N reduced models

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Margarita García [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); González-Arroyo, Antonio [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); Departamento de Física Teórica, C-XI Universidad Autónoma de Madrid,E-28049 Madrid (Spain); Keegan, Liam [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland); Okawa, Masanori [Graduate School of Science, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Core of Research for the Energetic Universe, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Ramos, Alberto [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland)

    2015-06-29

    We investigate Monte Carlo updating algorithms for simulating SU(N) Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole SU(N) matrix at once, or iterating through SU(2) subgroups of the SU(N) matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  3. A comparison of updating algorithms for large $N$ reduced models

    CERN Document Server

    Pérez, Margarita García; Keegan, Liam; Okawa, Masanori; Ramos, Alberto

    2015-01-01

    We investigate Monte Carlo updating algorithms for simulating $SU(N)$ Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole $SU(N)$ matrix at once, or iterating through $SU(2)$ subgroups of the $SU(N)$ matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  4. Finite element model updating of a prestressed concrete box girder bridge using subproblem approximation

    Science.gov (United States)

    Chen, G. W.; Omenzetter, P.

    2016-04-01

    This paper presents the implementation of an updating procedure for the finite element model (FEM) of a prestressed concrete continuous box-girder highway off-ramp bridge. Ambient vibration testing was conducted to excite the bridge, assisted by linear chirp sweepings induced by two small electrodynamic shakes deployed to enhance the excitation levels, since the bridge was closed to traffic. The data-driven stochastic subspace identification method was executed to recover the modal properties from measurement data. An initial FEM was developed and correlation between the experimental modal results and their analytical counterparts was studied. Modelling of the pier and abutment bearings was carefully adjusted to reflect the real operational conditions of the bridge. The subproblem approximation method was subsequently utilized to automatically update the FEM. For this purpose, the influences of bearing stiffness, and mass density and Young's modulus of materials were examined as uncertain parameters using sensitivity analysis. The updating objective function was defined based on a summation of squared values of relative errors of natural frequencies between the FEM and experimentation. All the identified modes were used as the target responses with the purpose of putting more constrains for the optimization process and decreasing the number of potentially feasible combinations for parameter changes. The updated FEM of the bridge was able to produce sufficient improvements in natural frequencies in most modes of interest, and can serve for a more precise dynamic response prediction or future investigation of the bridge health.

  5. Net Metering and Interconnection Procedures-- Incorporating Best Practices

    Energy Technology Data Exchange (ETDEWEB)

    Jason Keyes, Kevin Fox, Joseph Wiedman, Staff at North Carolina Solar Center

    2009-04-01

    State utility commissions and utilities themselves are actively developing and revising their procedures for the interconnection and net metering of distributed generation. However, the procedures most often used by regulators and utilities as models have not been updated in the past three years, in which time most of the distributed solar facilities in the United States have been installed. In that period, the Interstate Renewable Energy Council (IREC) has been a participant in more than thirty state utility commission rulemakings regarding interconnection and net metering of distributed generation. With the knowledge gained from this experience, IREC has updated its model procedures to incorporate current best practices. This paper presents the most significant changes made to IREC’s model interconnection and net metering procedures.

  6. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tencate, Alister J. [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); Kalivas, John H., E-mail: kalijohn@isu.edu [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); White, Alexander J. [Department of Physics and Optical Engineering, Rose-Hulman Institute of Technology, Terre Huate, IN 47803 (United States)

    2016-05-19

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  7. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    International Nuclear Information System (INIS)

    Tencate, Alister J.; Kalivas, John H.; White, Alexander J.

    2016-01-01

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  8. Ontology Update in the Cognitive Model of Ontology Learning

    Directory of Open Access Journals (Sweden)

    Zhang De-Hai

    2016-01-01

    Full Text Available Ontology has been used in many hot-spot fields, but most ontology construction methods are semiautomatic, and the construction process of ontology is still a tedious and painstaking task. In this paper, a kind of cognitive models is presented for ontology learning which can simulate human being’s learning from world. In this model, the cognitive strategies are applied with the constrained axioms. Ontology update is a key step when the new knowledge adds into the existing ontology and conflict with old knowledge in the process of ontology learning. This proposal designs and validates the method of ontology update based on the axiomatic cognitive model, which include the ontology update postulates, axioms and operations of the learning model. It is proved that these operators subject to the established axiom system.

  9. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  10. Finite element model updating of concrete structures based on imprecise probability

    Science.gov (United States)

    Biswal, S.; Ramaswamy, A.

    2017-09-01

    Imprecise probability based methods are developed in this study for the parameter estimation, in finite element model updating for concrete structures, when the measurements are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Procedures are also developed for integrating the imprecision in the parameters of the finite element model, in the finite element software Abaqus. The proposed methods are then verified against reinforced concrete beams and prestressed concrete beams tested in our laboratory as part of this study.

  11. Information dissemination model for social media with constant updates

    Science.gov (United States)

    Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui

    2018-07-01

    With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.

  12. Updated climatological model predictions of ionospheric and HF propagation parameters

    International Nuclear Information System (INIS)

    Reilly, M.H.; Rhoads, F.J.; Goodman, J.M.; Singh, M.

    1991-01-01

    The prediction performances of several climatological models, including the ionospheric conductivity and electron density model, RADAR C, and Ionospheric Communications Analysis and Predictions Program, are evaluated for different regions and sunspot number inputs. Particular attention is given to the near-real-time (NRT) predictions associated with single-station updates. It is shown that a dramatic improvement can be obtained by using single-station ionospheric data to update the driving parameters for an ionospheric model for NRT predictions of f(0)F2 and other ionospheric and HF circuit parameters. For middle latitudes, the improvement extends out thousands of kilometers from the update point to points of comparable corrected geomagnetic latitude. 10 refs

  13. [Purity Detection Model Update of Maize Seeds Based on Active Learning].

    Science.gov (United States)

    Tang, Jin-ya; Huang, Min; Zhu, Qi-bing

    2015-08-01

    Seed purity reflects the degree of seed varieties in typical consistent characteristics, so it is great important to improve the reliability and accuracy of seed purity detection to guarantee the quality of seeds. Hyperspectral imaging can reflect the internal and external characteristics of seeds at the same time, which has been widely used in nondestructive detection of agricultural products. The essence of nondestructive detection of agricultural products using hyperspectral imaging technique is to establish the mathematical model between the spectral information and the quality of agricultural products. Since the spectral information is easily affected by the sample growth environment, the stability and generalization of model would weaken when the test samples harvested from different origin and year. Active learning algorithm was investigated to add representative samples to expand the sample space for the original model, so as to implement the rapid update of the model's ability. Random selection (RS) and Kennard-Stone algorithm (KS) were performed to compare the model update effect with active learning algorithm. The experimental results indicated that in the division of different proportion of sample set (1:1, 3:1, 4:1), the updated purity detection model for maize seeds from 2010 year which was added 40 samples selected by active learning algorithm from 2011 year increased the prediction accuracy for 2011 new samples from 47%, 33.75%, 49% to 98.89%, 98.33%, 98.33%. For the updated purity detection model of 2011 year, its prediction accuracy for 2010 new samples increased by 50.83%, 54.58%, 53.75% to 94.57%, 94.02%, 94.57% after adding 56 new samples from 2010 year. Meanwhile the effect of model updated by active learning algorithm was better than that of RS and KS. Therefore, the update for purity detection model of maize seeds is feasible by active learning algorithm.

  14. General Separations Area (GSA) Groundwater Flow Model Update: Hydrostratigraphic Data

    Energy Technology Data Exchange (ETDEWEB)

    Bagwell, L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Bennett, P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-02-21

    This document describes the assembly, selection, and interpretation of hydrostratigraphic data for input to an updated groundwater flow model for the General Separations Area (GSA; Figure 1) at the Department of Energy’s (DOE) Savannah River Site (SRS). This report is one of several discrete but interrelated tasks that support development of an updated groundwater model (Bagwell and Flach, 2016).

  15. Summary of Expansions, Updates, and Results in GREET 2017 Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Elgowainy, Amgad [Argonne National Lab. (ANL), Argonne, IL (United States); Han, Jeongwoo [Argonne National Lab. (ANL), Argonne, IL (United States); Benavides, Pahola Thathiana [Argonne National Lab. (ANL), Argonne, IL (United States); Burnham, Andrew [Argonne National Lab. (ANL), Argonne, IL (United States); Cai, Hao [Argonne National Lab. (ANL), Argonne, IL (United States); Canter, Christina [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Rui [Argonne National Lab. (ANL), Argonne, IL (United States); Dai, Qiang [Argonne National Lab. (ANL), Argonne, IL (United States); Kelly, Jarod [Argonne National Lab. (ANL), Argonne, IL (United States); Lee, Dong-Yeon [Argonne National Lab. (ANL), Argonne, IL (United States); Lee, Uisung [Argonne National Lab. (ANL), Argonne, IL (United States); Li, Qianfeng [Argonne National Lab. (ANL), Argonne, IL (United States); Lu, Zifeng [Argonne National Lab. (ANL), Argonne, IL (United States); Qin, Zhangcai [Argonne National Lab. (ANL), Argonne, IL (United States); Sun, Pingping [Argonne National Lab. (ANL), Argonne, IL (United States); Supekar, Sarang D. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-11-01

    This report provides a technical summary of the expansions and updates to the 2017 release of Argonne National Laboratory’s Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET®) model, including references and links to key technical documents related to these expansions and updates. The GREET 2017 release includes an updated version of the GREET1 (the fuel-cycle GREET model) and GREET2 (the vehicle-cycle GREET model), both in the Microsoft Excel platform and in the GREET.net modeling platform. Figure 1 shows the structure of the GREET Excel modeling platform. The .net platform integrates all GREET modules together seamlessly.

  16. Self-shielding models of MICROX-2 code: Review and updates

    International Nuclear Information System (INIS)

    Hou, J.; Choi, H.; Ivanov, K.N.

    2014-01-01

    Highlights: • The MICROX-2 code has been improved to expand its application to advanced reactors. • New fine-group cross section libraries based on ENDF/B-VII have been generated. • Resonance self-shielding and spatial self-shielding models have been improved. • The improvements were assessed by a series of benchmark calculations against MCNPX. - Abstract: The MICROX-2 is a transport theory code that solves for the neutron slowing-down and thermalization equations of a two-region lattice cell. The MICROX-2 code has been updated to expand its application to advanced reactor concepts and fuel cycle simulations, including generation of new fine-group cross section libraries based on ENDF/B-VII. In continuation of previous work, the MICROX-2 methods are reviewed and updated in this study, focusing on its resonance self-shielding and spatial self-shielding models for neutron spectrum calculations. The improvement of self-shielding method was assessed by a series of benchmark calculations against the Monte Carlo code, using homogeneous and heterogeneous pin cell models. The results have shown that the implementation of the updated self-shielding models is correct and the accuracy of physics calculation is improved. Compared to the existing models, the updates reduced the prediction error of the infinite multiplication factor by ∼0.1% and ∼0.2% for the homogeneous and heterogeneous pin cell models, respectively, considered in this study

  17. Finite-element-model updating using computational intelligence techniques applications to structural dynamics

    CERN Document Server

    Marwala, Tshilidzi

    2010-01-01

    Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...

  18. FE Model Updating on an In-Service Self-Anchored Suspension Bridge with Extra-Width Using Hybrid Method

    Directory of Open Access Journals (Sweden)

    Zhiyuan Xia

    2017-02-01

    Full Text Available Nowadays, many more bridges with extra-width have been needed for vehicle throughput. In order to obtain a precise finite element (FE model of those complex bridge structures, the practical hybrid updating method by integration of Gaussian mutation particle swarm optimization (GMPSO, Kriging meta-model and Latin hypercube sampling (LHS was proposed. By demonstrating the efficiency and accuracy of the hybrid method through the model updating of a damaged simply supported beam, the proposed method was applied to the model updating of a self-anchored suspension bridge with extra-width which showed great necessity considering the results of ambient vibration test. The results of bridge model updating showed that both of the mode frequencies and shapes had relatively high agreement between the updated model and experimental structure. The successful model updating of this bridge fills in the blanks of model updating of a complex self-anchored suspension bridge. Moreover, the updating process enables other model updating issues for complex bridge structures

  19. Model Updating Nonlinear System Identification Toolbox, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology (ZONA) proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology that utilizes flight data with...

  20. Fuzzy cross-model cross-mode method and its application to update the finite element model of structures

    International Nuclear Information System (INIS)

    Liu Yang; Xu Dejian; Li Yan; Duan Zhongdong

    2011-01-01

    As a novel updating technique, cross-model cross-mode (CMCM) method possesses a high efficiency and capability of flexible selecting updating parameters. However, the success of this method depends on the accuracy of measured modal shapes. Usually, the measured modal shapes are inaccurate since many kinds of measured noises are inevitable. Furthermore, the complete testing modal shapes are required by CMCM method so that the calculating errors may be introduced into the measured modal shapes by conducting the modal expansion or model reduction technique. Therefore, this algorithm is faced with the challenge of updating the finite element (FE) model of practical complex structures. In this study, the fuzzy CMCM method is proposed in order to weaken the effect of errors of the measured modal shapes on the updated results. Then two simulated examples are applied to compare the performance of the fuzzy CMCM method with the CMCM method. The test results show that proposed method is more promising to update the FE model of practical structures than CMCM method.

  1. Model Updating Nonlinear System Identification Toolbox, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology by adopting the flight data with state-of-the-art...

  2. Rotating shaft model updating from modal data by a direct energy approach : a feasibility study

    International Nuclear Information System (INIS)

    Audebert, S.

    1996-01-01

    Investigations to improve the rotating machinery monitoring tend more and more to use numerical models. The aim is to obtain multi-fluid bearing rotor models which are able to correctly represent their dynamic behaviour, either modal or forced response type. The possibility of extending the direct energy method, initially developed for undamped structures, to rotating machinery is studied. It is based on the minimization of the kinetic and strain energy gap between experimental and analytic modal data. The preliminary determination of a multi-linear bearing rotor system Eigen modes shows the problem complexity in comparison with undamped non rotating structures: taking into account gyroscopic effects and bearing damping, as factors of rotor velocities, leads to complex component Eigen modes; moreover, non symmetric matrices, related to stiffness and damping bearing contributions, induce distinct left and right-hand side Eigen modes (left hand side Eigenmodes corresponds to the adjoint structure). Theoretically, the extension of the energy method is studied, considering first the intermediate case of an undamped non gyroscopic structure, second the general case of a rotating shaft: dta used for updating procedure are Eigen frequencies and left- and right- hand side mode shapes. Since left hand side mode shapes cannot be directly measured, they are replaced by analytic ones. The method is tested on a two-bearing rotor system, with a mass added; simulated data are used, relative to a non compatible structure, i.e. which is not a part of the set of modified analytic possible structures. Parameters to be corrected are the mass density, the Young's modulus, and the stiffness and damping linearized characteristics of bearings. If parameters are influent in regard with modes to be updates, the updating method permits a significant improvement of the gap between analytic and experimental modes, even for modes not involves in the procedure. Modal damping appears to be more

  3. Real Time Updating in Distributed Urban Rainfall Runoff Modelling

    DEFF Research Database (Denmark)

    Borup, Morten; Madsen, Henrik

    that are being updated from system measurements was studied. The results showed that the fact alone that it takes time for rainfall data to travel the distance between gauges and catchments has such a big negative effect on the forecast skill of updated models, that it can justify the choice of even very...... as in a real data case study. The results confirmed that the method is indeed suitable for DUDMs and that it can be used to utilise upstream as well as downstream water level and flow observations to improve model estimates and forecasts. Due to upper and lower sensor limits many sensors in urban drainage...

  4. Modelling precipitation extremes in the Czech Republic: update of intensity–duration–frequency curves

    Directory of Open Access Journals (Sweden)

    Michal Fusek

    2016-11-01

    Full Text Available Precipitation records from six stations of the Czech Hydrometeorological Institute were subject to statistical analysis with the objectives of updating the intensity–duration–frequency (IDF curves, by applying extreme value distributions, and comparing the updated curves against those produced by an empirical procedure in 1958. Another objective was to investigate differences between both sets of curves, which could be explained by such factors as different measuring instruments, measuring stations altitudes and data analysis methods. It has been shown that the differences between the two sets of IDF curves are significantly influenced by the chosen method of data analysis.

  5. A last updating evolution model for online social networks

    Science.gov (United States)

    Bu, Zhan; Xia, Zhengyou; Wang, Jiandong; Zhang, Chengcui

    2013-05-01

    As information technology has advanced, people are turning to electronic media more frequently for communication, and social relationships are increasingly found on online channels. However, there is very limited knowledge about the actual evolution of the online social networks. In this paper, we propose and study a novel evolution network model with the new concept of “last updating time”, which exists in many real-life online social networks. The last updating evolution network model can maintain the robustness of scale-free networks and can improve the network reliance against intentional attacks. What is more, we also found that it has the “small-world effect”, which is the inherent property of most social networks. Simulation experiment based on this model show that the results and the real-life data are consistent, which means that our model is valid.

  6. Reservoir management under geological uncertainty using fast model update

    NARCIS (Netherlands)

    Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.

    2015-01-01

    Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU

  7. Off-Highway Gasoline Consuption Estimation Models Used in the Federal Highway Administration Attribution Process: 2008 Updates

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ho-Ling [ORNL; Davis, Stacy Cagle [ORNL

    2009-12-01

    This report is designed to document the analysis process and estimation models currently used by the Federal Highway Administration (FHWA) to estimate the off-highway gasoline consumption and public sector fuel consumption. An overview of the entire FHWA attribution process is provided along with specifics related to the latest update (2008) on the Off-Highway Gasoline Use Model and the Public Use of Gasoline Model. The Off-Highway Gasoline Use Model is made up of five individual modules, one for each of the off-highway categories: agricultural, industrial and commercial, construction, aviation, and marine. This 2008 update of the off-highway models was the second major update (the first model update was conducted during 2002-2003) after they were originally developed in mid-1990. The agricultural model methodology, specifically, underwent a significant revision because of changes in data availability since 2003. Some revision to the model was necessary due to removal of certain data elements used in the original estimation method. The revised agricultural model also made use of some newly available information, published by the data source agency in recent years. The other model methodologies were not drastically changed, though many data elements were updated to improve the accuracy of these models. Note that components in the Public Use of Gasoline Model were not updated in 2008. A major challenge in updating estimation methods applied by the public-use model is that they would have to rely on significant new data collection efforts. In addition, due to resource limitation, several components of the models (both off-highway and public-us models) that utilized regression modeling approaches were not recalibrated under the 2008 study. An investigation of the Environmental Protection Agency's NONROAD2005 model was also carried out under the 2008 model update. Results generated from the NONROAD2005 model were analyzed, examined, and compared, to the extent that

  8. Emergency evacuation/transportation plan update: Traffic model development and evaluation of early closure procedures. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-10-28

    Prolonged delays in traffic experienced by Laboratory personnel during a recent early dismissal in inclement weather, coupled with reconstruction efforts along NM 502 east of the White Rock Wye for the next 1 to 2 years, has prompted Los Alamos National Laboratory (LANL) to re-evaluate and improve the present transportation plan and its integration with contingency plans maintained in other organizations. Facilities planners and emergency operations staff need to evaluate the transportation system`s capability to inefficiently and safely evacuate LANL under different low-level emergency conditions. A variety of potential procedures governing the release of employees from the different technical areas (TAs) requires evaluation, perhaps with regard to multiple emergency-condition scenarios, with one or more optimal procedures ultimately presented for adoption by Lab Management. The work undertaken in this project will hopefully lay a foundation for an on-going, progressive transportation system analysis capability. It utilizes microscale simulation techniques to affirm, reassess and validate the Laboratory`s Early Dismissal/Closure/Delayed Opening Plan. The Laboratory is required by Federal guidelines, and compelled by prudent practice and conscientious regard for the welfare of employees and nearby residents, to maintain plans and operating procedures for evacuation if the need arises. The tools developed during this process can be used outside of contingency planning. It is anticipated that the traffic models developed will allow site planners to evaluate changes to the traffic network which could better serve the normal traffic levels. Changes in roadway configuration, control strategies (signalization and signing), response strategies to traffic accidents, and patterns of demand can be modelled using the analysis tools developed during this project. Such scenarios typically are important considerations in master planning and facilities programming.

  9. Sequential updating of a new dynamic pharmacokinetic model for caffeine in premature neonates.

    Science.gov (United States)

    Micallef, Sandrine; Amzal, Billy; Bach, Véronique; Chardon, Karen; Tourneux, Pierre; Bois, Frédéric Y

    2007-01-01

    Caffeine treatment is widely used in nursing care to reduce the risk of apnoea in premature neonates. To check the therapeutic efficacy of the treatment against apnoea, caffeine concentration in blood is an important indicator. The present study was aimed at building a pharmacokinetic model as a basis for a medical decision support tool. In the proposed model, time dependence of physiological parameters is introduced to describe rapid growth of neonates. To take into account the large variability in the population, the pharmacokinetic model is embedded in a population structure. The whole model is inferred within a Bayesian framework. To update caffeine concentration predictions as data of an incoming patient are collected, we propose a fast method that can be used in a medical context. This involves the sequential updating of model parameters (at individual and population levels) via a stochastic particle algorithm. Our model provides better predictions than the ones obtained with models previously published. We show, through an example, that sequential updating improves predictions of caffeine concentration in blood (reduce bias and length of credibility intervals). The update of the pharmacokinetic model using body mass and caffeine concentration data is studied. It shows how informative caffeine concentration data are in contrast to body mass data. This study provides the methodological basis to predict caffeine concentration in blood, after a given treatment if data are collected on the treated neonate.

  10. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  11. Finite element model updating using bayesian framework and modal properties

    CSIR Research Space (South Africa)

    Marwala, T

    2005-01-01

    Full Text Available Finite element (FE) models are widely used to predict the dynamic characteristics of aerospace structures. These models often give results that differ from measured results and therefore need to be updated to match measured results. Some...

  12. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2011

    International Nuclear Information System (INIS)

    Nigg, David W.; Steuhm, Devin A.

    2011-01-01

    . Furthermore, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system is being implemented and initial computational results have been obtained. This capability will have many applications in 2011 and beyond as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation. Finally we note that although full implementation of the new computational models and protocols will extend over a period 3-4 years as noted above, interim applications in the much nearer term have already been demonstrated. In particular, these demonstrations included an analysis that was useful for understanding the cause of some issues in December 2009 that were triggered by a larger than acceptable discrepancy between the measured excess core reactivity and a calculated value that was based on the legacy computational methods. As the Modeling Update project proceeds we anticipate further such interim, informal, applications in parallel with formal qualification of the system under the applicable INL Quality Assurance procedures and standards.

  13. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2011

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg; Devin A. Steuhm

    2011-09-01

    , a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system is being implemented and initial computational results have been obtained. This capability will have many applications in 2011 and beyond as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation. Finally we note that although full implementation of the new computational models and protocols will extend over a period 3-4 years as noted above, interim applications in the much nearer term have already been demonstrated. In particular, these demonstrations included an analysis that was useful for understanding the cause of some issues in December 2009 that were triggered by a larger than acceptable discrepancy between the measured excess core reactivity and a calculated value that was based on the legacy computational methods. As the Modeling Update project proceeds we anticipate further such interim, informal, applications in parallel with formal qualification of the system under the applicable INL Quality Assurance procedures and standards.

  14. Probabilistic Modeling of Updating Epistemic Uncertainty In Pile Capacity Prediction With a Single Failure Test Result

    Directory of Open Access Journals (Sweden)

    Indra Djati Sidi

    2017-12-01

    Full Text Available The model error N has been introduced to denote the discrepancy between measured and predicted capacity of pile foundation. This model error is recognized as epistemic uncertainty in pile capacity prediction. The statistics of N have been evaluated based on data gathered from various sites and may be considered only as a eneral-error trend in capacity prediction, providing crude estimates of the model error in the absence of more specific data from the site. The results of even a single load test to failure, should provide direct evidence of the pile capacity at a given site. Bayes theorem has been used as a rational basis for combining new data with previous data to revise assessment of uncertainty and reliability. This study is devoted to the development of procedures for updating model error (N, and subsequently the predicted pile capacity with a results of single failure test.

  15. Finite element modelling and updating of friction stir welding (FSW joint for vibration analysis

    Directory of Open Access Journals (Sweden)

    Zahari Siti Norazila

    2017-01-01

    Full Text Available Friction stir welding of aluminium alloys widely used in automotive and aerospace application due to its advanced and lightweight properties. The behaviour of FSW joints plays a significant role in the dynamic characteristic of the structure due to its complexities and uncertainties therefore the representation of an accurate finite element model of these joints become a research issue. In this paper, various finite elements (FE modelling technique for prediction of dynamic properties of sheet metal jointed by friction stir welding will be presented. Firstly, nine set of flat plate with different series of aluminium alloy; AA7075 and AA6061 joined by FSW are used. Nine set of specimen was fabricated using various types of welding parameters. In order to find the most optimum set of FSW plate, the finite element model using equivalence technique was developed and the model validated using experimental modal analysis (EMA on nine set of specimen and finite element analysis (FEA. Three types of modelling were engaged in this study; rigid body element Type 2 (RBE2, bar element (CBAR and spot weld element connector (CWELD. CBAR element was chosen to represent weld model for FSW joints due to its accurate prediction of mode shapes and contains an updating parameter for weld modelling compare to other weld modelling. Model updating was performed to improve correlation between EMA and FEA and before proceeds to updating, sensitivity analysis was done to select the most sensitive updating parameter. After perform model updating, total error of the natural frequencies for CBAR model is improved significantly. Therefore, CBAR element was selected as the most reliable element in FE to represent FSW weld joint.

  16. Two updating methods for dissipative models with non symmetric matrices

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Aubry, D.

    1997-01-01

    In this paper the feasibility of the extension of two updating methods to rotating machinery models is considered, the particularity of rotating machinery models is to use non-symmetric stiffness and damping matrices. It is shown that the two methods described here, the inverse Eigen-sensitivity method and the error in constitutive relation method can be adapted to such models given some modification.As far as inverse sensitivity method is concerned, an error function based on the difference between right hand calculated and measured Eigen mode shapes and calculated and measured Eigen values is used. Concerning the error in constitutive relation method, the equation which defines the error has to be modified due to the non definite positiveness of the stiffness matrix. The advantage of this modification is that, in some cases, it is possible to focus the updating process on some specific model parameters. Both methods were validated on a simple test model consisting in a two-bearing and disc rotor system. (author)

  17. Construction and updating of event models in auditory event processing.

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Adapting to change: The role of the right hemisphere in mental model building and updating.

    Science.gov (United States)

    Filipowicz, Alex; Anderson, Britt; Danckert, James

    2016-09-01

    We recently proposed that the right hemisphere plays a crucial role in the processes underlying mental model building and updating. Here, we review the evidence we and others have garnered to support this novel account of right hemisphere function. We begin by presenting evidence from patient work that suggests a critical role for the right hemisphere in the ability to learn from the statistics in the environment (model building) and adapt to environmental change (model updating). We then provide a review of neuroimaging research that highlights a network of brain regions involved in mental model updating. Next, we outline specific roles for particular regions within the network such that the anterior insula is purported to maintain the current model of the environment, the medial prefrontal cortex determines when to explore new or alternative models, and the inferior parietal lobule represents salient and surprising information with respect to the current model. We conclude by proposing some future directions that address some of the outstanding questions in the field of mental model building and updating. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. Receiver Operating Characteristic Curve-Based Prediction Model for Periodontal Disease Updated With the Calibrated Community Periodontal Index.

    Science.gov (United States)

    Su, Chiu-Wen; Yen, Amy Ming-Fang; Lai, Hongmin; Chen, Hsiu-Hsi; Chen, Sam Li-Sheng

    2017-12-01

    The accuracy of a prediction model for periodontal disease using the community periodontal index (CPI) has been undertaken by using an area under a receiver operating characteristics (AUROC) curve. How the uncalibrated CPI, as measured by general dentists trained by periodontists in a large epidemiologic study, and affects the performance in a prediction model, has not been researched yet. A two-stage design was conducted by first proposing a validation study to calibrate CPI between a senior periodontal specialist and trained general dentists who measured CPIs in the main study of a nationwide survey. A Bayesian hierarchical logistic regression model was applied to estimate the non-updated and updated clinical weights used for building up risk scores. How the calibrated CPI affected performance of the updated prediction model was quantified by comparing AUROC curves between the original and updated models. Estimates regarding calibration of CPI obtained from the validation study were 66% and 85% for sensitivity and specificity, respectively. After updating, clinical weights of each predictor were inflated, and the risk score for the highest risk category was elevated from 434 to 630. Such an update improved the AUROC performance of the two corresponding prediction models from 62.6% (95% confidence interval [CI]: 61.7% to 63.6%) for the non-updated model to 68.9% (95% CI: 68.0% to 69.6%) for the updated one, reaching a statistically significant difference (P prediction model was demonstrated for periodontal disease as measured by the calibrated CPI derived from a large epidemiologic survey.

  20. Seabrook Station Level 2 PRA Update to Include Accident Management

    International Nuclear Information System (INIS)

    Lutz, Robert; Lucci, Melissa; Kiper, Kenneth; Henry, Robert

    2006-01-01

    A ground-breaking study was recently completed as part of the Seabrook Level 2 PRA update. This study updates the post-core damage phenomena to be consistent with the most recent information and includes accident management activities that should be modeled in the Level 2 PRA. Overall, the result is a Level 2 PRA that fully meets the requirements of the ASME PRA Standard with respect to modeling accident management in the LERF assessment and NRC requirements in Regulatory Guide 1.174 for considering late containment failures. This technical paper deals only with the incorporation of operator actions into the Level 2 PRA based on a comprehensive study of the Seabrook Station accident response procedures and guidance. The paper describes the process used to identify the key operator actions that can influence the Level 2 PRA results and the development of success criteria for these key operator actions. This addresses a key requirement of the ASME PRA Standard for considering SAMG. An important benefit of this assessment was the identification of Seabrook specific accident management insights that can be fed back into the Seabrook Station accident management procedures and guidance or the training provided to plant personnel for these procedures and guidance. (authors)

  1. Updating known distribution models for forecasting climate change impact on endangered species.

    Science.gov (United States)

    Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo

    2013-01-01

    To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only.

  2. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. II. Optimization model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    improvements. The biological model of the replacement model is described in a previous paper and in this paper the optimization model is described. The model is developed as a prototype for use under practical conditions. The application of the model is demonstrated using data from two commercial Danish sow......Recent methodological improvements in replacement models comprising multi-level hierarchical Markov processes and Bayesian updating have hardly been implemented in any replacement model and the aim of this study is to present a sow replacement model that really uses these methodological...... herds. It is concluded that the Bayesian updating technique and the hierarchical structure decrease the size of the state space dramatically. Since parameter estimates vary considerably among herds it is concluded that decision support concerning sow replacement only makes sense with parameters...

  3. The 2014 update to the National Seismic Hazard Model in California

    Science.gov (United States)

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  4. How update schemes influence crowd simulations

    International Nuclear Information System (INIS)

    Seitz, Michael J; Köster, Gerta

    2014-01-01

    Time discretization is a key modeling aspect of dynamic computer simulations. In current pedestrian motion models based on discrete events, e.g. cellular automata and the Optimal Steps Model, fixed-order sequential updates and shuffle updates are prevalent. We propose to use event-driven updates that process events in the order they occur, and thus better match natural movement. In addition, we present a parallel update with collision detection and resolution for situations where computational speed is crucial. Two simulation studies serve to demonstrate the practical impact of the choice of update scheme. Not only do density-speed relations differ, but there is a statistically significant effect on evacuation times. Fixed-order sequential and random shuffle updates with a short update period come close to event-driven updates. The parallel update scheme overestimates evacuation times. All schemes can be employed for arbitrary simulation models with discrete events, such as car traffic or animal behavior. (paper)

  5. Ambient modal testing of a double-arch dam: the experimental campaign and model updating

    International Nuclear Information System (INIS)

    García-Palacios, Jaime H.; Soria, José M.; Díaz, Iván M.; Tirado-Andrés, Francisco

    2016-01-01

    A finite element model updating of a double-curvature-arch dam (La Tajera, Spain) is carried out hereof using the modal parameters obtained from an operational modal analysis. That is, the system modal dampings, natural frequencies and mode shapes have been identified using output-only identification techniques under environmental loads (wind, vehicles). A finite element model of the dam-reservoir-foundation system was initially created. Then, a testing campaing was then carried out from the most significant test points using high-sensitivity accelerometers wirelessly synchronized. Afterwards, the model updating of the initial model was done using a Monte Carlo based approach in order to match it to the recorded dynamic behaviour. The updated model may be used within a structural health monitoring system for damage detection or, for instance, for the analysis of the seismic response of the arch dam- reservoir-foundation coupled system. (paper)

  6. Bacteriophages: update on application as models for viruses in water

    African Journals Online (AJOL)

    Bacteriophages: update on application as models for viruses in water. ... the resistance of human viruses to water treatment and disinfection processes. ... highly sensitive molecular techniques viruses have been detected in drinking water ...

  7. An Updated Site Scale Saturated Zone Ground Water Transport Model For Yucca Mountain

    International Nuclear Information System (INIS)

    S. Kelkar; H. Viswanathan; A. Eddebbarrh; M. Ding; P. Reimus; B. Robinson; B. Arnold; A. Meijer

    2006-01-01

    The Yucca Mountain site scale saturated zone transport model has been revised to incorporate the updated flow model based on a hydrogeologic framework model using the latest lithology data, increased grid resolution that better resolves the geology within the model domain, updated Kd distributions for radionuclides of interest, and updated retardation factor distributions for colloid filtration. The resulting numerical transport model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The transport model results are validated by comparing the model transport pathways with those derived from geochemical data, and by comparing the transit times from the repository footprint to the compliance boundary at the accessible environment with those derived from 14 C-based age estimates. The transport model includes the processes of advection, dispersion, fracture flow, matrix diffusion, sorption, and colloid-facilitated transport. The transport of sorbing radionuclides in the aqueous phase is modeled as a linear, equilibrium process using the Kd model. The colloid-facilitated transport of radionuclides is modeled using two approaches: the colloids with irreversibly embedded radionuclides undergo reversible filtration only, while the migration of radionuclides that reversibly sorb to colloids is modeled with modified values for sorption coefficient and matrix diffusion coefficients. Model breakthrough curves for various radionuclides at the compliance boundary are presented along with their sensitivity to various parameters

  8. A revised model of Jupiter's inner electron belts: Updating the Divine radiation model

    Science.gov (United States)

    Garrett, Henry B.; Levin, Steven M.; Bolton, Scott J.; Evans, Robin W.; Bhattacharya, Bidushi

    2005-02-01

    In 1983, Divine presented a comprehensive model of the Jovian charged particle environment that has long served as a reference for missions to Jupiter. However, in situ observations by Galileo and synchrotron observations from Earth indicate the need to update the model in the inner radiation zone. Specifically, a review of the model for 1 MeV data. Further modifications incorporating observations from the Galileo and Cassini spacecraft will be reported in the future.

  9. Experimental liver fibrosis research: update on animal models, legal issues and translational aspects

    Science.gov (United States)

    2013-01-01

    Liver fibrosis is defined as excessive extracellular matrix deposition and is based on complex interactions between matrix-producing hepatic stellate cells and an abundance of liver-resident and infiltrating cells. Investigation of these processes requires in vitro and in vivo experimental work in animals. However, the use of animals in translational research will be increasingly challenged, at least in countries of the European Union, because of the adoption of new animal welfare rules in 2013. These rules will create an urgent need for optimized standard operating procedures regarding animal experimentation and improved international communication in the liver fibrosis community. This review gives an update on current animal models, techniques and underlying pathomechanisms with the aim of fostering a critical discussion of the limitations and potential of up-to-date animal experimentation. We discuss potential complications in experimental liver fibrosis and provide examples of how the findings of studies in which these models are used can be translated to human disease and therapy. In this review, we want to motivate the international community to design more standardized animal models which might help to address the legally requested replacement, refinement and reduction of animals in fibrosis research. PMID:24274743

  10. Research of Cadastral Data Modelling and Database Updating Based on Spatio-temporal Process

    Directory of Open Access Journals (Sweden)

    ZHANG Feng

    2016-02-01

    Full Text Available The core of modern cadastre management is to renew the cadastre database and keep its currentness,topology consistency and integrity.This paper analyzed the changes and their linkage of various cadastral objects in the update process.Combined object-oriented modeling technique with spatio-temporal objects' evolution express,the paper proposed a cadastral data updating model based on the spatio-temporal process according to people's thought.Change rules based on the spatio-temporal topological relations of evolution cadastral spatio-temporal objects are drafted and further more cascade updating and history back trace of cadastral features,land use and buildings are realized.This model implemented in cadastral management system-ReGIS.Achieved cascade changes are triggered by the direct driving force or perceived external events.The system records spatio-temporal objects' evolution process to facilitate the reconstruction of history,change tracking,analysis and forecasting future changes.

  11. A hierarchical updating method for finite element model of airbag buffer system under landing impact

    Directory of Open Access Journals (Sweden)

    He Huan

    2015-12-01

    Full Text Available In this paper, we propose an impact finite element (FE model for an airbag landing buffer system. First, an impact FE model has been formulated for a typical airbag landing buffer system. We use the independence of the structure FE model from the full impact FE model to develop a hierarchical updating scheme for the recovery module FE model and the airbag system FE model. Second, we define impact responses at key points to compare the computational and experimental results to resolve the inconsistency between the experimental data sampling frequency and experimental triggering. To determine the typical characteristics of the impact dynamics response of the airbag landing buffer system, we present the impact response confidence factors (IRCFs to evaluate how consistent the computational and experiment results are. An error function is defined between the experimental and computational results at key points of the impact response (KPIR to serve as a modified objective function. A radial basis function (RBF is introduced to construct updating variables for a surrogate model for updating the objective function, thereby converting the FE model updating problem to a soluble optimization problem. Finally, the developed method has been validated using an experimental and computational study on the impact dynamics of a classic airbag landing buffer system.

  12. Procedural Modeling for Digital Cultural Heritage

    Directory of Open Access Journals (Sweden)

    Simon Haegler

    2009-01-01

    Full Text Available The rapid development of computer graphics and imaging provides the modern archeologist with several tools to realistically model and visualize archeological sites in 3D. This, however, creates a tension between veridical and realistic modeling. Visually compelling models may lead people to falsely believe that there exists very precise knowledge about the past appearance of a site. In order to make the underlying uncertainty visible, it has been proposed to encode this uncertainty with different levels of transparency in the rendering, or of decoloration of the textures. We argue that procedural modeling technology based on shape grammars provides an interesting alternative to such measures, as they tend to spoil the experience for the observer. Both its efficiency and compactness make procedural modeling a tool to produce multiple models, which together sample the space of possibilities. Variations between the different models express levels of uncertainty implicitly, while letting each individual model keeping its realistic appearance. The underlying, structural description makes the uncertainty explicit. Additionally, procedural modeling also yields the flexibility to incorporate changes as knowledge of an archeological site gets refined. Annotations explaining modeling decisions can be included. We demonstrate our procedural modeling implementation with several recent examples.

  13. A State Space Model for Spatial Updating of Remembered Visual Targets during Eye Movements.

    Science.gov (United States)

    Mohsenzadeh, Yalda; Dash, Suryadeep; Crawford, J Douglas

    2016-01-01

    In the oculomotor system, spatial updating is the ability to aim a saccade toward a remembered visual target position despite intervening eye movements. Although this has been the subject of extensive experimental investigation, there is still no unifying theoretical framework to explain the neural mechanism for this phenomenon, and how it influences visual signals in the brain. Here, we propose a unified state-space model (SSM) to account for the dynamics of spatial updating during two types of eye movement; saccades and smooth pursuit. Our proposed model is a non-linear SSM and implemented through a recurrent radial-basis-function neural network in a dual Extended Kalman filter (EKF) structure. The model parameters and internal states (remembered target position) are estimated sequentially using the EKF method. The proposed model replicates two fundamental experimental observations: continuous gaze-centered updating of visual memory-related activity during smooth pursuit, and predictive remapping of visual memory activity before and during saccades. Moreover, our model makes the new prediction that, when uncertainty of input signals is incorporated in the model, neural population activity and receptive fields expand just before and during saccades. These results suggest that visual remapping and motor updating are part of a common visuomotor mechanism, and that subjective perceptual constancy arises in part from training the visual system on motor tasks.

  14. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    International Nuclear Information System (INIS)

    Ehlert, Kurt; Loewe, Laurence

    2014-01-01

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise

  15. Machine learning in updating predictive models of planning and scheduling transportation projects

    Science.gov (United States)

    1997-01-01

    A method combining machine learning and regression analysis to automatically and intelligently update predictive models used in the Kansas Department of Transportations (KDOTs) internal management system is presented. The predictive models used...

  16. An Update on Modifications to Water Treatment Plant Model

    Science.gov (United States)

    Water treatment plant (WTP) model is an EPA tool for informing regulatory options. WTP has a few versions: 1). WTP2.2 can help in regulatory analysis. An updated version (WTP3.0) will allow plant-specific analysis (WTP-ccam) and thus help meet plant-specific treatment objectives...

  17. Quantifying Update Effects in Citizen-Oriented Software

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-02-01

    Full Text Available Defining citizen-oriented software. Detailing technical issues regarding update process in this kind of software. Presenting different effects triggered by types of update. Building model for update costs estimation, including producer-side and consumer-side effects. Analyzing model applicability on INVMAT – large scale matrix inversion software. Proposing a model for update effects estimation. Specifying ways for softening effects of inaccurate updates.

  18. The updated geodetic mean dynamic topography model – DTU15MDT

    DEFF Research Database (Denmark)

    Knudsen, Per; Andersen, Ole Baltazar; Maximenko, Nikolai

    An update to the global mean dynamic topography model DTU13MDT is presented. For DTU15MDT the newer gravity model EIGEN-6C4 has been combined with the DTU15MSS mean sea surface model to construct this global mean dynamic topography model. The EIGEN-6C4 is derived using the full series of GOCE data...

  19. Updating representation of land surface-atmosphere feedbacks in airborne campaign modeling analysis

    Science.gov (United States)

    Huang, M.; Carmichael, G. R.; Crawford, J. H.; Chan, S.; Xu, X.; Fisher, J. A.

    2017-12-01

    An updated modeling system to support airborne field campaigns is being built at NASA Ames Pleiades, with focus on adjusting the representation of land surface-atmosphere feedbacks. The main updates, referring to previous experiences with ARCTAS-CARB and CalNex in the western US to study air pollution inflows, include: 1) migrating the WRF (Weather Research and Forecasting) coupled land surface model from Noah to improved/more complex models especially Noah-MP and Rapid Update Cycle; 2) enabling the WRF land initialization with suitably spun-up land model output; 3) incorporating satellite land cover, vegetation dynamics, and soil moisture data (i.e., assimilating Soil Moisture Active Passive data using the ensemble Kalman filter approach) into WRF. Examples are given of comparing the model fields with available aircraft observations during spring-summer 2016 field campaigns taken place at the eastern side of continents (KORUS-AQ in South Korea and ACT-America in the eastern US), the air pollution export regions. Under fair weather and stormy conditions, air pollution vertical distributions and column amounts, as well as the impact from land surface, are compared. These help identify challenges and opportunities for LEO/GEO satellite remote sensing and modeling of air quality in the northern hemisphere. Finally, we briefly show applications of this system on simulating Australian conditions, which would explore the needs for further development of the observing system in the southern hemisphere and inform the Clean Air and Urban Landscapes (https://www.nespurban.edu.au) modelers.

  20. Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system

    Science.gov (United States)

    Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.

    2017-05-01

    We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.

  1. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    Science.gov (United States)

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  2. 77 FR 19077 - Adoption of Updated EDGAR Filer Manual

    Science.gov (United States)

    2012-03-30

    ... practice, publication for notice and comment is not required under the Administrative Procedure Act (APA...-30008] Adoption of Updated EDGAR Filer Manual AGENCY: Securities and Exchange Commission. ACTION: Final... Electronic Data Gathering, Analysis, and Retrieval System (EDGAR) Filer Manual to reflect updates to the...

  3. 76 FR 73506 - Adoption of Updated EDGAR Filer Manual

    Science.gov (United States)

    2011-11-29

    ... practice, publication for notice and comment is not required under the Administrative Procedure Act (APA...-29868] Adoption of Updated EDGAR Filer Manual AGENCY: Securities and Exchange Commission. ACTION: Final... Electronic Data Gathering, Analysis, and Retrieval System (EDGAR) Filer Manual to reflect updates to the...

  4. "Updates to Model Algorithms & Inputs for the Biogenic Emissions Inventory System (BEIS) Model"

    Science.gov (United States)

    We have developed new canopy emission algorithms and land use data for BEIS. Simulations with BEIS v3.4 and these updates in CMAQ v5.0.2 are compared these changes to the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and evaluated the simulations against observatio...

  5. A procedure for building product models

    DEFF Research Database (Denmark)

    Hvam, Lars; Riis, Jesper; Malis, Martin

    2001-01-01

    This article presents a procedure for building product models to support the specification processes dealing with sales, design of product variants and production preparation. The procedure includes, as the first phase, an analysis and redesign of the business processes, which are to be supported...... with product models. The next phase includes an analysis of the product assortment, and the set up of a so-called product master. Finally the product model is designed and implemented using object oriented modelling. The procedure is developed in order to ensure that the product models constructed are fit...... for the business processes they support, and properly structured and documented, in order to facilitate that the systems can be maintained continually and further developed. The research has been carried out at the Centre for Industrialisation of Engineering, Department of Manufacturing Engineering, Technical...

  6. Using temporal information to construct, update, and retrieve situation models of narratives

    NARCIS (Netherlands)

    Rinck, M.; Hähnel, A.; Becker, G.

    2001-01-01

    Four experiments explored how readers use temporal information to construct and update situation models and retrieve them from memory. In Experiment 1, readers spontaneously constructed temporal and spatial situation models of single sentences. In Experiment 2, temporal inconsistencies caused

  7. Updating and prospective validation of a prognostic model for high sickness absence

    NARCIS (Netherlands)

    Roelen, C.A.M.; Heymans, M.W.; Twisk, J.W.R.; van Rhenen, W.; Pallesen, S.; Bjorvatn, B.; Moen, B.E.; Mageroy, N.

    2015-01-01

    Objectives To further develop and validate a Dutch prognostic model for high sickness absence (SA). Methods Three-wave longitudinal cohort study of 2,059 Norwegian nurses. The Dutch prognostic model was used to predict high SA among Norwegian nurses at wave 2. Subsequently, the model was updated by

  8. Model Hosting for continuous updating and transparent Water Resources Management

    Science.gov (United States)

    Jódar, Jorge; Almolda, Xavier; Batlle, Francisco; Carrera, Jesús

    2013-04-01

    Numerical models have become a standard tool for water resources management. They are required for water volume bookkeeping and help in decision making. Nevertheless, numerical models are complex and they can be used only by highly qualified technicians, which are often far from the decision makers. Moreover, they need to be maintained. That is, they require updating of their state, by assimilation of measurements, natural and anthropic actions (e.g., pumping and weather data), and model parameters. Worst, their very complexity implies that are they viewed as obscure and far, which hinders transparency and governance. We propose internet model hosting as an alternative to overcome these limitations. The basic idea is to keep the model hosted in the cloud. The model is updated as new data (measurements and external forcing) becomes available, which ensures continuous maintenance, with a minimal human cost (only required to address modelling problems). Internet access facilitates model use not only by modellers, but also by people responsible for data gathering and by water managers. As a result, the model becomes an institutional tool shared by water agencies to help them not only in decision making for sustainable management of water resources, but also in generating a common discussion platform. By promoting intra-agency sharing, the model becomes the common official position of the agency, which facilitates commitment in their adopted decisions regarding water management. Moreover, by facilitating access to stakeholders and the general public, the state of the aquifer and the impacts of alternative decisions become transparent. We have developed a tool (GAC, Global Aquifer Control) to address the above requirements. The application has been developed using Cloud Computing technologies, which facilitates the above operations. That is, GAC automatically updates the numerical models with the new available measurements, and then simulates numerous management options

  9. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan; Lau, Cheryl; Mü ller, Pascal; Wonka, Peter; Pauly, Mark

    2017-01-01

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  10. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan

    2017-05-24

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  11. Using radar altimetry to update a routing model of the Zambezi River Basin

    DEFF Research Database (Denmark)

    Michailovsky, Claire Irene B.; Bauer-Gottwein, Peter

    2012-01-01

    Satellite radar altimetry allows for the global monitoring of lakes and river levels. However, the widespread use of altimetry for hydrological studies is limited by the coarse temporal and spatial resolution provided by current altimetric missions and the fact that discharge rather than level...... is needed for hydrological applications. To overcome these limitations, altimetry river levels can be combined with hydrological modeling in a dataassimilation framework. This study focuses on the updating of a river routing model of the Zambezi using river levels from radar altimetry. A hydrological model...... of the basin was built to simulate the land phase of the water cycle and produce inflows to a Muskingum routing model. River altimetry from the ENVISAT mission was then used to update the storages in the reaches of the Muskingum model using the Extended Kalman Filter. The method showed improvements in modeled...

  12. 78 FR 4766 - Adoption of Updated EDGAR Filer Manual

    Science.gov (United States)

    2013-01-23

    ..., publication for notice and comment is not required under the Administrative Procedure Act (APA).\\7\\ It follows...-68644; 39-2488; IC-30348] Adoption of Updated EDGAR Filer Manual AGENCY: Securities and Exchange...) Filer Manual and related rules to reflect updates to the EDGAR system. The revisions are being made...

  13. MARMOT update for oxide fuel modeling

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, Pritam [Idaho National Lab. (INL), Idaho Falls, ID (United States); Jiang, Chao [Idaho National Lab. (INL), Idaho Falls, ID (United States); Aagesen, Larry [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ahmed, Karim [Idaho National Lab. (INL), Idaho Falls, ID (United States); Jiang, Wen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Biner, Bulent [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, Xianming [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Tonks, Michael [Pennsylvania State Univ., University Park, PA (United States); Millett, Paul [Univ. of Arkansas, Fayetteville, AR (United States)

    2016-09-01

    This report summarizes the lower-length-scale research and development progresses in FY16 at Idaho National Laboratory in developing mechanistic materials models for oxide fuels, in parallel to the development of the MARMOT code which will be summarized in a separate report. This effort is a critical component of the microstructure based fuel performance modeling approach, supported by the Fuels Product Line in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. The progresses can be classified into three categories: 1) development of materials models to be used in engineering scale fuel performance modeling regarding the effect of lattice defects on thermal conductivity, 2) development of modeling capabilities for mesoscale fuel behaviors including stage-3 gas release, grain growth, high burn-up structure, fracture and creep, and 3) improved understanding in material science by calculating the anisotropic grain boundary energies in UO$_2$ and obtaining thermodynamic data for solid fission products. Many of these topics are still under active development. They are updated in the report with proper amount of details. For some topics, separate reports are generated in parallel and so stated in the text. The accomplishments have led to better understanding of fuel behaviors and enhance capability of the MOOSE-BISON-MARMOT toolkit.

  14. A PSO Driven Intelligent Model Updating and Parameter Identification Scheme for Cable-Damper System

    Directory of Open Access Journals (Sweden)

    Danhui Dan

    2015-01-01

    Full Text Available The precise measurement of the cable force is very important for monitoring and evaluating the operation status of cable structures such as cable-stayed bridges. The cable system should be installed with lateral dampers to reduce the vibration, which affects the precise measurement of the cable force and other cable parameters. This paper suggests a cable model updating calculation scheme driven by the particle swarm optimization (PSO algorithm. By establishing a finite element model considering the static geometric nonlinearity and stress-stiffening effect firstly, an automatically finite element method model updating powered by PSO algorithm is proposed, with the aims to identify the cable force and relevant parameters of cable-damper system precisely. Both numerical case studies and full-scale cable tests indicated that, after two rounds of updating process, the algorithm can accurately identify the cable force, moment of inertia, and damping coefficient of the cable-damper system.

  15. Recent Updates to the System Advisor Model (SAM)

    Energy Technology Data Exchange (ETDEWEB)

    DiOrio, Nicholas A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-14

    The System Advisor Model (SAM) is a mature suite of techno-economic models for many renewable energy technologies that can be downloaded for free as a desktop application or software development kit. SAM is used for system-level modeling, including generating performance pro the release of the code as an open source project on GitHub. Other additions that will be covered include the ability to download data directly into SAM from the National Solar Radiation Database (NSRDB) and up- dates to a user-interface macro that assists with PV system sizing. A brief update on SAM's battery model and its integration with the detailed photovoltaic model will also be discussed. Finally, an outline of planned work for the next year will be presented, including the addition of a bifacial model, support for multiple MPPT inputs for detailed inverter modeling, and the addition of a model for inverter thermal behavior.

  16. Guideline for Monitoring and Management of Pediatric Patients Before, During, and After Sedation for Diagnostic and Therapeutic Procedures: Update 2016.

    Science.gov (United States)

    2016-10-15

    The safe sedation of children for procedures requires a systematic approach that includes the following: no administration of sedating medication without the safety net of medical/dental supervision, careful presedation evaluation for underlying medical or surgical conditions that would place the child at increased risk from sedating medications, appropriate fasting for elective procedures and a balance between the depth of sedation and risk for those who are unable to fast because of the urgent nature of the procedure, a focused airway examination for large (kissing) tonsils or anatomic airway abnormalities that might increase thepotential for airway obstruction, a clear understanding of the medication's pharmacokinetic and pharmacodynamic effects and drug interactions, appropriate training and skills in airway management to allow rescue of the patient, age- and size-appropriate equipment for airway management and venous access, appropriate medications and reversal agents, sufficient numbers of staff to both carry out the procedure and monitor the patient, appropriate physiologic monitoring during and after the procedure, a properly equipped and staffed recovery area, recovery to the presedation level of consciousness before discharge from medical/dental supervision, and appropriate discharge instructions. This report was developed through a collaborative effort of the American Academy of Pediatrics and the American Academy of Pediatric Dentistry to offer pediatric providers updated information and guidance in delivering safe sedation to children.

  17. Construction and Updating of Event Models in Auditory Event Processing

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  18. A Survey on Procedural Modelling for Virtual Worlds

    NARCIS (Netherlands)

    Smelik, R.M.; Tutenel, T.; Bidarra, R.; Benes, B.

    2014-01-01

    Procedural modelling deals with (semi-)automatic content generation by means of a program or procedure. Among other advantages, its data compression and the potential to generate a large variety of detailed content with reduced human intervention, have made procedural modelling attractive for

  19. Comparison of two optimization algorithms for fuzzy finite element model updating for damage detection in a wind turbine blade

    Science.gov (United States)

    Turnbull, Heather; Omenzetter, Piotr

    2018-03-01

    vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.

  20. Evaluation of Lower East Fork Poplar Creek Mercury Sources - Model Update

    Energy Technology Data Exchange (ETDEWEB)

    Ketelle, Richard [East Tennessee Technology Park (ETTP), Oak Ridge, TN (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Mark J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevelhimer, Mark S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Watson, David B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brooks, Scott C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mayes, Melanie [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); DeRolph, Christopher R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dickson, Johnbull O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Olsen, Todd A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    The purpose of this report is to assess new data that has become available and provide an update to the evaluations and modeling presented in the Oak Ridge National Laboratory (ORNL) Technical Manuscript Evaluation of lower East Fork Poplar Creek (LEFPC) Mercury Sources (Watson et al., 2016). Primary sources of field and laboratory data for this update include multiple US Department of Energy (DOE) programs including Environmental Management (EM; e.g., Biological Monitoring and Abatement Program, Mercury Remediation Technology Development [TD], and Applied Field Research Initiative), Office of Science (Mercury Science Focus Areas [SFA] project), and the Y-12 National Security Complex (Y-12) Compliance Department.

  1. Modified methods for growing 3-D skin equivalents: an update.

    Science.gov (United States)

    Lamb, Rebecca; Ambler, Carrie A

    2014-01-01

    Artificial epidermis can be reconstituted in vitro by seeding primary epidermal cells (keratinocytes) onto a supportive substrate and then growing the developing skin equivalent at the air-liquid interface. In vitro skin models are widely used to study skin biology and for industrial drug and cosmetic testing. Here, we describe updated methods for growing 3-dimensional skin equivalents using de-vitalized, de-epidermalized dermis (DED) substrates including methods for DED substrate preparation, cell seeding, growth conditions, and fixation procedures.

  2. Update on Nonsurgical Lung Volume Reduction Procedures

    Directory of Open Access Journals (Sweden)

    J. Alberto Neder

    2016-01-01

    Full Text Available There has been a surge of interest in endoscopic lung volume reduction (ELVR strategies for advanced COPD. Valve implants, coil implants, biological LVR (BioLVR, bronchial thermal vapour ablation, and airway stents are used to induce lung deflation with the ultimate goal of improving respiratory mechanics and chronic dyspnea. Patients presenting with severe air trapping (e.g., inspiratory capacity/total lung capacity (TLC 225% predicted and thoracic hyperinflation (TLC > 150% predicted have the greatest potential to derive benefit from ELVR procedures. Pre-LVRS or ELVR assessment should ideally include cardiological evaluation, high resolution CT scan, ventilation and perfusion scintigraphy, full pulmonary function tests, and cardiopulmonary exercise testing. ELVR procedures are currently available in selected Canadian research centers as part of ethically approved clinical trials. If a decision is made to offer an ELVR procedure, one-way valves are the first option in the presence of complete lobar exclusion and no significant collateral ventilation. When the fissure is not complete, when collateral ventilation is evident in heterogeneous emphysema or when emphysema is homogeneous, coil implants or BioLVR (in that order are the next logical alternatives.

  3. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  4. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  5. Development of the updated system of city underground pipelines based on Visual Studio

    Science.gov (United States)

    Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong

    2009-10-01

    Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.

  6. Updating Sea Spray Aerosol Emissions in the Community Multiscale Air Quality Model

    Science.gov (United States)

    Gantt, B.; Bash, J. O.; Kelly, J.

    2014-12-01

    Sea spray aerosols (SSA) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. In this study, the Community Multiscale Air Quality (CMAQ) model is updated to enhance fine mode SSA emissions, include sea surface temperature (SST) dependency, and revise surf zone emissions. Based on evaluation with several regional and national observational datasets in the continental U.S., the updated emissions generally improve surface concentrations predictions of primary aerosols composed of sea-salt and secondary aerosols affected by sea-salt chemistry in coastal and near-coastal sites. Specifically, the updated emissions lead to better predictions of the magnitude and coastal-to-inland gradient of sodium, chloride, and nitrate concentrations at Bay Regional Atmospheric Chemistry Experiment (BRACE) sites near Tampa, FL. Including SST-dependency to the SSA emission parameterization leads to increased sodium concentrations in the southeast U.S. and decreased concentrations along the Pacific coast and northeastern U.S., bringing predictions into closer agreement with observations at most Interagency Monitoring of Protected Visual Environments (IMPROVE) and Chemical Speciation Network (CSN) sites. Model comparison with California Research at the Nexus of Air Quality and Climate Change (CalNex) observations will also be discussed, with particular focus on the South Coast Air Basin where clean marine air mixes with anthropogenic pollution in a complex environment. These SSA emission updates enable more realistic simulation of chemical processes in coastal environments, both in clean marine air masses and mixtures of clean marine and polluted conditions.

  7. State energy-price system: 1981 update

    Energy Technology Data Exchange (ETDEWEB)

    Fang, J.M.; Imhoff, K.L.; Hood, L.J.

    1983-08-01

    This report updates the State Energy Price Data System (STEPS) to include state-level energy prices by fuel and by end-use sectors for 1981. Both physical unit prices and Btu prices are presented. Basic documentation of the data base remains generally the same as in the original report: State Energy Price System; Volume 1: Overview and Technical Documentation (DOE/NBB-0029 Volume 1 of 2, November 1982). The present report documents only the changes in procedures necessitated by the update to 1981 and the corrections to the basic documentation.

  8. Flow Forecasting using Deterministic Updating of Water Levels in Distributed Hydrodynamic Urban Drainage Models

    DEFF Research Database (Denmark)

    Hansen, Lisbet Sneftrup; Borup, Morten; Moller, Arne

    2014-01-01

    drainage models and reduce a number of unavoidable discrepancies between the model and reality. The latter can be achieved partly by inserting measured water levels from the sewer system into the model. This article describes how deterministic updating of model states in this manner affects a simulation...

  9. SAM Photovoltaic Model Technical Reference 2016 Update

    Energy Technology Data Exchange (ETDEWEB)

    Gilman, Paul [National Renewable Energy Laboratory (NREL), Golden, CO (United States); DiOrio, Nicholas A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Freeman, Janine M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Janzou, Steven [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dobos, Aron [No longer NREL employee; Ryberg, David [No longer NREL employee

    2018-03-19

    This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixed arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.

  10. DUAL STATE-PARAMETER UPDATING SCHEME ON A CONCEPTUAL HYDROLOGIC MODEL USING SEQUENTIAL MONTE CARLO FILTERS

    Science.gov (United States)

    Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin

    Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.

  11. Standard Review Plan Update and Development Program

    International Nuclear Information System (INIS)

    1992-05-01

    This implementing procedures document (IPD) was prepared for use in implementing tasks under the standard review plan update and development program (SRP-UDP). The IPD provides comprehensive guidance and detailed procedures for SRP-UDP tasks. The IPD is mandatory for contractors performing work for the SRP-UDP. It is guidance for the staff. At the completion of the SRP-UDP, the IPD will be revised (to remove the UDP aspects) and will replace NRR Office Letter No. 800 as long-term maintenance procedures

  12. Standard Review Plan Update and Development Program

    Energy Technology Data Exchange (ETDEWEB)

    1992-05-01

    This implementing procedures document (IPD) was prepared for use in implementing tasks under the standard review plan update and development program (SRP-UDP). The IPD provides comprehensive guidance and detailed procedures for SRP-UDP tasks. The IPD is mandatory for contractors performing work for the SRP-UDP. It is guidance for the staff. At the completion of the SRP-UDP, the IPD will be revised (to remove the UDP aspects) and will replace NRR Office Letter No. 800 as long-term maintenance procedures.

  13. The 2018 and 2020 Updates of the U.S. National Seismic Hazard Models

    Science.gov (United States)

    Petersen, M. D.

    2017-12-01

    During 2018 the USGS will update the 2014 National Seismic Hazard Models by incorporating new seismicity models, ground motion models, site factors, fault inputs, and by improving weights to ground motion models using empirical and other data. We will update the earthquake catalog for the U.S. and introduce new rate models. Additional fault data will be used to improve rate estimates on active faults. New ground motion models (GMMs) and site factors for Vs30 have been released by the Pacific Earthquake Engineering Research Center (PEER) and we will consider these in assessing ground motions in craton and extended margin regions of the central and eastern U.S. The USGS will also include basin-depth terms for selected urban areas of the western United States to improve long-period shaking assessments using published depth estimates to 1.0 and 2.5 km/s shear wave velocities. We will produce hazard maps for input into the building codes that span a broad range of periods (0.1 to 5 s) and site classes (shear wave velocity from 2000 m/s to 200 m/s in the upper 30 m of the crust, Vs30). In the 2020 update we plan on including: a new national crustal model that defines basin depths required in the latest GMMs, new 3-D ground motion simulations for several urban areas, new magnitude-area equations, and new fault geodetic and geologic strain rate models. The USGS will also consider including new 3-D ground motion simulations for inclusion in these long-period maps. These new models are being evaluated and will be discussed at one or more regional and topical workshops held at the beginning of 2018.

  14. Impact of time displaced precipitation estimates for on-line updated models

    DEFF Research Database (Denmark)

    Borup, Morten; Grum, Morten; Mikkelsen, Peter Steen

    2012-01-01

    When an online runoff model is updated from system measurements the requirements to the precipitation estimates change. Using rain gauge data as precipitation input there will be a displacement between the time where the rain intensity hits the gauge and the time where the rain hits the actual...

  15. Environmental Measurements Laboratory (EML) procedures manual

    International Nuclear Information System (INIS)

    Chieco, N.A.; Bogen, D.C.; Knutson, E.O.

    1990-11-01

    Volume 1 of this manual documents the procedures and existing technology that are currently used by the Environmental Measurements Laboratory. A section devoted to quality assurance has been included. These procedures have been updated and revised and new procedures have been added. They include: sampling; radiation measurements; analytical chemistry; radionuclide data; special facilities; and specifications. 228 refs., 62 figs., 37 tabs. (FL)

  16. Are Forecast Updates Progressive?

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)

    2010-01-01

    textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,

  17. Valence-Dependent Belief Updating: Computational Validation

    Directory of Open Access Journals (Sweden)

    Bojana Kuzmanovic

    2017-06-01

    Full Text Available People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates with trials with bad news (worse-than-expected base rates. After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on

  18. Typical NRC inspection procedures for model plant

    International Nuclear Information System (INIS)

    Blaylock, J.

    1984-01-01

    A summary of NRC inspection procedures for a model LEU fuel fabrication plant is presented. Procedures and methods for combining inventory data, seals, measurement techniques, and statistical analysis are emphasized

  19. Comparing the impact of time displaced and biased precipitation estimates for online updated urban runoff models.

    Science.gov (United States)

    Borup, Morten; Grum, Morten; Mikkelsen, Peter Steen

    2013-01-01

    When an online runoff model is updated from system measurements, the requirements of the precipitation input change. Using rain gauge data as precipitation input there will be a displacement between the time when the rain hits the gauge and the time where the rain hits the actual catchment, due to the time it takes for the rain cell to travel from the rain gauge to the catchment. Since this time displacement is not present for system measurements the data assimilation scheme might already have updated the model to include the impact from the particular rain cell when the rain data is forced upon the model, which therefore will end up including the same rain twice in the model run. This paper compares forecast accuracy of updated models when using time displaced rain input to that of rain input with constant biases. This is done using a simple time-area model and historic rain series that are either displaced in time or affected with a bias. The results show that for a 10 minute forecast, time displacements of 5 and 10 minutes compare to biases of 60 and 100%, respectively, independent of the catchments time of concentration.

  20. Update of the ITER MELCOR model for the validation of the Cryostat design

    Energy Technology Data Exchange (ETDEWEB)

    Martínez, M.; Labarta, C.; Terrón, S.; Izquierdo, J.; Perlado, J.M.

    2015-07-01

    Some transients can compromise the vacuum in the Cryostat of ITER and cause significant loads. A MELCOR model has been updated in order to assess this loads. Transients have been run with this model and its result will be used in the mechanical assessment of the cryostat. (Author)

  1. iTree-Hydro: Snow hydrology update for the urban forest hydrology model

    Science.gov (United States)

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2011-01-01

    This article presents snow hydrology updates made to iTree-Hydro, previously called the Urban Forest Effects—Hydrology model. iTree-Hydro Version 1 was a warm climate model developed by the USDA Forest Service to provide a process-based planning tool with robust water quantity and quality predictions given data limitations common to most urban areas. Cold climate...

  2. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ) model version 5.0.2

    Science.gov (United States)

    Gantt, B.; Kelly, J. T.; Bash, J. O.

    2015-11-01

    Sea spray aerosols (SSAs) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. Model evaluations of SSA emissions have mainly focused on the global scale, but regional-scale evaluations are also important due to the localized impact of SSAs on atmospheric chemistry near the coast. In this study, SSA emissions in the Community Multiscale Air Quality (CMAQ) model were updated to enhance the fine-mode size distribution, include sea surface temperature (SST) dependency, and reduce surf-enhanced emissions. Predictions from the updated CMAQ model and those of the previous release version, CMAQv5.0.2, were evaluated using several coastal and national observational data sets in the continental US. The updated emissions generally reduced model underestimates of sodium, chloride, and nitrate surface concentrations for coastal sites in the Bay Regional Atmospheric Chemistry Experiment (BRACE) near Tampa, Florida. Including SST dependency to the SSA emission parameterization led to increased sodium concentrations in the southeastern US and decreased concentrations along parts of the Pacific coast and northeastern US. The influence of sodium on the gas-particle partitioning of nitrate resulted in higher nitrate particle concentrations in many coastal urban areas due to increased condensation of nitric acid in the updated simulations, potentially affecting the predicted nitrogen deposition in sensitive ecosystems. Application of the updated SSA emissions to the California Research at the Nexus of Air Quality and Climate Change (CalNex) study period resulted in a modest improvement in the predicted surface concentration of sodium and nitrate at several central and southern California coastal sites. This update of SSA emissions enabled a more realistic simulation of the atmospheric chemistry in coastal environments where marine air mixes with urban pollution.

  3. The USEtox story: A survey of model developer visions and user requirements

    DEFF Research Database (Denmark)

    Westh, Torbjørn Bochsen; Hauschild, Michael Zwicky; Birkved, Morten

    2015-01-01

    into LCA software and methods, (4) improve update/testing procedures, (5) strengthen communication between developers and users, and (6) extend model scope. By generalizing our recommendations to guide scientific model development in a broader context, we emphasize to acknowledge different levels of user...... expertise to integrate sound revision and update procedures and to facilitate modularity, data import/export, and incorporation into relevant software and databases during model design and development. Our fully documented approach can inspire performing similar surveys on other LCA-related tools...

  4. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  5. UPDATING NATIONAL TOPOGRAPHIC DATA BASE USING CHANGE DETECTION METHODS

    Directory of Open Access Journals (Sweden)

    E. Keinan

    2016-06-01

    Full Text Available The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA, the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  6. Updating National Topographic Data Base Using Change Detection Methods

    Science.gov (United States)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  7. Guidelines for Monitoring and Management of Pediatric Patients Before, During, and After Sedation for Diagnostic and Therapeutic Procedures: Update 2016.

    Science.gov (United States)

    Coté, Charles J; Wilson, Stephen

    2016-07-01

    The safe sedation of children for procedures requires a systematic approach that includes the following: no administration of sedating medication without the safety net of medical/dental supervision, careful presedation evaluation for underlying medical or surgical conditions that would place the child at increased risk from sedating medications, appropriate fasting for elective procedures and a balance between the depth of sedation and risk for those who are unable to fast because of the urgent nature of the procedure, a focused airway examination for large (kissing) tonsils or anatomic airway abnormalities that might increase the potential for airway obstruction, a clear understanding of the medication's pharmacokinetic and pharmacodynamic effects and drug interactions, appropriate training and skills in airway management to allow rescue of the patient, age- and size-appropriate equipment for airway management and venous access, appropriate medications and reversal agents, sufficient numbers of staff to both carry out the procedure and monitor the patient, appropriate physiologic monitoring during and after the procedure, a properly equipped and staffed recovery area, recovery to the presedation level of consciousness before discharge from medical/dental supervision, and appropriate discharge instructions. This report was developed through a collaborative effort of the American Academy of Pediatrics and the American Academy of Pediatric Dentistry to offer pediatric providers updated information and guidance in delivering safe sedation to children. Copyright © 2016 American Academy of Pediatric Dentistry and American Academy of Pediatrics. This report is being published concurrently in Pediatric Dentistry July 2016. The articles are identical. Either citation can be used when citing this report.

  8. Study of Updating Initiating Event Frequency using Prognostics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of)

    2014-10-15

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA.

  9. Study of Updating Initiating Event Frequency using Prognostics

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung

    2014-01-01

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA

  10. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  11. A baseline-free procedure for transformation models under interval censorship.

    Science.gov (United States)

    Gu, Ming Gao; Sun, Liuquan; Zuo, Guoxin

    2005-12-01

    An important property of Cox regression model is that the estimation of regression parameters using the partial likelihood procedure does not depend on its baseline survival function. We call such a procedure baseline-free. Using marginal likelihood, we show that an baseline-free procedure can be derived for a class of general transformation models under interval censoring framework. The baseline-free procedure results a simplified and stable computation algorithm for some complicated and important semiparametric models, such as frailty models and heteroscedastic hazard/rank regression models, where the estimation procedures so far available involve estimation of the infinite dimensional baseline function. A detailed computational algorithm using Markov Chain Monte Carlo stochastic approximation is presented. The proposed procedure is demonstrated through extensive simulation studies, showing the validity of asymptotic consistency and normality. We also illustrate the procedure with a real data set from a study of breast cancer. A heuristic argument showing that the score function is a mean zero martingale is provided.

  12. Updated Results from the Michigan Titan Thermospheric General Circulation Model (TTGCM)

    Science.gov (United States)

    Bell, J. M.; Bougher, S. W.; de Lahaye, V.; Waite, J. H.; Ridley, A.

    2006-05-01

    This paper presents updated results from the Michigan Titan Thermospheric General Circulation Model (TTGCM) that was recently unveiled in operational form (Bell et al 2005 Spring AGU). Since then, we have incorporated a suite of chemical reactions for the major neutral constituents in Titan's upper atmosphere (N2, CH4). Additionally, some selected minor neutral constituents and major ionic species are also supported in the framework. At this time, HCN, which remains one of the critical thermally active species in the upper atmosphere, remains specified at all altitudes, utilizing profiles derived from recent Cassini-Huygen's measurements. In addition to these improvements, a parallel effort is underway to develop a non-hydrostatic Titan Thermospheric General Circulation Model for further comparisons. In this work, we emphasize the impacts of self-consistent chemistry on the results of the updated TTGCM relative to its frozen chemistry predecessor. Meanwhile, the thermosphere's thermodynamics remains determined by the interplay of solar EUV forcing and HCN rotational cooling, which is calculated by a full line- by-line radiative transfer routine along the lines of Yelle (1991) and Mueller-Wodarg (2000, 2002). In addition to these primary drivers, a treatment of magnetospheric heating is further tested. The model's results will be compared with both the Cassini INMS data and the model of Mueller-Wodarg (2000,2002).

  13. Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Final Report, Version 2)

    Science.gov (United States)

    EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing developmen...

  14. Updating and prospective validation of a prognostic model for high sickness absence.

    Science.gov (United States)

    Roelen, C A M; Heymans, M W; Twisk, J W R; van Rhenen, W; Pallesen, S; Bjorvatn, B; Moen, B E; Magerøy, N

    2015-01-01

    To further develop and validate a Dutch prognostic model for high sickness absence (SA). Three-wave longitudinal cohort study of 2,059 Norwegian nurses. The Dutch prognostic model was used to predict high SA among Norwegian nurses at wave 2. Subsequently, the model was updated by adding person-related (age, gender, marital status, children at home, and coping strategies), health-related (BMI, physical activity, smoking, and caffeine and alcohol intake), and work-related (job satisfaction, job demands, decision latitude, social support at work, and both work-to-family and family-to-work spillover) variables. The updated model was then prospectively validated for predictions at wave 3. 1,557 (77 %) nurses had complete data at wave 2 and 1,342 (65 %) at wave 3. The risk of high SA was under-estimated by the Dutch model, but discrimination between high-risk and low-risk nurses was fair after re-calibration to the Norwegian data. Gender, marital status, BMI, physical activity, smoking, alcohol intake, job satisfaction, job demands, decision latitude, support at the workplace, and work-to-family spillover were identified as potential predictors of high SA. However, these predictors did not improve the model's discriminative ability, which remained fair at wave 3. The prognostic model correctly identifies 73 % of Norwegian nurses at risk of high SA, although additional predictors are needed before the model can be used to screen working populations for risk of high SA.

  15. Do lateral boundary condition update frequency and the resolution of the boundary data affect the regional model COSMO-CLM? A sensitivity study.

    Science.gov (United States)

    Pankatz, K.; Kerkweg, A.

    2014-12-01

    The work presented is part of the joint project "DecReg" ("Regional decadal predictability") which is in turn part of the project "MiKlip" ("Decadal predictions"), an effort funded by the german Federal Ministry of Education and Research to improve decadal predictions on a global and regional scale. In regional climate modeling it is common to update the lateral boundary conditions (LBC) of the regional model every six hours. This is mainly due to the fact, that reference data sets like ERA are only available every six hours. Additionally, for offline coupling procedures it would be too costly to store LBC data in higher temporal resolution for climate simulations. However, theoretically, the coupling frequency could be as high as the time step of the driving model. Meanwhile, it is unclear if a more frequent update of the LBC has a significant effect on the climate in the domain of the regional model (RCM). This study uses the RCM COSMO-CLM/MESSy (Kerkweg and Jöckel, 2012) to couple COSMO-CLM offline to the GCM ECHAM5. One study examines a 30 year time slice experiment for three update frequencies of the LBC, namely six hours, one hour and six minutes. The evaluation of means, standard deviations and statistics of the climate in regional domain shows only small deviations, some stastically significant though, of 2m temperature, sea level pressure and precipitaion.The second scope of the study assesses parameters linked to cyclone activity, which is affected by the LBC update frequency. Differences in track density and strength are found when comparing the simulations.The second study examines the quality of decadal hind-casts of the decade 2001-2010 when the horizontal resolution of the driving model, namely T42, T63, T85, T106, from which the LBC are calculated, is altered. Two sets of simulations are evaluated. For the first set of simulations, the GCM simulations are performed at different resolutions using the same boundary conditions for GHGs and SSTs, thus

  16. FEM Updating of Tall Buildings using Ambient Vibration Data

    DEFF Research Database (Denmark)

    Ventura, C. E.; Lord, J. F.; Turek, M.

    2005-01-01

    Ambient vibration testing is the most economical non-destructive testing method to acquire vibration data from large civil engineering structures. The purpose of this paper is to demonstrate how ambient vibration Modal Identification techniques can be effectively used with Model Updating tools...... to develop reliable finite element models of large civil engineering structures. A fifteen story and a forty-eight story reinforced concrete buildings are used as case studies for this purpose. The dynamic characteristics of interest for this study were the first few lateral and torsional natural frequencies...... the information provided in the design documentation of the building. Different parameters of the model were then modified using an automated procedure to improve the correlation between measured and calculated modal parameters. Careful attention was placed to the selection of the parameters to be modified...

  17. "Updates to Model Algorithms & Inputs for the Biogenic ...

    Science.gov (United States)

    We have developed new canopy emission algorithms and land use data for BEIS. Simulations with BEIS v3.4 and these updates in CMAQ v5.0.2 are compared these changes to the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and evaluated the simulations against observations. This has resulted in improvements in model evaluations of modeled isoprene, NOx, and O3. The National Exposure Research Laboratory (NERL) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA mission to protect human health and the environment. AMAD research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollution problem, but also in developing emission control policies and regulations for air quality improvements.

  18. Image processing of full-field strain data and its use in model updating

    International Nuclear Information System (INIS)

    Wang, W; Mottershead, J E; Sebastian, C M; Patterson, E A

    2011-01-01

    Finite element model updating is an inverse problem based on measured structural outputs, typically natural frequencies. Full-field responses such as static stress/strain patterns and vibration mode shapes contain valuable information for model updating but within large volumes of highly-redundant data. Pattern recognition and image processing provide feasible techniques to extract effective and efficient information, often known as shape features, from this data. For instance, the Zernike polynomials having the properties of orthogonality and rotational invariance are powerful decomposition kernels for a shape defined within a unit circle. In this paper, full field strain patterns for a specimen, in the form of a square plate with a circular hole, under a tensile load are considered. Effective shape features can be constructed by a set of modified Zernike polynomials. The modification includes the application of a weighting function to the Zernike polynomials so that high strain magnitudes around the hole are well represented. The Gram-Schmidt process is then used to ensure orthogonality for the obtained decomposition kernels over the domain of the specimen. The difference between full-field strain patterns measured by digital image correlation (DIC) and reconstructed using 15 shape features (Zernike moment descriptors, ZMDs) at different steps in the elasto-plastic deformation of the specimen is found to be very small. It is significant that only a very small number of shape features are necessary and sufficient to represent the full-field data. Model updating of nonlinear elasto-plastic material properties is carried out by adjusting the parameters of a FE model until the FE strain pattern converges upon the measured strains as determined using ZMDs.

  19. Developments in UK defect assessment procedures R6 revision 4 and BS7910

    International Nuclear Information System (INIS)

    Sharples, J.K.; Ainsworth, R.A.; Budden, P.J.

    2003-01-01

    The R6 defect assessment procedures have been developed over many years by the UK nuclear power generation industry. The procedures are updated on a regular basis, taking into account the information resulting from the R6 development programme and other available information worldwide. A major revision, Revision 4, of the R6 procedures was released in 2000. Just prior to that release, in 1999, the British Standards flaw assessment procedure BS7910 was issued and combined and updated the previous published documents PD6493 and PD6539, for components operating at temperatures where creep was negligible and important, respectively. BS79l0 is also under constant development. This paper provides a brief overview of the BS7910 and R6 Revision 4 procedures and describes updates to the respective documents since they were first issued. Some ongoing developments which will lead to future revisions to the documents are also described. (author)

  20. Model checking as an aid to procedure design

    International Nuclear Information System (INIS)

    Zhang, Wenhu

    2001-01-01

    The OECD Halden Reactor Project has been actively working on computer assisted operating procedures for many years. The objective of the research has been to provide computerised assistance for procedure design, verification and validation, implementation and maintenance. For the verification purpose, the application of formal methods has been considered in several reports. The recent formal verification activity conducted at the Halden Project is based on using model checking to the verification of procedures. This report presents verification approaches based on different model checking techniques and tools for the formalization and verification of operating procedures. Possible problems and relative merits of the different approaches are discussed. A case study of one of the approaches is presented to show the practical application of formal verification. Application of formal verification in the traditional procedure design process can reduce the human resources involved in reviews and simulations, and hence reduce the cost of verification and validation. A discussion of the integration of the formal verification with the traditional procedure design process is given at the end of this report. (Author)

  1. Predicting Individual Physiological Responses During Marksmanship Field Training Using an Updated SCENARIO-J Model

    National Research Council Canada - National Science Library

    Yokota, Miyo

    2004-01-01

    ...)) for individual variation and a metabolic rate (M) correction during downhill movements. This study evaluated the updated version of the model incorporating these new features, using a dataset collected during U.S. Marine Corps (USMC...

  2. Combining Multi-Source Remotely Sensed Data and a Process-Based Model for Forest Aboveground Biomass Updating.

    Science.gov (United States)

    Lu, Xiaoman; Zheng, Guang; Miller, Colton; Alvarado, Ernesto

    2017-09-08

    Monitoring and understanding the spatio-temporal variations of forest aboveground biomass (AGB) is a key basis to quantitatively assess the carbon sequestration capacity of a forest ecosystem. To map and update forest AGB in the Greater Khingan Mountains (GKM) of China, this work proposes a physical-based approach. Based on the baseline forest AGB from Landsat Enhanced Thematic Mapper Plus (ETM+) images in 2008, we dynamically updated the annual forest AGB from 2009 to 2012 by adding the annual AGB increment (ABI) obtained from the simulated daily and annual net primary productivity (NPP) using the Boreal Ecosystem Productivity Simulator (BEPS) model. The 2012 result was validated by both field- and aerial laser scanning (ALS)-based AGBs. The predicted forest AGB for 2012 estimated from the process-based model can explain 31% ( n = 35, p forest AGBs, respectively. However, due to the saturation of optical remote sensing-based spectral signals and contribution of understory vegetation, the BEPS-based AGB tended to underestimate/overestimate the AGB for dense/sparse forests. Generally, our results showed that the remotely sensed forest AGB estimates could serve as the initial carbon pool to parameterize the process-based model for NPP simulation, and the combination of the baseline forest AGB and BEPS model could effectively update the spatiotemporal distribution of forest AGB.

  3. Enhancing photogrammetric 3d city models with procedural modeling techniques for urban planning support

    International Nuclear Information System (INIS)

    Schubiger-Banz, S; Arisona, S M; Zhong, C

    2014-01-01

    This paper presents a workflow to increase the level of detail of reality-based 3D urban models. It combines the established workflows from photogrammetry and procedural modeling in order to exploit distinct advantages of both approaches. The combination has advantages over purely automatic acquisition in terms of visual quality, accuracy and model semantics. Compared to manual modeling, procedural techniques can be much more time effective while maintaining the qualitative properties of the modeled environment. In addition, our method includes processes for procedurally adding additional features such as road and rail networks. The resulting models meet the increasing needs in urban environments for planning, inventory, and analysis

  4. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  5. Key Update Assistant for Resource-Constrained Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    developed a push-button solution - powered by stochastic model checking - that network designers can easily benefit from, and it paves the way for consumers to set up key update related security parameters. Key Update Assistant, as we named it, runs necessary model checking operations and determines...

  6. Update on procedure-related risks for prenatal diagnosis techniques

    DEFF Research Database (Denmark)

    Tabor, Ann; Alfirevic, Zarko

    2010-01-01

    Introduction: As a consequence of the introduction of effective screening methods, the number of invasive prenatal diagnostic procedures is steadily declining. The aim of this review is to summarize the risks related to these procedures. Material and Methods: Review of the literature. Results: Data...... from randomised controlled trials as well as from systematic reviews and a large national registry study are consistent with a procedure-related miscarriage rate of 0.5-1.0% for amniocentesis as well as for chorionic villus sampling (CVS). In single-center studies performance may be remarkably good due...... not be performed before 15 + 0 weeks' gestation. CVS on the other hand should not be performed before 10 weeks' gestation due to a possible increase in risk of limb reduction defects. Discussion: Experienced operators have a higher success rate and a lower complication rate. The decreasing number of prenatal...

  7. Body Dysmorphic Disorder: Neurobiological Features and an Updated Model

    Science.gov (United States)

    Li, Wei; Arienzo, Donatello; Feusner, Jamie D.

    2013-01-01

    Body Dysmorphic Disorder (BDD) affects approximately 2% of the population and involves misperceived defects of appearance along with obsessive preoccupation and compulsive behaviors. There is evidence of neurobiological abnormalities associated with symptoms in BDD, although research to date is still limited. This review covers the latest neuropsychological, genetic, neurochemical, psychophysical, and neuroimaging studies and synthesizes these findings into an updated (yet still preliminary) neurobiological model of the pathophysiology of BDD. We propose a model in which visual perceptual abnormalities, along with frontostriatal and limbic system dysfunction, may combine to contribute to the symptoms of impaired insight and obsessive thoughts and compulsive behaviors expressed in BDD. Further research is necessary to gain a greater understanding of the etiological formation of BDD symptoms and their evolution over time. PMID:25419211

  8. The European Stroke Organisation Guidelines: a standard operating procedure.

    Science.gov (United States)

    Ntaios, George; Bornstein, Natan M; Caso, Valeria; Christensen, Hanne; De Keyser, Jacques; Diener, Hans-Christoph; Diez-Tejedor, Exuperio; Ferro, Jose M; Ford, Gary A; Grau, Armin; Keller, Emanuella; Leys, Didier; Russell, David; Toni, Danilo; Turc, Guillaume; Van der Worp, Bart; Wahlgren, Nils; Steiner, Thorsten

    2015-10-01

    In 2008, the recently founded European Stroke Organisation published its guidelines for the management of ischemic stroke and transient ischemic attack. This highly cited document was translated in several languages and was updated in 2009. Since then, the European Stroke Organisation has published guidelines for the management of intracranial aneurysms and subarachnoidal hemorrhage, for the establishment of stroke units and stroke centers, and recently for the management of intracerebral hemorrhage. In recent years, the methodology for the development of guidelines has evolved significantly. To keep pace with this progress and driven by the strong determination of the European Stroke Organisation to further promote stroke management, education, and research, the European Stroke Organisation decided to delineate a detailed standard operating procedure for its guidelines. There are two important cornerstones in this standard operating procedure: The first is the implementation of the Grading of Recommendations Assessment, Development, and Evaluation methodology for the development of its Guideline Documents. The second one is the decision of the European Stroke Organisation to move from the classical model of a single Guideline Document about a major topic (e.g. management of ischemic stroke) to focused modules (i.e. subdivisions of a major topic). This will enable the European Stroke Organisation to react faster when new developments in a specific stroke field occur and update its recommendations on the related module rather swiftly; with the previous approach of a single large Guideline Document, its entire revision had to be completed before an updated publication, delaying the production of up-to-date guidelines. After discussion within the European Stroke Organisation Guidelines Committee and significant input from European Stroke Organisation members as well as methodologists and analysts, this document presents the official standard operating procedure for

  9. Updating river basin models with radar altimetry

    DEFF Research Database (Denmark)

    Michailovsky, Claire Irene B.

    suited for use in data assimilation frameworks which combine the information content from models and current observations to produce improved forecasts and reduce prediction uncertainty. The focus of the second and third papers of this thesis was therefore the use of radar altimetry as update data...... of political unwillingness to share data which is a common problem in particular in transboundary settings. In this context, remote sensing (RS) datasets provide an appealing alternative to traditional in-situ data and much research effort has gone into the use of these datasets for hydrological applications...... response of a catchment to meteorological forcing. While river discharge cannot be directly measured from space, radar altimetry (RA) can measure water level variations in rivers at the locations where the satellite ground track and river network intersect called virtual stations or VS. In this PhD study...

  10. Some safe and sensible shortcuts for efficiently upscaled updates of existing elevation models.

    Science.gov (United States)

    Knudsen, Thomas; Aasbjerg Nielsen, Allan

    2013-04-01

    The Danish national elevation model, DK-DEM, was introduced in 2009 and is based on LiDAR data collected in the time frame 2005-2007. Hence, DK-DEM is aging, and it is time to consider how to integrate new data with the current model in a way that improves the representation of new landscape features, while still preserving the overall (very high) quality of the model. In LiDAR terms, 2005 is equivalent to some time between the palaeolithic and the neolithic. So evidently, when (and if) an update project is launched, we may expect some notable improvements due to the technical and scientific developments from the last half decade. To estimate the magnitude of these potential improvements, and to devise efficient and effective ways of integrating the new and old data, we currently carry out a number of case studies based on comparisons between the current terrain model (with a ground sample distance, GSD, of 1.6 m), and a number of new high resolution point clouds (10-70 points/m2). Not knowing anything about the terms of a potential update project, we consider multiple scenarios ranging from business as usual: A new model with the same GSD, but improved precision, to aggressive upscaling: A new model with 4 times better GSD, i.e. a 16-fold increase in the amount of data. Especially in the latter case speeding up the gridding process is important. Luckily recent results from one of our case studies reveal that for very high resolution data in smooth terrain (which is the common case in Denmark), using local mean (LM) as grid value estimator is only negligibly worse than using the theoretically "best" estimator, i.e. ordinary kriging (OK) with rigorous modelling of the semivariogram. The bias in a leave one out cross validation differs on the micrometer level, while the RMSE differs on the 0.1 mm level. This is fortunate, since a LM estimator can be implemented in plain stream mode, letting the points from the unstructured point cloud (i.e. no TIN generation) stream

  11. The updating of clinical practice guidelines: insights from an international survey

    Directory of Open Access Journals (Sweden)

    Solà Ivan

    2011-09-01

    Full Text Available Abstract Background Clinical practice guidelines (CPGs have become increasingly popular, and the methodology to develop guidelines has evolved enormously. However, little attention has been given to the updating process, in contrast to the appraisal of the available literature. We conducted an international survey to identify current practices in CPG updating and explored the need to standardize and improve the methods. Methods We developed a questionnaire (28 items based on a review of the existing literature about guideline updating and expert comments. We carried out the survey between March and July 2009, and it was sent by email to 106 institutions: 69 members of the Guidelines International Network who declared that they developed CPGs; 30 institutions included in the U.S. National Guideline Clearinghouse database that published more than 20 CPGs; and 7 institutions selected by an expert committee. Results Forty-four institutions answered the questionnaire (42% response rate. In the final analysis, 39 completed questionnaires were included. Thirty-six institutions (92% reported that they update their guidelines. Thirty-one institutions (86% have a formal procedure for updating their guidelines, and 19 (53% have a formal procedure for deciding when a guideline becomes out of date. Institutions describe the process as moderately rigorous (36% or acknowledge that it could certainly be more rigorous (36%. Twenty-two institutions (61% alert guideline users on their website when a guideline is older than three to five years or when there is a risk of being outdated. Twenty-five institutions (64% support the concept of "living guidelines," which are continuously monitored and updated. Eighteen institutions (46% have plans to design a protocol to improve their guideline-updating process, and 21 (54% are willing to share resources with other organizations. Conclusions Our study is the first to describe the process of updating CPGs among prominent

  12. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ) model version 5.0.2

    Data.gov (United States)

    U.S. Environmental Protection Agency — The uploaded data consists of the BRACE Na aerosol observations paired with CMAQ model output, the updated model's parameterization of sea salt aerosol emission size...

  13. EOP Improvement Proposal for SGTR based on The OPR PSA Update

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Hee; Cho, Jae Hyun; Kim, Dong San; Yang, Joon Eon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This updating process was also focused to enhance the PSA quality and to respect the as built and as operated conditions of target plants. For this purpose, the EOP(Emergency Operating Procedure) and AOP(Abnormal Operating Procedure) of target plant were reviewed in detail and various thermal hydraulic(T/H) analysis were also performed to analyze the realistic PSA accident sequence model. In this paper, the unreasonable point of SGTR (Steam Generator Tube Rupture) EOP based on PSA perspective was identified and the initial proposal for EOP change items from PSA insight was proposed. In this paper, the unreasonable point of SGTR EOP based on PSA perspective was identified and the EOP improvement items are proposed to enhance safety and operator's convenience for the target plant.

  14. Updating Human Factors Engineering Guidelines for Conducting Safety Reviews of Nuclear Power Plants

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Higgins, J.; Fleger, Stephen

    2011-01-01

    The U.S. Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) programs of applicants for nuclear power plant construction permits, operating licenses, standard design certifications, and combined operating licenses. The purpose of these safety reviews is to help ensure that personnel performance and reliability are appropriately supported. Detailed design review procedures and guidance for the evaluations is provided in three key documents: the Standard Review Plan (NUREG-0800), the HFE Program Review Model (NUREG-0711), and the Human-System Interface Design Review Guidelines (NUREG-0700). These documents were last revised in 2007, 2004 and 2002, respectively. The NRC is committed to the periodic update and improvement of the guidance to ensure that it remains a state-of-the-art design evaluation tool. To this end, the NRC is updating its guidance to stay current with recent research on human performance, advances in HFE methods and tools, and new technology being employed in plant and control room design. This paper describes the role of HFE guidelines in the safety review process and the content of the key HFE guidelines used. Then we will present the methodology used to develop HFE guidance and update these documents, and describe the current status of the update program.

  15. A simple but accurate procedure for solving the five-parameter model

    International Nuclear Information System (INIS)

    Mares, Oana; Paulescu, Marius; Badescu, Viorel

    2015-01-01

    Highlights: • A new procedure for extracting the parameters of the one-diode model is proposed. • Only the basic information listed in the datasheet of PV modules are required. • Results demonstrate a simple, robust and accurate procedure. - Abstract: The current–voltage characteristic of a photovoltaic module is typically evaluated by using a model based on the solar cell equivalent circuit. The complexity of the procedure applied for extracting the model parameters depends on data available in manufacture’s datasheet. Since the datasheet is not detailed enough, simplified models have to be used in many cases. This paper proposes a new procedure for extracting the parameters of the one-diode model in standard test conditions, using only the basic data listed by all manufactures in datasheet (short circuit current, open circuit voltage and maximum power point). The procedure is validated by using manufacturers’ data for six commercially crystalline silicon photovoltaic modules. Comparing the computed and measured current–voltage characteristics the determination coefficient is in the range 0.976–0.998. Thus, the proposed procedure represents a feasible tool for solving the five-parameter model applied to crystalline silicon photovoltaic modules. The procedure is described in detail, to guide potential users to derive similar models for other types of photovoltaic modules.

  16. Agent Communication for Dynamic Belief Update

    Science.gov (United States)

    Kobayashi, Mikito; Tojo, Satoshi

    Thus far, various formalizations of rational / logical agent model have been proposed. In this paper, we include the notion of communication channel and belief modality into update logic, and introduce Belief Update Logic (BUL). First, we discuss that how we can reformalize the inform action of FIPA-ACL into communication channel, which represents a connection between agents. Thus, our agents can send a message only when they believe, and also there actually is, a channel between him / her and a receiver. Then, we present a static belief logic (BL) and show its soundness and completeness. Next, we develop the logic to BUL, which can update Kripke model by the inform action; in which we show that in the updated model the belief operator also satisfies K45. Thereafter, we show that every sentence in BUL can be translated into BL; thus, we can contend that BUL is also sound and complete. Furthermore, we discuss the features of CUL, including the case of inconsistent information, as well as channel transmission. Finally, we summarize our contribution and discuss some future issues.

  17. Update on Small Modular Reactors Dynamic System Modeling Tool: Web Application

    International Nuclear Information System (INIS)

    Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.; Batteh, John J; Tiller, Michael M.

    2015-01-01

    Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individual component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.

  18. Management around invasive procedures in mastocytosis : An update

    NARCIS (Netherlands)

    Hermans, Maud A. W.; Arends, Nicolette J. T.; van Wijk, Roy Gerth; van Hagen, P. Martin; Kluin-Nelemans, Hanneke C.; Elberink, Hanneke N. G. Oude; Pasmans, Suzanne G. M. A.; van Daele, Paul L. A.

    2017-01-01

    Objective: Mastocytosis is a chronic hematologic disorder that is characterized by the accumulation of aberrant mast cells and typically involves the skin and/or bone marrow. Patients with mastocytosis are at increased risk of anaphylaxis. Based on theoretical assumptions, medical procedures

  19. Land Boundary Conditions for the Goddard Earth Observing System Model Version 5 (GEOS-5) Climate Modeling System: Recent Updates and Data File Descriptions

    Science.gov (United States)

    Mahanama, Sarith P.; Koster, Randal D.; Walker, Gregory K.; Takacs, Lawrence L.; Reichle, Rolf H.; De Lannoy, Gabrielle; Liu, Qing; Zhao, Bin; Suarez, Max J.

    2015-01-01

    The Earths land surface boundary conditions in the Goddard Earth Observing System version 5 (GEOS-5) modeling system were updated using recent high spatial and temporal resolution global data products. The updates include: (i) construction of a global 10-arcsec land-ocean lakes-ice mask; (ii) incorporation of a 10-arcsec Globcover 2009 land cover dataset; (iii) implementation of Level 12 Pfafstetter hydrologic catchments; (iv) use of hybridized SRTM global topography data; (v) construction of the HWSDv1.21-STATSGO2 merged global 30 arc second soil mineral and carbon data in conjunction with a highly-refined soil classification system; (vi) production of diffuse visible and near-infrared 8-day MODIS albedo climatologies at 30-arcsec from the period 2001-2011; and (vii) production of the GEOLAND2 and MODIS merged 8-day LAI climatology at 30-arcsec for GEOS-5. The global data sets were preprocessed and used to construct global raster data files for the software (mkCatchParam) that computes parameters on catchment-tiles for various atmospheric grids. The updates also include a few bug fixes in mkCatchParam, as well as changes (improvements in algorithms, etc.) to mkCatchParam that allow it to produce tile-space parameters efficiently for high resolution AGCM grids. The update process also includes the construction of data files describing the vegetation type fractions, soil background albedo, nitrogen deposition and mean annual 2m air temperature to be used with the future Catchment CN model and the global stream channel network to be used with the future global runoff routing model. This report provides detailed descriptions of the data production process and data file format of each updated data set.

  20. OSATE Overview & Community Updates

    Science.gov (United States)

    2015-02-15

    update 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Delange /Julien 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...main language capabilities Modeling patterns & model samples for beginners Error-Model examples EMV2 model constructs Demonstration of tools Case

  1. Drawing-Based Procedural Modeling of Chinese Architectures.

    Science.gov (United States)

    Fei Hou; Yue Qi; Hong Qin

    2012-01-01

    This paper presents a novel modeling framework to build 3D models of Chinese architectures from elevation drawing. Our algorithm integrates the capability of automatic drawing recognition with powerful procedural modeling to extract production rules from elevation drawing. First, different from the previous symbol-based floor plan recognition, based on the novel concept of repetitive pattern trees, small horizontal repetitive regions of the elevation drawing are clustered in a bottom-up manner to form architectural components with maximum repetition, which collectively serve as building blocks for 3D model generation. Second, to discover the global architectural structure and its components' interdependencies, the components are structured into a shape tree in a top-down subdivision manner and recognized hierarchically at each level of the shape tree based on Markov Random Fields (MRFs). Third, shape grammar rules can be derived to construct 3D semantic model and its possible variations with the help of a 3D component repository. The salient contribution lies in the novel integration of procedural modeling with elevation drawing, with a unique application to Chinese architectures.

  2. Experimental Update of the Overtopping Model Used for the Wave Dragon Wave Energy Converter

    Directory of Open Access Journals (Sweden)

    Erik Friis-Madsen

    2013-04-01

    Full Text Available An overtopping model specifically suited for Wave Dragon is needed in order to improve the reliability of its performance estimates. The model shall be comprehensive of all relevant physical processes that affect overtopping and flexible to adapt to any local conditions and device configuration. An experimental investigation is carried out to update an existing formulation suited for 2D draft-limited, low-crested structures, in order to include the effects on the overtopping flow of the wave steepness, the 3D geometry of Wave Dragon, the wing reflectors, the device motions and the non-rigid connection between platform and reflectors. The study is carried out in four phases, each of them specifically targeted at quantifying one of these effects through a sensitivity analysis and at modeling it through custom-made parameters. These are depending on features of the wave or the device configuration, all of which can be measured in real-time. Instead of using new fitting coefficients, this approach allows a broader applicability of the model beyond the Wave Dragon case, to any overtopping WEC or structure within the range of tested conditions. Predictions reliability of overtopping over Wave Dragon increased, as the updated model allows improved accuracy and precision respect to the former version.

  3. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. I. Biological model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    that really uses all these methodological improvements. In this paper, the biological model describing the performance and feed intake of sows is presented. In particular, estimation of herd specific parameters is emphasized. The optimization model is described in a subsequent paper......Several replacement models have been presented in literature. In other applicational areas like dairy cow replacement, various methodological improvements like hierarchical Markov processes and Bayesian updating have been implemented, but not in sow models. Furthermore, there are methodological...... improvements like multi-level hierarchical Markov processes with decisions on multiple time scales, efficient methods for parameter estimations at herd level and standard software that has been hardly implemented at all in any replacement model. The aim of this study is to present a sow replacement model...

  4. Hydrogeological structure model of the Olkiluoto Site. Update in 2010

    International Nuclear Information System (INIS)

    Vaittinen, T.; Ahokas, H.; Nummela, J.; Paulamaeki, S.

    2011-09-01

    As part of the programme for the final disposal of spent nuclear fuel, a hydrogeological structure model containing the hydraulically significant zones on Olkiluoto Island has been compiled. The structure model describes the deterministic site scale zones that dominate the groundwater flow. The main objective of the study is to provide the geometry and the hydrogeological properties related to the groundwater flow for the zones and the sparsely fractured bedrock to be used in the numerical modelling of groundwater flow and geochemical transport and thereby in the safety assessment. Also, these zones should be taken into account in the repository layout and in the construction of the disposal facility and they have a long-term impact on the evolution of the site and the safety of the disposal repository. The previous hydrogeological model was compiled in 2008 and this updated version is based on data available at the end of May 2010. The updating was based on new hydrogeological observations and a systematic approach covering all drillholes to assess measured fracture transmissivities typical of the site-scale hydrogeological zones. New data consisted of head observations and interpreted pressure and flow responses caused by field activities. Essential background data for the modelling included the ductile deformation model and the site scale brittle deformation zones modelled in the geological model version 2.0. The GSM combine both geological and geophysical investigation data on the site. As a result of the modelling campaign, hydrogeological zones HZ001, HZ008, HZ19A, HZ19B, HZ19C, HZ20A, HZ20B, HZ21, HZ21B, HZ039, HZ099, OL-BFZ100, and HZ146 were included in the structure model. Compared with the previous model, zone HZ004 was replaced with zone HZ146 and zone HZ039 was introduced for the first time. Alternative zone HZ21B was included in the basic model. For the modelled zones, both the zone intersections, describing the fractures with dominating groundwater

  5. Fena Valley Reservoir watershed and water-balance model updates and expansion of watershed modeling to southern Guam

    Science.gov (United States)

    Rosa, Sarah N.; Hay, Lauren E.

    2017-12-01

    In 2014, the U.S. Geological Survey, in cooperation with the U.S. Department of Defense’s Strategic Environmental Research and Development Program, initiated a project to evaluate the potential impacts of projected climate-change on Department of Defense installations that rely on Guam’s water resources. A major task of that project was to develop a watershed model of southern Guam and a water-balance model for the Fena Valley Reservoir. The southern Guam watershed model provides a physically based tool to estimate surface-water availability in southern Guam. The U.S. Geological Survey’s Precipitation Runoff Modeling System, PRMS-IV, was used to construct the watershed model. The PRMS-IV code simulates different parts of the hydrologic cycle based on a set of user-defined modules. The southern Guam watershed model was constructed by updating a watershed model for the Fena Valley watersheds, and expanding the modeled area to include all of southern Guam. The Fena Valley watershed model was combined with a previously developed, but recently updated and recalibrated Fena Valley Reservoir water-balance model.Two important surface-water resources for the U.S. Navy and the citizens of Guam were modeled in this study; the extended model now includes the Ugum River watershed and improves upon the previous model of the Fena Valley watersheds. Surface water from the Ugum River watershed is diverted and treated for drinking water, and the Fena Valley watersheds feed the largest surface-water reservoir on Guam. The southern Guam watershed model performed “very good,” according to the criteria of Moriasi and others (2007), in the Ugum River watershed above Talofofo Falls with monthly Nash-Sutcliffe efficiency statistic values of 0.97 for the calibration period and 0.93 for the verification period (a value of 1.0 represents perfect model fit). In the Fena Valley watershed, monthly simulated streamflow volumes from the watershed model compared reasonably well with the

  6. Updated Life-Cycle Assessment of Aluminum Production and Semi-fabrication for the GREET Model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiang [Argonne National Lab. (ANL), Argonne, IL (United States); Kelly, Jarod C. [Argonne National Lab. (ANL), Argonne, IL (United States); Burnham, Andrew [Argonne National Lab. (ANL), Argonne, IL (United States); Elgowainy, Amgad [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    This report serves as an update for the life-cycle analysis (LCA) of aluminum production based on the most recent data representing the state-of-the-art of the industry in North America. The 2013 Aluminum Association (AA) LCA report on the environmental footprint of semifinished aluminum products in North America provides the basis for the update (The Aluminum Association, 2013). The scope of this study covers primary aluminum production, secondary aluminum production, as well as aluminum semi-fabrication processes including hot rolling, cold rolling, extrusion and shape casting. This report focuses on energy consumptions, material inputs and criteria air pollutant emissions for each process from the cradle-to-gate of aluminum, which starts from bauxite extraction, and ends with manufacturing of semi-fabricated aluminum products. The life-cycle inventory (LCI) tables compiled are to be incorporated into the vehicle cycle model of Argonne National Laboratory’s Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) Model for the release of its 2015 version.

  7. A Stepwise Fitting Procedure for automated fitting of Ecopath with Ecosim models

    Directory of Open Access Journals (Sweden)

    Erin Scott

    2016-01-01

    Full Text Available The Stepwise Fitting Procedure automates testing of alternative hypotheses used for fitting Ecopath with Ecosim (EwE models to observation reference data (Mackinson et al. 2009. The calibration of EwE model predictions to observed data is important to evaluate any model that will be used for ecosystem based management. Thus far, the model fitting procedure in EwE has been carried out manually: a repetitive task involving setting >1000 specific individual searches to find the statistically ‘best fit’ model. The novel fitting procedure automates the manual procedure therefore producing accurate results and lets the modeller concentrate on investigating the ‘best fit’ model for ecological accuracy.

  8. A new frequency matching technique for FRF-based model updating

    Science.gov (United States)

    Yang, Xiuming; Guo, Xinglin; Ouyang, Huajiang; Li, Dongsheng

    2017-05-01

    Frequency Response Function (FRF) residues have been widely used to update Finite Element models. They are a kind of original measurement information and have the advantages of rich data and no extraction errors, etc. However, like other sensitivity-based methods, an FRF-based identification method also needs to face the ill-conditioning problem which is even more serious since the sensitivity of the FRF in the vicinity of a resonance is much greater than elsewhere. Furthermore, for a given frequency measurement, directly using a theoretical FRF at a frequency may lead to a huge difference between the theoretical FRF and the corresponding experimental FRF which finally results in larger effects of measurement errors and damping. Hence in the solution process, correct selection of the appropriate frequency to get the theoretical FRF in every iteration in the sensitivity-based approach is an effective way to improve the robustness of an FRF-based algorithm. A primary tool for right frequency selection based on the correlation of FRFs is the Frequency Domain Assurance Criterion. This paper presents a new frequency selection method which directly finds the frequency that minimizes the difference of the order of magnitude between the theoretical and experimental FRFs. A simulated truss structure is used to compare the performance of different frequency selection methods. For the sake of reality, it is assumed that not all the degrees of freedom (DoFs) are available for measurement. The minimum number of DoFs required in each approach to correctly update the analytical model is regarded as the right identification standard.

  9. A skin abscess model for teaching incision and drainage procedures.

    Science.gov (United States)

    Fitch, Michael T; Manthey, David E; McGinnis, Henderson D; Nicks, Bret A; Pariyadath, Manoj

    2008-07-03

    Skin and soft tissue infections are increasingly prevalent clinical problems, and it is important for health care practitioners to be well trained in how to treat skin abscesses. A realistic model of abscess incision and drainage will allow trainees to learn and practice this basic physician procedure. We developed a realistic model of skin abscess formation to demonstrate the technique of incision and drainage for educational purposes. The creation of this model is described in detail in this report. This model has been successfully used to develop and disseminate a multimedia video production for teaching this medical procedure. Clinical faculty and resident physicians find this model to be a realistic method for demonstrating abscess incision and drainage. This manuscript provides a detailed description of our model of abscess incision and drainage for medical education. Clinical educators can incorporate this model into skills labs or demonstrations for teaching this basic procedure.

  10. Minimally invasive splenectomy: an update and review

    Science.gov (United States)

    Gamme, Gary; Birch, Daniel W.; Karmali, Shahzeer

    2013-01-01

    Laparoscopic splenectomy (LS) has become an established standard of care in the management of surgical diseases of the spleen. The present article is an update and review of current procedures and controversies regarding minimally invasive splenectomy. We review the indications and contraindications for LS as well as preoperative considerations. An individual assessment of the procedures and outcomes of multiport laparoscopic splenectomy, hand-assisted laparoscopic splenectomy, robotic splenectomy, natural orifice transluminal endoscopic splenectomy and single-port splenectomy is included. Furthermore, this review examines postoperative considerations after LS, including the postoperative course of uncomplicated patients, postoperative portal vein thrombosis, infections and malignancy. PMID:23883500

  11. Effects of lateral boundary condition resolution and update frequency on regional climate model predictions

    Science.gov (United States)

    Pankatz, Klaus; Kerkweg, Astrid

    2015-04-01

    The work presented is part of the joint project "DecReg" ("Regional decadal predictability") which is in turn part of the project "MiKlip" ("Decadal predictions"), an effort funded by the German Federal Ministry of Education and Research to improve decadal predictions on a global and regional scale. In MiKlip, one big question is if regional climate modeling shows "added value", i.e. to evaluate, if regional climate models (RCM) produce better results than the driving models. However, the scope of this study is to look more closely at the setup specific details of regional climate modeling. As regional models only simulate a small domain, they have to inherit information about the state of the atmosphere at their lateral boundaries from external data sets. There are many unresolved questions concerning the setup of lateral boundary conditions (LBC). External data sets come from global models or from global reanalysis data-sets. A temporal resolution of six hours is common for this kind of data. This is mainly due to the fact, that storage space is a limiting factor, especially for climate simulations. However, theoretically, the coupling frequency could be as high as the time step of the driving model. Meanwhile, it is unclear if a more frequent update of the LBCs has a significant effect on the climate in the domain of the RCM. The first study examines how the RCM reacts to a higher update frequency. The study is based on a 30 year time slice experiment for three update frequencies of the LBC, namely six hours, one hour and six minutes. The evaluation of means, standard deviations and statistics of the climate in the regional domain shows only small deviations, some statistically significant though, of 2m temperature, sea level pressure and precipitation. The second part of the first study assesses parameters linked to cyclone activity, which is affected by the LBC update frequency. Differences in track density and strength are found when comparing the simulations

  12. Procedural Personas for Player Decision Modeling and Procedural Content Generation

    DEFF Research Database (Denmark)

    Holmgård, Christoffer

    2016-01-01

    ." These methods for constructing procedural personas are then integrated with existing procedural content generation systems, acting as critics that shape the output of these systems, optimizing generated content for different personas and by extension, different kinds of players and their decision making styles......How can player models and artificially intelligent (AI) agents be useful in early-stage iterative game and simulation design? One answer may be as ways of generating synthetic play-test data, before a game or level has ever seen a player, or when the sampled amount of play test data is very low....... This thesis explores methods for creating low-complexity, easily interpretable, generative AI agents for use in game and simulation design. Based on insights from decision theory and behavioral economics, the thesis investigates how player decision making styles may be defined, operationalised, and measured...

  13. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  14. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  15. Standard Model updates and new physics analysis with the Unitarity Triangle fit

    International Nuclear Information System (INIS)

    Bevan, A.; Bona, M.; Ciuchini, M.; Derkach, D.; Franco, E.; Silvestrini, L.; Lubicz, V.; Tarantino, C.; Martinelli, G.; Parodi, F.; Schiavi, C.; Pierini, M.; Sordini, V.; Stocchi, A.; Vagnoni, V.

    2013-01-01

    We present the summer 2012 update of the Unitarity Triangle (UT) analysis performed by the UTfit Collaboration within the Standard Model (SM) and beyond. The increased accuracy on several of the fundamental constraints is now enhancing some of the tensions amongst and within the constraint themselves. In particular, the long standing tension between exclusive and inclusive determinations of the V ub and V cb CKM matrix elements is now playing a major role. Then we present the generalisation the UT analysis to investigate new physics (NP) effects, updating the constraints on NP contributions to ΔF=2 processes. In the NP analysis, both CKM and NP parameters are fitted simultaneously to obtain the possible NP effects in any specific sector. Finally, based on the NP constraints, we derive upper bounds on the coefficients of the most general ΔF=2 effective Hamiltonian. These upper bounds can be translated into lower bounds on the scale of NP that contributes to these low-energy effective interactions

  16. A systematic decision-making process on the need for updating clinical practice guidelines proved to be feasible in a pilot study.

    Science.gov (United States)

    Becker, Monika; Jaschinski, Thomas; Eikermann, Michaela; Mathes, Tim; Bühn, Stefanie; Koppert, Wolfgang; Leffler, Andreas; Neugebauer, Edmund; Pieper, Dawid

    2018-04-01

    The objective of this study was to test and evaluate a new decision-making process on the need for updating within the update of a German clinical practice guideline (CPG). The pilot study comprised (1) limited searches in Pubmed to identify new potentially relevant evidence, (2) an online survey among the members of the CPG group to assess the need for update, and (3) a consensus conference for determination and prioritization of guideline sections with a high need for update. Subsequently, we conducted a second online survey to evaluate the procedure. The searches resulted in 902 abstracts that were graded as new potentially relevant evidence. Twenty five of 39 members of the CPG group (64%) participated in the online survey. Seventy six percent of those took part in the second online survey. The evaluation study found on average a grade of support of the procedure regarding the determination of the need for update of 3.65 (standard deviation: 0.76) on a likert scale with 1 = "no support" to 5 = "very strong support." The conducted procedure presents a systematic approach for assessing whether and to what extent a CPG requires updating and enables setting priorities for which particular guideline section to update within a CPG. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Procedural Skills Education – Colonoscopy as a Model

    Directory of Open Access Journals (Sweden)

    Maitreyi Raman

    2008-01-01

    Full Text Available Traditionally, surgical and procedural apprenticeship has been an assumed activity of students, without a formal educational context. With increasing barriers to patient and operating room access such as shorter work week hours for residents, and operating room and endoscopy time at a premium, alternate strategies to maximizing procedural skill development are being considered. Recently, the traditional surgical apprenticeship model has been challenged, with greater emphasis on the need for surgical and procedural skills training to be more transparent and for alternatives to patient-based training to be considered. Colonoscopy performance is a complex psychomotor skill requiring practioners to integrate multiple sensory inputs, and involves higher cortical centres for optimal performance. Colonoscopy skills involve mastery in the cognitive, technical and process domains. In the present review, we propose a model for teaching colonoscopy to the novice trainee based on educational theory.

  18. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  19. Minnesota's forest statistics, 1987: an inventory update.

    Science.gov (United States)

    Jerold T. Hahn; W. Brad Smith

    1987-01-01

    The Minnesota 1987 inventory update, derived by using tree growth models, reports 13.5 million acres of timberland, a decline of less than 1% since 1977. This bulletin presents findings from the inventory update in tables detailing timer land area, volume, and biomass.

  20. Wisconsin's forest statistics, 1987: an inventory update.

    Science.gov (United States)

    W. Brad Smith; Jerold T. Hahn

    1989-01-01

    The Wisconsin 1987 inventory update, derived by using tree growth models, reports 14.7 million acres of timberland, a decline of less than 1% since 1983. This bulletin presents findings from the inventory update in tables detailing timberland area, volume, and biomass.

  1. [Complementary and alternative procedures for fibromyalgia syndrome : Updated guidelines 2017 and overview of systematic review articles].

    Science.gov (United States)

    Langhorst, J; Heldmann, P; Henningsen, P; Kopke, K; Krumbein, L; Lucius, H; Winkelmann, A; Wolf, B; Häuser, W

    2017-06-01

    The regular update of the guidelines on fibromyalgia syndrome, AWMF number 145/004, was scheduled for April 2017. The guidelines were developed by 13 scientific societies and 2 patient self-help organizations coordinated by the German Pain Society. Working groups (n =8) with a total of 42 members were formed balanced with respect to gender, medical expertise, position in the medical or scientific hierarchy and potential conflicts of interest. A search of the literature for systematic reviews of randomized controlled trials of complementary and alternative therapies from December 2010 to May 2016 was performed in the Cochrane library, MEDLINE, PsycINFO and Scopus databases. Levels of evidence were assigned according to the classification system of the Oxford Centre for Evidence-Based Medicine version 2009. The strength of recommendations was formed by multiple step formalized procedures to reach a consensus. Efficacy, risks, patient preferences and applicability of available therapies were weighed up against each other. The guidelines were reviewed and approved by the board of directors of the societies engaged in the development of the guidelines. Meditative movement therapies (e.g. qi gong, tai chi and yoga) are strongly recommended. Acupuncture and weight reduction in cases of obesity can be considered.

  2. Cuba "updates" its economic model: perspectives for cooperation with the European Union

    OpenAIRE

    Schmieg, Evita

    2017-01-01

    Following the thawing of relations with the United States under Obama, Cuba is now seeking closer integration into the global economy through a programme of "guidelines" for updating the country’s economic model adopted in 2011. The central goals are increasing exports, substituting imports and encouraging foreign direct investment in order to improve the country’s hard currency situation, increase domestic value creation and reduce dependency on Venezuela. The guidelines also expand the spac...

  3. Energy Economic Data Base (EEDB) Program: Phase VI update (1983) report

    International Nuclear Information System (INIS)

    1984-09-01

    This update of the Energy Economic Data Base is the latest in a series of technical and cost studies prepared by United Engineers and Constructors Inc., during the last 18 years. The data base was developed during 1978 and has been updated annually since then. The purpose of the updates has been to reflect the impact of changing regulations and technology on the costs of electric power generating stations. This Phase VI (Sixth) Update report documents the results of the 1983 EEDB Program update effort. The latest effort was a comprehensive update of the technical and capital cost information for the pressurized water reactor, boiling water reactor, and liquid metal fast breeder reactor nuclear power plant data models and for the 800 MWe and 500 MWe high sulfur coal-fired power plant data models. The update provided representative costs for these nuclear and coal-fired power plants for the 1980's. In addition, the updated nuclear power plant data models for the 1980's were modified to provide anticipated costs for nuclear power plants for the 1990's. Consequently, the Phase VI Update has continued to provide important benchmark information through which technical and capital cost trends may be identified that have occurred since January 1, 1978

  4. On the general procedure for modelling complex ecological systems

    International Nuclear Information System (INIS)

    He Shanyu.

    1987-12-01

    In this paper, the principle of a general procedure for modelling complex ecological systems, i.e. the Adaptive Superposition Procedure (ASP) is shortly stated. The result of application of ASP in a national project for ecological regionalization is also described. (author). 3 refs

  5. On Realism of Architectural Procedural Models

    Czech Academy of Sciences Publication Activity Database

    Beneš, J.; Kelly, T.; Děchtěrenko, Filip; Křivánek, J.; Müller, P.

    2017-01-01

    Roč. 36, č. 2 (2017), s. 225-234 ISSN 0167-7055 Grant - others:AV ČR(CZ) StrategieAV21/14 Program:StrategieAV Institutional support: RVO:68081740 Keywords : realism * procedural modeling * architecture Subject RIV: IN - Informatics, Computer Science OBOR OECD: Cognitive sciences Impact factor: 1.611, year: 2016

  6. Combining A Priori Knowledge and Sensor Information for Updating the Global Position of an Autonomous Vehicle

    NARCIS (Netherlands)

    Zivkovic, Z.; Schoute, Albert L.; van der Heijden, Ferdinand; van Amerongen, J.; Jonker, B.; Regtien, P.P.L; Stramigioli, S.

    The problem of updating the global position of an autonomous vehicle is considered. An iterative procedure is proposed to fit a map to a set of noisy measurements. The procedure is inspired by a non-parametric procedure for probability density function mode searching. We show how this could be used

  7. Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo

    Science.gov (United States)

    McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.

    2017-11-01

    Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.

  8. Office Operative Hysteroscopy: An Update.

    Science.gov (United States)

    Salazar, Christina Alicia; Isaacson, Keith B

    2018-02-01

    Hysteroscopy is considered the gold standard for the evaluation of intracavitary pathology in both premenopausal and postmenopausal patients associated with abnormal uterine bleeding, as well as for the evaluation of infertile patients with suspected cavity abnormalities. Office-based operative hysteroscopy allows patients to resume activities immediately and successfully integrates clinical practice into a "see and treat" modality, avoiding the added risks of anesthesia and the inconvenience of the operating room. For 2017, the Centers for Medicare and Medicaid Services has provided a substantial increase in reimbursement for a select number of office-based hysteroscopic procedures. This review provides an update on the indications, equipment, and procedures for office hysteroscopy, as well as the management of complications that may arise within an office-based practice. Copyright © 2018. Published by Elsevier Inc.

  9. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  10. Operating procedures for the Pajarito Site Critical Assembly Facility

    International Nuclear Information System (INIS)

    Malenfant, R.E.

    1983-03-01

    Operating procedures consistent with DOE Order 5480.2, Chapter VI, and the American National Standard Safety Guide for the Performance of Critical Experiments are defined for the Pajarito Site Critical Assembly Facility of the Los Alamos National Laboratory. These operating procedures supersede and update those previously published in 1973 and apply to any criticality experiment performed at the facility

  11. Communication and Procedural Models of the E-Commerce Systems

    Directory of Open Access Journals (Sweden)

    Petr SUCHÁNEK

    2009-06-01

    Full Text Available E-commerce systems became a standard interface between sellers (or suppliers and customers. One of basic condition of an e-commerce system to be efficient is correct definitions and describes of the all internal and external processes. All is targeted the customers´ needs and requirements. The optimal and most exact way how to obtain and find optimal solution of e-commerce system and its processes structure in companies is the modeling and simulation. In this article author shows basic model of communication between customers and sellers in connection with the customer feedback and procedural models of e-commerce systems in terms of e-shops. Procedural model was made with the aid of definition of SOA.

  12. Standard interface files and procedures for reactor physics codes. Version IV

    International Nuclear Information System (INIS)

    O'Dell, R.D.

    1977-09-01

    Standards, procedures, and recommendations of the Committee on Computer Code Coordination for promoting the exchange of reactor physics codes are updated to Version IV status. Standards and procedures covering general programming, program structure, standard interface files, and file management and handling subroutines are included

  13. Updated thermal model using simplified short-wave radiosity calculations

    International Nuclear Information System (INIS)

    Smith, J.A.; Goltz, S.M.

    1994-01-01

    An extension to a forest canopy thermal radiance model is described that computes the short-wave energy flux absorbed within the canopy by solving simplified radiosity equations describing flux transfers between canopy ensemble classes partitioned by vegetation layer and leaf slope. Integrated short-wave reflectance and transmittance-factors obtained from measured leaf optical properties were found to be nearly equal for the canopy studied. Short-wave view factor matrices were approximated by combining the average leaf scattering coefficient with the long-wave view factor matrices already incorporated in the model. Both the updated and original models were evaluated for a dense spruce fir forest study site in Central Maine. Canopy short-wave absorption coefficients estimated from detailed Monte Carlo ray tracing calculations were 0.60, 0.04, and 0.03 for the top, middle, and lower canopy layers corresponding to leaf area indices of 4.0, 1.05, and 0.25. The simplified radiosity technique yielded analogous absorption values of 0.55, 0.03, and 0.01. The resulting root mean square error in modeled versus measured canopy temperatures for all layers was less than 1°C with either technique. Maximum error in predicted temperature using the simplified radiosity technique was approximately 2°C during peak solar heating. (author)

  14. Updated thermal model using simplified short-wave radiosity calculations

    Energy Technology Data Exchange (ETDEWEB)

    Smith, J. A.; Goltz, S. M.

    1994-02-15

    An extension to a forest canopy thermal radiance model is described that computes the short-wave energy flux absorbed within the canopy by solving simplified radiosity equations describing flux transfers between canopy ensemble classes partitioned by vegetation layer and leaf slope. Integrated short-wave reflectance and transmittance-factors obtained from measured leaf optical properties were found to be nearly equal for the canopy studied. Short-wave view factor matrices were approximated by combining the average leaf scattering coefficient with the long-wave view factor matrices already incorporated in the model. Both the updated and original models were evaluated for a dense spruce fir forest study site in Central Maine. Canopy short-wave absorption coefficients estimated from detailed Monte Carlo ray tracing calculations were 0.60, 0.04, and 0.03 for the top, middle, and lower canopy layers corresponding to leaf area indices of 4.0, 1.05, and 0.25. The simplified radiosity technique yielded analogous absorption values of 0.55, 0.03, and 0.01. The resulting root mean square error in modeled versus measured canopy temperatures for all layers was less than 1°C with either technique. Maximum error in predicted temperature using the simplified radiosity technique was approximately 2°C during peak solar heating. (author)

  15. Prediction error, ketamine and psychosis: An updated model.

    Science.gov (United States)

    Corlett, Philip R; Honey, Garry D; Fletcher, Paul C

    2016-11-01

    In 2007, we proposed an explanation of delusion formation as aberrant prediction error-driven associative learning. Further, we argued that the NMDA receptor antagonist ketamine provided a good model for this process. Subsequently, we validated the model in patients with psychosis, relating aberrant prediction error signals to delusion severity. During the ensuing period, we have developed these ideas, drawing on the simple principle that brains build a model of the world and refine it by minimising prediction errors, as well as using it to guide perceptual inferences. While previously we focused on the prediction error signal per se, an updated view takes into account its precision, as well as the precision of prior expectations. With this expanded perspective, we see several possible routes to psychotic symptoms - which may explain the heterogeneity of psychotic illness, as well as the fact that other drugs, with different pharmacological actions, can produce psychotomimetic effects. In this article, we review the basic principles of this model and highlight specific ways in which prediction errors can be perturbed, in particular considering the reliability and uncertainty of predictions. The expanded model explains hallucinations as perturbations of the uncertainty mediated balance between expectation and prediction error. Here, expectations dominate and create perceptions by suppressing or ignoring actual inputs. Negative symptoms may arise due to poor reliability of predictions in service of action. By mapping from biology to belief and perception, the account proffers new explanations of psychosis. However, challenges remain. We attempt to address some of these concerns and suggest future directions, incorporating other symptoms into the model, building towards better understanding of psychosis. © The Author(s) 2016.

  16. 40 CFR 1065.202 - Data updating, recording, and control.

    Science.gov (United States)

    2010-07-01

    ... command and control frequency Minimum recording frequency § 1065.510 Speed and torque during an engine step-map 1 Hz 1 mean value per step. § 1065.510 Speed and torque during an engine sweep-map 5 Hz 1 Hz... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments § 1065.202 Data updating, recording...

  17. Inhibition, Updating Working Memory, and Shifting Predict Reading Disability Symptoms in a Hybrid Model: Project KIDS.

    Science.gov (United States)

    Daucourt, Mia C; Schatschneider, Christopher; Connor, Carol M; Al Otaiba, Stephanie; Hart, Sara A

    2018-01-01

    Recent achievement research suggests that executive function (EF), a set of regulatory processes that control both thought and action necessary for goal-directed behavior, is related to typical and atypical reading performance. This project examines the relation of EF, as measured by its components, Inhibition, Updating Working Memory, and Shifting, with a hybrid model of reading disability (RD). Our sample included 420 children who participated in a broader intervention project when they were in KG-third grade (age M = 6.63 years, SD = 1.04 years, range = 4.79-10.40 years). At the time their EF was assessed, using a parent-report Behavior Rating Inventory of Executive Function (BRIEF), they had a mean age of 13.21 years ( SD = 1.54 years; range = 10.47-16.63 years). The hybrid model of RD was operationalized as a composite consisting of four symptoms, and set so that any child could have any one, any two, any three, any four, or none of the symptoms included in the hybrid model. The four symptoms include low word reading achievement, unexpected low word reading achievement, poorer reading comprehension compared to listening comprehension, and dual-discrepancy response-to-intervention, requiring both low achievement and low growth in word reading. The results of our multilevel ordinal logistic regression analyses showed a significant relation between all three components of EF (Inhibition, Updating Working Memory, and Shifting) and the hybrid model of RD, and that the strength of EF's predictive power for RD classification was the highest when RD was modeled as having at least one or more symptoms. Importantly, the chances of being classified as having RD increased as EF performance worsened and decreased as EF performance improved. The question of whether any one EF component would emerge as a superior predictor was also examined and results showed that Inhibition, Updating Working Memory, and Shifting were equally valuable as predictors of the hybrid model of RD

  18. Averaging models: parameters estimation with the R-Average procedure

    Directory of Open Access Journals (Sweden)

    S. Noventa

    2010-01-01

    Full Text Available The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982, can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto & Vicentini, 2007 can be used to estimate the parameters of these models. By the use of multiple information criteria in the model selection procedure, R-Average allows for the identification of the best subset of parameters that account for the data. After a review of the general method, we present an implementation of the procedure in the framework of R-project, followed by some experiments using a Monte Carlo method.

  19. Disruption of the Right Temporoparietal Junction Impairs Probabilistic Belief Updating.

    Science.gov (United States)

    Mengotti, Paola; Dombert, Pascasie L; Fink, Gereon R; Vossel, Simone

    2017-05-31

    Generating and updating probabilistic models of the environment is a fundamental modus operandi of the human brain. Although crucial for various cognitive functions, the neural mechanisms of these inference processes remain to be elucidated. Here, we show the causal involvement of the right temporoparietal junction (rTPJ) in updating probabilistic beliefs and we provide new insights into the chronometry of the process by combining online transcranial magnetic stimulation (TMS) with computational modeling of behavioral responses. Female and male participants performed a modified location-cueing paradigm, where false information about the percentage of cue validity (%CV) was provided in half of the experimental blocks to prompt updating of prior expectations. Online double-pulse TMS over rTPJ 300 ms (but not 50 ms) after target appearance selectively decreased participants' updating of false prior beliefs concerning %CV, reflected in a decreased learning rate of a Rescorla-Wagner model. Online TMS over rTPJ also impacted on participants' explicit beliefs, causing them to overestimate %CV. These results confirm the involvement of rTPJ in updating of probabilistic beliefs, thereby advancing our understanding of this area's function during cognitive processing. SIGNIFICANCE STATEMENT Contemporary views propose that the brain maintains probabilistic models of the world to minimize surprise about sensory inputs. Here, we provide evidence that the right temporoparietal junction (rTPJ) is causally involved in this process. Because neuroimaging has suggested that rTPJ is implicated in divergent cognitive domains, the demonstration of an involvement in updating internal models provides a novel unifying explanation for these findings. We used computational modeling to characterize how participants change their beliefs after new observations. By interfering with rTPJ activity through online transcranial magnetic stimulation, we showed that participants were less able to update

  20. MO-DE-304-00: Workforce Assessment Committee Update

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-06-15

    The Abt study of medical physicist work values for radiation oncology physics services, Round IV is completed. It supersedes the Abt III study of 2008. The 2015 Abt study measured qualified medical physicist (QMP) work associated with routine radiation oncology procedures as well as some special procedures. As before, a work model was created to allow the medical physicist to defend QMP work based on both routine and special procedures service mix. The work model can be used to develop a cost justification report for setting charges for radiation oncology physics services. The Abt study Round IV was designed to empower the medical physicist to negotiate a service or employment contract with providers based on measured national QMP workforce and staffing data. For a variety of reasons, the diagnostic imaging contingent of AAPM has had a more difficult time trying estimate workforce requirements than their therapy counterparts. Over the past several years, the Diagnostic Work and Workforce Study Subcommittee (DWWSS) has collected survey data from AAPM members, but the data have been very difficult to interpret. The DWWSS has reached out to include more AAPM volunteers to create a more full and accurate representation of actual clinical practice models on the subcommittee. Though much work remains, through hours of discussion and brainstorming, the DWWSS has somewhat of a clear path forward. This talk will provide attendees with an update on the efforts of the subcommittee. Learning Objectives: Understand the new information documented in the Abt studies. Understand how to use the Abt studies to justify medical physicist staffing. Learn relevant historical information on imaging physicist workforce. Understand the process of the DWWSS in 2014. Understand the intended path forward for the DWWSS.

  1. MO-DE-304-00: Workforce Assessment Committee Update

    International Nuclear Information System (INIS)

    2015-01-01

    The Abt study of medical physicist work values for radiation oncology physics services, Round IV is completed. It supersedes the Abt III study of 2008. The 2015 Abt study measured qualified medical physicist (QMP) work associated with routine radiation oncology procedures as well as some special procedures. As before, a work model was created to allow the medical physicist to defend QMP work based on both routine and special procedures service mix. The work model can be used to develop a cost justification report for setting charges for radiation oncology physics services. The Abt study Round IV was designed to empower the medical physicist to negotiate a service or employment contract with providers based on measured national QMP workforce and staffing data. For a variety of reasons, the diagnostic imaging contingent of AAPM has had a more difficult time trying estimate workforce requirements than their therapy counterparts. Over the past several years, the Diagnostic Work and Workforce Study Subcommittee (DWWSS) has collected survey data from AAPM members, but the data have been very difficult to interpret. The DWWSS has reached out to include more AAPM volunteers to create a more full and accurate representation of actual clinical practice models on the subcommittee. Though much work remains, through hours of discussion and brainstorming, the DWWSS has somewhat of a clear path forward. This talk will provide attendees with an update on the efforts of the subcommittee. Learning Objectives: Understand the new information documented in the Abt studies. Understand how to use the Abt studies to justify medical physicist staffing. Learn relevant historical information on imaging physicist workforce. Understand the process of the DWWSS in 2014. Understand the intended path forward for the DWWSS

  2. Summary of Expansions, Updates, and Results in GREET® 2016 Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-10-01

    This report documents the technical content of the expansions and updates in Argonne National Laboratory’s GREET® 2016 release and provides references and links to key documents related to these expansions and updates.

  3. A Review of the Updated Pharmacophore for the Alpha 5 GABA(A Benzodiazepine Receptor Model

    Directory of Open Access Journals (Sweden)

    Terry Clayton

    2015-01-01

    Full Text Available An updated model of the GABA(A benzodiazepine receptor pharmacophore of the α5-BzR/GABA(A subtype has been constructed prompted by the synthesis of subtype selective ligands in light of the recent developments in both ligand synthesis, behavioral studies, and molecular modeling studies of the binding site itself. A number of BzR/GABA(A α5 subtype selective compounds were synthesized, notably α5-subtype selective inverse agonist PWZ-029 (1 which is active in enhancing cognition in both rodents and primates. In addition, a chiral positive allosteric modulator (PAM, SH-053-2′F-R-CH3 (2, has been shown to reverse the deleterious effects in the MAM-model of schizophrenia as well as alleviate constriction in airway smooth muscle. Presented here is an updated model of the pharmacophore for α5β2γ2 Bz/GABA(A receptors, including a rendering of PWZ-029 docked within the α5-binding pocket showing specific interactions of the molecule with the receptor. Differences in the included volume as compared to α1β2γ2, α2β2γ2, and α3β2γ2 will be illustrated for clarity. These new models enhance the ability to understand structural characteristics of ligands which act as agonists, antagonists, or inverse agonists at the Bz BS of GABA(A receptors.

  4. Update on dexmedetomidine: use in nonintubated patients requiring sedation for surgical procedures

    Directory of Open Access Journals (Sweden)

    Mohanad Shukry

    2010-03-01

    Full Text Available Mohanad Shukry, Jeffrey A MillerUniversity of Oklahoma Health Sciences Center, Department of Anesthesiology, Children’s Hospital of Oklahoma, Oklahoma City, OK, USAAbstract: Dexmedetomidine was introduced two decades ago as a sedative and supplement to sedation in the intensive care unit for patients whose trachea was intubated. However, since that time dexmedetomidine has been commonly used as a sedative and hypnotic for patients undergoing procedures without the need for tracheal intubation. This review focuses on the application of dexmedetomidine as a sedative and/or total anesthetic in patients undergoing procedures without the need for tracheal intubation. Dexmedetomidine was used for sedation in monitored anesthesia care (MAC, airway procedures including fiberoptic bronchoscopy, dental procedures, ophthalmological procedures, head and neck procedures, neurosurgery, and vascular surgery. Additionally, dexmedetomidine was used for the sedation of pediatric patients undergoing different type of procedures such as cardiac catheterization and magnetic resonance imaging. Dexmedetomidine loading dose ranged from 0.5 to 5 μg kg-1, and infusion dose ranged from 0.2 to 10 μg kg-1 h-1. Dexmedetomidine was administered in conjunction with local anesthesia and/or other sedatives. Ketamine was administered with dexmedetomidine and opposed its bradycardiac effects. Dexmedetomidine may by useful in patients needing sedation without tracheal intubation. The literature suggests potential use of dexmedetomidine solely or as an adjunctive agent to other sedation agents. Dexmedetomidine was especially useful when spontaneous breathing was essential such as in procedures on the airway, or when sudden awakening from sedation was required such as for cooperative clinical examination during craniotomies.Keywords: dexmedetomidine, sedation, nonintubated patients

  5. Effect of asynchronous updating on the stability of cellular automata

    International Nuclear Information System (INIS)

    Baetens, J.M.; Van der Weeën, P.; De Baets, B.

    2012-01-01

    Highlights: ► An upper bound on the Lyapunov exponent of asynchronously updated CA is established. ► The employed update method has repercussions on the stability of CAs. ► A decision on the employed update method should be taken with care. ► Substantial discrepancies arise between synchronously and asynchronously updated CA. ► Discrepancies between different asynchronous update schemes are less pronounced. - Abstract: Although cellular automata (CAs) were conceptualized as utter discrete mathematical models in which the states of all their spatial entities are updated simultaneously at every consecutive time step, i.e. synchronously, various CA-based models that rely on so-called asynchronous update methods have been constructed in order to overcome the limitations that are tied up with the classical way of evolving CAs. So far, only a few researchers have addressed the consequences of this way of updating on the evolved spatio-temporal patterns, and the reachable stationary states. In this paper, we exploit Lyapunov exponents to determine to what extent the stability of the rules within a family of totalistic CAs is affected by the underlying update method. For that purpose, we derive an upper bound on the maximum Lyapunov exponent of asynchronously iterated CAs, and show its validity, after which we present a comparative study between the Lyapunov exponents obtained for five different update methods, namely one synchronous method and four well-established asynchronous methods. It is found that the stability of CAs is seriously affected if one of the latter methods is employed, whereas the discrepancies arising between the different asynchronous methods are far less pronounced and, finally, we discuss the repercussions of our findings on the development of CA-based models.

  6. Summary Analysis: Hanford Site Composite Analysis Update

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-06-05

    The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.

  7. A model to determine payments associated with radiology procedures.

    Science.gov (United States)

    Mabotuwana, Thusitha; Hall, Christopher S; Thomas, Shiby; Wald, Christoph

    2017-12-01

    Across the United States, there is a growing number of patients in Accountable Care Organizations and under risk contracts with commercial insurance. This is due to proliferation of new value-based payment models and care delivery reform efforts. In this context, the business model of radiology within a hospital or health system context is shifting from a primary profit-center to a cost-center with a goal of cost savings. Radiology departments need to increasingly understand how the transactional nature of the business relates to financial rewards. The main challenge with current reporting systems is that the information is presented only at an aggregated level, and often not broken down further, for instance, by type of exam. As such, the primary objective of this research is to provide better visibility into payments associated with individual radiology procedures in order to better calibrate expense/capital structure of the imaging enterprise to the actual revenue or value-add to the organization it belongs to. We propose a methodology that can be used to determine technical payments at a procedure level. We use a proportion based model to allocate payments to individual radiology procedures based on total charges (which also includes non-radiology related charges). Using a production dataset containing 424,250 radiology exams we calculated the overall average technical charge for Radiology to be $873.08 per procedure and the corresponding average payment to be $326.43 (range: $48.27 for XR and $2750.11 for PET/CT) resulting in an average payment percentage of 37.39% across all exams. We describe how charges associated with a procedure can be used to approximate technical payments at a more granular level with a focus on Radiology. The methodology is generalizable to approximate payment for other services as well. Understanding payments associated with each procedure can be useful during strategic practice planning. Charge-to-total charge ratio can be used to

  8. An update of the classical Bokhman’s dualistic model of endometrial cancer

    Directory of Open Access Journals (Sweden)

    Miłosz Wilczyński

    2016-07-01

    Full Text Available According to the classical dualistic model introduced by Bokhman in 1983, endometrial cancer (EC is divided into two basic types. The prototypical histological type for type I and type II of EC is endometrioid carcinoma and serous carcinoma, respectively. The traditional classification is based on clinical, endocrine and histopathological features, however, it sometimes does not reflect the full heterogeneity of EC. New molecular evidence, supported by clinical diversity of the cancer, indicates that the classical dualistic model is valid only to some extent. The review updates a mutational diversity of EC, introducing a new molecular classification of the tumour in regard to data presented by The Cancer Genome Atlas Research Network (TGCA.

  9. Repeat-swap homology modeling of secondary active transporters: updated protocol and prediction of elevator-type mechanisms.

    Science.gov (United States)

    Vergara-Jaque, Ariela; Fenollar-Ferrer, Cristina; Kaufmann, Desirée; Forrest, Lucy R

    2015-01-01

    Secondary active transporters are critical for neurotransmitter clearance and recycling during synaptic transmission and uptake of nutrients. These proteins mediate the movement of solutes against their concentration gradients, by using the energy released in the movement of ions down pre-existing concentration gradients. To achieve this, transporters conform to the so-called alternating-access hypothesis, whereby the protein adopts at least two conformations in which the substrate binding sites are exposed to one or other side of the membrane, but not both simultaneously. Structures of a bacterial homolog of neuronal glutamate transporters, GltPh, in several different conformational states have revealed that the protein structure is asymmetric in the outward- and inward-open states, and that the conformational change connecting them involves a elevator-like movement of a substrate binding domain across the membrane. The structural asymmetry is created by inverted-topology repeats, i.e., structural repeats with similar overall folds whose transmembrane topologies are related to each other by two-fold pseudo-symmetry around an axis parallel to the membrane plane. Inverted repeats have been found in around three-quarters of secondary transporter folds. Moreover, the (a)symmetry of these systems has been successfully used as a bioinformatic tool, called "repeat-swap modeling" to predict structural models of a transporter in one conformation using the known structure of the transporter in the complementary conformation as a template. Here, we describe an updated repeat-swap homology modeling protocol, and calibrate the accuracy of the method using GltPh, for which both inward- and outward-facing conformations are known. We then apply this repeat-swap homology modeling procedure to a concentrative nucleoside transporter, VcCNT, which has a three-dimensional arrangement related to that of GltPh. The repeat-swapped model of VcCNT predicts that nucleoside transport also

  10. Repeat-swap homology modeling of secondary active transporters: updated protocol and prediction of elevator-type mechanisms

    Directory of Open Access Journals (Sweden)

    Cristina eFenollar Ferrer

    2015-09-01

    Full Text Available Secondary active transporters are critical for neurotransmitter clearance and recycling during synaptic transmission and uptake of nutrients. These proteins mediate the movement of solutes against their concentration gradients, by using the energy released in the movement of ions down pre-existing concentration gradients. To achieve this, transporters conform to the so-called alternating-access hypothesis, whereby the protein adopts at least two conformations in which the substrate binding sites are exposed to either the outside or inside of the membrane, but not both simultaneously. Structures of a bacterial homolog of neuronal glutamate transporters, GltPh, in several different conformational states have revealed that the protein structure is asymmetric in the outward- and inward-open states, and that the conformational change connecting them involves a elevator-like movement of a substrate binding domain across the membrane. The structural asymmetry is created by inverted-topology repeats, i.e., structural repeats with similar overall folds whose transmembrane topologies are related to each other by two-fold pseudo-symmetry around an axis parallel to the membrane plane. Inverted repeats have been found in around three-quarters of secondary transporter folds. Moreover, the (asymmetry of these systems has been successfully used as a bioinformatic tool, called repeat-swap modeling to predict structural models of a transporter in one conformation using the known structure of the transporter in the complementary conformation as a template. Here, we describe an updated repeat-swap homology modeling protocol, and calibrate the accuracy of the method using GltPh, for which both inward- and outward-facing conformations are known. We then apply this repeat-swap homology modeling procedure to a concentrative nucleoside transporter, VcCNT, which has a three-dimensional arrangement related to that of GltPh. The repeat-swapped model of VcCNT predicts that

  11. The value of information updating in new product development

    CERN Document Server

    Artmann, Christian

    2009-01-01

    This work shows how managing uncertainty in new product development can be improved by conducting an information update during the development process. The book details the comprehensive model needed to perform that information update.

  12. Advantages for EDF of using and updating PSAs for the probabilistic analysis of accident scenarios in nuclear power plants

    International Nuclear Information System (INIS)

    Feuillade, Gilles

    2000-01-01

    This paper shows the advantages for EDF of using and updating PSA models of PWR units for the probabilistic assessment of accident scenarios. These advantages may be classified in various categories: The construction of PSA models makes it possible to aggregate knowledge in a variety of fields: thermohydraulics, the behavior of equipment and systems, organization, plant operation, etc. The updating of PSA models makes it possible to monitor the state of progress of PWR unit safety levels and also allows a variety of applications to be used throughout the service life of the unit. The results obtained are directly applicable to the units as the 'reference PSA models' developed by EDF conform to the units in service. The use of PSA for the examination of incident or accident scenarios makes it possible to verify the adequacy of the resources both in terms of 'systems' and 'plant operation'. These two notions are to be taken in the broadest sense, as they cover the aspects of system design, reliability and availability of equipment, organization of plant operation, comprehensiveness of operating procedures, human redundancy, etc. The use of PSA models through the different applications (analyses of predominant sequences, analyses of main equipment failures, analyses main operator actions, analyses according to the power units, etc) is becoming an indispensable supplement to conventional deterministic analyses of the envisaged accident scenarios. Within the scope of accident prevention, it constitutes a tool to assist the decision-maker, especially when evaluating the pertinence of the General Operating Rules (operating procedures, operating technical specifications, periodic testing, etc). (S.Y.)

  13. Update on radiation safety and dose reduction in pediatric neuroradiology

    International Nuclear Information System (INIS)

    Mahesh, Mahadevappa

    2015-01-01

    The number of medical X-ray imaging procedures is growing exponentially across the globe. Even though the overall benefit from medical X-ray imaging procedures far outweighs any associated risks, it is crucial to take all necessary steps to minimize radiation risks to children without jeopardizing image quality. Among the X-ray imaging studies, except for interventional fluoroscopy procedures, CT studies constitute higher dose and therefore draw considerable scrutiny. A number of technological advances have provided ways for better and safer CT imaging. This article provides an update on the radiation safety of patients and staff and discusses dose optimization in medical X-ray imaging within pediatric neuroradiology. (orig.)

  14. Update on radiation safety and dose reduction in pediatric neuroradiology

    Energy Technology Data Exchange (ETDEWEB)

    Mahesh, Mahadevappa [Johns Hopkins University School of Medicine, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States)

    2015-09-15

    The number of medical X-ray imaging procedures is growing exponentially across the globe. Even though the overall benefit from medical X-ray imaging procedures far outweighs any associated risks, it is crucial to take all necessary steps to minimize radiation risks to children without jeopardizing image quality. Among the X-ray imaging studies, except for interventional fluoroscopy procedures, CT studies constitute higher dose and therefore draw considerable scrutiny. A number of technological advances have provided ways for better and safer CT imaging. This article provides an update on the radiation safety of patients and staff and discusses dose optimization in medical X-ray imaging within pediatric neuroradiology. (orig.)

  15. A visual tracking method based on deep learning without online model updating

    Science.gov (United States)

    Tang, Cong; Wang, Yicheng; Feng, Yunsong; Zheng, Chao; Jin, Wei

    2018-02-01

    The paper proposes a visual tracking method based on deep learning without online model updating. In consideration of the advantages of deep learning in feature representation, deep model SSD (Single Shot Multibox Detector) is used as the object extractor in the tracking model. Simultaneously, the color histogram feature and HOG (Histogram of Oriented Gradient) feature are combined to select the tracking object. In the process of tracking, multi-scale object searching map is built to improve the detection performance of deep detection model and the tracking efficiency. In the experiment of eight respective tracking video sequences in the baseline dataset, compared with six state-of-the-art methods, the method in the paper has better robustness in the tracking challenging factors, such as deformation, scale variation, rotation variation, illumination variation, and background clutters, moreover, its general performance is better than other six tracking methods.

  16. Preconditioner Updates Applied to CFD Model Problems

    Czech Academy of Sciences Publication Activity Database

    Birken, P.; Duintjer Tebbens, Jurjen; Meister, A.; Tůma, Miroslav

    2008-01-01

    Roč. 58, č. 11 (2008), s. 1628-1641 ISSN 0168-9274 R&D Projects: GA AV ČR 1ET400300415; GA AV ČR KJB100300703 Institutional research plan: CEZ:AV0Z10300504 Keywords : finite volume methods * update preconditioning * Krylov subspace methods * Euler equations * conservation laws Subject RIV: BA - General Mathematics Impact factor: 0.952, year: 2008

  17. Office-based deep sedation for pediatric ophthalmologic procedures using a sedation service model.

    Science.gov (United States)

    Lalwani, Kirk; Tomlinson, Matthew; Koh, Jeffrey; Wheeler, David

    2012-01-01

    Aims. (1) To assess the efficacy and safety of pediatric office-based sedation for ophthalmologic procedures using a pediatric sedation service model. (2) To assess the reduction in hospital charges of this model of care delivery compared to the operating room (OR) setting for similar procedures. Background. Sedation is used to facilitate pediatric procedures and to immobilize patients for imaging and examination. We believe that the pediatric sedation service model can be used to facilitate office-based deep sedation for brief ophthalmologic procedures and examinations. Methods. After IRB approval, all children who underwent office-based ophthalmologic procedures at our institution between January 1, 2000 and July 31, 2008 were identified using the sedation service database and the electronic health record. A comparison of hospital charges between similar procedures in the operating room was performed. Results. A total of 855 procedures were reviewed. Procedure completion rate was 100% (C.I. 99.62-100). There were no serious complications or unanticipated admissions. Our analysis showed a significant reduction in hospital charges (average of $1287 per patient) as a result of absent OR and recovery unit charges. Conclusions. Pediatric ophthalmologic minor procedures can be performed using a sedation service model with significant reductions in hospital charges.

  18. Office-Based Deep Sedation for Pediatric Ophthalmologic Procedures Using a Sedation Service Model

    Directory of Open Access Journals (Sweden)

    Kirk Lalwani

    2012-01-01

    Full Text Available Aims. (1 To assess the efficacy and safety of pediatric office-based sedation for ophthalmologic procedures using a pediatric sedation service model. (2 To assess the reduction in hospital charges of this model of care delivery compared to the operating room (OR setting for similar procedures. Background. Sedation is used to facilitate pediatric procedures and to immobilize patients for imaging and examination. We believe that the pediatric sedation service model can be used to facilitate office-based deep sedation for brief ophthalmologic procedures and examinations. Methods. After IRB approval, all children who underwent office-based ophthalmologic procedures at our institution between January 1, 2000 and July 31, 2008 were identified using the sedation service database and the electronic health record. A comparison of hospital charges between similar procedures in the operating room was performed. Results. A total of 855 procedures were reviewed. Procedure completion rate was 100% (C.I. 99.62–100. There were no serious complications or unanticipated admissions. Our analysis showed a significant reduction in hospital charges (average of $1287 per patient as a result of absent OR and recovery unit charges. Conclusions. Pediatric ophthalmologic minor procedures can be performed using a sedation service model with significant reductions in hospital charges.

  19. Updated Reference Model for Heat Generation in the Lithosphere

    Science.gov (United States)

    Wipperfurth, S. A.; Sramek, O.; Roskovec, B.; Mantovani, F.; McDonough, W. F.

    2017-12-01

    Models integrating geophysics and geochemistry allow for characterization of the Earth's heat budget and geochemical evolution. Global lithospheric geophysical models are now constrained by surface and body wave data and are classified into several unique tectonic types. Global lithospheric geochemical models have evolved from petrological characterization of layers to a combination of petrologic and seismic constraints. Because of these advances regarding our knowledge of the lithosphere, it is necessary to create an updated chemical and physical reference model. We are developing a global lithospheric reference model based on LITHO1.0 (segmented into 1°lon x 1°lat x 9-layers) and seismological-geochemical relationships. Uncertainty assignments and correlations are assessed for its physical attributes, including layer thickness, Vp and Vs, and density. This approach yields uncertainties for the masses of the crust and lithospheric mantle. Heat producing element abundances (HPE: U, Th, and K) are ascribed to each volume element. These chemical attributes are based upon the composition of subducting sediment (sediment layers), composition of surface rocks (upper crust), a combination of petrologic and seismic correlations (middle and lower crust), and a compilation of xenolith data (lithospheric mantle). The HPE abundances are correlated within each voxel, but not vertically between layers. Efforts to provide correlation of abundances horizontally between each voxel are discussed. These models are used further to critically evaluate the bulk lithosphere heat production in the continents and the oceans. Cross-checks between our model and results from: 1) heat flux (Artemieva, 2006; Davies, 2013; Cammarano and Guerri, 2017), 2) gravity (Reguzzoni and Sampietro, 2015), and 3) geochemical and petrological models (Rudnick and Gao, 2014; Hacker et al. 2015) are performed.

  20. Procedural Content Graphs for Urban Modeling

    Directory of Open Access Journals (Sweden)

    Pedro Brandão Silva

    2015-01-01

    Full Text Available Massive procedural content creation, for example, for virtual urban environments, is a difficult, yet important challenge. While shape grammars are a popular example of effectiveness in architectural modeling, they have clear limitations regarding readability, manageability, and expressive power when addressing a variety of complex structural designs. Moreover, shape grammars aim at geometry specification and do not facilitate integration with other types of content, such as textures or light sources, which could rather accompany the generation process. We present procedural content graphs, a graph-based solution for procedural generation that addresses all these issues in a visual, flexible, and more expressive manner. Besides integrating handling of diverse types of content, this approach introduces collective entity manipulation as lists, seamlessly providing features such as advanced filtering, grouping, merging, ordering, and aggregation, essentially unavailable in shape grammars. Hereby, separated entities can be easily merged or just analyzed together in order to perform a variety of context-based decisions and operations. The advantages of this approach are illustrated via examples of tasks that are either very cumbersome or simply impossible to express with previous grammar approaches.

  1. Medi SPICE : an update

    OpenAIRE

    Mc Caffery, Fergal; Dorling, Alec; Casey, Valentine

    2010-01-01

    peer-reviewed This paper provides an update on the development of a software process assessment and improvement model (Medi SPICE) specifically for the medical device industry. The development of Medi SPICE was launched at the SPICE 2009 Conference. Medi SPICE will consist of a Process Reference Model and a Process Assessment Model. The Medi SPICE Process Assessment Model will be used to perform conformant assessments of the software process capability of medical device suppliers in accord...

  2. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  3. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  4. Updated science systems on USCGC Healy

    Science.gov (United States)

    Chayes, D. N.; Roberts, S. D.; Arko, R. A.; Hiller, S. M.

    2008-12-01

    The USCG cutter Healy is the U.S. Arctic research icebreaker. Prior to the 2008 season, a number of upgrades and improvements were made to the science systems. These included the addition of two Bell BGM-3 marine gravity meters. The vessel's existing meterological sensors were enhanced with two RM Young model 85004 heated ultrasonic anemometers; a Paroscientific, Inc. model "MET-3A" air temperature, humidity and barometric pressure subsystem; and an RM Young model 50202 heated rain gauge. The flow through sea water system was updated with new flow meters, a SeaBird SBE45 thermosalinograph, long and a short wave radiation sensors, a Seapoint fluorometer. A Milltech Marine Smart Radio model SR161 Automatic Identification System (AIS) receiver and an updated interface to real-time winch and wire performance have been added. Our onboard real-time GIS has been updated to include real-time plotting of other ship tracks from our AIS receiver and the ability for users to save and share planned tracks. For the HLY0806 leg, we implemented a SWAP ship-to ship wireless connection for our two-ship operations with the Canadian icebreaker Louis S. St. Laurent similar to the one we implemented for our two-ship program with the Swedish icebreaker Oden in 2005. We updated our routine delivery of underway data to investigators, as well as a copy for archiving to the NSF-supported Marine Geoscience Data System (MGDS), using portable "boomerang" drives. An end-user workstation was added to accommodate increasing demand for onboard processing. Technical support for science on the Healy is supported by the U.S. National Science Foundation.

  5. Merrill's Atlas of radiographic positions and radiologic procedures. Volumes 1-3. Sixth edition

    International Nuclear Information System (INIS)

    Ballinger, P.W.

    1985-01-01

    Merrill's Atlas describes and explains the routine and specialized radiologic procedures for all body systems. This edition thoroughly reorganized, updated and expanded. Volumes one and two describe all routine and flouroscopic procedures; and volume three describes more specialized areas in the profession

  6. 2014 Update of the United States National Seismic Hazard Maps

    Science.gov (United States)

    Petersen, M.D.; Mueller, C.S.; Haller, K.M.; Moschetti, M.; Harmsen, S.C.; Field, E.H.; Rukstales, K.S.; Zeng, Y.; Perkins, D.M.; Powers, P.; Rezaeian, S.; Luco, N.; Olsen, A.; Williams, R.

    2012-01-01

    The U.S. National Seismic Hazard Maps are revised every six years, corresponding with the update cycle of the International Building Code. These maps cover the conterminous U.S. and will be updated in 2014 using the best-available science that is obtained from colleagues at regional and topical workshops, which are convened in 2012-2013. Maps for Alaska and Hawaii will be updated shortly following this update. Alternative seismic hazard models discussed at the workshops will be implemented in a logic tree framework and will be used to develop the seismic hazard maps and associated products. In this paper we describe the plan to update the hazard maps, the issues raised in workshops up to March 2012, and topics that will be discussed at future workshops. An advisory panel will guide the development of the hazard maps and ensure that the maps are acceptable to a broad segment of the science and engineering communities. These updated maps will then be considered by end-users for inclusion in building codes, risk models, and public policy documents.

  7. Identification of material parameters for plasticity models: A comparative study on the finite element model updating and the virtual fields method

    Science.gov (United States)

    Martins, J. M. P.; Thuillier, S.; Andrade-Campos, A.

    2018-05-01

    The identification of material parameters, for a given constitutive model, can be seen as the first step before any practical application. In the last years, the field of material parameters identification received an important boost with the development of full-field measurement techniques, such as Digital Image Correlation. These techniques enable the use of heterogeneous displacement/strain fields, which contain more information than the classical homogeneous tests. Consequently, different techniques have been developed to extract material parameters from full-field measurements. In this study, two of these techniques are addressed, the Finite Element Model Updating (FEMU) and the Virtual Fields Method (VFM). The main idea behind FEMU is to update the parameters of a constitutive model implemented in a finite element model until both numerical and experimental results match, whereas VFM makes use of the Principle of Virtual Work and does not require any finite element simulation. Though both techniques proved their feasibility in linear and non-linear constitutive models, it is rather difficult to rank their robustness in plasticity. The purpose of this work is to perform a comparative study in the case of elasto-plastic models. Details concerning the implementation of each strategy are presented. Moreover, a dedicated code for VFM within a large strain framework is developed. The reconstruction of the stress field is performed through a user subroutine. A heterogeneous tensile test is considered to compare FEMU and VFM strategies.

  8. Bariatric Surgery and Endoluminal Procedures: IFSO Worldwide Survey 2014.

    Science.gov (United States)

    Angrisani, L; Santonicola, A; Iovino, P; Vitiello, A; Zundel, N; Buchwald, H; Scopinaro, N

    2017-09-01

    Several bariatric surgery worldwide surveys have been previously published to illustrate the evolution of bariatric surgery in the last decades. The aim of this survey is to report an updated overview of all bariatric procedures performed in 2014.For the first time, a special section on endoluminal techniques was added. The 2014 International Federation for the Surgery of Obesity and Metabolic Disorders (IFSO) survey form evaluating the number and the type of surgical and endoluminal bariatric procedures was emailed to all IFSO societies. Trend analyses from 2011 to 2014 were also performed. There were 56/60 (93.3%) responders. The total number of bariatric/metabolic procedures performed in 2014 consisted of 579,517 (97.6%) surgical operations and 14,725 (2.4%) endoluminal procedures. The most commonly performed procedure in the world was sleeve gastrectomy (SG) that reached 45.9%, followed by Roux-en-Y gastric bypass (RYGB) (39.6%), and adjustable gastric banding (AGB) (7.4%). The annual percentage changes from 2013 revealed the increase of SG and decrease of RYGB in all the IFSO regions (USA/Canada, Europe, and Asia/Pacific) with the exception of Latin/South America, where SG decreased and RYGB represented the most frequent procedure. There was a further increase in the total number of bariatric/metabolic procedures in 2014 and SG is currently the most frequent surgical procedure in the world. This is the first survey that describes the endoluminal procedures, but the accuracy of provided data should be hopefully improved in the next future. We encourage the creation of further national registries and their continuous updates taking into account all new bariatric procedures including the endoscopic procedures that will obtain increasing importance in the near future.

  9. Comparing the impact of time displaced and biased precipitation estimates for online updated urban runoff models

    DEFF Research Database (Denmark)

    Borup, Morten; Mikkelsen, Peter Steen; Borup, Morten

    2013-01-01

    When an online runoff model is updated from system measurements, the requirements of the precipitation input change. Using rain gauge data as precipitation input there will be a displacement between the time when the rain hits the gauge and the time where the rain hits the actual catchment, due...

  10. Updating risk prediction tools: a case study in prostate cancer.

    Science.gov (United States)

    Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M

    2012-01-01

    Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. PROCRU: A model for analyzing crew procedures in approach to landing

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Lancraft, R.; Zacharias, G.

    1980-01-01

    A model for analyzing crew procedures in approach to landing is developed. The model employs the information processing structure used in the optimal control model and in recent models for monitoring and failure detection. Mechanisms are added to this basic structure to model crew decision making in this multi task environment. Decisions are based on probability assessments and potential mission impact (or gain). Sub models for procedural activities are included. The model distinguishes among external visual, instrument visual, and auditory sources of information. The external visual scene perception models incorporate limitations in obtaining information. The auditory information channel contains a buffer to allow for storage in memory until that information can be processed.

  12. Best Practice Updates for Pediatric/Adolescent Weight Loss Surgery

    Science.gov (United States)

    Pratt, Janey S.A.; Lenders, Carine M.; Dionne, Emily A.; Hoppin, Alison G.; Hsu, George L.K.; Inge, Thomas H.; Lawlor, David F.; Marino, Margaret F.; Meyers, Alan F.; Rosenblum, Jennifer L.; Sanchez, Vivian M.

    2011-01-01

    The objective of this study is to update evidence-based best practice guidelines for pediatric/adolescent weight loss surgery (WLS). We performed a systematic search of English-language literature on WLS and pediatric, adolescent, gastric bypass, laparoscopic gastric banding, and extreme obesity published between April 2004 and May 2007 in PubMed, MEDLINE, and the Cochrane Library. Keywords were used to narrow the search for a selective review of abstracts, retrieval of full articles, and grading of evidence according to systems used in established evidence-based models. In light of evidence on the natural history of obesity and on outcomes of WLS in adolescents, guidelines for surgical treatment of obesity in this age group need to be updated. We recommend modification of selection criteria to include adolescents with BMI ≥ 35 and specific obesity-related comorbidities for which there is clear evidence of important short-term morbidity (i.e., type 2 diabetes, severe steatohepatitis, pseudotumor cerebri, and moderate-to-severe obstructive sleep apnea). In addition, WLS should be considered for adolescents with extreme obesity (BMI ≥ 40) and other comorbidities associated with long-term risks. We identified >1,085 papers; 186 of the most relevant were reviewed in detail. Regular updates of evidence-based recommendations for best practices in pediatric/adolescent WLS are required to address advances in technology and the growing evidence base in pediatric WLS. Key considerations in patient safety include carefully designed criteria for patient selection, multidisciplinary evaluation, choice of appropriate procedure, thorough screening and management of comorbidities, optimization of long-term compliance, and age-appropriate fully informed consent. PMID:19396070

  13. Inhibition, Updating Working Memory, and Shifting Predict Reading Disability Symptoms in a Hybrid Model: Project KIDS

    Directory of Open Access Journals (Sweden)

    Mia C. Daucourt

    2018-03-01

    Full Text Available Recent achievement research suggests that executive function (EF, a set of regulatory processes that control both thought and action necessary for goal-directed behavior, is related to typical and atypical reading performance. This project examines the relation of EF, as measured by its components, Inhibition, Updating Working Memory, and Shifting, with a hybrid model of reading disability (RD. Our sample included 420 children who participated in a broader intervention project when they were in KG-third grade (age M = 6.63 years, SD = 1.04 years, range = 4.79–10.40 years. At the time their EF was assessed, using a parent-report Behavior Rating Inventory of Executive Function (BRIEF, they had a mean age of 13.21 years (SD = 1.54 years; range = 10.47–16.63 years. The hybrid model of RD was operationalized as a composite consisting of four symptoms, and set so that any child could have any one, any two, any three, any four, or none of the symptoms included in the hybrid model. The four symptoms include low word reading achievement, unexpected low word reading achievement, poorer reading comprehension compared to listening comprehension, and dual-discrepancy response-to-intervention, requiring both low achievement and low growth in word reading. The results of our multilevel ordinal logistic regression analyses showed a significant relation between all three components of EF (Inhibition, Updating Working Memory, and Shifting and the hybrid model of RD, and that the strength of EF’s predictive power for RD classification was the highest when RD was modeled as having at least one or more symptoms. Importantly, the chances of being classified as having RD increased as EF performance worsened and decreased as EF performance improved. The question of whether any one EF component would emerge as a superior predictor was also examined and results showed that Inhibition, Updating Working Memory, and Shifting were equally valuable as predictors of the

  14. EANM procedural guidelines for radionuclide myocardial perfusion imaging with SPECT and SPECT/CT

    DEFF Research Database (Denmark)

    Verberne, Hein J; Acampa, Wanda; Anagnostopoulos, Constantinos

    2015-01-01

    Since the publication of the European Association of Nuclear Medicine (EANM) procedural guidelines for radionuclide myocardial perfusion imaging (MPI) in 2005, many small and some larger steps of progress have been made, improving MPI procedures. In this paper, the major changes from the updated ...

  15. Updated embrittlement trend curve for reactor pressure vessel steels

    International Nuclear Information System (INIS)

    Kirk, M.; Santos, C.; Eason, E.; Wright, J.; Odette, G.R.

    2003-01-01

    The reactor pressure vessels of commercial nuclear power plants are subject to embrittlement due to exposure to high energy neutrons from the core. Irradiation embrittlement of RPV belt-line materials is currently evaluated using US Regulatory Guide 1.99 Revision 2 (RG 1.99 Rev 2), which presents methods for estimating the Charpy transition temperature shift (ΔT30) at 30 ft-lb (41 J) and the drop in Charpy upper shelf energy (ΔUSE). A more recent embrittlement model, based on a broader database and more recent research results, is presented in NUREG/CR-6551. The objective of this paper is to describe the most recent update to the embrittlement model in NUREG/CR-6551, based upon additional data and increased understanding of embrittlement mechanisms. The updated ΔT30 and USE models include fluence, copper, nickel, phosphorous content, and product form; the ΔT30 model also includes coolant temperature, irradiation time (or flux), and a long-time term. The models were developed using multi-variable surface fitting techniques, understanding of the ΔT30 mechanisms, and engineering judgment. The updated ΔT30 model reduces scatter significantly relative to RG 1.99 Rev 2 on the currently available database for plates, forgings, and welds. This updated embrittlement trend curve will form the basis of revision 3 to Regulatory Guide 1.99. (author)

  16. Statistical elements in calculations procedures for air quality control; Elementi di statistica nelle procedure di calcolo per il controllo della qualita' dell'aria

    Energy Technology Data Exchange (ETDEWEB)

    Mura, M.C. [Istituto Superiore di Sanita' , Laboratorio di Igiene Ambientale, Rome (Italy)

    2001-07-01

    The statistical processing of data resulting from the monitoring of chemical atmospheric pollution aimed at air quality control is presented. The form of procedural models may offer a practical instrument to the operators in the sector. The procedural models are modular and can be easily integrated with other models. They include elementary calculation procedures and mathematical methods for statistical analysis. The calculation elements have been developed by probabilistic induction so as to relate them to the statistical analysis. The calculation elements have been developed by probabilistic induction so as to relate them to the statistical models, which are the basis of the methods used for the study and the forecast of atmospheric pollution. This report is part of the updating and training activity that the Istituto Superiore di Sanita' has been carrying on for over twenty years, addressed to operators of the environmental field. [Italian] Il processo di elaborazione statistica dei dati provenienti dal monitoraggio dell'inquinamento chimico dell'atmosfera, finalizzato al controllo della qualita' dell'aria, e' presentato in modelli di procedure al fine di fornire un sintetico strumento di lavoro agli operatori del settore. I modelli di procedure sono modulari ed integrabili. Includono gli elementi di calcolo elementare ed i metodi statistici d'analisi. Gli elementi di calcolo sono sviluppati con metodo d'induzione probabilistica per collegarli ai modelli statistici, che sono alla base dei metodi d'analisi nello studio del fenomeno dell'inquinamento atmosferico anche a fini previsionali. Il rapporto si inserisce nell'attivita' di aggiornamento e di formazione che fin dagli anni ottanta l'Istituto Superiore di Sanita' indirizza agli operatori del settore ambientale.

  17. [The emphases and basic procedures of genetic counseling in psychotherapeutic model].

    Science.gov (United States)

    Zhang, Yuan-Zhi; Zhong, Nanbert

    2006-11-01

    The emphases and basic procedures of genetic counseling are all different with those in old models. In the psychotherapeutic model, genetic counseling will not only focus on counselees' genetic disorders and birth defects, but also their psychological problems. "Client-centered therapy" termed by Carl Rogers plays an important role in genetic counseling process. The basic procedures of psychotherapeutic model of genetic counseling include 7 steps: initial contact, introduction, agendas, inquiry of family history, presenting information, closing the session and follow-up.

  18. Updating radon daughter bronchial dosimetry

    International Nuclear Information System (INIS)

    Harley, N.H.; Cohen, B.S.

    1990-01-01

    It is of value to update radon daughter bronchial dosimetry as new information becomes available. Measurements have now been performed using hollow casts of the human bronchial tree with a larynx to determine convective or turbulent deposition in the upper airways. These measurements allow a more realistic calculation of bronchial deposition by diffusion. Particle diameters of 0.15 and 0.2 μm were used which correspond to the activity median diameters for radon daughters in both environmental and mining atmospheres. The total model incorporates Yeh/Schum bronchial morphometry, deposition of unattached and attached radon daughters, build up and decay of the daughters and mucociliary clearance. The alpha dose to target cells in the bronchial epithelium is calculated for the updated model and compared with previous calculations of bronchial dose

  19. A RENORMALIZATION PROCEDURE FOR TENSOR MODELS AND SCALAR-TENSOR THEORIES OF GRAVITY

    OpenAIRE

    SASAKURA, NAOKI

    2010-01-01

    Tensor models are more-index generalizations of the so-called matrix models, and provide models of quantum gravity with the idea that spaces and general relativity are emergent phenomena. In this paper, a renormalization procedure for the tensor models whose dynamical variable is a totally symmetric real three-tensor is discussed. It is proven that configurations with certain Gaussian forms are the attractors of the three-tensor under the renormalization procedure. Since these Gaussian config...

  20. Vasectomy reversal: a clinical update

    Directory of Open Access Journals (Sweden)

    Abhishek P Patel

    2016-01-01

    Full Text Available Vasectomy is a safe and effective method of contraception used by 42-60 million men worldwide. Approximately 3%-6% of men opt for a vasectomy reversal due to the death of a child or divorce and remarriage, change in financial situation, desire for more children within the same marriage, or to alleviate the dreaded postvasectomy pain syndrome. Unlike vasectomy, vasectomy reversal is a much more technically challenging procedure that is performed only by a minority of urologists and places a larger financial strain on the patient since it is usually not covered by insurance. Interest in this procedure has increased since the operating microscope became available in the 1970s, which consequently led to improved patency and pregnancy rates following the procedure. In this clinical update, we discuss patient evaluation, variables that may influence reversal success rates, factors to consider in choosing to perform vasovasostomy versus vasoepididymostomy, and the usefulness of vasectomy reversal to alleviate postvasectomy pain syndrome. We also review the use of robotics for vasectomy reversal and other novel techniques and instrumentation that have emerged in recent years to aid in the success of this surgery.

  1. 76 FR 43890 - Technical Amendment to Commission Procedures for Filing Applications for Orders for Exemptive...

    Science.gov (United States)

    2011-07-22

    ... to clarify and update references to an SEC Web site address and to eliminate certain formatting... Commission is amending Sec. 240.0-12(b) to update references to an SEC Web site address to be used in... formatting requirements. I. Certain Findings Under the Administrative Procedure Act (``APA''), notice of...

  2. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  3. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.; Pojonk, Oliver; Rosic, Bojana V.; Zander, Elmar

    2014-01-01

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  4. Guide for External Beam Radiotherapy. Procedures 2007

    International Nuclear Information System (INIS)

    Ardiet, Jean-Michel; Bourhis, Jean; Eschwege, Francois; Gerard, Jean-Pierre; Martin, Philippe; Mazeron, Jean-Jacques; Barillot, Isabelle; Bey, Pierre; Cosset, Jean-Marc; Thomas, Olivier; Bolla, Michel; Bourguignon, Michel; Godet, Jean-Luc; Krembel, David; Valero, Marc; Bara, Christine; Beauvais-March, Helene; Derreumaux, Sylvie; Vidal, Jean-Pierre; Drouard, Jean; Sarrazin, Thierry; Lindecker-Cournil, Valerie; Robin, Sun Hee Lee; Thevenet, Nicolas; Depenweiller, Christian; Le Tallec, Philippe; Ortholan, Cecile; Aimone, Nicole; Baldeschi, Carine; Cantelli, Andree; Estivalet, Stephane; Le Prince, Cyrille; QUERO, Laurent; Costa, Andre; Gerard, Jean-Pierre; Ardiet, Jean-Michel; Bensadoun, Rene-Jean; Bourhis, Jean; Calais, Gilles; Lartigau, Eric; Ginot, Aurelie; Girard, Nicolas; Mornex, Francoise; Bolla, Michel; Chauvet, Bruno; Maingon, Philippe; Martin, Etienne; Azria, David; Gerard, Jean-Pierre; Grehange, Gilles; Hennequin, Christophe; Peiffert, Didier; Toledano, Alain; Belkacemi, Yazid; Courdi, Adel; Belliere, Aurelie; Peignaux, Karine; Mahe, Marc; Bondiau, Pierre-Yves; Kantor, Guy; Lepechoux, Cecile; Carrie, Christian; Claude, Line

    2007-01-01

    In order to optimize quality and security in the delivery of radiation treatment, the French SFRO (Societe francaise de radiotherapie oncologique) is publishing a Guide for Radiotherapy. This guide is realized according to the HAS (Haute Autorite de sante) methodology of 'structured experts consensus'. This document is made of two parts: a general description of external beam radiation therapy and chapters describing the technical procedures of the main tumors to be irradiated (24). For each procedure, a special attention is given to dose constraints in the organs at risk. This guide will be regularly updated

  5. Sensitivity study of a method for updating a finite element model of a nuclear power station cooling tower

    International Nuclear Information System (INIS)

    Billet, L.

    1994-01-01

    The Research and Development Division of Electricite de France is developing a surveillance method of cooling towers involving on-site wind-induced measurements. The method is supposed to detect structural damage in the tower. The damage is identified by tuning a finite element model of the tower on experimental mode shapes and eigenfrequencies. The sensitivity of the method was evaluated through numerical tests. First, the dynamic response of a damaged tower was simulated by varying the stiffness of some area of the model shell (from 1 % to 24 % of the total shell area). Second, the structural parameters of the undamaged cooling tower model were updated in order to make the output of the undamaged model as close as possible to the synthetic experimental data. The updating method, based on the minimization of the differences between experimental modal energies and modal energies calculated by the model, did not detect a stiffness change over less than 3 % of the shell area. Such a sensitivity is thought to be insufficient to detect tower cracks which behave like highly localized defaults. (author). 8 refs., 9 figs., 6 tabs

  6. WNP-2 core model upgrade

    International Nuclear Information System (INIS)

    Golightly, C.E.; Ravindranath, T.K.; Belblidia, L.A.; O'Farrell, D.; Andersen, P.S.

    2006-01-01

    The paper describes the core model upgrade of the WNP-2 training simulator and the reasons for the upgrade. The core model as well as the interface with the rest of the simulator are briefly described . The paper also describes the procedure that will be used by WNP-2 to update the simulator core data after future core reloads. Results from the fully integrated simulator are presented. (author)

  7. The diagnostic value of specific IgE to Ara h 2 to predict peanut allergy in children is comparable to a validated and updated diagnostic prediction model.

    Science.gov (United States)

    Klemans, Rob J B; Otte, Dianne; Knol, Mirjam; Knol, Edward F; Meijer, Yolanda; Gmelig-Meyling, Frits H J; Bruijnzeel-Koomen, Carla A F M; Knulst, André C; Pasmans, Suzanne G M A

    2013-01-01

    A diagnostic prediction model for peanut allergy in children was recently published, using 6 predictors: sex, age, history, skin prick test, peanut specific immunoglobulin E (sIgE), and total IgE minus peanut sIgE. To validate this model and update it by adding allergic rhinitis, atopic dermatitis, and sIgE to peanut components Ara h 1, 2, 3, and 8 as candidate predictors. To develop a new model based only on sIgE to peanut components. Validation was performed by testing discrimination (diagnostic value) with an area under the receiver operating characteristic curve and calibration (agreement between predicted and observed frequencies of peanut allergy) with the Hosmer-Lemeshow test and a calibration plot. The performance of the (updated) models was similarly analyzed. Validation of the model in 100 patients showed good discrimination (88%) but poor calibration (P original model: sex, skin prick test, peanut sIgE, and total IgE minus sIgE. When building a model with sIgE to peanut components, Ara h 2 was the only predictor, with a discriminative ability of 90%. Cutoff values with 100% positive and negative predictive values could be calculated for both the updated model and sIgE to Ara h 2. In this way, the outcome of the food challenge could be predicted with 100% accuracy in 59% (updated model) and 50% (Ara h 2) of the patients. Discrimination of the validated model was good; however, calibration was poor. The discriminative ability of Ara h 2 was almost comparable to that of the updated model, containing 4 predictors. With both models, the need for peanut challenges could be reduced by at least 50%. Copyright © 2012 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.

  8. 78 FR 20148 - Reporting Procedure for Mathematical Models Selected To Predict Heated Effluent Dispersion in...

    Science.gov (United States)

    2013-04-03

    ... procedure acceptable to the NRC staff for providing summary details of mathematical modeling methods used in... NUCLEAR REGULATORY COMMISSION [NRC-2013-0062] Reporting Procedure for Mathematical Models Selected... Regulatory Guide (RG) 4.4, ``Reporting Procedure for Mathematical Models Selected to Predict Heated Effluent...

  9. Slab2 - Updated Subduction Zone Geometries and Modeling Tools

    Science.gov (United States)

    Moore, G.; Hayes, G. P.; Portner, D. E.; Furtney, M.; Flamme, H. E.; Hearne, M. G.

    2017-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0), is a highly utilized dataset that has been applied to a wide range of geophysical problems. In 2017, these models have been improved and expanded upon as part of the Slab2 modeling effort. With a new data driven approach that can be applied to a broader range of tectonic settings and geophysical data sets, we have generated a model set that will serve as a more comprehensive, reliable, and reproducible resource for three-dimensional slab geometries at all of the world's convergent margins. The newly developed framework of Slab2 is guided by: (1) a large integrated dataset, consisting of a variety of geophysical sources (e.g., earthquake hypocenters, moment tensors, active-source seismic survey images of the shallow slab, tomography models, receiver functions, bathymetry, trench ages, and sediment thickness information); (2) a dynamic filtering scheme aimed at constraining incorporated seismicity to only slab related events; (3) a 3-D data interpolation approach which captures both high resolution shallow geometries and instances of slab rollback and overlap at depth; and (4) an algorithm which incorporates uncertainties of contributing datasets to identify the most probable surface depth over the extent of each subduction zone. Further layers will also be added to the base geometry dataset, such as historic moment release, earthquake tectonic providence, and interface coupling. Along with access to several queryable data formats, all components have been wrapped into an open source library in Python, such that suites of updated models can be released as further data becomes available. This presentation will discuss the extent of Slab2 development, as well as the current availability of the model and modeling tools.

  10. Updating Stiffness and Hysteretic Damping Matrices Using Measured Modal Data

    OpenAIRE

    Jiashang Jiang; Yongxin Yuan

    2018-01-01

    A new direct method for the finite element (FE) matrix updating problem in a hysteretic (or material) damping model based on measured incomplete vibration modal data is presented. With this method, the optimally approximated stiffness and hysteretic damping matrices can be easily constructed. The physical connectivity of the original model is preserved and the measured modal data are embedded in the updated model. The numerical results show that the proposed method works well.

  11. Q2/Q3 2017 Solar Industry Update

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, David J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hoskins, Jack [Dept. of Energy (DOE), Washington DC (United States); Margolis, Robert M. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-10-24

    This technical presentation provides an update on the major trends that occurred in the solar industry in Q2 and Q3 of 2017. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.

  12. Q2/Q3 2016 Solar Industry Update

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, David; Boff, Daniel; Margolis, Robert

    2016-10-11

    This technical presentation provides an update on the major trends that occurred in the solar industry in the Q2 and Q3 of 2016. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.

  13. Q3/Q4 2016 Solar Industry Update

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, David; Boff, Daniel; Margolis, Robert

    2016-12-21

    This technical presentation provides an update on the major trends that occurred in the solar industry in the Q3 and Q4 of 2016. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.

  14. Q3/Q4 2017 Solar Industry Update

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, David J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hoskins, Jack [Dept. of Energy (DOE), Washington DC (United States); Margolis, Robert M. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-02-15

    This technical presentation provides an update on the major trends that occurred in the solar industry in the Q3 and Q4 of 2017. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.

  15. Bayesian updating of reliability of civil infrastructure facilities based on condition-state data and fault-tree model

    International Nuclear Information System (INIS)

    Ching Jianye; Leu, S.-S.

    2009-01-01

    This paper considers a difficult but practical circumstance of civil infrastructure management-deterioration/failure data of the infrastructure system are absent while only condition-state data of its components are available. The goal is to develop a framework for estimating time-varying reliabilities of civil infrastructure facilities under such a circumstance. A novel method of analyzing time-varying condition-state data that only reports operational/non-operational status of the components is proposed to update the reliabilities of civil infrastructure facilities. The proposed method assumes that the degradation arrivals can be modeled as a Poisson process with unknown time-varying arrival rate and damage impact and that the target system can be represented as a fault-tree model. To accommodate large uncertainties, a Bayesian algorithm is proposed, and the reliability of the infrastructure system can be quickly updated based on the condition-state data. Use of the new method is demonstrated with a real-world example of hydraulic spillway gate system.

  16. Selective updating of working memory content modulates meso-cortico-striatal activity.

    Science.gov (United States)

    Murty, Vishnu P; Sambataro, Fabio; Radulescu, Eugenia; Altamura, Mario; Iudicello, Jennifer; Zoltick, Bradley; Weinberger, Daniel R; Goldberg, Terry E; Mattay, Venkata S

    2011-08-01

    Accumulating evidence from non-human primates and computational modeling suggests that dopaminergic signals arising from the midbrain (substantia nigra/ventral tegmental area) mediate striatal gating of the prefrontal cortex during the selective updating of working memory. Using event-related functional magnetic resonance imaging, we explored the neural mechanisms underlying the selective updating of information stored in working memory. Participants were scanned during a novel working memory task that parses the neurophysiology underlying working memory maintenance, overwriting, and selective updating. Analyses revealed a functionally coupled network consisting of a midbrain region encompassing the substantia nigra/ventral tegmental area, caudate, and dorsolateral prefrontal cortex that was selectively engaged during working memory updating compared to the overwriting and maintenance of working memory content. Further analysis revealed differential midbrain-dorsolateral prefrontal interactions during selective updating between low-performing and high-performing individuals. These findings highlight the role of this meso-cortico-striatal circuitry during the selective updating of working memory in humans, which complements previous research in behavioral neuroscience and computational modeling. Published by Elsevier Inc.

  17. Procedural Optimization Models for Multiobjective Flexible JSSP

    Directory of Open Access Journals (Sweden)

    Elena Simona NICOARA

    2013-01-01

    Full Text Available The most challenging issues related to manufacturing efficiency occur if the jobs to be sched-uled are structurally different, if these jobs allow flexible routings on the equipments and mul-tiple objectives are required. This framework, called Multi-objective Flexible Job Shop Scheduling Problems (MOFJSSP, applicable to many real processes, has been less reported in the literature than the JSSP framework, which has been extensively formalized, modeled and analyzed from many perspectives. The MOFJSSP lie, as many other NP-hard problems, in a tedious place where the vast optimization theory meets the real world context. The paper brings to discussion the most optimization models suited to MOFJSSP and analyzes in detail the genetic algorithms and agent-based models as the most appropriate procedural models.

  18. Comments on the Updated Tetrapartite Pallium Model in the Mouse and Chick, Featuring a Homologous Claustro-Insular Complex.

    Science.gov (United States)

    Puelles, Luis

    2017-01-01

    This essay reviews step by step the conceptual changes of the updated tetrapartite pallium model from its tripartite and early tetrapartite antecedents. The crucial observations in mouse material are explained first in the context of assumptions, tentative interpretations, and literature data. Errors and the solutions offered to resolve them are made explicit. Next, attention is centered on the lateral pallium sector of the updated model, whose definition is novel in incorporating a claustro-insular complex distinct from both olfactory centers (ventral pallium) and the isocortex (dorsal pallium). The general validity of the model is postulated at least for tetrapods. Genoarchitectonic studies performed to check the presence of a claustro-insular field homolog in the avian brain are reviewed next. These studies have indeed revealed the existence of such a complex in the avian mesopallium (though stratified outside-in rather than inside-out as in mammals), and there are indications that the same pattern may be found in reptiles as well. Peculiar pallio-pallial tangential migratory phenomena are apparently shared as well between mice and chicks. The issue of whether the avian mesopallium has connections that are similar to the known connections of the mammalian claustro-insular complex is considered next. Accrued data are consistent with similar connections for the avian insula homolog, but they are judged to be insufficient to reach definitive conclusions about the avian claustrum. An aside discusses that conserved connections are not a necessary feature of field-homologous neural centers. Finally, the present scenario on the evolution of the pallium of sauropsids and mammals is briefly visited, as highlighted by the updated tetrapartite model and present results. © 2017 S. Karger AG, Basel.

  19. Automatic Rapid Updating of ATR Target Knowledge Bases

    National Research Council Canada - National Science Library

    Wells, Barton

    1999-01-01

    .... Methods of comparing infrared images with CAD model renderings, including object detection, feature extraction, object alignment, match quality evaluation, and CAD model updating are researched and analyzed...

  20. An incremental procedure model for e-learning projects at universities

    Directory of Open Access Journals (Sweden)

    Pahlke, Friedrich

    2006-11-01

    Full Text Available E-learning projects at universities are produced under different conditions than in industry. The main characteristic of many university projects is that these are realized quasi in a solo effort. In contrast, in private industry the different, interdisciplinary skills that are necessary for the development of e-learning are typically supplied by a multimedia agency.A specific procedure tailored for the use at universities is therefore required to facilitate mastering the amount and complexity of the tasks.In this paper an incremental procedure model is presented, which describes the proceeding in every phase of the project. It allows a high degree of flexibility and emphasizes the didactical concept – instead of the technical implementation. In the second part, we illustrate the practical use of the theoretical procedure model based on the project “Online training in Genetic Epidemiology”.

  1. Updating Stiffness and Hysteretic Damping Matrices Using Measured Modal Data

    Directory of Open Access Journals (Sweden)

    Jiashang Jiang

    2018-01-01

    Full Text Available A new direct method for the finite element (FE matrix updating problem in a hysteretic (or material damping model based on measured incomplete vibration modal data is presented. With this method, the optimally approximated stiffness and hysteretic damping matrices can be easily constructed. The physical connectivity of the original model is preserved and the measured modal data are embedded in the updated model. The numerical results show that the proposed method works well.

  2. New Procedure to Develop Lumped Kinetic Models for Heavy Fuel Oil Combustion

    KAUST Repository

    Han, Yunqing

    2016-09-20

    A new procedure to develop accurate lumped kinetic models for complex fuels is proposed, and applied to the experimental data of the heavy fuel oil measured by thermogravimetry. The new procedure is based on the pseudocomponents representing different reaction stages, which are determined by a systematic optimization process to ensure that the separation of different reaction stages with highest accuracy. The procedure is implemented and the model prediction was compared against that from a conventional method, yielding a significantly improved agreement with the experimental data. © 2016 American Chemical Society.

  3. Atmospheric release model for the E-area low-level waste facility: Updates and modifications

    International Nuclear Information System (INIS)

    None, None

    2017-01-01

    The atmospheric release model (ARM) utilizes GoldSim® Monte Carlo simulation software (GTG, 2017) to evaluate the flux of gaseous radionuclides as they volatilize from E-Area disposal facility waste zones, diffuse into the air-filled soil pores surrounding the waste, and emanate at the land surface. This report documents the updates and modifications to the ARM for the next planned E-Area PA considering recommendations from the 2015 PA strategic planning team outlined by Butcher and Phifer.

  4. Measuring online learning systems success: applying the updated DeLone and McLean model.

    Science.gov (United States)

    Lin, Hsiu-Fen

    2007-12-01

    Based on a survey of 232 undergraduate students, this study used the updated DeLone and McLean information systems success model to examine the determinants for successful use of online learning systems (OLS). The results provided an expanded understanding of the factors that measure OLS success. The results also showed that system quality, information quality, and service quality had a significant effect on actual OLS use through user satisfaction and behavioral intention to use OLS.

  5. Atmospheric release model for the E-area low-level waste facility: Updates and modifications

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2017-11-16

    The atmospheric release model (ARM) utilizes GoldSim® Monte Carlo simulation software (GTG, 2017) to evaluate the flux of gaseous radionuclides as they volatilize from E-Area disposal facility waste zones, diffuse into the air-filled soil pores surrounding the waste, and emanate at the land surface. This report documents the updates and modifications to the ARM for the next planned E-Area PA considering recommendations from the 2015 PA strategic planning team outlined by Butcher and Phifer.

  6. Summary report of third research coordination meeting on updated decay data library for actinides

    International Nuclear Information System (INIS)

    Kellett, M.A.

    2009-07-01

    The third meeting of the Coordinated Research Project on 'Updated Decay Data Library for Actinides' was held at the IAEA, Vienna on 8-10 October 2008. A summary of the presentations made by each participant is given, along with subsequent discussions. The evaluation procedure was reviewed, and a short tutorial session was given on the use of software adopted from the Decay Data Evaluation Project (DDEP). The list of radionuclides under review and evaluation was updated, along with their agreed allocation amongst participants. (author)

  7. Update on scalar singlet dark matter

    NARCIS (Netherlands)

    Cline, J.M.; Scott, P.; Kainulainen, K.; Weniger, C.

    2013-01-01

    One of the simplest models of dark matter is where a scalar singlet field S comprises some or all of the dark matter and interacts with the standard model through an vertical bar H vertical bar S-2(2) coupling to the Higgs boson. We update the present limits on the model from LHC searches for

  8. Using radar altimetry to update a large-scale hydrological model of the Brahmaputra river basin

    DEFF Research Database (Denmark)

    Finsen, F.; Milzow, Christian; Smith, R.

    2014-01-01

    Measurements of river and lake water levels from space-borne radar altimeters (past missions include ERS, Envisat, Jason, Topex) are useful for calibration and validation of large-scale hydrological models in poorly gauged river basins. Altimetry data availability over the downstream reaches...... of the Brahmaputra is excellent (17 high-quality virtual stations from ERS-2, 6 from Topex and 10 from Envisat are available for the Brahmaputra). In this study, altimetry data are used to update a large-scale Budyko-type hydrological model of the Brahmaputra river basin in real time. Altimetry measurements...... improved model performance considerably. The Nash-Sutcliffe model efficiency increased from 0.77 to 0.83. Real-time river basin modelling using radar altimetry has the potential to improve the predictive capability of large-scale hydrological models elsewhere on the planet....

  9. Update schemes of multi-velocity floor field cellular automaton for pedestrian dynamics

    Science.gov (United States)

    Luo, Lin; Fu, Zhijian; Cheng, Han; Yang, Lizhong

    2018-02-01

    Modeling pedestrian movement is an interesting problem both in statistical physics and in computational physics. Update schemes of cellular automaton (CA) models for pedestrian dynamics govern the schedule of pedestrian movement. Usually, different update schemes make the models behave in different ways, which should be carefully recalibrated. Thus, in this paper, we investigated the influence of four different update schemes, namely parallel/synchronous scheme, random scheme, order-sequential scheme and shuffled scheme, on pedestrian dynamics. The multi-velocity floor field cellular automaton (FFCA) considering the changes of pedestrians' moving properties along walking paths and heterogeneity of pedestrians' walking abilities was used. As for parallel scheme only, the collisions detection and resolution should be considered, resulting in a great difference from any other update schemes. For pedestrian evacuation, the evacuation time is enlarged, and the difference in pedestrians' walking abilities is better reflected, under parallel scheme. In face of a bottleneck, for example a exit, using a parallel scheme leads to a longer congestion period and a more dispersive density distribution. The exit flow and the space-time distribution of density and velocity have significant discrepancies under four different update schemes when we simulate pedestrian flow with high desired velocity. Update schemes may have no influence on pedestrians in simulation to create tendency to follow others, but sequential and shuffled update scheme may enhance the effect of pedestrians' familiarity with environments.

  10. An Update to the NASA Reference Solar Sail Thrust Model

    Science.gov (United States)

    Heaton, Andrew F.; Artusio-Glimpse, Alexandra B.

    2015-01-01

    An optical model of solar sail material originally derived at JPL in 1978 has since served as the de facto standard for NASA and other solar sail researchers. The optical model includes terms for specular and diffuse reflection, thermal emission, and non-Lambertian diffuse reflection. The standard coefficients for these terms are based on tests of 2.5 micrometer Kapton sail material coated with 100 nm of aluminum on the front side and chromium on the back side. The original derivation of these coefficients was documented in an internal JPL technical memorandum that is no longer available. Additionally more recent optical testing has taken place and different materials have been used or are under consideration by various researchers for solar sails. Here, where possible, we re-derive the optical coefficients from the 1978 model and update them to accommodate newer test results and sail material. The source of the commonly used value for the front side non-Lambertian coefficient is not clear, so we investigate that coefficient in detail. Although this research is primarily designed to support the upcoming NASA NEA Scout and Lunar Flashlight solar sail missions, the results are also of interest to the wider solar sail community.

  11. Design and development for updating national 1:50,000 topographic databases in China

    Directory of Open Access Journals (Sweden)

    CHEN Jun

    2010-02-01

    Full Text Available 1.1 Objective Map databases are irreplaceable national treasure of immense importance. Their currency referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million’s kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. 1.2 Methodology The updating of map databases is different than its original creation, and a number of new problems should be solved, such as change detection using latest multi-source data, incremental object revision and relation amendment. The methodology of this paper consists of the following three parts: 1 Examine the four key aspects of map database updating and develop basic updating models/methods, such as currentness-oriented integration of multi-resource data, completeness-based incremental change detection in the context of existing datasets, consistency-aware processing of updated data sets, and user-friendly propagation and services of updates. 2 Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. 3 Design a national 1:50,000 databases updating strategy and its production workflow, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process

  12. Software Updating in Wireless Sensor Networks: A Survey and Lacunae

    Directory of Open Access Journals (Sweden)

    Cormac J. Sreenan

    2013-11-01

    Full Text Available Wireless Sensor Networks are moving out of the laboratory and into the field. For a number of reasons there is often a need to update sensor node software, or node configuration, after deployment. The need for over-the-air updates is driven both by the scale of deployments, and by the remoteness and inaccessibility of sensor nodes. This need has been recognized since the early days of sensor networks, and research results from the related areas of mobile networking and distributed systems have been applied to this area. In order to avoid any manual intervention, the update process needs to be autonomous. This paper presents a comprehensive survey of software updating in Wireless Sensor Networks, and analyses the features required to make these updates autonomous. A new taxonomy of software update features and a new model for fault detection and recovery are presented. The paper concludes by identifying the lacunae relating to autonomous software updates, providing direction for future research.

  13. Experimental test of spatial updating models for monkey eye-head gaze shifts.

    Directory of Open Access Journals (Sweden)

    Tom J Van Grootel

    Full Text Available How the brain maintains an accurate and stable representation of visual target locations despite the occurrence of saccadic gaze shifts is a classical problem in oculomotor research. Here we test and dissociate the predictions of different conceptual models for head-unrestrained gaze-localization behavior of macaque monkeys. We adopted the double-step paradigm with rapid eye-head gaze shifts to measure localization accuracy in response to flashed visual stimuli in darkness. We presented the second target flash either before (static, or during (dynamic the first gaze displacement. In the dynamic case the brief visual flash induced a small retinal streak of up to about 20 deg at an unpredictable moment and retinal location during the eye-head gaze shift, which provides serious challenges for the gaze-control system. However, for both stimulus conditions, monkeys localized the flashed targets with accurate gaze shifts, which rules out several models of visuomotor control. First, these findings exclude the possibility that gaze-shift programming relies on retinal inputs only. Instead, they support the notion that accurate eye-head motor feedback updates the gaze-saccade coordinates. Second, in dynamic trials the visuomotor system cannot rely on the coordinates of the planned first eye-head saccade either, which rules out remapping on the basis of a predictive corollary gaze-displacement signal. Finally, because gaze-related head movements were also goal-directed, requiring continuous access to eye-in-head position, we propose that our results best support a dynamic feedback scheme for spatial updating in which visuomotor control incorporates accurate signals about instantaneous eye- and head positions rather than relative eye- and head displacements.

  14. Foothills model forest grizzly bear study : project update

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-01-01

    This report updates a five year study launched in 1999 to ensure the continued healthy existence of grizzly bears in west-central Alberta by integrating their needs into land management decisions. The objective was to gather better information and to develop computer-based maps and models regarding grizzly bear migration, habitat use and response to human activities. The study area covers 9,700 square km in west-central Alberta where 66 to 147 grizzly bears exist. During the first 3 field seasons, researchers captured and radio collared 60 bears. Researchers at the University of Calgary used remote sensing tools and satellite images to develop grizzly bear habitat maps. Collaborators at the University of Washington used trained dogs to find bear scat which was analyzed for DNA, stress levels and reproductive hormones. Resource Selection Function models are being developed by researchers at the University of Alberta to identify bear locations and to see how habitat is influenced by vegetation cover and oil, gas, forestry and mining activities. The health of the bears is being studied by researchers at the University of Saskatchewan and the Canadian Cooperative Wildlife Health Centre. The study has already advanced the scientific knowledge of grizzly bear behaviour. Preliminary results indicate that grizzlies continue to find mates, reproduce and gain weight and establish dens. These are all good indicators of a healthy population. Most bear deaths have been related to poaching. The study will continue for another two years. 1 fig.

  15. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ) model version 5.0.2

    OpenAIRE

    Gantt, B.; Kelly, J. T.; Bash, J. O.

    2015-01-01

    Sea spray aerosols (SSAs) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. Model evaluations of SSA emissions have mainly focused on the global scale, but regional-scale evaluations are also important due to the localized impact of SSAs on atmospheric chemistry near the coast. In this study, SSA emissions in the Community Multiscale Air Quality (CMAQ) model were updated to enhance the...

  16. Co-operation and Phase Behavior under the Mixed Updating Rules

    International Nuclear Information System (INIS)

    Zhang Wen; Li Yao-Sheng; Xu Chen

    2015-01-01

    We present a model by considering two updating rules when the agents play prisoner's dilemma on a square lattice. Agents can update their strategies by referencing one of his neighbors of higher payoffs under the imitation updating rule or directly replaced by one of his neighbors according to the death-birth updating rule. The frequency of co-operation is related to the probability q of occurrence of the imitation updating or the death-birth updating and the game parameter b. The death-birth updating rule favors the co-operation while the imitation updating rule favors the defection on the lattice, although both rules suppress the co-operation in the well-mixed population. Therefore a totally co-operative state may emerge when the death-birth updating is involved in the evolution when b is relatively small. We also obtain a phase diagram on the q-b plane. There are three phases on the plane with two pure phases of a totally co-operative state and a totally defective state and a mixing phase of mixed strategies. Based on the pair approximation, we theoretically analyze the phase behavior and obtain a quantitative agreement with the simulation results. (paper)

  17. Correct coding for laboratory procedures during assisted reproductive technology cycles.

    Science.gov (United States)

    2016-04-01

    This document provides updated coding information for services related to assisted reproductive technology procedures. This document replaces the 2012 ASRM document of the same name. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  18. A single model procedure for tank calibration function estimation

    International Nuclear Information System (INIS)

    York, J.C.; Liebetrau, A.M.

    1995-01-01

    Reliable tank calibrations are a vital component of any measurement control and accountability program for bulk materials in a nuclear reprocessing facility. Tank volume calibration functions used in nuclear materials safeguards and accountability programs are typically constructed from several segments, each of which is estimated independently. Ideally, the segments correspond to structural features in the tank. In this paper the authors use an extension of the Thomas-Liebetrau model to estimate the entire calibration function in a single step. This procedure automatically takes significant run-to-run differences into account and yields an estimate of the entire calibration function in one operation. As with other procedures, the first step is to define suitable calibration segments. Next, a polynomial of low degree is specified for each segment. In contrast with the conventional practice of constructing a separate model for each segment, this information is used to set up the design matrix for a single model that encompasses all of the calibration data. Estimation of the model parameters is then done using conventional statistical methods. The method described here has several advantages over traditional methods. First, modeled run-to-run differences can be taken into account automatically at the estimation step. Second, no interpolation is required between successive segments. Third, variance estimates are based on all the data, rather than that from a single segment, with the result that discontinuities in confidence intervals at segment boundaries are eliminated. Fourth, the restrictive assumption of the Thomas-Liebetrau method, that the measured volumes be the same for all runs, is not required. Finally, the proposed methods are readily implemented using standard statistical procedures and widely-used software packages

  19. The nuclear licensing procedure

    International Nuclear Information System (INIS)

    Wagner, H.

    1976-01-01

    To begin with, the present nuclear licensing procedure is illustrated by a diagram. The relationship between the state and the Laender, the various experts (GRS - IRS + LRA -, TUEV, DWD, university institutes, firms of consulting engineers, etc), participation of the public, e.g. publication of the relevant documents, questions, objections (made by individuals or by groups such as citizens' initiatives), public discussion, official notice, appeals against the decision, the right of immediate execution of the decision are shortly dealt with. Finally, ways to improve the licensing procedure are discussed, from the evaluation of the documents to be submitted, published, and examined by the authorities (and their experts) up to an improvement of the administrative procedure. An improved licensing procedure should satisfy the well-founded claims of the public for more transparency as well as the equally justifiable claims of industry and utilities in order to ensure that the citizens' legal right to have safe and adequate electric power is guaranteed. The updated energy programme established by the Federal Government is mentioned along with the effectiveness of dealing with nuclear problems on the various levels of a Land government. (orig.) [de

  20. Subgrid-scale scalar flux modelling based on optimal estimation theory and machine-learning procedures

    Science.gov (United States)

    Vollant, A.; Balarac, G.; Corre, C.

    2017-09-01

    New procedures are explored for the development of models in the context of large eddy simulation (LES) of a passive scalar. They rely on the combination of the optimal estimator theory with machine-learning algorithms. The concept of optimal estimator allows to identify the most accurate set of parameters to be used when deriving a model. The model itself can then be defined by training an artificial neural network (ANN) on a database derived from the filtering of direct numerical simulation (DNS) results. This procedure leads to a subgrid scale model displaying good structural performance, which allows to perform LESs very close to the filtered DNS results. However, this first procedure does not control the functional performance so that the model can fail when the flow configuration differs from the training database. Another procedure is then proposed, where the model functional form is imposed and the ANN used only to define the model coefficients. The training step is a bi-objective optimisation in order to control both structural and functional performances. The model derived from this second procedure proves to be more robust. It also provides stable LESs for a turbulent plane jet flow configuration very far from the training database but over-estimates the mixing process in that case.

  1. Optimizing dynamic downscaling in one-way nesting using a regional ocean model

    Science.gov (United States)

    Pham, Van Sy; Hwang, Jin Hwan; Ku, Hyeyun

    2016-10-01

    Dynamical downscaling with nested regional oceanographic models has been demonstrated to be an effective approach for both operationally forecasted sea weather on regional scales and projections of future climate change and its impact on the ocean. However, when nesting procedures are carried out in dynamic downscaling from a larger-scale model or set of observations to a smaller scale, errors are unavoidable due to the differences in grid sizes and updating intervals. The present work assesses the impact of errors produced by nesting procedures on the downscaled results from Ocean Regional Circulation Models (ORCMs). Errors are identified and evaluated based on their sources and characteristics by employing the Big-Brother Experiment (BBE). The BBE uses the same model to produce both nesting and nested simulations; so it addresses those error sources separately (i.e., without combining the contributions of errors from different sources). Here, we focus on discussing errors resulting from the spatial grids' differences, the updating times and the domain sizes. After the BBE was separately run for diverse cases, a Taylor diagram was used to analyze the results and recommend an optimal combination of grid size, updating period and domain sizes. Finally, suggested setups for the downscaling were evaluated by examining the spatial correlations of variables and the relative magnitudes of variances between the nested model and the original data.

  2. Working memory updating occurs independently of the need to maintain task-context: accounting for triggering updating in the AX-CPT paradigm.

    Science.gov (United States)

    Kessler, Yoav; Baruchin, Liad J; Bouhsira-Sabag, Anat

    2017-01-01

    Theoretical models suggest that maintenance and updating are two functional states of working memory (WM), which are controlled by a gate between perceptual information and WM representations. Opening the gate enables updating WM with input, while closing it enables keeping the maintained information shielded from interference. However, it is still unclear when gate opening takes place, and what is the external signal that triggers it. A version of the AX-CPT paradigm was used to examine a recent proposal in the literature, suggesting that updating is triggered whenever the maintenance of the context is necessary for task performance (context-dependent tasks). In four experiments using this paradigm, we show that (1) a task-switching cost takes place in both context-dependent and context-independent trials; (2) task-switching is additive to the dependency effect, and (3) unlike switching cost, the dependency effect is not affected by preparation and, therefore, does not reflect context-updating. We suggest that WM updating is likely to be triggered by a simple mechanism that occurs in each trial of the task regardless of whether maintaining the context is needed or not. The implications for WM updating and its relationship to task-switching are discussed.

  3. Likelihood updating of random process load and resistance parameters by monitoring

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2003-01-01

    that maximum likelihood estimation is a rational alternative to an arbitrary weighting for least square fitting. The derived likelihood function gets singularities if the spectrum is prescribed with zero values at some frequencies. This is often the case for models of technically relevant processes......, even though it is of complicated mathematical form, allows an approximate Bayesian updating and control of the time development of the parameters. Some of these parameters can be structural parameters that by too much change reveal progressing damage or other malfunctioning. Thus current process......Spectral parameters for a stationary Gaussian process are most often estimated by Fourier transformation of a realization followed by some smoothing procedure. This smoothing is often a weighted least square fitting of some prespecified parametric form of the spectrum. In this paper it is shown...

  4. Eastern US seismic hazard characterization update

    International Nuclear Information System (INIS)

    Savy, J.B.; Boissonnade, A.C.; Mensing, R.W.; Short, C.M.

    1993-06-01

    In January 1989, LLNL published the results of a multi-year project, funded by NRC, on estimating seismic hazard at nuclear plant sites east of the Rockies. The goal of this study was twofold: to develop a good central estimate (median) of the seismic hazard and to characterize the uncertainty in the estimates of this hazard. In 1989, LLNL was asked by DOE to develop site specific estimates of the seismic hazard at the Savannah River Site (SRS) in South Carolina as part of the New Production Reactor (NPR) project. For the purpose of the NPR, a complete review of the methodology and of the data acquisition process was performed. Work done under the NPR project has shown that first order improvement in the estimates of the uncertainty (i.e., lower mean hazard values) could be easily achieved by updating the modeling of the seismicity and ground motion attenuation uncertainty. To this effect, NRC sponsored LLNL to perform a reelicitation to update the seismicity and ground motion experts' inputs and to revise methods to combine seismicity and ground motion inputs in the seismic hazard analysis for nuclear power plant sites east of the Rocky Mountains. The objective of the recent study was to include the first order improvements that reflect the latest knowledge in seismicity and ground motion modeling and produce an update of all the hazard results produced in the 1989 study. In particular, it had been demonstrated that eliciting seismicity information in terms of rates of earthquakes rather than a- and b-values, and changing the elicitation format to a one-on-one interview, improved our ability to express the uncertainty of earthquake rates of occurrence at large magnitudes. Thus, NRC sponsored this update study to refine the model of uncertainty, and to re-elicitate of the experts' interpretations of the zonation and seismicity, as well as to reelicitate the ground motion models, based on current state of knowledge

  5. The Pajarito Site operating procedures for the Los Alamos Critical Experiments Facility

    International Nuclear Information System (INIS)

    Malenfant, R.E.

    1991-12-01

    Operating procedures consistent with DOE Order 5480.6, and the American National Standard Safety Guide for the Performance of Critical Experiments are defined for the Los Alamos Critical Experiments Facility (LACEF) of the Los Alamos National Laboratory. These operating procedures supersede and update those previously published in 1983 and apply to any criticality experiment performed at the facility. 11 refs

  6. A Logic of Blockchain Updates

    OpenAIRE

    Brünnler, Kai; Flumini, Dandolo; Studer, Thomas

    2017-01-01

    Blockchains are distributed data structures that are used to achieve consensus in systems for cryptocurrencies (like Bitcoin) or smart contracts (like Ethereum). Although blockchains gained a lot of popularity recently, there is no logic-based model for blockchains available. We introduce BCL, a dynamic logic to reason about blockchain updates, and show that BCL is sound and complete with respect to a simple blockchain model.

  7. Crucial role of strategy updating for coexistence of strategies in interaction networks

    Science.gov (United States)

    Zhang, Jianlei; Zhang, Chunyan; Cao, Ming; Weissing, Franz J.

    2015-04-01

    Network models are useful tools for studying the dynamics of social interactions in a structured population. After a round of interactions with the players in their local neighborhood, players update their strategy based on the comparison of their own payoff with the payoff of one of their neighbors. Here we show that the assumptions made on strategy updating are of crucial importance for the strategy dynamics. In the first step, we demonstrate that seemingly small deviations from the standard assumptions on updating have major implications for the evolutionary outcome of two cooperation games: cooperation can more easily persist in a Prisoner's Dilemma game, while it can go more easily extinct in a Snowdrift game. To explain these outcomes, we develop a general model for the updating of states in a network that allows us to derive conditions for the steady-state coexistence of states (or strategies). The analysis reveals that coexistence crucially depends on the number of agents consulted for updating. We conclude that updating rules are as important for evolution on a network as network structure and the nature of the interaction.

  8. The plant operating procedure information modeling system for creation and maintenance of procedures

    International Nuclear Information System (INIS)

    Fanto, S.V.; Petras, D.S.; Reiner, R.T.; Frost, D.R.; Orendi, R.G.

    1990-01-01

    This paper reports that as a result of the accident at Three Mile Island, regulatory requirements were issued to upgrade Emergency Operating Procedures for nuclear power plants. The use of human-factored, function-oriented, EOPs were mandated to improve human reliability and to mitigate the consequences of a broad range of initiating events, subsequent failures and operator errors, without having to first diagnose the specific events. The Westinghouse Owners Group responded by developing the Emergency Response Guidelines in a human-factored, two-column format to aid in the transfer of the improved technical information to the operator during transients and accidents. The ERGs are a network of 43 interrelated guidelines which specify operator actions to be taken during plant emergencies to restore the plant to a safe and stable condition. Each utility then translates these guidelines into plant specific EOPs. The creation and maintenance of this large web of interconnecting ERGs/EOPs is an extremely complex task. This paper reports that in order to aid procedure documentation specialists with this time-consuming and tedious task, the Plant Operating Procedure Information Modeling system was developed to provide a controlled and consistent means to build and maintain the ERGs/EOPs and their supporting documentation

  9. Q4 2017/Q1 2018 Solar Industry Update

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, David J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Margolis, Robert M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hoskins, Jack [U.S. Department of Energy

    2018-05-16

    This technical presentation provides an update on the major trends that occurred in the solar industry in Q4 2017 and Q1 2018. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.

  10. Memory reconsolidation mediates the updating of hippocampal memory content

    OpenAIRE

    Jonathan L C Lee

    2010-01-01

    The retrieval or reactivation of a memory places it into a labile state, requiring a process of reconsolidation to restabilize it. This retrieval-induced plasticity is a potential mechanism for the modification of the existing memory. Following previous data supportive of a functional role for memory reconsolidation in the modification of memory strength, here I show that hippocampal memory reconsolidation also supports the updating of contextual memory content. Using a procedure that se...

  11. Rapid Update Cycle (RUC) [20 km

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Rapid Update Cycle (RUC) weather forecast model was developed by the National Centers for Environmental Prediction (NCEP). On May 1, 2012, the RUC was replaced...

  12. Rapid Update Cycle (RUC) [13 km

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Rapid Update Cycle (RUC) weather forecast model was developed by the National Centers for Environmental Prediction (NCEP). On May 1, 2012, the RUC was replaced...

  13. New Procedure to Develop Lumped Kinetic Models for Heavy Fuel Oil Combustion

    KAUST Repository

    Han, Yunqing; Elbaz, Ayman M.; Roberts, William L.; Im, Hong G.

    2016-01-01

    A new procedure to develop accurate lumped kinetic models for complex fuels is proposed, and applied to the experimental data of the heavy fuel oil measured by thermogravimetry. The new procedure is based on the pseudocomponents representing

  14. Automated finite element updating using strain data for the lifetime reliability assessment of bridges

    International Nuclear Information System (INIS)

    Okasha, Nader M.; Frangopol, Dan M.; Orcesi, André D.

    2012-01-01

    The importance of improving the understanding of the performance of structures over their lifetime under uncertainty with information obtained from structural health monitoring (SHM) has been widely recognized. However, frameworks that efficiently integrate monitoring data into the life-cycle management of structures are yet to be developed. The objective of this paper is to propose and illustrate an approach for updating the lifetime reliability of aging bridges using monitored strain data obtained from crawl tests. It is proposed to use automated finite element model updating techniques as a tool for updating the resistance parameters of the structure. In this paper, the results from crawl tests are used to update the finite element model and, in turn, update the lifetime reliability. The original and updated lifetime reliabilities are computed using advanced computational tools. The approach is illustrated on an existing bridge.

  15. Egocentric-updating during navigation facilitates episodic memory retrieval.

    Science.gov (United States)

    Gomez, Alice; Rousset, Stéphane; Baciu, Monica

    2009-11-01

    Influential models suggest that spatial processing is essential for episodic memory [O'Keefe, J., & Nadel, L. (1978). The hippocampus as a cognitive map. London: Oxford University Press]. However, although several types of spatial relations exist, such as allocentric (i.e. object-to-object relations), egocentric (i.e. static object-to-self relations) or egocentric updated on navigation information (i.e. self-to-environment relations in a dynamic way), usually only allocentric representations are described as potentially subserving episodic memory [Nadel, L., & Moscovitch, M. (1998). Hippocampal contributions to cortical plasticity. Neuropharmacology, 37(4-5), 431-439]. This study proposes to confront the allocentric representation hypothesis with an egocentric updated with self-motion representation hypothesis. In the present study, we explored retrieval performance in relation to these two types of spatial processing levels during learning. Episodic remembering has been assessed through Remember responses in a recall and in a recognition task, combined with a "Remember-Know-Guess" paradigm [Gardiner, J. M. (2001). Episodic memory and autonoetic consciousness: A first-person approach. Philosophical Transactions of the Royal Society B: Biological Sciences, 356(1413), 1351-1361] to assess the autonoetic level of responses. Our results show that retrieval performance was significantly higher when encoding was performed in the egocentric-updated condition. Although egocentric updated with self-motion and allocentric representations are not mutually exclusive, these results suggest that egocentric updating processing facilitates remember responses more than allocentric processing. The results are discussed according to Burgess and colleagues' model of episodic memory [Burgess, N., Becker, S., King, J. A., & O'Keefe, J. (2001). Memory for events and their spatial context: models and experiments. Philosophical Transactions of the Royal Society of London. Series B

  16. Heat bath method for the twisted Eguchi-Kawai model

    Energy Technology Data Exchange (ETDEWEB)

    Fabricius, K.; Haan, O.

    1984-08-16

    We reformulate the twisted Eguchi-Kawaii model in a way that allows us to use the heat bath method for the updating procedure of the link matrices. This new formulation is more efficient by a factor of 2.5 in computer time and of 2.3 in memory need.

  17. Belief update as social choice

    NARCIS (Netherlands)

    van Benthem, J.; Girard, P.; Roy, O.; Marion, M.

    2011-01-01

    Dynamic epistemic-doxastic logics describe the new knowledge or new beliefs indexBelief of agents after some informational event has happened. Technically, this requires an update rule that turns a doxastic-epistemic modelM(recording the current information state of the agents) and a dynamic ‘event

  18. Inter-firm Networks, Organizational Learning and Knowledge Updating: An Empirical Study

    Science.gov (United States)

    Zhang, Su-rong; Wang, Wen-ping

    In the era of knowledge-based economy which information technology develops rapidly, the rate of knowledge updating has become a critical factor for enterprises to gaining competitive advantage .We build an interactional theoretical model among inter-firm networks, organizational learning and knowledge updating thereby and demonstrate it with empirical study at last. The result shows that inter-firm networks and organizational learning is the source of knowledge updating.

  19. A Procedure for Modeling Photovoltaic Arrays under Any Configuration and Shading Conditions

    Directory of Open Access Journals (Sweden)

    Daniel Gonzalez Montoya

    2018-03-01

    Full Text Available Photovoltaic (PV arrays can be connected following regular or irregular connection patterns to form regular configurations (e.g., series-parallel, total cross-tied, bridge-linked, etc. or irregular configurations, respectively. Several reported works propose models for a single configuration; hence, making the evaluation of arrays with different configuration is a considerable time-consuming task. Moreover, if the PV array adopts an irregular configuration, the classical models cannot be used for its analysis. This paper proposes a modeling procedure for PV arrays connected in any configuration and operating under uniform or partial shading conditions. The procedure divides the array into smaller arrays, named sub-arrays, which can be independently solved. The modeling procedure selects the mesh current solution or the node voltage solution depending on the topology of each sub-array. Therefore, the proposed approach analyzes the PV array using the least number of nonlinear equations. The proposed solution is validated through simulation and experimental results, which demonstrate the proposed model capacity to reproduce the electrical behavior of PV arrays connected in any configuration.

  20. Visual perception of procedural textures: identifying perceptual dimensions and predicting generation models.

    Science.gov (United States)

    Liu, Jun; Dong, Junyu; Cai, Xiaoxu; Qi, Lin; Chantler, Mike

    2015-01-01

    Procedural models are widely used in computer graphics for generating realistic, natural-looking textures. However, these mathematical models are not perceptually meaningful, whereas the users, such as artists and designers, would prefer to make descriptions using intuitive and perceptual characteristics like "repetitive," "directional," "structured," and so on. To make up for this gap, we investigated the perceptual dimensions of textures generated by a collection of procedural models. Two psychophysical experiments were conducted: free-grouping and rating. We applied Hierarchical Cluster Analysis (HCA) and Singular Value Decomposition (SVD) to discover the perceptual features used by the observers in grouping similar textures. The results suggested that existing dimensions in literature cannot accommodate random textures. We therefore utilized isometric feature mapping (Isomap) to establish a three-dimensional perceptual texture space which better explains the features used by humans in texture similarity judgment. Finally, we proposed computational models to map perceptual features to the perceptual texture space, which can suggest a procedural model to produce textures according to user-defined perceptual scales.

  1. Visual perception of procedural textures: identifying perceptual dimensions and predicting generation models.

    Directory of Open Access Journals (Sweden)

    Jun Liu

    Full Text Available Procedural models are widely used in computer graphics for generating realistic, natural-looking textures. However, these mathematical models are not perceptually meaningful, whereas the users, such as artists and designers, would prefer to make descriptions using intuitive and perceptual characteristics like "repetitive," "directional," "structured," and so on. To make up for this gap, we investigated the perceptual dimensions of textures generated by a collection of procedural models. Two psychophysical experiments were conducted: free-grouping and rating. We applied Hierarchical Cluster Analysis (HCA and Singular Value Decomposition (SVD to discover the perceptual features used by the observers in grouping similar textures. The results suggested that existing dimensions in literature cannot accommodate random textures. We therefore utilized isometric feature mapping (Isomap to establish a three-dimensional perceptual texture space which better explains the features used by humans in texture similarity judgment. Finally, we proposed computational models to map perceptual features to the perceptual texture space, which can suggest a procedural model to produce textures according to user-defined perceptual scales.

  2. Analyzing longitudinal data with the linear mixed models procedure in SPSS.

    Science.gov (United States)

    West, Brady T

    2009-09-01

    Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.

  3. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  4. Transvaginal mesh procedures for pelvic organ prolapse.

    Science.gov (United States)

    Walter, Jens-Erik

    2011-02-01

    To provide an update on transvaginal mesh procedures, newly available minimally invasive surgical techniques for pelvic floor repair. The discussion is limited to minimally invasive transvaginal mesh procedures. PubMed and Medline were searched for articles published in English, using the key words "pelvic organ prolapse," transvaginal mesh," and "minimally invasive surgery." Results were restricted to systematic reviews, randomized control trials/controlled clinical trials, and observational studies. Searches were updated on a regular basis, and articles were incorporated in the guideline to May 2010. Grey (unpublished) literature was identified through searching the websites of health technology assessment and health technology assessment-related agencies, clinical practice guideline collections, clinical trial registries, and national and international medical specialty societies. The quality of evidence was rated using the criteria described in the Report of the Canadian Task Force on the Preventive Health Care. Recommendations for practice were ranked according to the method described in that report (Table 1). Counselling for the surgical treatment of pelvic organ prolapse should consider all benefits, harms, and costs of the surgical procedure, with particular emphasis on the use of mesh. 1. Patients should be counselled that transvaginal mesh procedures are considered novel techniques for pelvic floor repair that demonstrate high rates of anatomical cure in uncontrolled short-term case series. (II-2B) 2. Patients should be informed of the range of success rates until stronger evidence of superiority is published. (II-2B) 3. Training specific to transvaginal mesh procedures should be undertaken before procedures are performed. (III-C) 4. Patients should undergo thorough preoperative counselling regarding (a) the potential serious adverse sequelae of transvaginal mesh repairs, including mesh exposure, pain, and dyspareunia; and (b) the limited data available

  5. Aircraft engine sensor fault diagnostics using an on-line OBEM update method.

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    Full Text Available This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI system, in which a Hybrid Kalman Filter (HKF was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault.

  6. [Social determinants of health and disability: updating the model for determination].

    Science.gov (United States)

    Tamayo, Mauro; Besoaín, Álvaro; Rebolledo, Jaime

    Social determinants of health (SDH) are conditions in which people live. These conditions impact their lives, health status and social inclusion level. In line with the conceptual and comprehensive progression of disability, it is important to update SDH due to their broad implications in implementing health interventions in society. This proposal supports incorporating disability in the model as a structural determinant, as it would lead to the same social inclusion/exclusion of people described in other structural SDH. This proposal encourages giving importance to designing and implementing public policies to improve societal conditions and contribute to social equity. This will be an act of reparation, justice and fulfilment with the Convention on the Rights of Persons with Disabilities. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. An alternative atmospheric diffusion model for control room habitability assessments

    International Nuclear Information System (INIS)

    Ramsdell, J.V. Jr.

    1990-01-01

    The US Nuclear Regulatory (NRC) staff uses procedures to evaluate control room designs for compliance with General Design Criterion 19 of the Code of Federal Regulations, Appendix A, 10 CRF Part 50. These procedures deal primarily with radiation protection. However, other hazardous materials, for example, chlorine, pose a potential threat to control room habitability. The NRC is considering changes in their current procedures to update methods and extend their applicability. Two changes to the current procedures are suggested: using a puff diffusion model to estimate concentrations at air intakes and using a new method to estimate diffusion coefficients

  8. Interactive Procedural Modelling of Coherent Waterfall Scenes

    OpenAIRE

    Emilien , Arnaud; Poulin , Pierre; Cani , Marie-Paule; Vimont , Ulysse

    2015-01-01

    International audience; Combining procedural generation and user control is a fundamental challenge for the interactive design of natural scenery. This is particularly true for modelling complex waterfall scenes where, in addition to taking charge of geometric details, an ideal tool should also provide a user with the freedom to shape the running streams and falls, while automatically maintaining physical plausibility in terms of flow network, embedding into the terrain, and visual aspects of...

  9. Q4 2016/Q1 2017 Solar Industry Update

    Energy Technology Data Exchange (ETDEWEB)

    Margolis, Robert; Feldman, David; Boff, Daniel

    2017-05-17

    This technical presentation provides an update on the major trends that occurred in the solar industry in the fourth quarter of 2016 and the first quarter of 2017. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.

  10. Identification of cracks in thick beams with a cracked beam element model

    Science.gov (United States)

    Hou, Chuanchuan; Lu, Yong

    2016-12-01

    The effect of a crack on the vibration of a beam is a classical problem, and various models have been proposed, ranging from the basic stiffness reduction method to the more sophisticated model involving formulation based on the additional flexibility due to a crack. However, in the damage identification or finite element model updating applications, it is still common practice to employ a simple stiffness reduction factor to represent a crack in the identification process, whereas the use of a more realistic crack model is rather limited. In this paper, the issues with the simple stiffness reduction method, particularly concerning thick beams, are highlighted along with a review of several other crack models. A robust finite element model updating procedure is then presented for the detection of cracks in beams. The description of the crack parameters is based on the cracked beam flexibility formulated by means of the fracture mechanics, and it takes into consideration of shear deformation and coupling between translational and longitudinal vibrations, and thus is particularly suitable for thick beams. The identification procedure employs a global searching technique using Genetic Algorithms, and there is no restriction on the location, severity and the number of cracks to be identified. The procedure is verified to yield satisfactory identification for practically any configurations of cracks in a beam.

  11. Deductive Updating Is Not Bayesian

    Science.gov (United States)

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2015-01-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based theories such as mental model theory and probabilistic theories. This study looks at conclusion updating after the addition of statistical information to examine the hypothesis that deductive reasoning cannot be explained by probabilistic…

  12. Procedure guideline for thyroid scintigraphy (version 3)

    International Nuclear Information System (INIS)

    Dietlein, M.; Schicha, H.; Eschner, W.; Deutsche Gesellschaft fuer Medizinische Physik; Koeln Univ.; Leisner, B.; Allgemeines Krankenhaus St. Georg, Hamburg; Reiners, C.; Wuerzburg Univ.

    2007-01-01

    The version 3 of the procedure guideline for thyroid scintigraphy is an update of the procedure guideline previously published in 2003. The interpretation of the scintigraphy requires the knowledge of the patients' history, the palpation of the neck, the laboratory parameters and of the sonography. The interpretation of the technetium-99m uptake requires the knowledge of the TSH-level. As a consequence of the improved alimentary iodine supply the 99m Tc-uptake has decreased; 100 000 counts per scintigraphy should be acquired. For this, an imaging time of 10 minutes is generally needed using a high resolution collimator for thyroid imaging. (orig.)

  13. Effective updating process of seismic fragilities using Bayesian method and information entropy

    International Nuclear Information System (INIS)

    Kato, Masaaki; Takata, Takashi; Yamaguchi, Akira

    2008-01-01

    Seismic probabilistic safety assessment (SPSA) is an effective method for evaluating overall performance of seismic safety of a plant. Seismic fragilities are estimated to quantify the seismically induced accident sequences. It is a great concern that the SPSA results involve uncertainties, a part of which comes from the uncertainty in the seismic fragility of equipment and systems. A straightforward approach to reduce the uncertainty is to perform a seismic qualification test and to reflect the results on the seismic fragility estimate. In this paper, we propose a figure-of-merit to find the most cost-effective condition of the seismic qualification tests about the acceleration level and number of components tested. Then a mathematical method to reflect the test results on the fragility update is developed. A Bayesian method is used for the fragility update procedure. Since a lognormal distribution that is used for the fragility model does not have a Bayes conjugate function, a parameterization method is proposed so that the posterior distribution expresses the characteristics of the fragility. The information entropy is used as the figure-of-merit to express importance of obtained evidence. It is found that the information entropy is strongly associated with the uncertainty of the fragility. (author)

  14. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Koenraad J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-14

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alone reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.

  15. Heat bath method for the twisted Eguchi-Kawai model

    International Nuclear Information System (INIS)

    Fabricius, K.; Haan, O.

    1984-01-01

    We reformulate the twisted Eguchi-Kawaii model in a way that allows us to use the heat bath method for the updating procedure of the link matrices. This new formulation is more efficient by a factor of 2.5 in computer time and of 2.3 in memory need. (orig.)

  16. Experimental Update of the Overtopping Model Used for the Wave Dragon Wave Energy Converter

    DEFF Research Database (Denmark)

    Parmeggiani, Stefano; Kofoed, Jens Peter; Friis-Madsen, Erik

    2013-01-01

    An overtopping model specifically suited for Wave Dragon is needed in order to improve the reliability of its performance estimates. The model shall be comprehensive of all relevant physical processes that affect overtopping and flexible to adapt to any local conditions and device configuration....... An experimental investigation is carried out to update an existing formulation suited for 2D draft-limited, low-crested structures, in order to include the effects on the overtopping flow of the wave steepness, the 3D geometry of Wave Dragon, the wing reflectors, the device motions and the non-rigid connection...... of which can be measured in real-time. Instead of using new fitting coefficients, this approach allows a broader applicability of the model beyond the Wave Dragon case, to any overtopping WEC or structure within the range of tested conditions. Predictions reliability of overtopping over Wave Dragon...

  17. An Iterative Ensemble Kalman Filter with One-Step-Ahead Smoothing for State-Parameters Estimation of Contaminant Transport Models

    KAUST Repository

    Gharamti, M. E.

    2015-05-11

    The ensemble Kalman filter (EnKF) is a popular method for state-parameters estimation of subsurface flow and transport models based on field measurements. The common filtering procedure is to directly update the state and parameters as one single vector, which is known as the Joint-EnKF. In this study, we follow the one-step-ahead smoothing formulation of the filtering problem, to derive a new joint-based EnKF which involves a smoothing step of the state between two successive analysis steps. The new state-parameters estimation scheme is derived in a consistent Bayesian filtering framework and results in separate update steps for the state and the parameters. This new algorithm bears strong resemblance with the Dual-EnKF, but unlike the latter which first propagates the state with the model then updates it with the new observation, the proposed scheme starts by an update step, followed by a model integration step. We exploit this new formulation of the joint filtering problem and propose an efficient model-integration-free iterative procedure on the update step of the parameters only for further improved performances. Numerical experiments are conducted with a two-dimensional synthetic subsurface transport model simulating the migration of a contaminant plume in a heterogenous aquifer domain. Contaminant concentration data are assimilated to estimate both the contaminant state and the hydraulic conductivity field. Assimilation runs are performed under imperfect modeling conditions and various observational scenarios. Simulation results suggest that the proposed scheme efficiently recovers both the contaminant state and the aquifer conductivity, providing more accurate estimates than the standard Joint and Dual EnKFs in all tested scenarios. Iterating on the update step of the new scheme further enhances the proposed filter’s behavior. In term of computational cost, the new Joint-EnKF is almost equivalent to that of the Dual-EnKF, but requires twice more model

  18. Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2) (External Review Draft)

    Science.gov (United States)

    EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change mod...

  19. Finite element model updating of multi-span steel-arch-steel-girder bridges based on ambient vibrations

    Science.gov (United States)

    Hou, Tsung-Chin; Gao, Wei-Yuan; Chang, Chia-Sheng; Zhu, Guan-Rong; Su, Yu-Min

    2017-04-01

    The three-span steel-arch-steel-girder Jiaxian Bridge was newly constructed in 2010 to replace the former one that has been destroyed by Typhoon Sinlaku (2008, Taiwan). It was designed and built to continue the domestic service requirement, as well as to improve the tourism business of the Kaohsiung city government, Taiwan. This study aimed at establishing the baseline model of Jiaxian Bridge for hazardous scenario simulation such as typhoons, floods and earthquakes. Necessities of these precaution works were attributed to the inherent vulnerability of the sites: near fault and river cross. The uncalibrated baseline bridge model was built with structural finite element in accordance with the blueprints. Ambient vibration measurements were performed repeatedly to acquire the elastic dynamic characteristics of the bridge structure. Two frequency domain system identification algorithms were employed to extract the measured operational modal parameters. Modal shapes, frequencies, and modal assurance criteria (MAC) were configured as the fitting targets so as to calibrate/update the structural parameters of the baseline model. It has been recognized that different types of structural parameters contribute distinguishably to the fitting targets, as this study has similarly explored. For steel-arch-steel-girder bridges in particular this case, joint rigidity of the steel components was found to be dominant while material properties and section geometries relatively minor. The updated model was capable of providing more rational elastic responses of the bridge superstructure under normal service conditions as well as hazardous scenarios, and can be used for manage the health conditions of the bridge structure.

  20. Update on the Essure System for Permanent Birth Control.

    Science.gov (United States)

    Fantasia, Heidi Collins

    In 2002, the U.S. Food and Drug Administration approved the Essure system for permanent birth control. Implantation with this device offers a minimally invasive option for permanent female contraception that is placed during a brief office visit. Unlike laparoscopic tubal sterilization, the Essure procedure requires no hospitalization or general anesthesia, resulting in minimal recovery time. After a decade of stability in the report of adverse effects, the U.S. Food and Drug Administration noted a sharp increase in patient-reported adverse events, including chronic pelvic pain, irregular bleeding, allergic reactions, and autoimmune-like reactions. In response to this increase in complaints, the U.S. Food and Drug Administration issued updated guidelines for patient education and counseling. This article discusses those updates, as well as implications for nurses who provide health care to women seeking permanent contraception. © 2017 AWHONN, the Association of Women’s Health, Obstetric and Neonatal Nurses.

  1. A Procedure for Building Product Models in Intelligent Agent-based OperationsManagement

    DEFF Research Database (Denmark)

    Hvam, Lars; Riis, Jesper; Malis, Martin

    2003-01-01

    This article presents a procedure for building product models to support the specification processes dealing with sales, design of product variants and production preparation. The procedure includes, as the first phase, an analysis and redesign of the business processes that are to be supported b...

  2. Lagrangian relaxation technique in power systems operation planning: Multipliers updating problem

    Energy Technology Data Exchange (ETDEWEB)

    Ruzic, S. [Electric Power Utility of Serbia, Belgrade (Yugoslavia)

    1995-11-01

    All Lagrangian relaxation based approaches to the power systems operation planning have an important common part: the Lagrangian multipliers correction procedure. It is the subject of this paper. Different approaches presented in the literature are discussed and an original method for the Lagrangian multipliers updating is proposed. The basic idea of this new method is to update Lagrangian multipliers trying to satisfy Khun-Tucker optimality conditions. Instead of the dual function maximization the `distance of optimality function` is defined and minimized. If Khun-Tucker optimality conditions are satisfied the value of this function is in range (-1,0); otherwise the function has a big positive value. This method called `the distance of optimality method` takes into account future changes in planning generations due to the Lagrangian multipliers updating. The influence of changes in a multiplier associated to one system constraint to the satisfaction of some other system requirements is also considered. The numerical efficiency of the proposed method is analyzed and compared with results obtained using the sub-gradient technique. 20 refs, 2 tabs

  3. The Updated BaSTI Stellar Evolution Models and Isochrones. I. Solar-scaled Calculations

    DEFF Research Database (Denmark)

    Hidalgo, Sebastian L.; Pietrinferni, Adriano; Cassisi, Santi

    2018-01-01

    We present an updated release of the BaSTI (a Bag of Stellar Tracks and Isochrones) stellar model and isochrone library for a solar-scaled heavy element distribution. The main input physics that have been changed from the previous BaSTI release include the solar metal mixture, electron conduction...... to metal enrichment ratio dY/dZ = 1.31. The isochrones cover an age range between 20 Myr and 14.5 Gyr, consistently take into account the pre-main-sequence phase, and have been translated to a large number of popular photometric systems. Asteroseismic properties of the theoretical models have also been...... calculated. We compare our isochrones with results from independent databases and with several sets of observations to test the accuracy of the calculations. All stellar evolution tracks, asteroseismic properties, and isochrones are made available through a dedicated web site....

  4. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    Science.gov (United States)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  5. Using genetic algorithm and TOPSIS for Xinanjiang model calibration with a single procedure

    Science.gov (United States)

    Cheng, Chun-Tian; Zhao, Ming-Yan; Chau, K. W.; Wu, Xin-Yu

    2006-01-01

    Genetic Algorithm (GA) is globally oriented in searching and thus useful in optimizing multiobjective problems, especially where the objective functions are ill-defined. Conceptual rainfall-runoff models that aim at predicting streamflow from the knowledge of precipitation over a catchment have become a basic tool for flood forecasting. The parameter calibration of a conceptual model usually involves the multiple criteria for judging the performances of observed data. However, it is often difficult to derive all objective functions for the parameter calibration problem of a conceptual model. Thus, a new method to the multiple criteria parameter calibration problem, which combines GA with TOPSIS (technique for order performance by similarity to ideal solution) for Xinanjiang model, is presented. This study is an immediate further development of authors' previous research (Cheng, C.T., Ou, C.P., Chau, K.W., 2002. Combining a fuzzy optimal model with a genetic algorithm to solve multi-objective rainfall-runoff model calibration. Journal of Hydrology, 268, 72-86), whose obvious disadvantages are to split the whole procedure into two parts and to become difficult to integrally grasp the best behaviors of model during the calibration procedure. The current method integrates the two parts of Xinanjiang rainfall-runoff model calibration together, simplifying the procedures of model calibration and validation and easily demonstrated the intrinsic phenomenon of observed data in integrity. Comparison of results with two-step procedure shows that the current methodology gives similar results to the previous method, is also feasible and robust, but simpler and easier to apply in practice.

  6. SURVEY OF SELECTED PROCEDURES FOR THE INDIRECT DETERMINATION OF THE GROUP REFRACTIVE INDEX OF AIR

    Directory of Open Access Journals (Sweden)

    Filip Dvořáček

    2018-02-01

    Full Text Available The main aim of the research was to evaluate numeric procedures of the indirect determination of the group refractive index of air and to choose the suitable ones for requirements of ordinary and high accuracy distance measurement in geodesy and length metrology. For this purpose, 10 existing computation methods were derived from various authors’ original publications and all were analysed for wide intervals of wavelengths and atmospheric parameters. The determination of the phase and the group refractive indices are essential parts in the evaluation of the first velocity corrections of laser interferometers and electronic distance meters. The validity of modern procedures was tested with respect to updated CIPM-2007 equations of the density of air. The refraction model of Leica AT401 laser tracker was analysed.

  7. Working Memory Updating Latency Reflects the Cost of Switching between Maintenance and Updating Modes of Operation

    Science.gov (United States)

    Kessler, Yoav; Oberauer, Klaus

    2014-01-01

    Updating and maintenance of information are 2 conflicting demands on working memory (WM). We examined the time required to update WM (updating latency) as a function of the sequence of updated and not-updated items within a list. Participants held a list of items in WM and updated a variable subset of them in each trial. Four experiments that vary…

  8. A testing procedure for wind turbine generators based on the power grid statistical model

    DEFF Research Database (Denmark)

    Farajzadehbibalan, Saber; Ramezani, Mohammad Hossein; Nielsen, Peter

    2017-01-01

    In this study, a comprehensive test procedure is developed to test wind turbine generators with a hardware-in-loop setup. The procedure employs the statistical model of the power grid considering the restrictions of the test facility and system dynamics. Given the model in the latent space...

  9. EANM procedural guidelines for radionuclide myocardial perfusion imaging with SPECT and SPECT/CT: 2015 revision

    International Nuclear Information System (INIS)

    Verberne, Hein J.; Eck-Smit, Berthe L.F. van; Wit, Tim C. de; Acampa, Wanda; Anagnostopoulos, Constantinos; Ballinger, Jim; Bengel, Frank; Bondt, Pieter De; Buechel, Ronny R.; Kaufmann, Philip A.; Cuocolo, Alberto; Flotats, Albert; Hacker, Marcus; Hindorf, Cecilia; Lindner, Oliver; Ljungberg, Michael; Lonsdale, Markus; Manrique, Alain; Minarik, David; Scholte, Arthur J.H.A.; Slart, Riemer H.J.A.; Traegaardh, Elin; Hesse, Birger

    2015-01-01

    Since the publication of the European Association of Nuclear Medicine (EANM) procedural guidelines for radionuclide myocardial perfusion imaging (MPI) in 2005, many small and some larger steps of progress have been made, improving MPI procedures. In this paper, the major changes from the updated 2015 procedural guidelines are highlighted, focusing on the important changes related to new instrumentation with improved image information and the possibility to reduce radiation exposure, which is further discussed in relation to the recent developments of new International Commission on Radiological Protection (ICRP) models. Introduction of the selective coronary vasodilator regadenoson and the use of coronary CT-contrast agents for hybrid imaging with SPECT/CT angiography are other important areas for nuclear cardiology that were not included in the previous guidelines. A large number of minor changes have been described in more detail in the fully revised version available at the EANM home page: http://eanm.org/ publications/guidelines/2015 0 7 E ANM F INAL myocardial p erfusion g uideline.pdf. (orig.)

  10. FRMAC Updates

    International Nuclear Information System (INIS)

    Mueller, P.

    1995-01-01

    This talks describes updates in the following updates in FRMAC publications concerning radiation emergencies: Monitoring and Analysis Manual; Evaluation and Assessment Manual; Handshake Series (Biannual) including exercises participated in; environmental Data and Instrument Transmission System (EDITS); Plume in a Box with all radiological data stored onto a hand-held computer; and courses given

  11. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    International Nuclear Information System (INIS)

    Karvonen, T.

    2013-11-01

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  12. Olkiluoto surface hydrological modelling: Update 2012 including salt transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Karvonen, T. [WaterHope, Helsinki (Finland)

    2013-11-15

    Posiva Oy is responsible for implementing a final disposal program for spent nuclear fuel of its owners Teollisuuden Voima Oyj and Fortum Power and Heat Oy. The spent nuclear fuel is planned to be disposed at a depth of about 400-450 meters in the crystalline bedrock at the Olkiluoto site. Leakages located at or close to spent fuel repository may give rise to the upconing of deep highly saline groundwater and this is a concern with regard to the performance of the tunnel backfill material after the closure of the tunnels. Therefore a salt transport sub-model was added to the Olkiluoto surface hydrological model (SHYD). The other improvements include update of the particle tracking algorithm and possibility to estimate the influence of open drillholes in a case where overpressure in inflatable packers decreases causing a hydraulic short-circuit between hydrogeological zones HZ19 and HZ20 along the drillhole. Four new hydrogeological zones HZ056, HZ146, BFZ100 and HZ039 were added to the model. In addition, zones HZ20A and HZ20B intersect with each other in the new structure model, which influences salinity upconing caused by leakages in shafts. The aim of the modelling of long-term influence of ONKALO, shafts and repository tunnels provide computational results that can be used to suggest limits for allowed leakages. The model input data included all the existing leakages into ONKALO (35-38 l/min) and shafts in the present day conditions. The influence of shafts was computed using eight different values for total shaft leakage: 5, 11, 20, 30, 40, 50, 60 and 70 l/min. The selection of the leakage criteria for shafts was influenced by the fact that upconing of saline water increases TDS-values close to the repository areas although HZ20B does not intersect any deposition tunnels. The total limit for all leakages was suggested to be 120 l/min. The limit for HZ20 zones was proposed to be 40 l/min: about 5 l/min the present day leakages to access tunnel, 25 l/min from

  13. Application and Evaluation of an Expert Judgment Elicitation Procedure for Correlations.

    Science.gov (United States)

    Zondervan-Zwijnenburg, Mariëlle; van de Schoot-Hubeek, Wenneke; Lek, Kimberley; Hoijtink, Herbert; van de Schoot, Rens

    2017-01-01

    The purpose of the current study was to apply and evaluate a procedure to elicit expert judgments about correlations, and to update this information with empirical data. The result is a face-to-face group elicitation procedure with as its central element a trial roulette question that elicits experts' judgments expressed as distributions. During the elicitation procedure, a concordance probability question was used to provide feedback to the experts on their judgments. We evaluated the elicitation procedure in terms of validity and reliability by means of an application with a small sample of experts. Validity means that the elicited distributions accurately represent the experts' judgments. Reliability concerns the consistency of the elicited judgments over time. Four behavioral scientists provided their judgments with respect to the correlation between cognitive potential and academic performance for two separate populations enrolled at a specific school in the Netherlands that provides special education to youth with severe behavioral problems: youth with autism spectrum disorder (ASD), and youth with diagnoses other than ASD. Measures of face-validity, feasibility, convergent validity, coherence, and intra-rater reliability showed promising results. Furthermore, the current study illustrates the use of the elicitation procedure and elicited distributions in a social science application. The elicited distributions were used as a prior for the correlation, and updated with data for both populations collected at the school of interest. The current study shows that the newly developed elicitation procedure combining the trial roulette method with the elicitation of correlations is a promising tool, and that the results of the procedure are useful as prior information in a Bayesian analysis.

  14. Comparative Effectiveness of Echoic and Modeling Procedures in Language Instruction With Culturally Disadvantaged Children.

    Science.gov (United States)

    Stern, Carolyn; Keislar, Evan

    In an attempt to explore a systematic approach to language expansion and improved sentence structure, echoic and modeling procedures for language instruction were compared. Four hypotheses were formulated: (1) children who use modeling procedures will produce better structured sentences than children who use echoic prompting, (2) both echoic and…

  15. Updating Allergy and/or Hypersensitivity Diagnostic Procedures in the WHO ICD-11 Revision.

    Science.gov (United States)

    Tanno, Luciana Kase; Calderon, Moises A; Li, James; Casale, Thomas; Demoly, Pascal

    2016-01-01

    The classification of allergy and/or hypersensitivity conditions for the World Health Organization (WHO) International Classification of Diseases (ICD)-11 provides the appropriate corresponding codes for allergic diseases, assuming that the final diagnosis is correct. This classification should be linked to in vitro and in vivo diagnostic procedures. Considering the impact for our specialty, we decided to review the codification of these procedures into the ICD aiming to have a baseline and to suggest changes and/or submit new proposals. For that, we prepared a list of the relevant allergy and/or hypersensitivity diagnostic procedures that health care professionals are dealing with on a daily basis. This was based on the main current guidelines and selected all possible and relevant corresponding terms from the ICD-10 (2015 version) and the ICD-11 β phase foundation (June 2015 version). More than 90% of very specific and important diagnostic procedures currently used by the allergists' community on a daily basis are missing. We observed that some concepts usually used by the allergist community on a daily basis are not fully recognized by other specialties. The whole scheme and the correspondence in the ICD-10 (2015 version) and ICD-11 foundation (June 2015 version) provided us a big picture of the missing or imprecise terms and how they are scattered in the current ICD-11 framework, allowing us to submit new proposals to increase the visibility of the allergy and/or hypersensitivity conditions and diagnostic procedures. Copyright © 2016 American Academy of Allergy, Asthma & Immunology. All rights reserved.

  16. 76 FR 38992 - Approval and Promulgation of Air Quality Implementation Plans; Pennsylvania; Update to Materials...

    Science.gov (United States)

    2011-07-05

    ... for Pennsylvania. Since the publication of the last IBR update, EPA has approved the following... revision'' and ``Applicable geographic area'' columns for the entry ``Continuous Source Testing Manual.'' 2... section 553(b)(3)(B) of the Administrative Procedures Act (APA) which, upon finding ``good cause...

  17. AN EFFICIENT SELF-UPDATING FACE RECOGNITION SYSTEM FOR PLASTIC SURGERY FACE

    Directory of Open Access Journals (Sweden)

    A. Devi

    2016-08-01

    Full Text Available Facial recognition system is fundamental a computer application for the automatic identification of a person through a digitized image or a video source. The major cause for the overall poor performance is related to the transformations in appearance of the user based on the aspects akin to ageing, beard growth, sun-tan etc. In order to overcome the above drawback, Self-update process has been developed in which, the system learns the biometric attributes of the user every time the user interacts with the system and the information gets updated automatically. The procedures of Plastic surgery yield a skilled and endurable means of enhancing the facial appearance by means of correcting the anomalies in the feature and then treating the facial skin with the aim of getting a youthful look. When plastic surgery is performed on an individual, the features of the face undergo reconstruction either locally or globally. But, the changes which are introduced new by plastic surgery remain hard to get modeled by the available face recognition systems and they deteriorate the performances of the face recognition algorithm. Hence the Facial plastic surgery produces changes in the facial features to larger extent and thereby creates a significant challenge to the face recognition system. This work introduces a fresh Multimodal Biometric approach making use of novel approaches to boost the rate of recognition and security. The proposed method consists of various processes like Face segmentation using Active Appearance Model (AAM, Face Normalization using Kernel Density Estimate/ Point Distribution Model (KDE-PDM, Feature extraction using Local Gabor XOR Patterns (LGXP and Classification using Independent Component Analysis (ICA. Efficient techniques have been used in each phase of the FRAS in order to obtain improved results.

  18. Update-in-Place Analysis for True Multidimensional Arrays

    Directory of Open Access Journals (Sweden)

    Steven M. Fitzgerald

    1996-01-01

    Full Text Available Applicative languages have been proposed for defining algorithms for parallel architectures because they are implicitly parallel and lack side effects. However, straightforward implementations of applicative-language compilers may induce large amounts of copying to preserve program semantics. The unnecessary copying of data can increase both the execution time and the memory requirements of an application. To eliminate the unnecessary copying of data, the Sisal compiler uses both build-in-place and update-in-place analyses. These optimizations remove unnecessary array copy operations through compile-time analysis. Both build-in-place and update-in-place are based on hierarchical ragged arrays, i.e., the vector-of-vectors array model. Although this array model is convenient for certain applications, many optimizations are precluded, e.g., vectorization. To compensate for this deficiency, new languages, such as Sisal 2.0, have extended array models that allow for both high-level array operations to be performed and efficient implementations to be devised. In this article, we introduce a new method to perform update-in-place analysis that is applicable to arrays stored either in hierarchical or in contiguous storage. Consequently, the array model that is appropriate for an application can be selected without the loss of performance. Moreover, our analysis is more amenable for distributed memory and large software systems.

  19. Formal Dismissal Procedures Under Illinois Teacher Tenure Laws. Revised Edition.

    Science.gov (United States)

    Jenkins, Newell N.; And Others

    This handbook, an updated and revised version of the 1975 original, contains the new statutory requirements, state board of education rules, and court decisions pertaining to teacher dismissal in Illinois. This handbook is designed to acquaint Illinois school administrators and school board members with the legal procedures necessary in dismissal…

  20. Updating the procedure for metaiodobenzylguanidine labelling with iodine radioisotopes employed in industrial production.

    Science.gov (United States)

    Franceschini, R; Mosca, R; Bonino, C

    1991-01-01

    The classical procedure used for the preparation of [125I]- and [131I]metaiodobenzylguanidine (MIBG) is the solid-phase isotopic exchange between MIBG and radioiodide. This reaction requires 1.5 hours at 160 degrees C to obtain maximum total labelling yields of 75-80%. Recently, the importance of rapid procedures for the preparation of 123I-MIBG has been highlighted. A highly efficient procedure for the industrial production of 123I-MIBG using ascorbic acid, tin sulfate and copper sulfate pentahydrate in 0.01 M sulfuric acid is reported. Sequential radio-TLC analysis of the labelling mixture shows that the labelling yield reaches 98% within 45 min at 100 degrees C. The specific activity of the 123I-MIBG produced in this manner is on the order of 100 Ci/mmol.

  1. Geometric subspace updates with applications to online adaptive nonlinear model reduction

    DEFF Research Database (Denmark)

    Zimmermann, Ralf; Peherstorfer, Benjamin; Willcox, Karen

    2018-01-01

    In many scientific applications, including model reduction and image processing, subspaces are used as ansatz spaces for the low-dimensional approximation and reconstruction of the state vectors of interest. We introduce a procedure for adapting an existing subspace based on information from...... Estimation (GROUSE). We establish for GROUSE a closed-form expression for the residual function along the geodesic descent direction. Specific applications of subspace adaptation are discussed in the context of image processing and model reduction of nonlinear partial differential equation systems....

  2. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    Science.gov (United States)

    Stirling, Mark; Bebbington, Mark; Brenna, Marco; Cronin, Shane; Christophersen, Annemarie; Deligne, Natalia; Hurst, Tony; Jolly, Art; Jolly, Gill; Kennedy, Ben; Kereszturi, Gabor; Lindsay, Jan; Neall, Vince; Procter, Jonathan; Rhoades, David; Scott, Brad; Shane, Phil; Smith, Ian; Smith, Richard; Wang, Ting; White, James D. L.; Wilson, Colin J. N.; Wilson, Tom

    2017-06-01

    We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g. short-term forecasting). The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  3. Conceptual Development of a National Volcanic Hazard Model for New Zealand

    Directory of Open Access Journals (Sweden)

    Mark Stirling

    2017-06-01

    Full Text Available We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration, and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g., short-term forecasting. The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.

  4. 78 FR 15807 - Energy Conservation Program: Test Procedures for Television Sets

    Science.gov (United States)

    2013-03-12

    ... displaying dynamic visual content from wired or wireless sources including but not limited to: * * *''. DOE...) standard ``CEA-2037-A, Determination of Television Average Power Consumption'' into the SNOPR. In today's SNOPR, DOE proposes to update the input power requirements in the TVs test procedure NOPR by referencing...

  5. Metal-rich, Metal-poor: Updated Stellar Population Models for Old Stellar Systems

    Science.gov (United States)

    Conroy, Charlie; Villaume, Alexa; van Dokkum, Pieter G.; Lind, Karin

    2018-02-01

    We present updated stellar population models appropriate for old ages (>1 Gyr) and covering a wide range in metallicities (‑1.5 ≲ [Fe/H] ≲ 0.3). These models predict the full spectral variation associated with individual element abundance variation as a function of metallicity and age. The models span the optical–NIR wavelength range (0.37–2.4 μm), include a range of initial mass functions, and contain the flexibility to vary 18 individual elements including C, N, O, Mg, Si, Ca, Ti, and Fe. To test the fidelity of the models, we fit them to integrated light optical spectra of 41 Galactic globular clusters (GCs). The value of testing models against GCs is that their ages, metallicities, and detailed abundance patterns have been derived from the Hertzsprung–Russell diagram in combination with high-resolution spectroscopy of individual stars. We determine stellar population parameters from fits to all wavelengths simultaneously (“full spectrum fitting”), and demonstrate explicitly with mock tests that this approach produces smaller uncertainties at fixed signal-to-noise ratio than fitting a standard set of 14 line indices. Comparison of our integrated-light results to literature values reveals good agreement in metallicity, [Fe/H]. When restricting to GCs without prominent blue horizontal branch populations, we also find good agreement with literature values for ages, [Mg/Fe], [Si/Fe], and [Ti/Fe].

  6. Evaluating procedural modelling for 3D models of informal settlements in urban design activities

    Directory of Open Access Journals (Sweden)

    Victoria Rautenbach

    2015-11-01

    Full Text Available Three-dimensional (3D modelling and visualisation is one of the fastest growing application fields in geographic information science. 3D city models are being researched extensively for a variety of purposes and in various domains, including urban design, disaster management, education and computer gaming. These models typically depict urban business districts (downtown or suburban residential areas. Despite informal settlements being a prevailing feature of many cities in developing countries, 3D models of informal settlements are virtually non-existent. 3D models of informal settlements could be useful in various ways, e.g. to gather information about the current environment in the informal settlements, to design upgrades, to communicate these and to educate inhabitants about environmental challenges. In this article, we described the development of a 3D model of the Slovo Park informal settlement in the City of Johannesburg Metropolitan Municipality, South Africa. Instead of using time-consuming traditional manual methods, we followed the procedural modelling technique. Visualisation characteristics of 3D models of informal settlements were described and the importance of each characteristic in urban design activities for informal settlement upgrades was assessed. Next, the visualisation characteristics of the Slovo Park model were evaluated. The results of the evaluation showed that the 3D model produced by the procedural modelling technique is suitable for urban design activities in informal settlements. The visualisation characteristics and their assessment are also useful as guidelines for developing 3D models of informal settlements. In future, we plan to empirically test the use of such 3D models in urban design projects in informal settlements.

  7. A Methodology to Detect and Update Active Deformation Areas Based on Sentinel-1 SAR Images

    Directory of Open Access Journals (Sweden)

    Anna Barra

    2017-09-01

    Full Text Available This work is focused on deformation activity mapping and monitoring using Sentinel-1 (S-1 data and the DInSAR (Differential Interferometric Synthetic Aperture Radar technique. The main goal is to present a procedure to periodically update and assess the geohazard activity (volcanic activity, landslides and ground-subsidence of a given area by exploiting the wide area coverage and the high coherence and temporal sampling (revisit time up to six days provided by the S-1 satellites. The main products of the procedure are two updatable maps: the deformation activity map and the active deformation areas map. These maps present two different levels of information aimed at different levels of geohazard risk management, from a very simplified level of information to the classical deformation map based on SAR interferometry. The methodology has been successfully applied to La Gomera, Tenerife and Gran Canaria Islands (Canary Island archipelago. The main obtained results are discussed.

  8. Memory updating and mental arithmetic

    Directory of Open Access Journals (Sweden)

    Cheng-Ching eHan

    2016-02-01

    Full Text Available Is domain-general memory updating ability predictive of calculation skills or are such skills better predicted by the capacity for updating specifically numerical information? Here, we used multidigit mental multiplication (MMM as a measure for calculating skill as this operation requires the accurate maintenance and updating of information in addition to skills needed for arithmetic more generally. In Experiment 1, we found that only individual differences with regard to a task updating numerical information following addition (MUcalc could predict the performance of MMM, perhaps owing to common elements between the task and MMM. In Experiment 2, new updating tasks were designed to clarify this: a spatial updating task with no numbers, a numerical task with no calculation, and a word task. The results showed that both MUcalc and the spatial task were able to predict the performance of MMM but only with the more difficult problems, while other updating tasks did not predict performance. It is concluded that relevant processes involved in updating the contents of working memory support mental arithmetic in adults.

  9. Update in clinical allergy and immunology.

    Science.gov (United States)

    von Gunten, S; Marsland, B J; von Garnier, C; Simon, D

    2012-12-01

    In the recent years, a tremendous body of studies has addressed a broad variety of distinct topics in clinical allergy and immunology. In this update, we discuss selected recent data that provide clinically and pathogenetically relevant insights or identify potential novel targets and strategies for therapy. The role of the microbiome in shaping allergic immune responses and molecular, as well as cellular mechanisms of disease, is discussed separately and in the context of atopic dermatitis, as an allergic model disease. Besides summarizing novel evidence, this update highlights current areas of uncertainties and debates that, as we hope, shall stimulate scientific discussions and research activities in the field. © 2012 John Wiley & Sons A/S.

  10. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    Science.gov (United States)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  11. A PROCEDURAL SOLUTION TO MODEL ROMAN MASONRY STRUCTURES

    Directory of Open Access Journals (Sweden)

    V. Cappellini

    2013-07-01

    Full Text Available The paper will describe a new approach based on the development of a procedural modelling methodology for archaeological data representation. This is a custom-designed solution based on the recognition of the rules belonging to the construction methods used in roman times. We have conceived a tool for 3D reconstruction of masonry structures starting from photogrammetric surveying. Our protocol considers different steps. Firstly we have focused on the classification of opus based on the basic interconnections that can lead to a descriptive system used for their unequivocal identification and design. Secondly, we have chosen an automatic, accurate, flexible and open-source photogrammetric pipeline named Pastis Apero Micmac – PAM, developed by IGN (Paris. We have employed it to generate ortho-images from non-oriented images, using a user-friendly interface implemented by CNRS Marseille (France. Thirdly, the masonry elements are created in parametric and interactive way, and finally they are adapted to the photogrammetric data. The presented application, currently under construction, is developed with an open source programming language called Processing, useful for visual, animated or static, 2D or 3D, interactive creations. Using this computer language, a Java environment has been developed. Therefore, even if the procedural modelling reveals an accuracy level inferior to the one obtained by manual modelling (brick by brick, this method can be useful when taking into account the static evaluation on buildings (requiring quantitative aspects and metric measures for restoration purposes.

  12. a Procedural Solution to Model Roman Masonry Structures

    Science.gov (United States)

    Cappellini, V.; Saleri, R.; Stefani, C.; Nony, N.; De Luca, L.

    2013-07-01

    The paper will describe a new approach based on the development of a procedural modelling methodology for archaeological data representation. This is a custom-designed solution based on the recognition of the rules belonging to the construction methods used in roman times. We have conceived a tool for 3D reconstruction of masonry structures starting from photogrammetric surveying. Our protocol considers different steps. Firstly we have focused on the classification of opus based on the basic interconnections that can lead to a descriptive system used for their unequivocal identification and design. Secondly, we have chosen an automatic, accurate, flexible and open-source photogrammetric pipeline named Pastis Apero Micmac - PAM, developed by IGN (Paris). We have employed it to generate ortho-images from non-oriented images, using a user-friendly interface implemented by CNRS Marseille (France). Thirdly, the masonry elements are created in parametric and interactive way, and finally they are adapted to the photogrammetric data. The presented application, currently under construction, is developed with an open source programming language called Processing, useful for visual, animated or static, 2D or 3D, interactive creations. Using this computer language, a Java environment has been developed. Therefore, even if the procedural modelling reveals an accuracy level inferior to the one obtained by manual modelling (brick by brick), this method can be useful when taking into account the static evaluation on buildings (requiring quantitative aspects) and metric measures for restoration purposes.

  13. Penerapan Model Pembelajaran Conceptual Understanding Procedures (CUPS sebagai Upaya Mengatasi Miskonsepsi Matematis Siswa

    Directory of Open Access Journals (Sweden)

    Asri Gita

    2018-01-01

    Full Text Available Kesalahan dalam memahami konsep menjadi salah satu faktor yang menyebabkan miskonsepsi pada pelajaran matematika. Miskonsepsi pada materi bangun datar disebabkan oleh cara belajar siswa yang hanya menghafalkan bentuk dasar tanpa memahami hubungan antar bangun datar dan sifat-sifatnya. Upaya yang dilakukan dalam mengatasi miskonsepsi tersebut adalah dengan menerapkan pembelajaran konstruktivis. Salah satu model pembelajaran konstruktivis adalah Conceptual Understanding Procedures (CUPs. Tujuan dari penelitian ini adalah untuk mengetahui penerapan model pembelajaran Conceptual Understanding Procedures (CUPs sebagai upaya mengatasi miskonsepsi matematis siswa pada materi sifat-sifat bangun datar segiempat. Subjek penelitian adalah 12 orang siswa SMP yang mengalami miskonsepsi pada materi sifat-sifat bangun datar segiempat. Teknik pengumpulan data pada penelitian ini melalui tes, video, observasi, dan wawancara. Validitas dan reliabilitas data melalui credibility, dependability, transferability, dan confirmability. Hasil dari penelitian ini menunjukkan bahwa penerapan model pembelajaran Conceptual Understanding Procedures (CUPs yang terdiri dari fase individu, fase kelompok triplet, dan fase interpretasi seluruh kelas dapat mengatasi miskonsepsi siswa pada materi sifat-sifat bangun datar segiempat. Perubahan miskonsepsi siswa juga dapat dilihat dari nilai tes yang mengalami peningkatan nilai berdasarkan nilai tes awal dan tes akhir siswa. Kata Kunci: Conceptual Understanding Procedures (CUPs, miskonsepsi, segiempat.   ABSTRACT Mistakes in understanding the concept became one of the factors that led to misconceptions in mathematics. The misconceptions in plane shapes are caused by the way of learning of students who only memorize the basic form without understanding the relationship between the plane shapes and its properties. Efforts made in overcoming these misconceptions is to apply constructivist learning. One of the constructivist learning

  14. Kalman filter to update forest cover estimates

    Science.gov (United States)

    Raymond L. Czaplewski

    1990-01-01

    The Kalman filter is a statistical estimator that combines a time-series of independent estimates, using a prediction model that describes expected changes in the state of a system over time. An expensive inventory can be updated using model predictions that are adjusted with more recent, but less expensive and precise, monitoring data. The concepts of the Kalman...

  15. 76 FR 64017 - Approval and Promulgation of Air Quality Implementation Plans; South Carolina; Update to...

    Science.gov (United States)

    2011-10-17

    ... of Air Quality Implementation Plans; South Carolina; Update to Materials Incorporated by Reference... the ``good cause'' exemption in section 553(b)(3)(B) of the Administrative Procedure Act (APA) which... also finds that there is good cause under APA section 553(d)(3) for this correction to become effective...

  16. Bootstrap procedure in the quasinuclear quark model

    International Nuclear Information System (INIS)

    Anisovich, V.V.; Gerasyuta, S.M.; Keltuyala, I.V.

    1983-01-01

    The scattering amplitude for quarks (dressed quarks of a single flavour, and three colours) is obtained by means of a bootstrap procedure with introdUction of an initial paint-wise interaction due to a heavy gluon exchange. The obtained quasi-nuclear model (effective short-range interaction in the S-wave states) has reasonable properties: there exist colourless meson states Jsup(p)=0sup(-), 1 - ; there are no bound states in coloured channels, a virtual diquark level Jsup(p)=1sup(+) appears in the coloured state anti 3sub(c)

  17. Application of a Bayesian algorithm for the Statistical Energy model updating of a railway coach

    DEFF Research Database (Denmark)

    Sadri, Mehran; Brunskog, Jonas; Younesian, Davood

    2016-01-01

    into account based on published data on comparison between experimental and theoretical results, so that the variance of the theory is estimated. The Monte Carlo Metropolis Hastings algorithm is employed to estimate the modified values of the parameters. It is shown that the algorithm can be efficiently used......The classical statistical energy analysis (SEA) theory is a common approach for vibroacoustic analysis of coupled complex structures, being efficient to predict high-frequency noise and vibration of engineering systems. There are however some limitations in applying the conventional SEA...... the performance of the proposed strategy, the SEA model updating of a railway passenger coach is carried out. First, a sensitivity analysis is carried out to select the most sensitive parameters of the SEA model. For the selected parameters of the model, prior probability density functions are then taken...

  18. MoDOT pavement preservation research program volume VII, re-calibration of triggers and performance models.

    Science.gov (United States)

    2015-10-01

    The objective of this task is to develop the concept and framework for a procedure to routinely create, re-calibrate, and update the : Trigger Tables and Performance Models. The scope of work for Task 6 includes a limited review of the recent pavemen...

  19. Up-date of the BCG code library

    International Nuclear Information System (INIS)

    Caldeira, A.D.; Garcia, R.D.M.

    1990-01-01

    Procedures for generating an up-date material library for the BCG code were established. A new library was generated by processing ENDF/B-IV data with the 89-1 version of the LINEAR, RECENT and SIGMA1 programs. The effect of library change in the neutron spectrum and effective multiplication factor of a fast reactor cell was analized. During the course of this study, an error was detected in the BCG code. Although localized in a narrow energy range, the discrepancies in neutron spectrum caused by the error were large enough to yield a difference of about 1% in the effective multiplication factor of the test cell. (author)

  20. ‘It is time to prepare the next patient’ real-time prediction of procedure duration in laparoscopic cholecystectomies

    NARCIS (Netherlands)

    Geudon, A.C.P.; Paalvast, M.S.M.; Meeuwsen, F.C.; Tax, D.M.J.; van Dijke, A.P.; Wauben, L.S.G.L.; van der Elst, M; Dankelman, J.; van den Dobbelsteen, J.J.

    2016-01-01

    Operating Room (OR) scheduling is crucial to allow efficient use of ORs. Currently, the predicted durations of surgical procedures are unreliable and the OR schedulers have to follow the progress of the procedures in order to update the daily planning accordingly. The OR schedulers often acquire

  1. Statistical and perceptual updating: correlated impairments in right brain injury.

    Science.gov (United States)

    Stöttinger, Elisabeth; Filipowicz, Alex; Marandi, Elahe; Quehl, Nadine; Danckert, James; Anderson, Britt

    2014-06-01

    It has been hypothesized that many of the cognitive impairments commonly seen after right brain damage (RBD) can be characterized as a failure to build or update mental models. We (Danckert et al. in Neglect as a disorder of representational updating. NOVA Open Access, New York, 2012a; Cereb Cortex 22:2745-2760, 2012b) were the first to directly assess the association between RBD and updating and found that RBD patients were unable to exploit a strongly biased play strategy in their opponent in the children's game rock, paper, scissors. Given that this game required many other cognitive capacities (i.e., working memory, sustained attention, reward processing), RBD patients could have failed this task for various reasons other than a failure to update. To assess the generality of updating deficits after RBD, we had RBD, left brain-damaged (LBD) patients and healthy controls (HCs) describe line drawings that evolved gradually from one figure (e.g., rabbit) to another (e.g., duck) in addition to the RPS updating task. RBD patients took significantly longer to alter their perceptual report from the initial object to the final object than did LBD patients and HCs. Although both patient groups performed poorly on the RPS task, only the RBD patients showed a significant correlation between the two, very different, updating tasks. We suggest these data indicate a general deficiency in the ability to update mental representations following RBD.

  2. Uranium hexafluoride: handling procedures and container criteria

    International Nuclear Information System (INIS)

    1977-04-01

    The U.S. Energy Research and Development Administration's (ERDA) procedures for packaging, measuring, and transferring uranium hexafluoride (UF 6 ) have been undergoing continual review and revision for several years to keep them in phase with developing agreements for the supply of enriched uranium. This report, first issued in 1966, was reissued in 1967 to make editorial changes and to provide for minor revisions in procedural information. In 1968 and 1972, Revisions 2 and 3, respectively, were issued as part of the continuing effort to present updated information. This document, Revision 4, includes primarily revisions to UF 6 cylinders, valves, and methods of use. This revision supersedes all previous issues of this report. The procedures will normally apply in all transactions involving receipt or shipment of UF 6 by ERDA, unless stipulated otherwise by contracts or agreements with ERDA or by notices published in the Federal Register

  3. Update and revision of WHO air quality guidelines for Europe

    Energy Technology Data Exchange (ETDEWEB)

    Younes, M. [WHO European Centre for Environment and Health, Bilthoven (Netherlands). Bilthoven Div.

    1995-12-31

    The WHO Air Quality Guidelines for Europe (AQG), published in 1987, have provided a uniform basis for the development of strategies for the control of air pollution, and have contributed to the maintenance and improvement of public health in several countries. The aim of the guidelines is to provide a basis for protecting public health from adverse effects of air pollutants, and for eliminating or reducing to a minimum, those contaminants that are known or likely to be hazardous to human health and wellbeing. Since the publication of the first edition of the AQG, new scientific data in the fields of air pollution toxicology and epidemiology have accumulated and new developments in risk assessment methodologies have taken place, requiring updating and/or revision of the existing guidelines. This fact was recognized during the preparation of the initial work plan of the European Centre for Environment and Health, and it was recommended that the Centre undertake any necessary amendments and extensions to the Air Quality Guidelines. The updating procedure is being carried out in cooperation with the International Programme on Chemical Safety (IPCS) and the Commission of the European Communities (CEC) and will be implemented through working group meetings which require the preparation of working documents on specific air pollutants or mixtures and a final consultation to discuss the updated document.It was initiated by a Planning Meeting which was organized in January 1993. The purpose of the planning meeting was to set the framework for the updating and revision process, in particular to discuss the scope and purpose, the contents and the format of the revised AQG publication, to define the details of and the time schedule for the updating process and to identify the working groups needed and their way of operation. (author)

  4. Update and revision of WHO air quality guidelines for Europe

    Energy Technology Data Exchange (ETDEWEB)

    Younes, M [WHO European Centre for Environment and Health, Bilthoven (Netherlands). Bilthoven Div.

    1996-12-31

    The WHO Air Quality Guidelines for Europe (AQG), published in 1987, have provided a uniform basis for the development of strategies for the control of air pollution, and have contributed to the maintenance and improvement of public health in several countries. The aim of the guidelines is to provide a basis for protecting public health from adverse effects of air pollutants, and for eliminating or reducing to a minimum, those contaminants that are known or likely to be hazardous to human health and wellbeing. Since the publication of the first edition of the AQG, new scientific data in the fields of air pollution toxicology and epidemiology have accumulated and new developments in risk assessment methodologies have taken place, requiring updating and/or revision of the existing guidelines. This fact was recognized during the preparation of the initial work plan of the European Centre for Environment and Health, and it was recommended that the Centre undertake any necessary amendments and extensions to the Air Quality Guidelines. The updating procedure is being carried out in cooperation with the International Programme on Chemical Safety (IPCS) and the Commission of the European Communities (CEC) and will be implemented through working group meetings which require the preparation of working documents on specific air pollutants or mixtures and a final consultation to discuss the updated document.It was initiated by a Planning Meeting which was organized in January 1993. The purpose of the planning meeting was to set the framework for the updating and revision process, in particular to discuss the scope and purpose, the contents and the format of the revised AQG publication, to define the details of and the time schedule for the updating process and to identify the working groups needed and their way of operation. (author)

  5. A combination of HARMONIE short time direct normal irradiance forecasts and machine learning: The #hashtdim procedure

    Science.gov (United States)

    Gastón, Martín; Fernández-Peruchena, Carlos; Körnich, Heiner; Landelius, Tomas

    2017-06-01

    The present work describes the first approach of a new procedure to forecast Direct Normal Irradiance (DNI): the #hashtdim that treats to combine ground information and Numerical Weather Predictions. The system is centered in generate predictions for the very short time. It combines the outputs from the Numerical Weather Prediction Model HARMONIE with an adaptive methodology based on Machine Learning. The DNI predictions are generated with 15-minute and hourly temporal resolutions and presents 3-hourly updates. Each update offers forecasts to the next 12 hours, the first nine hours are generated with 15-minute temporal resolution meanwhile the last three hours present hourly temporal resolution. The system is proved over a Spanish emplacement with BSRN operative station in south of Spain (PSA station). The #hashtdim has been implemented in the framework of the Direct Normal Irradiance Nowcasting methods for optimized operation of concentrating solar technologies (DNICast) project, under the European Union's Seventh Programme for research, technological development and demonstration framework.

  6. Fracture network modeling and GoldSim simulation support

    International Nuclear Information System (INIS)

    Sugita, Kenichirou; Dershowitz, W.

    2005-01-01

    During Heisei-16, Golder Associates provided support for JNC Tokai through discrete fracture network data analysis and simulation of the Mizunami Underground Research Laboratory (MIU), participation in Task 6 of the AEspoe Task Force on Modeling of Groundwater Flow and Transport, and development of methodologies for analysis of repository site characterization strategies and safety assessment. MIU support during H-16 involved updating the H-15 FracMan discrete fracture network (DFN) models for the MIU shaft region, and developing improved simulation procedures. Updates to the conceptual model included incorporation of 'Step2' (2004) versions of the deterministic structures, and revision of background fractures to be consistent with conductive structure data from the DH-2 borehole. Golder developed improved simulation procedures for these models through the use of hybrid discrete fracture network (DFN), equivalent porous medium (EPM), and nested DFN/EPM approaches. For each of these models, procedures were documented for the entire modeling process including model implementation, MMP simulation, and shaft grouting simulation. Golder supported JNC participation in Task 6AB, 6D and 6E of the AEspoe Task Force on Modeling of Groundwater Flow and Transport during H-16. For Task 6AB, Golder developed a new technique to evaluate the role of grout in performance assessment time-scale transport. For Task 6D, Golder submitted a report of H-15 simulations to SKB. For Task 6E, Golder carried out safety assessment time-scale simulations at the block scale, using the Laplace Transform Galerkin method. During H-16, Golder supported JNC's Total System Performance Assessment (TSPA) strategy by developing technologies for the analysis of the use site characterization data in safety assessment. This approach will aid in the understanding of the use of site characterization to progressively reduce site characterization uncertainty. (author)

  7. The New Italian Seismic Hazard Model

    Science.gov (United States)

    Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.

    2017-12-01

    In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme

  8. Calibration procedure for a potato crop growth model using information from across Europe

    DEFF Research Database (Denmark)

    Heidmann, Tove; Tofteng, Charlotte; Abrahamsen, Per

    2008-01-01

    for adaptation of the Daisy model to new potato varieties or for the improvement of the existing parameter set. The procedure is then, as a starting point, to focus the calibration process on the recommended list of parameters to change. We demonstrate this approach by showing the procedure for recalibrating...... three varieties using all relevant data from the sites. We believe these new parameterisations to be more robust, because they indirectly were based on information from the six different sites. We claim that this procedure combines both local and specific modeller expertise in a way that results in more......In the FertOrgaNic EU project, 3 years of field experiments with drip irrigation and fertigation were carried out at six different sites across Europe, involving seven different varieties of potato. The Daisy model, which simulates plant growth together with water and nitrogen dynamics, was used...

  9. On the Update Problems for Software Defined Networks

    Directory of Open Access Journals (Sweden)

    V. A. Zakharov

    2014-01-01

    Full Text Available The designing of network update algorithms is urgent for the development of SDN control software. A particular case of Network Update Problem is that of restoring seamlessly a given network configuration after some packet forwarding rules have been disabled (say, at the expiry of their time-outs. We study this problem in the framework of a formal model of SDN, develop correct and safe network recovering algorithms, and show that in general case there is no way to restore network configuration seamlessly without referring to priorities of packet forwarding rules.

  10. Semantic Modeling of Administrative Procedures from a Spanish Regional Public Administration

    Directory of Open Access Journals (Sweden)

    Francisco José Hidalgo López

    2018-02-01

    Full Text Available Over the past few years, Public Administrations have been providing systems for procedures and files electronic processing to ensure compliance with regulations and provide public services to citizens. Although each administration provides similar services to their citizens, these systems usually differ from the internal information management point of view since they usually come from different products and manufacturers. The common framework that regulations demand, and that Public Administrations must respect when processing electronic files, provides a unique opportunity for the development of intelligent agents in the field of administrative processes. However, for this development to be truly effective and applicable to the public sector, it is necessary to have a common representation model for these administrative processes. Although a lot of work has already been done in the development of public information reuse initiatives and common vocabularies standardization, this has not been carried out at the processes level. In this paper, we propose a semantic representation model of both processes models and processes for Public Administrations: the procedures and administrative files. The goal is to improve public administration open data initiatives and help to develop their sustainability policies, such as improving decision-making procedures and administrative management sustainability. As a case study, we modelled public administrative processes and files in collaboration with a Regional Public Administration in Spain, the Principality of Asturias, which enabled access to its information systems, helping the evaluation of our approach.

  11. Development of an updated PBPK model for trichloroethylene and metabolites in mice, and its application to discern the role of oxidative metabolism in TCE-induced hepatomegaly

    International Nuclear Information System (INIS)

    Evans, M.V.; Chiu, W.A.; Okino, M.S.; Caldwell, J.C.

    2009-01-01

    Trichloroethylene (TCE) is a lipophilic solvent rapidly absorbed and metabolized via oxidation and conjugation to a variety of metabolites that cause toxicity to several internal targets. Increases in liver weight (hepatomegaly) have been reported to occur quickly in rodents after TCE exposure, with liver tumor induction reported in mice after long-term exposure. An integrated dataset for gavage and inhalation TCE exposure and oral data for exposure to two of its oxidative metabolites (TCA and DCA) was used, in combination with an updated and more accurate physiologically-based pharmacokinetic (PBPK) model, to examine the question as to whether the presence of TCA in the liver is responsible for TCE-induced hepatomegaly in mice. The updated PBPK model was used to help discern the quantitative contribution of metabolites to this effect. The update of the model was based on a detailed evaluation of predictions from previously published models and additional preliminary analyses based on gas uptake inhalation data in mice. The parameters of the updated model were calibrated using Bayesian methods with an expanded pharmacokinetic database consisting of oral, inhalation, and iv studies of TCE administration as well as studies of TCE metabolites in mice. The dose-response relationships for hepatomegaly derived from the multi-study database showed that the proportionality of dose to response for TCE- and DCA-induced hepatomegaly is not observed for administered doses of TCA in the studied range. The updated PBPK model was used to make a quantitative comparison of internal dose of metabolized and administered TCA. While the internal dose of TCA predicted by modeling of TCE exposure (i.e., mg TCA/kg-d) showed a linear relationship with hepatomegaly, the slope of the relationship was much greater than that for directly administered TCA. Thus, the degree of hepatomegaly induced per unit of TCA produced through TCE oxidation is greater than that expected per unit of TCA

  12. Development of an updated PBPK model for trichloroethylene and metabolites in mice, and its application to discern the role of oxidative metabolism in TCE-induced hepatomegaly.

    Science.gov (United States)

    Evans, M V; Chiu, W A; Okino, M S; Caldwell, J C

    2009-05-01

    Trichloroethylene (TCE) is a lipophilic solvent rapidly absorbed and metabolized via oxidation and conjugation to a variety of metabolites that cause toxicity to several internal targets. Increases in liver weight (hepatomegaly) have been reported to occur quickly in rodents after TCE exposure, with liver tumor induction reported in mice after long-term exposure. An integrated dataset for gavage and inhalation TCE exposure and oral data for exposure to two of its oxidative metabolites (TCA and DCA) was used, in combination with an updated and more accurate physiologically-based pharmacokinetic (PBPK) model, to examine the question as to whether the presence of TCA in the liver is responsible for TCE-induced hepatomegaly in mice. The updated PBPK model was used to help discern the quantitative contribution of metabolites to this effect. The update of the model was based on a detailed evaluation of predictions from previously published models and additional preliminary analyses based on gas uptake inhalation data in mice. The parameters of the updated model were calibrated using Bayesian methods with an expanded pharmacokinetic database consisting of oral, inhalation, and iv studies of TCE administration as well as studies of TCE metabolites in mice. The dose-response relationships for hepatomegaly derived from the multi-study database showed that the proportionality of dose to response for TCE- and DCA-induced hepatomegaly is not observed for administered doses of TCA in the studied range. The updated PBPK model was used to make a quantitative comparison of internal dose of metabolized and administered TCA. While the internal dose of TCA predicted by modeling of TCE exposure (i.e., mg TCA/kg-d) showed a linear relationship with hepatomegaly, the slope of the relationship was much greater than that for directly administered TCA. Thus, the degree of hepatomegaly induced per unit of TCA produced through TCE oxidation is greater than that expected per unit of TCA

  13. An update on neurotoxin products and administration methods.

    Science.gov (United States)

    Lanoue, Julien; Dong, Joanna; Do, Timothy; Goldenberg, Gary

    2016-09-01

    Since onabotulinumtoxinA for nonsurgical aesthetic enhancement of glabellar lines was initially reported, the popularity of botulinum neurotoxin (BoNT) products among both clinicians and consumers has rapidly grown, and we have seen several additional BoNT formulations enter the market. As the demand for minimally invasive cosmetic procedures continues to increase, we will see the introduction of additional formulations of BoNT products as well as new delivery devices and administration techniques. In this article, we provide a brief update on current and upcoming BoNT products and also review the literature on novel administration methods based on recently published studies.

  14. The LANDFIRE Refresh strategy: updating the national dataset

    Science.gov (United States)

    Nelson, Kurtis J.; Connot, Joel A.; Peterson, Birgit E.; Martin, Charley

    2013-01-01

    The LANDFIRE Program provides comprehensive vegetation and fuel datasets for the entire United States. As with many large-scale ecological datasets, vegetation and landscape conditions must be updated periodically to account for disturbances, growth, and natural succession. The LANDFIRE Refresh effort was the first attempt to consistently update these products nationwide. It incorporated a combination of specific systematic improvements to the original LANDFIRE National data, remote sensing based disturbance detection methods, field collected disturbance information, vegetation growth and succession modeling, and vegetation transition processes. This resulted in the creation of two complete datasets for all 50 states: LANDFIRE Refresh 2001, which includes the systematic improvements, and LANDFIRE Refresh 2008, which includes the disturbance and succession updates to the vegetation and fuel data. The new datasets are comparable for studying landscape changes in vegetation type and structure over a decadal period, and provide the most recent characterization of fuel conditions across the country. The applicability of the new layers is discussed and the effects of using the new fuel datasets are demonstrated through a fire behavior modeling exercise using the 2011 Wallow Fire in eastern Arizona as an example.

  15. ERM model analysis for adaptation to hydrological model errors

    Science.gov (United States)

    Baymani-Nezhad, M.; Han, D.

    2018-05-01

    Hydrological conditions are changed continuously and these phenomenons generate errors on flood forecasting models and will lead to get unrealistic results. Therefore, to overcome these difficulties, a concept called model updating is proposed in hydrological studies. Real-time model updating is one of the challenging processes in hydrological sciences and has not been entirely solved due to lack of knowledge about the future state of the catchment under study. Basically, in terms of flood forecasting process, errors propagated from the rainfall-runoff model are enumerated as the main source of uncertainty in the forecasting model. Hence, to dominate the exciting errors, several methods have been proposed by researchers to update the rainfall-runoff models such as parameter updating, model state updating, and correction on input data. The current study focuses on investigations about the ability of rainfall-runoff model parameters to cope with three types of existing errors, timing, shape and volume as the common errors in hydrological modelling. The new lumped model, the ERM model, has been selected for this study to evaluate its parameters for its use in model updating to cope with the stated errors. Investigation about ten events proves that the ERM model parameters can be updated to cope with the errors without the need to recalibrate the model.

  16. 1997 update for the applications guide to vehicle SNM monitors

    International Nuclear Information System (INIS)

    York, R.L.; Fehlau, P.E.

    1997-04-01

    Ten years have elapsed since the publication of the original applications guide to vehicle special nuclear material (SNM) monitors. During that interval, use of automatic vehicle monitors has become more commonplace, and formal procedures for monitor upkeep and evaluation have become available. New concepts for vehicle monitoring are being explored, as well. This update report reviews the basics of vehicle SNM monitoring, discusses what is new in vehicle SNM monitoring, and catalogs the vehicle SNM monitors that are commercial available

  17. Deep Unfolding for Topic Models.

    Science.gov (United States)

    Chien, Jen-Tzung; Lee, Chao-Hsi

    2018-02-01

    Deep unfolding provides an approach to integrate the probabilistic generative models and the deterministic neural networks. Such an approach is benefited by deep representation, easy interpretation, flexible learning and stochastic modeling. This study develops the unsupervised and supervised learning of deep unfolded topic models for document representation and classification. Conventionally, the unsupervised and supervised topic models are inferred via the variational inference algorithm where the model parameters are estimated by maximizing the lower bound of logarithm of marginal likelihood using input documents without and with class labels, respectively. The representation capability or classification accuracy is constrained by the variational lower bound and the tied model parameters across inference procedure. This paper aims to relax these constraints by directly maximizing the end performance criterion and continuously untying the parameters in learning process via deep unfolding inference (DUI). The inference procedure is treated as the layer-wise learning in a deep neural network. The end performance is iteratively improved by using the estimated topic parameters according to the exponentiated updates. Deep learning of topic models is therefore implemented through a back-propagation procedure. Experimental results show the merits of DUI with increasing number of layers compared with variational inference in unsupervised as well as supervised topic models.

  18. A Micro-simulation model of updating expected travel time in provision of travel information : A bayesian belief approach implemented in a multi-state supernetwork

    NARCIS (Netherlands)

    Parvaneh, Z.; Liao, F.; Arentze, T.A.; Timmermans, H.J.P.; Shakshuki, Elhadi; Yasar, Ansar

    2014-01-01

    This study introduces a model of individual belief updating of subjective travel times as a function of the provision of different types of travel information. Travel information includes real-time prescriptive or descriptive, and public or personal information. The model is embedded in a

  19. Updated aerosol module and its application to simulate secondary organic aerosols during IMPACT campaign May 2008

    Directory of Open Access Journals (Sweden)

    Y. P. Li

    2013-07-01

    Full Text Available The formation of Secondary organic aerosol (SOA was simulated with the Secondary ORGanic Aerosol Model (SORGAM by a classical gas-particle partitioning concept, using the two-product model approach, which is widely used in chemical transport models. In this study, we extensively updated SORGAM including three major modifications: firstly, we derived temperature dependence functions of the SOA yields for aromatics and biogenic VOCs (volatile organic compounds, based on recent chamber studies within a sophisticated mathematic optimization framework; secondly, we implemented the SOA formation pathways from photo oxidation (OH initiated of isoprene; thirdly, we implemented the SOA formation channel from NO3-initiated oxidation of reactive biogenic hydrocarbons (isoprene and monoterpenes. The temperature dependence functions of the SOA yields were validated against available chamber experiments, and the updated SORGAM with temperature dependence functions was evaluated with the chamber data. Good performance was found with the normalized mean error of less than 30%. Moreover, the whole updated SORGAM module was validated against ambient SOA observations represented by the summed oxygenated organic aerosol (OOA concentrations abstracted from aerosol mass spectrometer (AMS measurements at a rural site near Rotterdam, the Netherlands, performed during the IMPACT campaign in May 2008. In this case, we embedded both the original and the updated SORGAM module into the EURopean Air pollution and Dispersion-Inverse Model (EURAD-IM, which showed general good agreements with the observed meteorological parameters and several secondary products such as O3, sulfate and nitrate. With the updated SORGAM module, the EURAD-IM model also captured the observed SOA concentrations reasonably well especially those during nighttime. In contrast, the EURAD-IM model before update underestimated the observations by a factor of up to 5. The large improvements of the modeled

  20. Defects in railway bridges and procedures for maintenance:UIC Code 778-4R

    OpenAIRE

    Elfgren, Lennart

    2009-01-01

    This leaflet gives guidelines and recommendations covering procedures for the maintenance and strengthening of railway bridges. Arrangements and methods for inspection are presented; defects are described; methods for monitoring and assessment are given; and procedures for maintenance, repair, strengthening and renewal are defined.The purpose is to update the 1989 edition of UIC Code 778-4R and to implement results from a European Integrated Research Project (2003-2007) on “Sustainable Bridge...

  1. Fast Updating National Geo-Spatial Databases with High Resolution Imagery: China's Methodology and Experience

    Science.gov (United States)

    Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.

    2014-04-01

    Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.

  2. Sampling and Analysis Plan Update for Groundwater Monitoring 1100-EM-1 Operable Unit

    International Nuclear Information System (INIS)

    DR Newcomer

    1999-01-01

    This document updates the sampling and analysis plan (Department of Energy/Richland Operations--95-50) to reflect current groundwater monitoring at the 1100-EM-1Operable Unit. Items requiring updating included sampling and analysis protocol, quality assurance and quality control, groundwater level measurement procedure, and data management. The plan covers groundwater monitoring, as specified in the 1993 Record of Decision, during the 5-year review period from 1995 through 1999. Following the 5-year review period, groundwater-monitoring data will be reviewed by Environmental Protection Agency to evaluate the progress of natural attenuation of trichloroethylene. Monitored natural attenuation and institutional controls for groundwater use at the inactive Horn Rapids Landfill was the selected remedy specified in the Record of Decision

  3. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  4. Update of the Dutch manual for costing studies in health care.

    Directory of Open Access Journals (Sweden)

    Tim A Kanters

    Full Text Available Dutch health economic guidelines include a costing manual, which describes preferred research methodology for costing studies and reference prices to ensure high quality studies and comparability between study outcomes. This paper describes the most important revisions of the costing manual compared to the previous version.An online survey was sent out to potential users of the costing manual to identify topics for improvement. The costing manual was aligned with contemporary health economic guidelines. All methodology sections and parameter values needed for costing studies, particularly reference prices, were updated. An expert panel of health economists was consulted several times during the review process. The revised manual was reviewed by two members of the expert panel and by reviewers of the Dutch Health Care Institute.The majority of survey respondents was satisfied with content and usability of the existing costing manual. Respondents recommended updating reference prices and adding some particular commonly needed reference prices. Costs categories were adjusted to the international standard: 1 costs within the health care sector; 2 patient and family costs; and 3 costs in other sectors. Reference prices were updated to reflect 2014 values. The methodology chapter was rewritten to match the requirements of the costing manual and preferences of the users. Reference prices for nursing days of specific wards, for diagnostic procedures and nurse practitioners were added.The usability of the costing manual was increased and parameter values were updated. The costing manual became integrated in the new health economic guidelines.

  5. Computational Modelling in Development of a Design Procedure for Concrete Road

    Directory of Open Access Journals (Sweden)

    B. Novotný

    2000-01-01

    Full Text Available The computational modelling plays a decisive part in development of a new design procedure for concrete pavement by quantifying impacts of individual design factors. In the present paper, the emphasis is placed on the modelling of a structural response of the jointed concrete pavement as a system of interacting rectangular slabs transferring wheel loads into an elastic layered subgrade. The finite element plate analysis is combined with the assumption of a linear contact stress variation over triangular elements of the contact region division. The linking forces are introduced to model the load transfer across the joints. The unknown contact stress nodal intensities as well as unknown linking forces are determined in an iterative way to fulfil slab/foundation and slab/slab contact conditions. The temperature effects are also considered and space is reserved for modelling of inelastic and additional environmental effects. It is pointed out that pavement design should be based on full data of pavement stressing, in contradiction to procedures accounting only for the axle load induced stresses.

  6. The use of flow models for design of plant operating procedures

    International Nuclear Information System (INIS)

    Lind, M.

    1982-03-01

    The report describe a systematic approach to the design of operating procedures or sequence automatics for process plant control. It is shown how flow models representing the topology of mass and energy flows on different levels of function provide plant information which is important for the considered design problem. The modelling methodology leads to the definition of three categories of control tasks. Two tasks relate to the regulation and control of changes of levels and flows of mass and energy in a system within a defined mode of operation. The third type relate to the control actions necessary for switching operations involved in changes of operating mode. These control tasks are identified for a given plant as part of the flow modelling activity. It is discussed how the flow model deal with the problem of assigning control task precedence in time eg. during start-up or shut-down operations. The method may be a basis for providing automated procedure support to the operator in unforeseen situations or may be a tool for control design. (auth.)

  7. Procedural 3d Modelling for Traditional Settlements. The Case Study of Central Zagori

    Science.gov (United States)

    Kitsakis, D.; Tsiliakou, E.; Labropoulos, T.; Dimopoulou, E.

    2017-02-01

    Over the last decades 3D modelling has been a fast growing field in Geographic Information Science, extensively applied in various domains including reconstruction and visualization of cultural heritage, especially monuments and traditional settlements. Technological advances in computer graphics, allow for modelling of complex 3D objects achieving high precision and accuracy. Procedural modelling is an effective tool and a relatively novel method, based on algorithmic modelling concept. It is utilized for the generation of accurate 3D models and composite facade textures from sets of rules which are called Computer Generated Architecture grammars (CGA grammars), defining the objects' detailed geometry, rather than altering or editing the model manually. In this paper, procedural modelling tools have been exploited to generate the 3D model of a traditional settlement in the region of Central Zagori in Greece. The detailed geometries of 3D models derived from the application of shape grammars on selected footprints, and the process resulted in a final 3D model, optimally describing the built environment of Central Zagori, in three levels of Detail (LoD). The final 3D scene was exported and published as 3D web-scene which can be viewed with 3D CityEngine viewer, giving a walkthrough the whole model, same as in virtual reality or game environments. This research work addresses issues regarding textures' precision, LoD for 3D objects and interactive visualization within one 3D scene, as well as the effectiveness of large scale modelling, along with the benefits and drawbacks that derive from procedural modelling techniques in the field of cultural heritage and more specifically on 3D modelling of traditional settlements.

  8. Thermocouple module halt failure acceptance test procedure for Tank 241-SY-101 DACS-1

    International Nuclear Information System (INIS)

    Ermi, A.M.

    1997-01-01

    The readiness of the Tank 241-SY-101 Data Acquisition and Control System (DACS-1) to provide monitoring and alarms for a halt failure of any thermocouple module will be tested during the performance of this procedure. Updated DACS-1 ''1/0 MODULE HEALTH STATUS'', ''MININ1'', and ''MININ2'' screens, which now provide indication of thermocouple module failure, will also be tested as part of this procedure

  9. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2014-01-01

    Highly recommended by JASA, Technometrics, and other journals, the first edition of this bestseller showed how to easily perform complex linear mixed model (LMM) analyses via a variety of software programs. Linear Mixed Models: A Practical Guide Using Statistical Software, Second Edition continues to lead readers step by step through the process of fitting LMMs. This second edition covers additional topics on the application of LMMs that are valuable for data analysts in all fields. It also updates the case studies using the latest versions of the software procedures and provides up-to-date information on the options and features of the software procedures available for fitting LMMs in SAS, SPSS, Stata, R/S-plus, and HLM.New to the Second Edition A new chapter on models with crossed random effects that uses a case study to illustrate software procedures capable of fitting these models Power analysis methods for longitudinal and clustered study designs, including software options for power analyses and suggest...

  10. Conflicts of Interest in Clinical Guidelines: Update of U.S. Preventive Services Task Force Policies and Procedures.

    Science.gov (United States)

    Ngo-Metzger, Quyen; Moyer, Virginia; Grossman, David; Ebell, Mark; Woo, Meghan; Miller, Therese; Brummer, Tana; Chowdhury, Joya; Kato, Elisabeth; Siu, Albert; Phillips, William; Davidson, Karina; Phipps, Maureen; Bibbins-Domingo, Kirsten

    2018-01-01

    The U.S. Preventive Services Task Force (USPSTF) provides independent, objective, and scientifically rigorous recommendations for clinical preventive services. A primary concern is to avoid even the appearance of members having special interests that might influence their ability to judge evidence and formulate unbiased recommendations. The conflicts of interest policy for the USPSTF is described, as is the formal process by which best practices were incorporated to update the policy. The USPSTF performed a literature review, conducted key informant interviews, and reviewed conflicts of interest policies of ten similar organizations. Important findings included transparency and public accessibility; full disclosure of financial relationships; disclosure of non-financial relationships (that create the potential for bias and compromise a member's objective judgment); disclosure of family members' conflicts of interests; and establishment of appropriate reporting periods. Controversies in best practices include the threshold of financial disclosures, ease of access to conflicts of interest policies and declarations, vague definition of non-financial biases, and request for family members' conflicts of interests (particularly those that are non-financial in nature). The USPSTF conflicts of interest policy includes disclosures for immediate family members, a clear non-financial conflicts of interest definition, long look-back period and application of the policy to prospective members. Conflicts of interest is solicited from all members every 4 months, formally reviewed, adjudicated, and made publicly available. The USPSTF conflicts of interest policy is publicly available as part of the USPSTF Procedure Manual. A continuous improvement process can be applied to conflicts of interest policies to enhance public trust in members of panels, such as the USPSTF, that produce clinical guidelines and recommendations. Copyright © 2018 American Journal of Preventive Medicine

  11. Minimum mean square error estimation and approximation of the Bayesian update

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.; Zander, Elmar

    2015-01-01

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.

  12. Minimum mean square error estimation and approximation of the Bayesian update

    KAUST Repository

    Litvinenko, Alexander

    2015-01-07

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.

  13. EANM procedural guidelines for radionuclide myocardial perfusion imaging with SPECT and SPECT/CT: 2015 revision

    Energy Technology Data Exchange (ETDEWEB)

    Verberne, Hein J.; Eck-Smit, Berthe L.F. van; Wit, Tim C. de [University of Amsterdam, Department of Nuclear Medicine, F2-238, Academic Medical Center, Amsterdam (Netherlands); Acampa, Wanda [National Council of Research, Institute of Biostructures and Bioimaging, Naples (Italy); Anagnostopoulos, Constantinos [Academy of Athens, Center for Experimental Surgery, Clinical and Translational Research, Biomedical Research Foundation, Athens (Greece); Ballinger, Jim [Guy' s Hospital - Guy' s and St Thomas' Trust Foundation, Department of Nuclear Medicine, London (United Kingdom); Bengel, Frank [Hannover Medical School, Department of Nuclear Medicine, Hannover (Germany); Bondt, Pieter De [OLV Hospital, Department of Nuclear Medicine, Aalst (Belgium); Buechel, Ronny R.; Kaufmann, Philip A. [University Hospital Zurich, Cardiac Imaging, Zurich (Switzerland); Cuocolo, Alberto [University Federico II, Department of Advanced Biomedical Sciences, Naples (Italy); Flotats, Albert [Universitat Autonoma de Barcelona, Nuclear Medicine Department, Hospital de la Santa Creu i Sant Pau, Barcelona (Spain); Hacker, Marcus [Medical University of Vienna, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-Guided Therapy, Vienna (Austria); Hindorf, Cecilia [Skaane University Hospital, Department of Radiation Physics, Lund (Sweden); Lindner, Oliver [University Hospital of the Ruhr-University Bochum, Heart and Diabetes Center North Rhine-Westphalia, Institute for Radiology, Nuclear Medicine and Molecular Imaging, Bad Oeynhausen (Germany); Ljungberg, Michael [Lund University, Department of Medical Radiation Physics, Lund (Sweden); Lonsdale, Markus [Bispebjerg Hospital, Department of Clinical Physiology and Nuclear Medicine, Copenhagen (Denmark); Manrique, Alain [Caen University Hospital, Department of Nuclear Medicine, Service Commun Investigations chez l' Homme, GIP Cyceron, Caen (France); Minarik, David [Skaane University Hospital, Radiation Physics, Malmoe (Sweden); Scholte, Arthur J.H.A. [Leiden University Medical Center, Department of Cardiology, Leiden (Netherlands); Slart, Riemer H.J.A. [University of Groningen, University Medical Center Groningen, Department of Nuclear Medicine and Molecular Imaging, Groningen (Netherlands); Traegaardh, Elin [Skaane University Hospital and Lund University, Clinical Physiology and Nuclear Medicine, Malmoe (Sweden); Hesse, Birger [University Hospital of Copenhagen, Department of Clinical Physiology and Nuclear Medicine and PET, Rigshospitalet, Copenhagen (Denmark)

    2015-11-15

    Since the publication of the European Association of Nuclear Medicine (EANM) procedural guidelines for radionuclide myocardial perfusion imaging (MPI) in 2005, many small and some larger steps of progress have been made, improving MPI procedures. In this paper, the major changes from the updated 2015 procedural guidelines are highlighted, focusing on the important changes related to new instrumentation with improved image information and the possibility to reduce radiation exposure, which is further discussed in relation to the recent developments of new International Commission on Radiological Protection (ICRP) models. Introduction of the selective coronary vasodilator regadenoson and the use of coronary CT-contrast agents for hybrid imaging with SPECT/CT angiography are other important areas for nuclear cardiology that were not included in the previous guidelines. A large number of minor changes have been described in more detail in the fully revised version available at the EANM home page: http://eanm.org/ publications/guidelines/2015{sub 0}7{sub E}ANM{sub F}INAL myocardial{sub p}erfusion{sub g}uideline.pdf. (orig.)

  14. Update in women's health.

    Science.gov (United States)

    Ganschow, Pamela S; Jacobs, Elizabeth A; Mackinnon, Jennifer; Charney, Pamela

    2009-06-01

    average 50-year-old woman, is provided in the guidelines. In addition, available risk prediction models, such as the NIH Web site calculator (http://www.cancer.gov/bcrisktool/) can also be used to estimate quantitative breast cancer risk. This model was updated in 2008 with race-specific data for calculating risk in African-American women.18 The harms and benefits of mammography should be discussed and incorporated along with a woman's preferences and breast cancer risk profile into the decision on when to begin screening. If a woman decides to forgo mammography, the decision should be readdressed every 1 to 2 years. STD screening guidelines19 USPSTF and CDC Routine screening for this infection is now recommended for ALL sexually active women age 24 and under, based on the recent high prevalence estimates for chlamydia It is not recommended for women (pregnant or nonpregnant) age 25 and older, unless they are at increased risk for infection. STD treatment guidelines20 CDC Flouroquinolones are NO longer recommended for treatment of N. gonorrhea, due to increasing resistance (as high as 15% of isolates in 2006). For uncomplicated infections, treatment of gonorrhea should be initiated with ceftriaxone 125 mg IM or cefixime 400 mg PO and co-treatment for chlamydia infection (unless ruled out with testing). Recent estimates demonstrate that almost 50% of persons with gonorrhea have concomitant chlamydia infection21. STD = sexually transmitted disease, NIH = National Institutes of Health, ACP = American College of Physicians, USPSTF = United States Prevention Services Task Force, CDC = Centers for Disease Control.

  15. Macrophyte and pH buffering updates to the Klamath River water-quality model upstream of Keno Dam, Oregon

    Science.gov (United States)

    Sullivan, Annett B.; Rounds, Stewart A.; Asbill-Case, Jessica R.; Deas, Michael L.

    2013-01-01

    A hydrodynamic, water temperature, and water-quality model of the Link River to Keno Dam reach of the upper Klamath River was updated to account for macrophytes and enhanced pH buffering from dissolved organic matter, ammonia, and orthophosphorus. Macrophytes had been observed in this reach by field personnel, so macrophyte field data were collected in summer and fall (June-October) 2011 to provide a dataset to guide the inclusion of macrophytes in the model. Three types of macrophytes were most common: pondweed (Potamogeton species), coontail (Ceratophyllum demersum), and common waterweed (Elodea canadensis). Pondweed was found throughout the Link River to Keno Dam reach in early summer with densities declining by mid-summer and fall. Coontail and common waterweed were more common in the lower reach near Keno Dam and were at highest density in summer. All species were most dense in shallow water (less than 2 meters deep) near shore. The highest estimated dry weight biomass for any sample during the study was 202 grams per square meter for coontail in August. Guided by field results, three macrophyte groups were incorporated into the CE-QUAL-W2 model for calendar years 2006-09. The CE-QUAL-W2 model code was adjusted to allow the user to initialize macrophyte populations spatially across the model grid. The default CE-QUAL-W2 model includes pH buffering by carbonates, but does not include pH buffering by organic matter, ammonia, or orthophosphorus. These three constituents, especially dissolved organic matter, are present in the upper Klamath River at concentrations that provide substantial pH buffering capacity. In this study, CE-QUAL-W2 was updated to include this enhanced buffering capacity in the simulation of pH. Acid dissociation constants for ammonium and phosphoric acid were taken from the literature. For dissolved organic matter, the number of organic acid groups and each group's acid dissociation constant (Ka) and site density (moles of sites per mole of

  16. The Updated BaSTI Stellar Evolution Models and Isochrones. I. Solar-scaled Calculations

    Science.gov (United States)

    Hidalgo, Sebastian L.; Pietrinferni, Adriano; Cassisi, Santi; Salaris, Maurizio; Mucciarelli, Alessio; Savino, Alessandro; Aparicio, Antonio; Silva Aguirre, Victor; Verma, Kuldeep

    2018-04-01

    We present an updated release of the BaSTI (a Bag of Stellar Tracks and Isochrones) stellar model and isochrone library for a solar-scaled heavy element distribution. The main input physics that have been changed from the previous BaSTI release include the solar metal mixture, electron conduction opacities, a few nuclear reaction rates, bolometric corrections, and the treatment of the overshooting efficiency for shrinking convective cores. The new model calculations cover a mass range between 0.1 and 15 M ⊙, 22 initial chemical compositions between [Fe/H] = ‑3.20 and +0.45, with helium to metal enrichment ratio dY/dZ = 1.31. The isochrones cover an age range between 20 Myr and 14.5 Gyr, consistently take into account the pre-main-sequence phase, and have been translated to a large number of popular photometric systems. Asteroseismic properties of the theoretical models have also been calculated. We compare our isochrones with results from independent databases and with several sets of observations to test the accuracy of the calculations. All stellar evolution tracks, asteroseismic properties, and isochrones are made available through a dedicated web site.

  17. Using plant procedures as the basis for conducting a job and task analysis

    International Nuclear Information System (INIS)

    Haynes, F.H.; Ruth, B.W.

    1985-01-01

    Plant procedures were selected, by Northeast Utilities (NU), as the basis for conducting Job and Task Analyses (JTA). The resultant JTA was used to design procedure based simulator training programs for Millstone 1, 2, and Connecticut Yankee. The task listings were both plant specific and exhibited excellent correlation to INPO's generic PWR and BWR task analyses. Using the procedures based method enabled us to perform the JTA using plant and training staff. This proved cost effective in terms of both time and money. Learning objectives developed from the JTA were easily justified and correlated directly to job performance within the context of the plant procedures. In addition, the analysis generated a comprehensive review of plant procedures and, conversely, the plant's normal procedure revision process generated an automatic trigger for updating the task data

  18. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas April 11 Update of switch in LHC 4 LHC 4 Point April 14 Update of switch in LHC 5 LHC 5 Point April 15 Update of switches in LHC 3 and LHC 2 Points LHC 3 and LHC 2 April 22 Update of switch N4 Meyrin Ouest April 23 Update of switch  N6 Prévessin Site Ap...

  19. Updating flood maps efficiently using existing hydraulic models, very-high-accuracy elevation data, and a geographic information system; a pilot study on the Nisqually River, Washington

    Science.gov (United States)

    Jones, Joseph L.; Haluska, Tana L.; Kresch, David L.

    2001-01-01

    A method of updating flood inundation maps at a fraction of the expense of using traditional methods was piloted in Washington State as part of the U.S. Geological Survey Urban Geologic and Hydrologic Hazards Initiative. Large savings in expense may be achieved by building upon previous Flood Insurance Studies and automating the process of flood delineation with a Geographic Information System (GIS); increases in accuracy and detail result from the use of very-high-accuracy elevation data and automated delineation; and the resulting digital data sets contain valuable ancillary information such as flood depth, as well as greatly facilitating map storage and utility. The method consists of creating stage-discharge relations from the archived output of the existing hydraulic model, using these relations to create updated flood stages for recalculated flood discharges, and using a GIS to automate the map generation process. Many of the effective flood maps were created in the late 1970?s and early 1980?s, and suffer from a number of well recognized deficiencies such as out-of-date or inaccurate estimates of discharges for selected recurrence intervals, changes in basin characteristics, and relatively low quality elevation data used for flood delineation. FEMA estimates that 45 percent of effective maps are over 10 years old (FEMA, 1997). Consequently, Congress has mandated the updating and periodic review of existing maps, which have cost the Nation almost 3 billion (1997) dollars. The need to update maps and the cost of doing so were the primary motivations for piloting a more cost-effective and efficient updating method. New technologies such as Geographic Information Systems and LIDAR (Light Detection and Ranging) elevation mapping are key to improving the efficiency of flood map updating, but they also improve the accuracy, detail, and usefulness of the resulting digital flood maps. GISs produce digital maps without manual estimation of inundated areas between

  20. Using Cell Phone Technology for Self-Monitoring Procedures in Inclusive Settings

    Science.gov (United States)

    Bedesem, Pena L.

    2012-01-01

    The purpose of this study was to determine the effects and social validity of an innovative method of self-monitoring for middle school students with high-incidence disabilities in inclusive settings. An updated self-monitoring procedure, called CellF-Monitoring, utilized a cell phone as an all-inclusive self-monitoring device. The study took…

  1. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  2. Updated RENORM/MBR Predictions for Diffraction at the LHC

    CERN Document Server

    Goulianos, K

    2015-01-01

    Updated RENORM/MBR-model predictions of diffractive, total, and total-inelastic cross sections at the LHC are presented and compared with experimental results and predictions from other models. In addition, expectations for diffraction at the upcoming LHC run at √s = 13 TeV are discussed.

  3. Methodology update for determination of the erosion coefficient(Z

    Directory of Open Access Journals (Sweden)

    Tošić Radislav

    2012-01-01

    Full Text Available The research and mapping the intensity of mechanical water erosion that have begun with the empirical methodology of S. Gavrilović during the mid-twentieth century last, by various intensity, until the present time. A many decades work on the research of these issues pointed to some shortcomings of the existing methodology, and thus the need for its innovation. In this sense, R. Lazarević made certain adjustments of the empirical methodology of S. Gavrilović by changing the tables for determination of the coefficients Φ, X and Y, that is, the tables for determining the mean erosion coefficient (Z. The main objective of this paper is to update the existing methodology for determining the erosion coefficient (Z with the empirical methodology of S. Gavrilović and amendments made by R. Lazarević (1985, but also with better adjustments to the information technologies and the needs of modern society. The proposed procedure, that is, the model to determine the erosion coefficient (Z in this paper is the result of ten years of scientific research and project work in mapping the intensity of mechanical water erosion and its modeling using various models of erosion in the Republic of Srpska and Serbia. By analyzing the correlation of results obtained by regression models and results obtained during the mapping of erosion on the territory of the Republic of Srpska, a high degree of correlation (R² = 0.9963 was established, which is essentially a good assessment of the proposed models.

  4. «Soft Power»: the Updated Theoretical Concept and Russian Assembly Model

    Directory of Open Access Journals (Sweden)

    Владимир Сергеевич Изотов

    2011-12-01

    Full Text Available The article is dedicated to critically important informational and ideological aspects of Russia's foreign policy. The goal is to revise and specify the notion soft power in the context of rapidly changing space of global politics. During the last years international isolation of Russia, including informational and ideological sphere is increasing. The way to overcome this negative trend is modernization of foreign policy strategy on the basis of updating of operational tools and ideological accents. It's becoming obvious that the real foreign policy success in the global world system is achieved by the use of soft power. The author tries to specify and conceptualize the phenomenon of Russia's soft power as a purposeful external ideology facing the urgent need of updating.

  5. Performance of the Line-By-Line Radiative Transfer Model (LBLRTM for temperature, water vapor, and trace gas retrievals: recent updates evaluated with IASI case studies

    Directory of Open Access Journals (Sweden)

    M. J. Alvarado

    2013-07-01

    Full Text Available Modern data assimilation algorithms depend on accurate infrared spectroscopy in order to make use of the information related to temperature, water vapor (H2O, and other trace gases provided by satellite observations. Reducing the uncertainties in our knowledge of spectroscopic line parameters and continuum absorption is thus important to improve the application of satellite data to weather forecasting. Here we present the results of a rigorous validation of spectroscopic updates to an advanced radiative transfer model, the Line-By-Line Radiative Transfer Model (LBLRTM, against a global dataset of 120 near-nadir, over-ocean, nighttime spectra from the Infrared Atmospheric Sounding Interferometer (IASI. We compare calculations from the latest version of LBLRTM (v12.1 to those from a previous version (v9.4+ to determine the impact of spectroscopic updates to the model on spectral residuals as well as retrieved temperature and H2O profiles. We show that the spectroscopy in the CO2 ν2 and ν3 bands is significantly improved in LBLRTM v12.1 relative to v9.4+, and that these spectroscopic updates lead to mean changes of ~0.5 K in the retrieved vertical temperature profiles between the surface and 10 hPa, with the sign of the change and the variability among cases depending on altitude. We also find that temperature retrievals using each of these two CO2 bands are remarkably consistent in LBLRTM v12.1, potentially allowing these bands to be used to retrieve atmospheric temperature simultaneously. The updated H2O spectroscopy in LBLRTM v12.1 substantially improves the a posteriori residuals in the P-branch of the H2O ν2 band, while the improvements in the R-branch are more modest. The H2O amounts retrieved with LBLRTM v12.1 are on average 14% lower between 100 and 200 hPa, 42% higher near 562 hPa, and 31% higher near the surface compared to the amounts retrieved with v9.4+ due to a combination of the different retrieved temperature profiles and the

  6. A combined deterministic and probabilistic procedure for safety assessment of components with cracks - Handbook.

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Bergman, Mats; Brickstad, Bjoern; Weilin Zang; Sattari-Far, Iradj; Andersson, Peder; Sund, Goeran; Dahlberg, Lars; Nilsson, Fred (Inspecta Technology AB, Stockholm (Sweden))

    2008-07-01

    SSM has supported research work for the further development of a previously developed procedure/handbook (SKI Report 99:49) for assessment of detected cracks and tolerance for defect analysis. During the operative use of the handbook it was identified needs to update the deterministic part of the procedure and to introduce a new probabilistic flaw evaluation procedure. Another identified need was a better description of the theoretical basis to the computer program. The principal aim of the project has been to update the deterministic part of the recently developed procedure and to introduce a new probabilistic flaw evaluation procedure. Other objectives of the project have been to validate the conservatism of the procedure, make the procedure well defined and easy to use and make the handbook that documents the procedure as complete as possible. The procedure/handbook and computer program ProSACC, Probabilistic Safety Assessment of Components with Cracks, has been extensively revised within this project. The major differences compared to the last revision are within the following areas: It is now possible to deal with a combination of deterministic and probabilistic data. It is possible to include J-controlled stable crack growth. The appendices on material data to be used for nuclear applications and on residual stresses are revised. A new deterministic safety evaluation system is included. The conservatism in the method for evaluation of the secondary stresses for ductile materials is reduced. A new geometry, a circular bar with a circumferential surface crack has been introduced. The results of this project will be of use to SSM in safety assessments of components with cracks and in assessments of the interval between the inspections of components in nuclear power plants

  7. Sensitivity Analysis of the Influence of Structural Parameters on Dynamic Behaviour of Highly Redundant Cable-Stayed Bridges

    Directory of Open Access Journals (Sweden)

    B. Asgari

    2013-01-01

    Full Text Available The model tuning through sensitivity analysis is a prominent procedure to assess the structural behavior and dynamic characteristics of cable-stayed bridges. Most of the previous sensitivity-based model tuning methods are automatic iterative processes; however, the results of recent studies show that the most reasonable results are achievable by applying the manual methods to update the analytical model of cable-stayed bridges. This paper presents a model updating algorithm for highly redundant cable-stayed bridges that can be used as an iterative manual procedure. The updating parameters are selected through the sensitivity analysis which helps to better understand the structural behavior of the bridge. The finite element model of Tatara Bridge is considered for the numerical studies. The results of the simulations indicate the efficiency and applicability of the presented manual tuning method for updating the finite element model of cable-stayed bridges. The new aspects regarding effective material and structural parameters and model tuning procedure presented in this paper will be useful for analyzing and model updating of cable-stayed bridges.

  8. Technical Note: Procedure for the calibration and validation of kilo-voltage cone-beam CT models

    Energy Technology Data Exchange (ETDEWEB)

    Vilches-Freixas, Gloria; Létang, Jean Michel; Rit, Simon, E-mail: simon.rit@creatis.insa-lyon.fr [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1206, INSA-Lyon, Université Lyon 1, Centre Léon Bérard, Lyon 69373 Cedex 08 (France); Brousmiche, Sébastien [Ion Beam Application, Louvain-la-Neuve 1348 (Belgium); Romero, Edward; Vila Oliva, Marc [Université de Lyon, CREATIS, CNRS UMR5220, Inserm U1206, INSA-Lyon, Université Lyon 1, Centre Léon Bérard, Lyon 69373 Cedex 08, France and Ion Beam Application, Louvain-la-Neuve 1348 (Belgium); Kellner, Daniel; Deutschmann, Heinz; Keuschnigg, Peter; Steininger, Philipp [Institute for Research and Development on Advanced Radiation Technologies, Paracelsus Medical University, Salzburg 5020 (Austria)

    2016-09-15

    Purpose: The aim of this work is to propose a general and simple procedure for the calibration and validation of kilo-voltage cone-beam CT (kV CBCT) models against experimental data. Methods: The calibration and validation of the CT model is a two-step procedure: the source model then the detector model. The source is described by the direction dependent photon energy spectrum at each voltage while the detector is described by the pixel intensity value as a function of the direction and the energy of incident photons. The measurements for the source consist of a series of dose measurements in air performed at each voltage with varying filter thicknesses and materials in front of the x-ray tube. The measurements for the detector are acquisitions of projection images using the same filters and several tube voltages. The proposed procedure has been applied to calibrate and assess the accuracy of simple models of the source and the detector of three commercial kV CBCT units. If the CBCT system models had been calibrated differently, the current procedure would have been exclusively used to validate the models. Several high-purity attenuation filters of aluminum, copper, and silver combined with a dosimeter which is sensitive to the range of voltages of interest were used. A sensitivity analysis of the model has also been conducted for each parameter of the source and the detector models. Results: Average deviations between experimental and theoretical dose values are below 1.5% after calibration for the three x-ray sources. The predicted energy deposited in the detector agrees with experimental data within 4% for all imaging systems. Conclusions: The authors developed and applied an experimental procedure to calibrate and validate any model of the source and the detector of a CBCT unit. The present protocol has been successfully applied to three x-ray imaging systems. The minimum requirements in terms of material and equipment would make its implementation suitable in

  9. Updating Recursive XML Views of Relations

    DEFF Research Database (Denmark)

    Choi, Byron; Cong, Gao; Fan, Wenfei

    2009-01-01

    This paper investigates the view update problem for XML views published from relational data. We consider XML views defined in terms of mappings directed by possibly recursive DTDs compressed into DAGs and stored in relations. We provide new techniques to efficiently support XML view updates...... specified in terms of XPath expressions with recursion and complex filters. The interaction between XPath recursion and DAG compression of XML views makes the analysis of the XML view update problem rather intriguing. Furthermore, many issues are still open even for relational view updates, and need...... to be explored. In response to these, on the XML side, we revise the notion of side effects and update semantics based on the semantics of XML views, and present effecient algorithms to translate XML updates to relational view updates. On the relational side, we propose a mild condition on SPJ views, and show...

  10. Dissociating Working Memory Updating and Automatic Updating: The Reference-Back Paradigm

    Science.gov (United States)

    Rac-Lubashevsky, Rachel; Kessler, Yoav

    2016-01-01

    Working memory (WM) updating is a controlled process through which relevant information in the environment is selected to enter the gate to WM and substitute its contents. We suggest that there is also an automatic form of updating, which influences performance in many tasks and is primarily manifested in reaction time sequential effects. The goal…

  11. Memory reconsolidation mediates the updating of hippocampal memory content

    Directory of Open Access Journals (Sweden)

    Jonathan L C Lee

    2010-11-01

    Full Text Available The retrieval or reactivation of a memory places it into a labile state, requiring a process of reconsolidation to restabilize it. This retrieval-induced plasticity is a potential mechanism for the modification of the existing memory. Following previous data supportive of a functional role for memory reconsolidation in the modification of memory strength, here I show that hippocampal memory reconsolidation also supports the updating of contextual memory content. Using a procedure that separates the learning of pure context from footshock-motivated contextual fear learning, I demonstrate doubly dissociable hippocampal mechanisms of initial context learning and subsequent updating of the neutral contextual representation to incorporate the footshock. Contextual memory consolidation was dependent upon BDNF expression in the dorsal hippocampus, whereas the footshock modification of the contextual representation required the expression of Zif268. These mechanisms match those previously shown to be selectively involved in hippocampal memory consolidation and reconsolidation, respectively. Moreover, memory reactivation is a necessary step in modifying memory content, as inhibition of hippocampal synaptic protein degradation also prevented the footshock-mediated memory modification. Finally, dorsal hippocampal knockdown of Zif268 impaired the reconsolidation of the pure contextual memory only under conditions of weak context memory training, as well as failing to disrupt contextual freezing when a strong contextual fear memory is reactivated by further conditioning. Therefore, an adaptive function of the reactivation and reconsolidation process is to enable the updating of memory content.

  12. Characteristics of Key Update Strategies for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2011-01-01

    Wireless sensor networks offer the advantages of simple and low-resource communication. Challenged by this simplicity and low-resources, security is of particular importance in many cases such as transmission of sensitive data or strict requirements of tamper-resistance. Updating the security keys...... is one of the essential points in security, which restrict the amount of data that may be exposed when a key is compromised. In this paper, we investigate key update methods that may be used in wireless sensor networks, and benefiting from stochastic model checking we derive characteristics...

  13. Responding to Changes in Building Legislation. Updating Training for the Building Regulations 1985 and Supporting Documents.

    Science.gov (United States)

    Harris, Robert; Phillips, Alan

    A project sought to develop a means of updating and retraining those required to comply with Britain's 1985 Building Regulations, which are substantially different from the previous ones in regard to procedures and technical content. The training needs analysis conducted indicated that the new training should be flexible and use practical and…

  14. Development of the Nonstationary Incremental Analysis Update Algorithm for Sequential Data Assimilation System

    Directory of Open Access Journals (Sweden)

    Yoo-Geun Ham

    2016-01-01

    Full Text Available This study introduces a modified version of the incremental analysis updates (IAU, called the nonstationary IAU (NIAU method, to improve the assimilation accuracy of the IAU while keeping the continuity of the analysis. Similar to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. However, unlike the IAU, the NIAU procedure uses time-evolved forcing using the forward operator as corrections to the model. The solution of the NIAU is superior to that of the forward IAU, of which analysis is performed at the beginning of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.

  15. Is percutaneous fine-needle biopsy a hazard? An update

    International Nuclear Information System (INIS)

    Smith, E.H.

    1987-01-01

    Fine-needle biopsy (FNB) has become a commonplace diagnostic procedure in most radiology departments with the assumption that risks are nonexistent. Animal experiments conclusively indicate the leakage of tumor cells after biopsy, but clinical evidence appears to point to a paucity of complications. In a prior review of the literature, two needle tract seedings (NTS) and two fatalities after FNB were discovered. A questionnaire at that time uncovered three more cases of NTS and four deaths. An updated literature search and questionnaire showed an additional 14 cases of NTS and nine deaths. An analysis of the date are reported

  16. GENERATION OF MULTI-LOD 3D CITY MODELS IN CITYGML WITH THE PROCEDURAL MODELLING ENGINE RANDOM3DCITY

    Directory of Open Access Journals (Sweden)

    F. Biljecki

    2016-09-01

    Full Text Available The production and dissemination of semantic 3D city models is rapidly increasing benefiting a growing number of use cases. However, their availability in multiple LODs and in the CityGML format is still problematic in practice. This hinders applications and experiments where multi-LOD datasets are required as input, for instance, to determine the performance of different LODs in a spatial analysis. An alternative approach to obtain 3D city models is to generate them with procedural modelling, which is – as we discuss in this paper – well suited as a method to source multi-LOD datasets useful for a number of applications. However, procedural modelling has not yet been employed for this purpose. Therefore, we have developed RANDOM3DCITY, an experimental procedural modelling engine for generating synthetic datasets of buildings and other urban features. The engine is designed to produce models in CityGML and does so in multiple LODs. Besides the generation of multiple geometric LODs, we implement the realisation of multiple levels of spatiosemantic coherence, geometric reference variants, and indoor representations. As a result of their permutations, each building can be generated in 392 different CityGML representations, an unprecedented number of modelling variants of the same feature. The datasets produced by RANDOM3DCITY are suited for several applications, as we show in this paper with documented uses. The developed engine is available under an open-source licence at Github at http://github.com/tudelft3d/Random3Dcity.

  17. Update and extension of the Brazil SimSmoke model to estimate the health impact of cigarette smoking by pregnant women in Brazil

    OpenAIRE

    Szklo, André Salem; Yuan, Zhe; Levy, David

    2017-01-01

    Abstract: A previous application of the Brazil SimSmoke tobacco control policy simulation model was used to show the effect of policies implemented between 1989 and 2010 on smoking-attributable deaths (SADs). In this study, we updated and further validated the Brazil SimSmoke model to incorporate policies implemented since 2011 (e.g., a new tax structure with the purpose of increasing revenues/real prices). In addition, we extended the model to estimate smoking-attributable maternal and child...

  18. Update on Fresh Fuel Characterization of U-Mo Alloys

    International Nuclear Information System (INIS)

    Burkes, D.E.; Wachs, D.M.; Keiser, D.D.; Okuniewski, M.A.; Jue, J.F.; Rice, F.J.; Prabhakaran, R.

    2009-01-01

    The need to provide more accurate property information on U-Mo fuel alloys to operators, modellers, researchers, fabricators, and government increases as success of the GTRI Reactor Convert program continues. This presentation provides an update on fresh fuel characterization activities that have occurred at the INL since the RERTR 2008 conference in Washington, D.C. The update is particularly focused on properties recently obtained and on the development progress of new measurement techniques. Furthermore, areas where useful and necessary information is still lacking is discussed. The update deals with mechanical, physical, and microstructural properties for both integrated and separate effects. Appropriate discussion of fabrication characteristics, impurities, thermodynamic response, and effects on the topic areas are provided, along with a background on the characterization techniques used and developed to obtain the information. Efforts to measure similar characteristics on irradiated fuel plates are discussed.

  19. Spiral model of procedural cycle of educational process management

    Directory of Open Access Journals (Sweden)

    Bezrukov Valery I.

    2016-01-01

    Full Text Available The article analyzes the nature and characteristics of the spiral model Procedure educational systems management cycle. The authors identify patterns between the development of information and communication technologies and the transformation of the education management process, give the characteristics of the concept of “information literacy” and “Media Education”. Consider the design function, determine its potential in changing the traditional educational paradigm to the new - information.

  20. A Comparison of Exposure Control Procedures in CAT Systems Based on Different Measurement Models for Testlets

    Science.gov (United States)

    Boyd, Aimee M.; Dodd, Barbara; Fitzpatrick, Steven

    2013-01-01

    This study compared several exposure control procedures for CAT systems based on the three-parameter logistic testlet response theory model (Wang, Bradlow, & Wainer, 2002) and Masters' (1982) partial credit model when applied to a pool consisting entirely of testlets. The exposure control procedures studied were the modified within 0.10 logits…

  1. On a computational method for modelling complex ecosystems by superposition procedure

    International Nuclear Information System (INIS)

    He Shanyu.

    1986-12-01

    In this paper, the Superposition Procedure is concisely described, and a computational method for modelling a complex ecosystem is proposed. With this method, the information contained in acceptable submodels and observed data can be utilized to maximal degree. (author). 1 ref

  2. Badhwar-O'Neill 2011 Galactic Cosmic Ray Model Update and Future Improvements

    Science.gov (United States)

    O'Neill, Pat M.; Kim, Myung-Hee Y.

    2014-01-01

    The Badhwar-O'Neill Galactic Cosmic Ray (GCR) Model based on actual GR measurements is used by deep space mission planners for the certification of micro-electronic systems and the analysis of radiation health risks to astronauts in space missions. The BO GCR Model provides GCR flux in deep space (outside the earth's magnetosphere) for any given time from 1645 to present. The energy spectrum from 50 MeV/n-20 GeV/n is provided for ions from hydrogen to uranium. This work describes the most recent version of the BO GCR model (BO'11). BO'11 determines the GCR flux at a given time applying an empirical time delay function to past sunspot activity. We describe the GCR measurement data used in the BO'11 update - modern data from BESS, PAMELA, CAPRICE, and ACE emphasized for than the older balloon data used for the previous BO model (BO'10). We look at the GCR flux for the last 24 solar minima and show how much greater the flux was for the cycle 24 minimum in 2010. The BO'11 Model uses the traditional, steady-state Fokker-Planck differential equation to account for particle transport in the heliosphere due to diffusion, convection, and adiabatic deceleration. It assumes a radially symmetrical diffusion coefficient derived from magnetic disturbances caused by sunspots carried onward by a constant solar wind. A more complex differential equation is now being tested to account for particle transport in the heliosphere in the next generation BO model. This new model is time-dependent (no longer a steady state model). In the new model, the dynamics and anti-symmetrical features of the actual heliosphere are accounted for so empirical time delay functions will no longer be required. The new model will be capable of simulating the more subtle features of modulation - such as the Sun's polarity and modulation dependence on the gradient and curvature drift. This improvement is expected to significantly improve the fidelity of the BO GCR model. Preliminary results of its

  3. Computerized Adaptive Testing with R: Recent Updates of the Package catR

    Directory of Open Access Journals (Sweden)

    David Magis

    2017-01-01

    Full Text Available The purpose of this paper is to list the recent updates of the R package catR. This package allows for generating response patterns under a computerized adaptive testing (CAT framework with underlying item response theory (IRT models. Among the most important updates, well-known polytomous IRT models are now supported by catR; several item selection rules have been added; and it is now possible to perform post-hoc simulations. Some functions were also rewritten or withdrawn to improve the usefulness and performances of the package.

  4. Updating systematic reviews: an international survey.

    Directory of Open Access Journals (Sweden)

    Chantelle Garritty

    Full Text Available BACKGROUND: Systematic reviews (SRs should be up to date to maintain their importance in informing healthcare policy and practice. However, little guidance is available about when and how to update SRs. Moreover, the updating policies and practices of organizations that commission or produce SRs are unclear. METHODOLOGY/PRINCIPAL FINDINGS: The objective was to describe the updating practices and policies of agencies that sponsor or conduct SRs. An Internet-based survey was administered to a purposive non-random sample of 195 healthcare organizations within the international SR community. Survey results were analyzed using descriptive statistics. The completed response rate was 58% (n = 114 from across 26 countries with 70% (75/107 of participants identified as producers of SRs. Among responders, 79% (84/107 characterized the importance of updating as high or very-high and 57% (60/106 of organizations reported to have a formal policy for updating. However, only 29% (35/106 of organizations made reference to a written policy document. Several groups (62/105; 59% reported updating practices as irregular, and over half (53/103 of organizational respondents estimated that more than 50% of their respective SRs were likely out of date. Authors of the original SR (42/106; 40% were most often deemed responsible for ensuring SRs were current. Barriers to updating included resource constraints, reviewer motivation, lack of academic credit, and limited publishing formats. Most respondents (70/100; 70% indicated that they supported centralization of updating efforts across institutions or agencies. Furthermore, 84% (83/99 of respondents indicated they favoured the development of a central registry of SRs, analogous to efforts within the clinical trials community. CONCLUSIONS/SIGNIFICANCE: Most organizations that sponsor and/or carry out SRs consider updating important. Despite this recognition, updating practices are not regular, and many organizations lack

  5. Training working memory updating in young adults.

    Science.gov (United States)

    Linares, Rocío; Borella, Erika; Lechuga, M Teresa; Carretti, Barbara; Pelegrina, Santiago

    2018-05-01

    Working memory updating (WMU) is a core mechanism in the human mental architecture and a good predictor of a wide range of cognitive processes. This study analyzed the benefits of two different WMU training procedures, near transfer effects on a working memory measure, and far transfer effects on nonverbal reasoning. Maintenance of any benefits a month later was also assessed. Participants were randomly assigned to: an adaptive training group that performed two numerical WMU tasks during four sessions; a non-adaptive training group that performed the same tasks but on a constant and less demanding level of difficulty; or an active control group that performed other tasks unrelated with working memory. After the training, all three groups showed improvements in most of the tasks, and these benefits were maintained a month later. The gain in one of the two WMU measures was larger for the adaptive and non-adaptive groups than for the control group. This specific gain in a task similar to the one trained would indicate the use of a better strategy for performing the task. Besides this nearest transfer effect, no other transfer effects were found. The adaptability of the training procedure did not produce greater improvements. These results are discussed in terms of the training procedure and the feasibility of training WMU.

  6. Update of the Unitarity Triangle Analysis

    CERN Document Server

    Bevan, A.J.; Ciuchini, M.; Derkach, D.; Stocchi, A.; Franco, E.; Silvestrini, L.; Lubicz, V.; Tarantino, Cecilia; Martinelli, G.; Parodi, F.; Schiavi, C.; Pierini, M.; Sordini, V.; Vagnoni, V.

    2010-01-01

    We present the status of the Unitarity Triangle Analysis (UTA), within the Standard Model (SM) and beyond, with experimental and theoretical inputs updated for the ICHEP 2010 conference. Within the SM, we find that the general consistency among all the constraints leaves space only to some tension (between the UTA prediction and the experimental measurement) in BR(B -> tau nu), sin(2 beta) and epsilon_K. In the UTA beyond the SM, we allow for New Physics (NP) effects in (Delta F)=2 processes. The hint of NP at the 2.9 sigma level in the B_s-\\bar B_s mixing turns out to be confirmed by the present update, which includes the new D0 result on the dimuon charge asymmetry but not the new CDF measurement of phi_s, being the likelihood not yet released.

  7. CONSIDERATIONS FOR THE TREATMENT OF COMPUTERIZED PROCEDURES IN HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-07-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  8. Analysis of evacuation procedure after the accident of the Fukushima Daiichi Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Murayama, T.; Iizuka, F.; El-Asaad, H. [Tokyo Inst. of Tech., Tokyo (Japan)

    2014-07-01

    After the Great East Japan Earthquake of March 2011 struck the coast of Eastern Japan, evacuation procedures were undermined due to the unexpected magnitude and severity of the disaster. Also, communications between local and national government were weakened, leading to dismemberment between society and government. Consequently this left the affected people without sufficient information or updates regarding evacuation procedures. This paper will concentrate on evacuation procedures led by locating residents with the help of media outlets (local newspapers and news reports). Analyzing movements of evacuees will help improve the evacuation method both for local residents and government bodies. (author)

  9. Analysis of evacuation procedure after the accident of the Fukushima Daiichi Nuclear Power Plant

    International Nuclear Information System (INIS)

    Murayama, T.; Iizuka, F.; El-Asaad, H.

    2014-01-01

    After the Great East Japan Earthquake of March 2011 struck the coast of Eastern Japan, evacuation procedures were undermined due to the unexpected magnitude and severity of the disaster. Also, communications between local and national government were weakened, leading to dismemberment between society and government. Consequently this left the affected people without sufficient information or updates regarding evacuation procedures. This paper will concentrate on evacuation procedures led by locating residents with the help of media outlets (local newspapers and news reports). Analyzing movements of evacuees will help improve the evacuation method both for local residents and government bodies. (author)

  10. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Koenraad J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-16

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.

  11. The Potosi Reservoir Model 2013c, Property Modeling Update

    Energy Technology Data Exchange (ETDEWEB)

    Adushita, Yasmin; Smith, Valerie; Leetaru, Hannes

    2014-09-30

    property modeling workflows and layering. This model was retained as the base case. In the preceding Task [1], the Potosi reservoir model was updated to take into account the new data from the Verification Well #2 (VW2) which was drilled in 2012. The porosity and permeability modeling was revised to take into account the log data from the new well. Revisions of the 2010 modeling assumptions were also done on relative permeability, capillary pressures, formation water salinity, and the maximum allowable well bottomhole pressure. Dynamic simulations were run using the injection target of 3.5 million tons per annum (3.2 MTPA) for 30 years. This dynamic model was named Potosi Dynamic Model 2013b. In this Task, a new property modeling workflow was applied, where seismic inversion data guided the porosity mapping and geobody extraction. The static reservoir model was fully guided by PorosityCube interpretations and derivations coupled with petrophysical logs from three wells. The two main assumptions are: porosity features in the PorosityCube that correlate with lost circulation zones represent vugular zones, and that these vugular zones are laterally continuous. Extrapolation was done carefully to populate the vugular facies and their corresponding properties outside the seismic footprint up to the boundary of the 30 by 30 mi (48 by 48 km) model. Dynamic simulations were also run using the injection target of 3.5 million tons per annum (3.2 MTPA) for 30 years. This new dynamic model was named Potosi Dynamic Model 2013c. Reservoir simulation with the latest model gives a cumulative injection of 43 million tons (39 MT) in 30 years with a single well, which corresponds to 40% of the injection target. The injection rate is approx. 3.2 MTPA in the first six months as the well is injecting into the surrounding vugs, and declines rapidly to 1.8 million tons per annum (1.6 MTPA) in year 3 once the surrounding vugs are full and the CO2 start to reach the matrix. After, the injection

  12. The TVT-obturator surgical procedure for the treatment of female stress urinary incontinence: a clinical update.

    Science.gov (United States)

    Waltregny, David; de Leval, Jean

    2009-03-01

    Six years ago, the inside-out transobturator tape TVT-O procedure was developed for the surgical treatment of female stress urinary incontinence (SUI) with the aim of minimizing the risk of urethra and bladder injuries and ensuring minimal tissue dissection. Initial feasibility and efficacy studies suggested that the TVT-O procedure is associated with high SUI cure rates and low morbidity at short term. A recent analysis of medium-term results indicated that the TVT-O procedure is efficient, with maintenance, after a 3-year minimum follow-up, of cure rates comparing favorably with those reported for TVT. No late complications were observed. As of July 2008, more than 35 clinical papers, including ten randomized trials and two national registries, have been published on the outcome of the TVT-O surgery. Results from these studies have confirmed that the TVT-O procedure is safe and as efficient as the TVT procedure, at least in the short/medium term.

  13. The Danish (Q)SAR Database Update Project

    DEFF Research Database (Denmark)

    Nikolov, Nikolai Georgiev; Dybdahl, Marianne; Abildgaard Rosenberg, Sine

    2013-01-01

    The Danish (Q)SAR Database is a collection of predictions from quantitative structure–activity relationship ((Q)SAR) models for over 70 environmental and human health-related endpoints (covering biodegradation, metabolism, allergy, irritation, endocrine disruption, teratogenicity, mutagenicity......, carcinogenicity and others), each of them available for 185,000 organic substances. The database has been available online since 2005 (http://qsar.food.dtu.dk). A major update project for the Danish (Q)SAR database is under way, with a new online release planned in the beginning of 2015. The updated version...... will contain more than 600,000 discrete organic structures and new, more precise predictions for all endpoints, derived by consensus algorithms from a number of state-of-the-art individual predictions. Copyright © 2013 Published by Elsevier Ireland Ltd....

  14. Phase 9 update (1987) report for the Energy Economic Data Base Program EEDB-IX

    International Nuclear Information System (INIS)

    1988-07-01

    This document is a review of the 1987 update of detailed reference powerplant cost estimates. This distribution is the latest in a series published since 1978. The overall program purpose is to provide periodically updated, detailed base construction cost estimates for large nuclear electric operating plants. These data, which are representative of current US powerplant construction cost experience, are a useful contribution to program planning by the Office of the Assistant Secretary for Nuclear Energy. The ninth update includes three new models as a basis for projecting future costs of both small and large nuclear power plants and examining the potential for cost reduction by incorporation of improved and advanced design features. The models designated as improved include the advantages of modular construction and standardized approach to design and construction. The advanced model includes many of the features of the improved data models as well as application of passive or near-passive safety systems and design simplification

  15. Updates to building-code maps for the 2015 NEHRP recommended seismic provisions

    Science.gov (United States)

    Luco, Nicolas; Bachman, Robert; Crouse, C.B; Harris, James R.; Hooper, John D.; Kircher, Charles A.; Caldwell, Phillp; Rukstales, Kenneth S.

    2015-01-01

    With the 2014 update of the U.S. Geological Survey (USGS) National Seismic Hazard Model (NSHM) as a basis, the Building Seismic Safety Council (BSSC) has updated the earthquake ground motion maps in the National Earthquake Hazards Reduction Program (NEHRP) Recommended Seismic Provisions for New Buildings and Other Structures, with partial funding from the Federal Emergency Management Agency. Anticipated adoption of the updated maps into the American Society of Civil Engineers Minimum Design Loads for Building and Other Structures and the International Building and Residential Codes is underway. Relative to the ground motions in the prior edition of each of these documents, most of the updated values are within a ±20% change. The larger changes are, in most cases, due to the USGS NSHM updates, reasons for which are given in companion publications. In some cases, the larger changes are partly due to a BSSC update of the slope of the fragility curve that is used to calculate the risk-targeted ground motions, and/or the introduction by BSSC of a quantitative definition of “active faults” used to calculate deterministic ground motions.

  16. Performing and updating an inventory of Oregon's expanding irrigated agricultural lands utilizing remote sensing technology

    Science.gov (United States)

    Hall, M. J.

    1981-01-01

    An inventory technique based upon using remote sensing technology, interpreting both high altitude aerial photography and LANDSAT multispectral scanner imagery, is discussed. It is noted that once the final land use inventory maps of irrigated agricultural lands are available and approximately scaled they may be overlaid directly onto either multispectral scanner or return beam vidicon prints, thereby providing an inexpensive updating procedure.

  17. Life cycle reliability assessment of new products—A Bayesian model updating approach

    International Nuclear Information System (INIS)

    Peng, Weiwen; Huang, Hong-Zhong; Li, Yanfeng; Zuo, Ming J.; Xie, Min

    2013-01-01

    The rapidly increasing pace and continuously evolving reliability requirements of new products have made life cycle reliability assessment of new products an imperative yet difficult work. While much work has been done to separately estimate reliability of new products in specific stages, a gap exists in carrying out life cycle reliability assessment throughout all life cycle stages. We present a Bayesian model updating approach (BMUA) for life cycle reliability assessment of new products. Novel features of this approach are the development of Bayesian information toolkits by separately including “reliability improvement factor” and “information fusion factor”, which allow the integration of subjective information in a specific life cycle stage and the transition of integrated information between adjacent life cycle stages. They lead to the unique characteristics of the BMUA in which information generated throughout life cycle stages are integrated coherently. To illustrate the approach, an application to the life cycle reliability assessment of a newly developed Gantry Machining Center is shown

  18. Price adjustment for traditional Chinese medicine procedures: Based on a standardized value parity model.

    Science.gov (United States)

    Wang, Haiyin; Jin, Chunlin; Jiang, Qingwu

    2017-11-20

    Traditional Chinese medicine (TCM) is an important part of China's medical system. Due to the prolonged low price of TCM procedures and the lack of an effective mechanism for dynamic price adjustment, the development of TCM has markedly lagged behind Western medicine. The World Health Organization (WHO) has emphasized the need to enhance the development of alternative and traditional medicine when creating national health care systems. The establishment of scientific and appropriate mechanisms to adjust the price of medical procedures in TCM is crucial to promoting the development of TCM. This study has examined incorporating value indicators and data on basic manpower expended, time spent, technical difficulty, and the degree of risk in the latest standards for the price of medical procedures in China, and this study also offers a price adjustment model with the relative price ratio as a key index. This study examined 144 TCM procedures and found that prices of TCM procedures were mainly based on the value of medical care provided; on average, medical care provided accounted for 89% of the price. Current price levels were generally low and the current price accounted for 56% of the standardized value of a procedure, on average. Current price levels accounted for a markedly lower standardized value of acupuncture, moxibustion, special treatment with TCM, and comprehensive TCM procedures. This study selected a total of 79 procedures and adjusted them by priority. The relationship between the price of TCM procedures and the suggested price was significantly optimized (p based on a standardized value parity model is a scientific and suitable method of price adjustment that can serve as a reference for other provinces and municipalities in China and other countries and regions that mainly have fee-for-service (FFS) medical care.

  19. Recreation of architectural structures using procedural modeling based on volumes

    Directory of Open Access Journals (Sweden)

    Santiago Barroso Juan

    2013-11-01

    Full Text Available While the procedural modeling of buildings and other architectural structures has evolved very significantly in recent years, there is noticeable absence of high-level tools that allow a designer, an artist or an historian, creating important buildings or architectonic structures in a particular city. In this paper we present a tool for creating buildings in a simple and clear, following rules that use the language and methodology of creating their own buildings, and hiding the user the algorithmic details of the creation of the model.

  20. Procedure guideline for radioiodine test (version 3)

    International Nuclear Information System (INIS)

    Dietlein, M.; Schicha, H.; Eschner, W.; Deutsche Gesellschaft fuer Medizinische Physik; Koeln Univ.; Lassmann, M.; Deutsche Gesellschaft fuer Medizinische Physik; Wuerzburg Univ.; Leisner, B.; Allgemeines Krankenhaus St. Georg, Hamburg; Reiners, C.; Wuerzburg Univ.

    2007-01-01

    The version 3 of the procedure guideline for radioiodine test is an update of the guideline previously published in 2003. The procedure guideline discusses the pros and cons of a single measurement or of repeated measurements of the iodine-131 uptake and their optimal timing. Different formulas are described when one, two or three values of the radioiodine kinetic are available. The probe with a sodium-iodine crystal, alternatively or additionally the gamma camera using the ROI-technique are instrumentations for the measurement of iodine-131 uptake. A possible source of error is an inappropriate measurement (sonography) of the target volume. The patients' preparation includes the withdrawal of antithyroid drugs 2-3 days before radioiodine administration. The patient has to avoid iodine-containing medication and the possibility of additives of iodine in vitamin- and electrolyte-supplementation has to be considered. (orig.)

  1. Main-Memory Operation Buffering for Efficient R-Tree Update

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Saltenis, Simonas; Biveinis, Laurynas

    2007-01-01

    the buffering of update operations in main memory as well as the grouping of operations to reduce disk I/O. In particular, operations are performed in bulk so that multiple operations are able to share I/O. The paper presents an analytical cost model that is shown to be accurate by empirical studies...... the main memory that is indeed available, or do not support some of the standard index operations. Assuming a setting where the index updates need not be written to disk immediately, we propose an R-tree-based indexing technique that does not exhibit any of these drawbacks. This technique exploits...

  2. Predictive market segmentation model: An application of logistic regression model and CHAID procedure

    Directory of Open Access Journals (Sweden)

    Soldić-Aleksić Jasna

    2009-01-01

    Full Text Available Market segmentation presents one of the key concepts of the modern marketing. The main goal of market segmentation is focused on creating groups (segments of customers that have similar characteristics, needs, wishes and/or similar behavior regarding the purchase of concrete product/service. Companies can create specific marketing plan for each of these segments and therefore gain short or long term competitive advantage on the market. Depending on the concrete marketing goal, different segmentation schemes and techniques may be applied. This paper presents a predictive market segmentation model based on the application of logistic regression model and CHAID analysis. The logistic regression model was used for the purpose of variables selection (from the initial pool of eleven variables which are statistically significant for explaining the dependent variable. Selected variables were afterwards included in the CHAID procedure that generated the predictive market segmentation model. The model results are presented on the concrete empirical example in the following form: summary model results, CHAID tree, Gain chart, Index chart, risk and classification tables.

  3. Update of axion CDM energy density

    International Nuclear Information System (INIS)

    Huh, Ji-Haeng

    2008-01-01

    We update cosmological bound on axion model. The contribution from the anharmonic effect and the newly introduced initial overshoot correction are considered. We present an explicit formula for the axion relic density in terms of the QCD scale Λ QCD , the current quark masses m q 's and the Peccei-Quinn scale F a , including firstly introduced 1.85 factor which is from the initial overshoot.

  4. Updating of working memory: lingering bindings.

    Science.gov (United States)

    Oberauer, Klaus; Vockenberg, Kerstin

    2009-05-01

    Three experiments investigated proactive interference and proactive facilitation in a memory-updating paradigm. Participants remembered several letters or spatial patterns, distinguished by their spatial positions, and updated them by new stimuli up to 20 times per trial. Self-paced updating times were shorter when an item previously remembered and then replaced reappeared in the same location than when it reappeared in a different location. This effect demonstrates residual memory for no-longer-relevant bindings of items to locations. The effect increased with the number of items to be remembered. With one exception, updating times did not increase, and recall of final values did not decrease, over successive updating steps, thus providing little evidence for proactive interference building up cumulatively.

  5. Behavioural Procedural Models – a multipurpose mechanistic account

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2012-05-01

    Full Text Available In this paper we outline an epistemological defence of what wecall Behavioural Procedural Models (BPMs, which represent the processes of individual decisions that lead to relevant economic patterns as psychologically (rather than rationally driven. Their general structure, and the way in which they may be incorporated to a multipurpose view of models, where the representational and interventionist goals are combined, is shown. It is argued that BPMs may provide “mechanistic-based explanations” in the sense defended by Hedström and Ylikoski (2010, which involve invariant regularities in Woodward’s sense. Such mechanisms provide a causal sort of explanation of anomalous economic patterns, which allow for extra marketintervention and manipulability in order to correct and improve some key individual decisions. This capability sets the basis for the so called libertarian paternalism (Sunstein and Thaler 2003.

  6. Evaluation of alternative surface runoff accounting procedures using the SWAT model

    Science.gov (United States)

    For surface runoff estimation in the Soil and Water Assessment Tool (SWAT) model, the curve number (CN) procedure is commonly adopted to calculate surface runoff by utilizing antecedent soil moisture condition (SCSI) in field. In the recent version of SWAT (SWAT2005), an alternative approach is ava...

  7. PSHA in Israel by using the synthetic ground motions from simulated seismicity: the modified SvE procedure

    Science.gov (United States)

    Meirova, T.; Shapira, A.; Eppelbaum, L.

    2018-05-01

    In this study, we updated and modified the SvE approach of Shapira and van Eck (Nat Hazards 8:201-215, 1993) which may be applied as an alternative to the conventional probabilistic seismic hazard assessment (PSHA) in Israel and other regions of low and moderate seismicity where measurements of strong ground motions are scarce. The new computational code SvE overcomes difficulties associated with the description of the earthquake source model and regional ground-motion scaling. In the modified SvE procedure, generating suites of regional ground motion is based on the extended two-dimensional source model of Motazedian and Atkinson (Bull Seism Soc Amer 95:995-1010, 2005a) and updated regional ground-motion scaling (Meirova and Hofstteter, Bull Earth Eng 15:3417-3436, 2017). The analytical approach of Mavroeidis and Papageorgiou (Bull Seism Soc Amer 93:1099-1131, 2003) is used to simulate the near-fault acceleration with the near-fault effects. The comparison of hazard estimates obtained by using the conventional method implemented in the National Building Code for Design provisions for earthquake resistance of structures and the modified SvE procedure for rock-site conditions indicates a general agreement with some perceptible differences at the periods of 0.2 and 0.5 s. For the periods above 0.5 s, the SvE estimates are systematically greater and can increase by a factor of 1.6. For the soft-soil sites, the SvE hazard estimates at the period of 0.2 s are greater than those based on the CB2008 ground-motion prediction equation (GMPE) by a factor of 1.3-1.6. We suggest that the hazard estimates for the sites with soft-soil conditions calculated by the modified SvE procedure are more reliable than those which can be found by means of the conventional PSHA. This result agrees with the opinion that the use of a standard GMPE applying the NEHRP soil classification based on the V s, 30 parameter may be inappropriate for PSHA at many sites in Israel.

  8. How do we update faces? Effects of gaze direction and facial expressions on working memory updating

    Directory of Open Access Journals (Sweden)

    Caterina eArtuso

    2012-09-01

    Full Text Available The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM. We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g. joy, while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g. fear. Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g. joy-direct gaze were compared to low binding conditions (e.g. joy-averted gaze. Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  9. How do we update faces? Effects of gaze direction and facial expressions on working memory updating.

    Science.gov (United States)

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  10. Simultaneous travel time tomography for updating both velocity and reflector geometry in triangular/tetrahedral cell model

    Science.gov (United States)

    Bai, Chao-ying; He, Lei-yu; Li, Xing-wang; Sun, Jia-yu

    2018-05-01

    To conduct forward and simultaneous inversion in a complex geological model, including an irregular topography (or irregular reflector or velocity anomaly), we in this paper combined our previous multiphase arrival tracking method (referred as triangular shortest-path method, TSPM) in triangular (2D) or tetrahedral (3D) cell model and a linearized inversion solver (referred to as damped minimum norms and constrained least squares problem solved using the conjugate gradient method, DMNCLS-CG) to formulate a simultaneous travel time inversion method for updating both velocity and reflector geometry by using multiphase arrival times. In the triangular/tetrahedral cells, we deduced the partial derivative of velocity variation with respective to the depth change of reflector. The numerical simulation results show that the computational accuracy can be tuned to a high precision in forward modeling and the irregular velocity anomaly and reflector geometry can be accurately captured in the simultaneous inversion, because the triangular/tetrahedral cell can be easily used to stitch the irregular topography or subsurface interface.

  11. 49 CFR 1002.3 - Updating user fees.

    Science.gov (United States)

    2010-10-01

    ... updating fees. Each fee shall be updated by updating the cost components comprising the fee. Cost... direct labor costs are direct labor costs determined by the cost study set forth in Revision of Fees For... by total office costs for the Offices directly associated with user fee activity. Actual updating of...

  12. Qualitative mechanism models and the rationalization of procedures

    Science.gov (United States)

    Farley, Arthur M.

    1989-01-01

    A qualitative, cluster-based approach to the representation of hydraulic systems is described and its potential for generating and explaining procedures is demonstrated. Many ideas are formalized and implemented as part of an interactive, computer-based system. The system allows for designing, displaying, and reasoning about hydraulic systems. The interactive system has an interface consisting of three windows: a design/control window, a cluster window, and a diagnosis/plan window. A qualitative mechanism model for the ORS (Orbital Refueling System) is presented to coordinate with ongoing research on this system being conducted at NASA Ames Research Center.

  13. Extension of emergency operating procedures for severe accident management

    International Nuclear Information System (INIS)

    Chiang, S.C.

    1992-01-01

    To enhance the capability of reactor operators to cope with the hypothetical severe accident its the key issue for utilities. Taiwan Power Company has started the enhancement programs on extension of emergency operating procedures (EOPs). It includes the review of existing LOPs based on the conclusions and recommendations of probabilistic risk assessment studies to confirm the operator actions. Then the plant specific analysis for accident management strategy will be performed and the existing EOPs will be updated accordingly

  14. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

    2015-01-01

    The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

  15. Rule-Governed Imitative Verbal Behavior as a Function of Modeling Procedures

    Science.gov (United States)

    Clinton, LeRoy; Boyce, Kathleen D.

    1975-01-01

    Investigated the effectiveness of modeling procedures alone and complemented by the appropriate rule statement on the production of plurals. Subjects were 20 normal and 20 retarded children who were randomly assigned to one of two learning conditions and who received either affective or informative social reinforcement. (Author/SDH)

  16. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2010

    Energy Technology Data Exchange (ETDEWEB)

    Rahmat Aryaeinejad; Douglas S. Crawford; Mark D. DeHart; George W. Griffith; D. Scott Lucas; Joseph W. Nielsen; David W. Nigg; James R. Parry; Jorge Navarro

    2010-09-01

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance and, to some extent, experiment management are obsolete, inconsistent with the state of modern nuclear engineering practice, and are becoming increasingly difficult to properly verify and validate (V&V). Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In 2009 the Idaho National Laboratory (INL) initiated a focused effort to address this situation through the introduction of modern high-fidelity computational software and protocols, with appropriate V&V, within the next 3-4 years via the ATR Core Modeling and Simulation and V&V Update (or “Core Modeling Update”) Project. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF).

  17. Longitudinal Data Analyses Using Linear Mixed Models in SPSS: Concepts, Procedures and Illustrations

    Directory of Open Access Journals (Sweden)

    Daniel T. L. Shek

    2011-01-01

    Full Text Available Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in Hong Kong are presented.

  18. Longitudinal data analyses using linear mixed models in SPSS: concepts, procedures and illustrations.

    Science.gov (United States)

    Shek, Daniel T L; Ma, Cecilia M S

    2011-01-05

    Although different methods are available for the analyses of longitudinal data, analyses based on generalized linear models (GLM) are criticized as violating the assumption of independence of observations. Alternatively, linear mixed models (LMM) are commonly used to understand changes in human behavior over time. In this paper, the basic concepts surrounding LMM (or hierarchical linear models) are outlined. Although SPSS is a statistical analyses package commonly used by researchers, documentation on LMM procedures in SPSS is not thorough or user friendly. With reference to this limitation, the related procedures for performing analyses based on LMM in SPSS are described. To demonstrate the application of LMM analyses in SPSS, findings based on six waves of data collected in the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in Hong Kong are presented.

  19. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  20. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  1. An Update of the Analytical Groundwater Modeling to Assess Water Resource Impacts at the Afton Solar Energy Zone

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, John J. [Argonne National Lab. (ANL), Argonne, IL (United States); Greer, Christopher B. [Argonne National Lab. (ANL), Argonne, IL (United States); Carr, Adrianne E. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-10-01

    The purpose of this study is to update a one-dimensional analytical groundwater flow model to examine the influence of potential groundwater withdrawal in support of utility-scale solar energy development at the Afton Solar Energy Zone (SEZ) as a part of the Bureau of Land Management’s (BLM’s) Solar Energy Program. This report describes the modeling for assessing the drawdown associated with SEZ groundwater pumping rates for a 20-year duration considering three categories of water demand (high, medium, and low) based on technology-specific considerations. The 2012 modeling effort published in the Final Programmatic Environmental Impact Statement for Solar Energy Development in Six Southwestern States (Solar PEIS; BLM and DOE 2012) has been refined based on additional information described below in an expanded hydrogeologic discussion.

  2. Nonadditive entropy maximization is inconsistent with Bayesian updating

    Science.gov (United States)

    Pressé, Steve

    2014-11-01

    The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.

  3. Concepts of incremental updating and versioning

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2004-07-01

    Full Text Available of the work undertaken recently by the Working Group (WG). The WG was voted for a Commission by the General Assembly held at the 21st ICC in Durban, South Africa. The basic problem being addressed by the Commission is that a user compiles their data base... or election). Historically, updates have been provided in bulk, with the new data set replacing the old one. User could: ignore update (if it is not significant enough), manually (and selectively) update their data base, or accept the whole update...

  4. P- and S-wave velocity models incorporating the Cascadia subduction zone for 3D earthquake ground motion simulations—Update for Open-File Report 2007–1348

    Science.gov (United States)

    Stephenson, William J.; Reitman, Nadine G.; Angster, Stephen J.

    2017-12-20

    In support of earthquake hazards studies and ground motion simulations in the Pacific Northwest, threedimensional (3D) P- and S-wave velocity (VP and VS , respectively) models incorporating the Cascadia subduction zone were previously developed for the region encompassed from about 40.2°N. to 50°N. latitude, and from about 122°W. to 129°W. longitude (fig. 1). This report describes updates to the Cascadia velocity property volumes of model version 1.3 ([V1.3]; Stephenson, 2007), herein called model version 1.6 (V1.6). As in model V1.3, the updated V1.6 model volume includes depths from 0 kilometers (km) (mean sea level) to 60 km, and it is intended to be a reference for researchers who have used, or are planning to use, this model in their earth science investigations. To this end, it is intended that the VP and VS property volumes of model V1.6 will be considered a template for a community velocity model of the Cascadia region as additional results become available. With the recent and ongoing development of the National Crustal Model (NCM; Boyd and Shah, 2016), we envision any future versions of this model will be directly integrated with that effort

  5. Implications of the Declarative/Procedural Model for Improving Second Language Learning: The Role of Memory Enhancement Techniques

    Science.gov (United States)

    Ullman, Michael T.; Lovelett, Jarrett T.

    2018-01-01

    The declarative/procedural (DP) model posits that the learning, storage, and use of language critically depend on two learning and memory systems in the brain: declarative memory and procedural memory. Thus, on the basis of independent research on the memory systems, the model can generate specific and often novel predictions for language. Till…

  6. SHINE Virtual Machine Model for In-flight Updates of Critical Mission Software

    Science.gov (United States)

    Plesea, Lucian

    2008-01-01

    This software is a new target for the Spacecraft Health Inference Engine (SHINE) knowledge base that compiles a knowledge base to a language called Tiny C - an interpreted version of C that can be embedded on flight processors. This new target allows portions of a running SHINE knowledge base to be updated on a "live" system without needing to halt and restart the containing SHINE application. This enhancement will directly provide this capability without the risk of software validation problems and can also enable complete integration of BEAM and SHINE into a single application. This innovation enables SHINE deployment in domains where autonomy is used during flight-critical applications that require updates. This capability eliminates the need for halting the application and performing potentially serious total system uploads before resuming the application with the loss of system integrity. This software enables additional applications at JPL (microsensors, embedded mission hardware) and increases the marketability of these applications outside of JPL.

  7. Comparing adaptive procedures for estimating the psychometric function for an auditory gap detection task.

    Science.gov (United States)

    Shen, Yi

    2013-05-01

    A subject's sensitivity to a stimulus variation can be studied by estimating the psychometric function. Generally speaking, three parameters of the psychometric function are of interest: the performance threshold, the slope of the function, and the rate at which attention lapses occur. In the present study, three psychophysical procedures were used to estimate the three-parameter psychometric function for an auditory gap detection task. These were an up-down staircase (up-down) procedure, an entropy-based Bayesian (entropy) procedure, and an updated maximum-likelihood (UML) procedure. Data collected from four young, normal-hearing listeners showed that while all three procedures provided similar estimates of the threshold parameter, the up-down procedure performed slightly better in estimating the slope and lapse rate for 200 trials of data collection. When the lapse rate was increased by mixing in random responses for the three adaptive procedures, the larger lapse rate was especially detrimental to the efficiency of the up-down procedure, and the UML procedure provided better estimates of the threshold and slope than did the other two procedures.

  8. Improving Semantic Updating Method on 3d City Models Using Hybrid Semantic-Geometric 3d Segmentation Technique

    Science.gov (United States)

    Sharkawi, K.-H.; Abdul-Rahman, A.

    2013-09-01

    to LoD4. The accuracy and structural complexity of the 3D objects increases with the LoD level where LoD0 is the simplest LoD (2.5D; Digital Terrain Model (DTM) + building or roof print) while LoD4 is the most complex LoD (architectural details with interior structures). Semantic information is one of the main components in CityGML and 3D City Models, and provides important information for any analyses. However, more often than not, the semantic information is not available for the 3D city model due to the unstandardized modelling process. One of the examples is where a building is normally generated as one object (without specific feature layers such as Roof, Ground floor, Level 1, Level 2, Block A, Block B, etc). This research attempts to develop a method to improve the semantic data updating process by segmenting the 3D building into simpler parts which will make it easier for the users to select and update the semantic information. The methodology is implemented for 3D buildings in LoD2 where the buildings are generated without architectural details but with distinct roof structures. This paper also introduces hybrid semantic-geometric 3D segmentation method that deals with hierarchical segmentation of a 3D building based on its semantic value and surface characteristics, fitted by one of the predefined primitives. For future work, the segmentation method will be implemented as part of the change detection module that can detect any changes on the 3D buildings, store and retrieve semantic information of the changed structure, automatically updates the 3D models and visualize the results in a userfriendly graphical user interface (GUI).

  9. On the number of different dynamics in Boolean networks with deterministic update schedules.

    Science.gov (United States)

    Aracena, J; Demongeot, J; Fanchon, E; Montalva, M

    2013-04-01

    Deterministic Boolean networks are a type of discrete dynamical systems widely used in the modeling of genetic networks. The dynamics of such systems is characterized by the local activation functions and the update schedule, i.e., the order in which the nodes are updated. In this paper, we address the problem of knowing the different dynamics of a Boolean network when the update schedule is changed. We begin by proving that the problem of the existence of a pair of update schedules with different dynamics is NP-complete. However, we show that certain structural properties of the interaction diagraph are sufficient for guaranteeing distinct dynamics of a network. In [1] the authors define equivalence classes which have the property that all the update schedules of a given class yield the same dynamics. In order to determine the dynamics associated to a network, we develop an algorithm to efficiently enumerate the above equivalence classes by selecting a representative update schedule for each class with a minimum number of blocks. Finally, we run this algorithm on the well known Arabidopsis thaliana network to determine the full spectrum of its different dynamics. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  11. Quantitative critical thinking: Student activities using Bayesian updating

    Science.gov (United States)

    Warren, Aaron R.

    2018-05-01

    One of the central roles of physics education is the development of students' ability to evaluate proposed hypotheses and models. This ability is important not just for students' understanding of physics but also to prepare students for future learning beyond physics. In particular, it is often hoped that students will better understand the manner in which physicists leverage the availability of prior knowledge to guide and constrain the construction of new knowledge. Here, we discuss how the use of Bayes' Theorem to update the estimated likelihood of hypotheses and models can help achieve these educational goals through its integration with evaluative activities that use hypothetico-deductive reasoning. Several types of classroom and laboratory activities are presented that engage students in the practice of Bayesian likelihood updating on the basis of either consistency with experimental data or consistency with pre-established principles and models. This approach is sufficiently simple for introductory physics students while offering a robust mechanism to guide relatively sophisticated student reflection concerning models, hypotheses, and problem-solutions. A quasi-experimental study utilizing algebra-based introductory courses is presented to assess the impact of these activities on student epistemological development. The results indicate gains on the Epistemological Beliefs Assessment for Physical Science (EBAPS) at a minimal cost of class-time.

  12. Development of a diagnosis- and procedure-based risk model for 30-day outcome after pediatric cardiac surgery.

    Science.gov (United States)

    Crowe, Sonya; Brown, Kate L; Pagel, Christina; Muthialu, Nagarajan; Cunningham, David; Gibbs, John; Bull, Catherine; Franklin, Rodney; Utley, Martin; Tsang, Victor T

    2013-05-01

    The study objective was to develop a risk model incorporating diagnostic information to adjust for case-mix severity during routine monitoring of outcomes for pediatric cardiac surgery. Data from the Central Cardiac Audit Database for all pediatric cardiac surgery procedures performed in the United Kingdom between 2000 and 2010 were included: 70% for model development and 30% for validation. Units of analysis were 30-day episodes after the first surgical procedure. We used logistic regression for 30-day mortality. Risk factors considered included procedural information based on Central Cardiac Audit Database "specific procedures," diagnostic information defined by 24 "primary" cardiac diagnoses and "univentricular" status, and other patient characteristics. Of the 27,140 30-day episodes in the development set, 25,613 were survivals, 834 were deaths, and 693 were of unknown status (mortality, 3.2%). The risk model includes procedure, cardiac diagnosis, univentricular status, age band (neonate, infant, child), continuous age, continuous weight, presence of non-Down syndrome comorbidity, bypass, and year of operation 2007 or later (because of decreasing mortality). A risk score was calculated for 95% of cases in the validation set (weight missing in 5%). The model discriminated well; the C-index for validation set was 0.77 (0.81 for post-2007 data). Removal of all but procedural information gave a reduced C-index of 0.72. The model performed well across the spectrum of predicted risk, but there was evidence of underestimation of mortality risk in neonates undergoing operation from 2007. The risk model performs well. Diagnostic information added useful discriminatory power. A future application is risk adjustment during routine monitoring of outcomes in the United Kingdom to assist quality assurance. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  13. Cost update technology, safety, and costs of decommissioning a reference uranium hexafluoride conversion plant

    International Nuclear Information System (INIS)

    Miles, T.L.; Liu, Y.

    1995-08-01

    The purpose of this study is to update the cost estimates developed in a previous report, NUREG/CR-1757 (Elder 1980) for decommissioning a reference uranium hexafluoride conversion plant from the original mid-1981 dollars to values representative of January 1993. The cost updates were performed by using escalation factors derived from cost index trends over the past 11.5 years. Contemporary price quotes wee used for costs that have increased drastically or for which is is difficult to find a cost trend. No changes were made in the decommissioning procedures or cost element requirements assumed in NUREG/CR-1757. This report includes only information that was changed from NUREG/CR-1757. Thus, for those interested in detailed descriptions and associated information for the reference uranium hexafluoride conversion plant, a copy of NUREG/CR-1757 will be needed

  14. Experiences with a procedure for modeling product knowledge

    DEFF Research Database (Denmark)

    Hansen, Benjamin Loer; Hvam, Lars

    2002-01-01

    This paper presents experiences with a procedure for building configurators. The procedure has been used in an American company producing custom-made precision air conditioning equipment. The paper describes experiences with the use of the procedure and experiences with the project in general....

  15. WIMS-D library update

    International Nuclear Information System (INIS)

    2007-05-01

    WIMS-D (Winfrith Improved Multigroup Scheme-D) is the name of a family of software packages for reactor lattice calculations and is one of the few reactor lattice codes in the public domain and available on noncommercial terms. WIMSD-5B has recently been released from the OECD Nuclear Energy Agency Data Bank, and features major improvements in machine portability, as well as incorporating a few minor corrections. This version supersedes WIMS-D/4, which was released by the Winfrith Technology Centre in the United Kingdom for IBM machines and has been adapted for various other computer platforms in different laboratories. The main weakness of the WIMS-D package is the multigroup constants library, which is based on very old data. The relatively good performance of WIMS-D is attributed to a series of empirical adjustments to the multigroup data. However, the adjustments are not always justified on the basis of more accurate and recent experimental measurements. Following the release of new and revised evaluated nuclear data files, it was felt that the performance of WIMS-D could be improved by updating the associated library. The WIMS-D Library Update Project (WLUP) was initiated in the early 1990s with the support of the IAEA. This project consisted of voluntary contributions from a large number of participants. Several benchmarks for testing the library were identified and analysed, the WIMSR module of the NJOY code system was upgraded and the author of NJOY accepted the proposed updates for the official code system distribution. A detailed parametric study was performed to investigate the effects of various data processing input options on the integral results. In addition, the data processing methods for the main reactor materials were optimized. Several partially updated libraries were produced for testing purposes. The final stage of the WLUP was organized as a coordinated research project (CRP) in order to speed up completion of the fully updated library

  16. A Procedure for Identification of Appropriate State Space and ARIMA Models Based on Time-Series Cross-Validation

    Directory of Open Access Journals (Sweden)

    Patrícia Ramos

    2016-11-01

    Full Text Available In this work, a cross-validation procedure is used to identify an appropriate Autoregressive Integrated Moving Average model and an appropriate state space model for a time series. A minimum size for the training set is specified. The procedure is based on one-step forecasts and uses different training sets, each containing one more observation than the previous one. All possible state space models and all ARIMA models where the orders are allowed to range reasonably are fitted considering raw data and log-transformed data with regular differencing (up to second order differences and, if the time series is seasonal, seasonal differencing (up to first order differences. The value of root mean squared error for each model is calculated averaging the one-step forecasts obtained. The model which has the lowest root mean squared error value and passes the Ljung–Box test using all of the available data with a reasonable significance level is selected among all the ARIMA and state space models considered. The procedure is exemplified in this paper with a case study of retail sales of different categories of women’s footwear from a Portuguese retailer, and its accuracy is compared with three reliable forecasting approaches. The results show that our procedure consistently forecasts more accurately than the other approaches and the improvements in the accuracy are significant.

  17. Modelling of groundwater flow and solute transport in Olkiluoto. Update 2008

    International Nuclear Information System (INIS)

    Loefman, J.; Pitkaenen, P.; Meszaros, F.; Keto, V.; Ahokas, H.

    2009-10-01

    Posiva Oy is preparing for the final disposal of spent nuclear fuel in the crystalline bedrock in Finland. Olkiluoto in Eurajoki has been selected as the primary site for the repository, subject to further detailed characterisation which is currently focused on the construction of an underground rock characterisation and research facility (the ONKALO). An essential part of the site investigation programme is analysis of the deep groundwater flow by means of numerical flow modelling. This study is the latest update concerning the site-scale flow modelling and is based on all the hydrogeological data gathered from field investigations by the end of 2007. The work is divided into two separate modelling tasks: 1) characterization of the baseline groundwater flow conditions before excavation of the ONKALO, and 2) a prediction/outcome (P/O) study of the potential hydrogeological disturbances due to the ONKALO. The flow model was calibrated by using all the available data that was appropriate for the applied, deterministic, equivalent porous medium (EPM) / dual-porosity (DP) approach. In the baseline modelling, calibration of the flow model focused on improving the agreement between the calculated results and the undisturbed observations. The calibration resulted in a satisfactory agreement with the measured pumping test responses, a very good overall agreement with the observed pressures in the deep drill holes and a fairly good agreement with the observed salinity. Some discrepancies still remained in a few single drill hole sections, because the fresh water infiltration in the model tends to dilute the groundwater too much at shallow depths. In the P/O calculations the flow model was further calibrated by using the monitoring data on the ONKALO disturbances. Having significantly more information on the inflows to the tunnel (compared with the previous study) allowed better calibration of the model, which allowed it to capture very well the observed inflow, the

  18. EFAM GTP 02 - the GKSS test procedure for determining the fracture behaviour of materials

    International Nuclear Information System (INIS)

    Schwalbe, K.H.; Heerens, J.; Zerbst, U.; Kocak, M.

    2002-01-01

    This document describes a unified fracture mechanics test method in procedural form for quasi-static testing of materials. It is based on the ESIS Procedures P1 and P2 and introduces additional features, such as middle cracked tension specimens, shallow cracks, the δ 5 crack tip opening displacement, the crack tip opening angle, the rate of dissipated energy, testing of weldments, and guidance for statistical treatment of scatter. Special validity criteria are given for tests on specimens with low constraint. This document represents an updated version of EFAM GTP 94. (orig.) [de

  19. Dynamical modeling procedure of a Li-ion battery pack suitable for real-time applications

    International Nuclear Information System (INIS)

    Castano, S.; Gauchia, L.; Voncila, E.; Sanz, J.

    2015-01-01

    Highlights: • Dynamical modeling of a 50 A h battery pack composed of 56 cells. • Detailed analysis of SOC tests at realistic performance range imposed by BMS. • We propose an electrical circuit that improves how the battery capacity is modeled. • The model is validated in the SOC range using a real-time experimental setup. - Abstract: This paper presents the modeling of a 50 A h battery pack composed of 56 cells, taking into account real battery performance conditions imposed by the BMS control. The modeling procedure starts with a detailed analysis of experimental charge and discharge SOC tests. Results from these tests are used to obtain the battery model parameters at a realistic performance range (20–80% SOC). The model topology aims to better describe the finite charge contained in a battery pack. The model has been validated at three different SOC values in order to verify the model response at real battery pack operation conditions. The validation tests show that the battery pack model is able to simulate the real battery response with excellent accuracy in the range tested. The proposed modeling procedure is fully applicable to any Li-ion battery pack, regardless of the number of series or parallel cells or its rated capacity

  20. Evaluation of the groundwater flow model for southern Utah and Goshen Valleys, Utah, updated to conditions through 2011, with new projections and groundwater management simulations

    Science.gov (United States)

    Brooks, Lynette E.

    2013-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Southern Utah Valley Municipal Water Association, updated an existing USGS model of southern Utah and Goshen Valleys for hydrologic and climatic conditions from 1991 to 2011 and used the model for projection and groundwater management simulations. All model files used in the transient model were updated to be compatible with MODFLOW-2005 and with the additional stress periods. The well and recharge files had the most extensive changes. Discharge to pumping wells in southern Utah and Goshen Valleys was estimated and simulated on an annual basis from 1991 to 2011. Recharge estimates for 1991 to 2011 were included in the updated model by using precipitation, streamflow, canal diversions, and irrigation groundwater withdrawals for each year. The model was evaluated to determine how well it simulates groundwater conditions during recent increased withdrawals and drought, and to determine if the model is adequate for use in future planning. In southern Utah Valley, the magnitude and direction of annual water-level fluctuation simulated by the updated model reasonably match measured water-level changes, but they do not simulate as much decline as was measured in some locations from 2000 to 2002. Both the rapid increase in groundwater withdrawals and the total groundwater withdrawals in southern Utah Valley during this period exceed the variations and magnitudes simulated during the 1949 to 1990 calibration period. It is possible that hydraulic properties may be locally incorrect or that changes, such as land use or irrigation diversions, occurred that are not simulated. In the northern part of Goshen Valley, simulated water-level changes reasonably match measured changes. Farther south, however, simulated declines are much less than measured declines. Land-use changes indicate that groundwater withdrawals in Goshen Valley are possibly greater than estimated and simulated. It is also possible that irrigation