How update schemes influence crowd simulations
International Nuclear Information System (INIS)
Seitz, Michael J; Köster, Gerta
2014-01-01
Time discretization is a key modeling aspect of dynamic computer simulations. In current pedestrian motion models based on discrete events, e.g. cellular automata and the Optimal Steps Model, fixed-order sequential updates and shuffle updates are prevalent. We propose to use event-driven updates that process events in the order they occur, and thus better match natural movement. In addition, we present a parallel update with collision detection and resolution for situations where computational speed is crucial. Two simulation studies serve to demonstrate the practical impact of the choice of update scheme. Not only do density-speed relations differ, but there is a statistically significant effect on evacuation times. Fixed-order sequential and random shuffle updates with a short update period come close to event-driven updates. The parallel update scheme overestimates evacuation times. All schemes can be employed for arbitrary simulation models with discrete events, such as car traffic or animal behavior. (paper)
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool
Energy Technology Data Exchange (ETDEWEB)
Beckers, Koenraad J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2018-02-14
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alone reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.
Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint
Energy Technology Data Exchange (ETDEWEB)
Beckers, Koenraad J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)
2018-02-16
This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.
Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations
International Nuclear Information System (INIS)
Ehlert, Kurt; Loewe, Laurence
2014-01-01
To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise
Update 0.2 to "pysimm: A python package for simulation of molecular systems"
Demidov, Alexander G.; Fortunato, Michael E.; Colina, Coray M.
2018-01-01
An update to the pysimm Python molecular simulation API is presented. A major part of the update is the implementation of a new interface with CASSANDRA - a modern, versatile Monte Carlo molecular simulation program. Several significant improvements in the LAMMPS communication module that allow better and more versatile simulation setup are reported as well. An example of an application implementing iterative CASSANDRA-LAMMPS interaction is illustrated.
O'Keeffe, C J; Ren, Ruichao; Orkoulas, G
2007-11-21
Spatial updating grand canonical Monte Carlo algorithms are generalizations of random and sequential updating algorithms for lattice systems to continuum fluid models. The elementary steps, insertions or removals, are constructed by generating points in space either at random (random updating) or in a prescribed order (sequential updating). These algorithms have previously been developed only for systems of impenetrable spheres for which no particle overlap occurs. In this work, spatial updating grand canonical algorithms are generalized to continuous, soft-core potentials to account for overlapping configurations. Results on two- and three-dimensional Lennard-Jones fluids indicate that spatial updating grand canonical algorithms, both random and sequential, converge faster than standard grand canonical algorithms. Spatial algorithms based on sequential updating not only exhibit the fastest convergence but also are ideal for parallel implementation due to the absence of strict detailed balance and the nature of the updating that minimizes interprocessor communication. Parallel simulation results for three-dimensional Lennard-Jones fluids show a substantial reduction of simulation time for systems of moderate and large size. The efficiency improvement by parallel processing through domain decomposition is always in addition to the efficiency improvement by sequential updating.
Efficiencies of joint non-local update moves in Monte Carlo simulations of coarse-grained polymers
Austin, Kieran S.; Marenz, Martin; Janke, Wolfhard
2018-03-01
In this study four update methods are compared in their performance in a Monte Carlo simulation of polymers in continuum space. The efficiencies of the update methods and combinations thereof are compared with the aid of the autocorrelation time with a fixed (optimal) acceptance ratio. Results are obtained for polymer lengths N = 14, 28 and 42 and temperatures below, at and above the collapse transition. In terms of autocorrelation, the optimal acceptance ratio is approximately 0.4. Furthermore, an overview of the step sizes of the update methods that correspond to this optimal acceptance ratio is given. This shall serve as a guide for future studies that rely on efficient computer simulations.
Improving precipitation simulation from updated surface characteristics in South America
Pereira, Gabriel; Silva, Maria Elisa Siqueira; Moraes, Elisabete Caria; Chiquetto, Júlio Barboza; da Silva Cardozo, Francielle
2017-07-01
Land use and land cover maps and their physical-chemical and biological properties are important variables in the numerical modeling of Earth systems. In this context, the main objective of this study is to analyze the improvements resulting from the land use and land cover map update in numerical simulations performed using the Regional Climate Model system version 4 (RegCM4), as well as the seasonal variations of physical parameters used by the Biosphere Atmosphere Transfer Scheme (BATS). In general, the update of the South America 2007 land use and land cover map, used by the BATS, improved the simulation of precipitation by 10 %, increasing the mean temporal correlation coefficient, compared to observed data, from 0.84 to 0.92 (significant at p Atlantic convergence zone (SACZ) positioning, presenting a spatial pattern of alternated areas with higher and lower precipitation rates. These important differences occur due to the replacement of tropical rainforest for pasture and agriculture and the replacement of agricultural areas for pasture, scrubland, and deciduous forest.
Updated Simulation Studies of Damage Limit of LHC Tertiary Collimators
AUTHOR|(CDS)2085459; Bertarelli, Alessandro; Bruce, Roderik; Carra, Federico; Cerutti, Francesco; Gradassi, Paolo; Lechner, Anton; Redaelli, Stefano; Skordis, Eleftherios
2015-01-01
The tertiary collimators (TCTs) in the LHC, installed in front of the experiments, in standard operation intercept fractions of 10−3 halo particles. However, they risk to be hit by high-intensity primary beams in case of asynchronous beam dump. TCT damage thresholds were initially inferred from results of destructive tests on a TCT jaw, supported by numerical simulations, assuming simplified impact scenarios with one single bunch hitting the jaw with a given impact parameter. In this paper, more realistic failure conditions, including a train of bunches and taking into account the full collimation hierarchy, are used to derive updated damage limits. The results are used to update the margins in the collimation hierarchy and could thus potentially have an influence on the LHC performance.
Intelligent launch and range operations virtual testbed (ILRO-VTB)
Bardina, Jorge; Rajkumar, Thirumalainambi
2003-09-01
Intelligent Launch and Range Operations Virtual Test Bed (ILRO-VTB) is a real-time web-based command and control, communication, and intelligent simulation environment of ground-vehicle, launch and range operation activities. ILRO-VTB consists of a variety of simulation models combined with commercial and indigenous software developments (NASA Ames). It creates a hybrid software/hardware environment suitable for testing various integrated control system components of launch and range. The dynamic interactions of the integrated simulated control systems are not well understood. Insight into such systems can only be achieved through simulation/emulation. For that reason, NASA has established a VTB where we can learn the actual control and dynamics of designs for future space programs, including testing and performance evaluation. The current implementation of the VTB simulates the operations of a sub-orbital vehicle of mission, control, ground-vehicle engineering, launch and range operations. The present development of the test bed simulates the operations of Space Shuttle Vehicle (SSV) at NASA Kennedy Space Center. The test bed supports a wide variety of shuttle missions with ancillary modeling capabilities like weather forecasting, lightning tracker, toxic gas dispersion model, debris dispersion model, telemetry, trajectory modeling, ground operations, payload models and etc. To achieve the simulations, all models are linked using Common Object Request Broker Architecture (CORBA). The test bed provides opportunities for government, universities, researchers and industries to do a real time of shuttle launch in cyber space.
An Updated Nuclear Equation of State for Neutron Stars and Supernova Simulations
Meixner, M. A.; Mathews, G. J.; Dalhed, H. E.; Lan, N. Q.
2011-10-01
We present an updated and improved Equation of State based upon the framework originally developed by Bowers & Wilson. The details of the EoS and improvements are described along with a description of how to access this EOS for numerical simulations. Among the improvements are an updated compressibility based upon recent measurements, the possibility of the formation of proton excess (Ye> 0.5) material and an improved treatment of the nuclear statistical equilibrium and the transition to pasta nuclei as the density approaches nuclear matter density. The possibility of a QCD chiral phase transition is also included at densities above nuclear matter density. We show comparisons of this EOS with the other two publicly available equations of state used in supernova collapse simulations. The advantages of the present EoS is that it is easily amenable to phenomenological parameterization to fit observed explosion properties and to accommodate new physical parameters.
Electromagnetic Simulation Seminar and Opera/Tosca update Seminar
IT Department
2012-01-01
9 May 2012 Kjell Johnsen Auditorium – Room 30-7-018 Electromagnetic Simulation Seminar & Opera/Tosca update Seminar By Cobham Technical Services – Vector Fields Software Virtual prototyping using electromagnetic simulation software plays an important role in the design stage of many devices and the Opera software has been in use for this purpose at CERN for over a decade. A technical seminar will take place concerning the latest developments in electromagnetic design, analysis and multi-physics applications for large scientific experiments. Information will be presented on applications such as superconducting magnets and ion-beam sources. The seminar will be presented by engineers/physicists from Cobham Technical Services – Vector Fields Software who develop the Opera program. It is FREE to attend and is open to both current Opera software users and also those who wish to expand their knowledge and understand better the capabilities on offer. Prog...
Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.
2017-12-01
Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.
Nuclear engine system simulation (NESS) program update
International Nuclear Information System (INIS)
Scheil, C.M.; Pelaccio, D.G.; Petrosky, L.J.
1993-01-01
The second phase of development of a Nuclear Thermal Propulsion (NTP) engine system design analysis code has been completed. The standalone, versatile Nuclear Engine System Simulation (NESS) code provides an accurate, detailed assessment of engine system operating performance, weight, and sizes. The critical information is required to support ongoing and future engine system and stage design study efforts. This recent development effort included incorporation of an updated solid-core nuclear thermal reactor model that yields a reduced core weight and higher fuel power density when compared to a NERVA type reactor. NESS can now analyze expander, gas generator, and bleed cycles, along with multi-redundant propellant pump feed systems. Performance and weight of efficient multi-stage axial turbopump can now be determined, in addition to the traditional centrifugal pump
Stupl, Jan; Faber, Nicolas; Foster, Cyrus; Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Nuttall, Andrew; Henze, Chris; Levit, Creon
2014-01-01
This paper provides an updated efficiency analysis of the LightForce space debris collision avoidance scheme. LightForce aims to prevent collisions on warning by utilizing photon pressure from ground based, commercial off the shelf lasers. Past research has shown that a few ground-based systems consisting of 10 kilowatt class lasers directed by 1.5 meter telescopes with adaptive optics could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. Our simulation approach utilizes the entire Two Line Element (TLE) catalogue in LEO for a given day as initial input. Least-squares fitting of a TLE time series is used for an improved orbit estimate. We then calculate the probability of collision for all LEO objects in the catalogue for a time step of the simulation. The conjunctions that exceed a threshold probability of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the probability of collision and evaluate the efficiency of the system. This paper describes new simulations with three updated aspects: 1) By utilizing a highly parallel simulation approach employing hundreds of processors, we have extended our analysis to a much broader dataset. The simulation time is extended to one year. 2) We analyze not only the efficiency of LightForce on conjunctions that naturally occur, but also take into account conjunctions caused by orbit perturbations due to LightForce engagements. 3) We use a new simulation approach that is regularly updating the LightForce engagement strategy, as it would be during actual operations. In this paper we present our simulation approach to parallelize the efficiency analysis, its computational performance and the resulting expected efficiency of the LightForce collision avoidance system. Results indicate that utilizing a network of four LightForce stations with 20 kilowatt lasers, 85% of all conjunctions with a
Shiraishi, Emi; Maeda, Kazuhiro; Kurata, Hiroyuki
2009-02-01
Numerical simulation of differential equation systems plays a major role in the understanding of how metabolic network models generate particular cellular functions. On the other hand, the classical and technical problems for stiff differential equations still remain to be solved, while many elegant algorithms have been presented. To relax the stiffness problem, we propose new practical methods: the gradual update of differential-algebraic equations based on gradual application of the steady-state approximation to stiff differential equations, and the gradual update of the initial values in differential-algebraic equations. These empirical methods show a high efficiency for simulating the steady-state solutions for the stiff differential equations that existing solvers alone cannot solve. They are effective in extending the applicability of dynamic simulation to biochemical network models.
Updating of the program for simulation of Darlington shutdown and regulation systems
International Nuclear Information System (INIS)
1988-07-01
This report describes the current status of the developments of a simulation of the Darlington Nuclear Generating Station shutdown and regulating systems, DARSIM done under contract to the Atomic Energy Control Board (AECB). The DARSIM program simulates the spatial neutron dynamics, the regulation of the reactor power, and shutdown system 1 and shutdown system 2 software. The DARSIM program operates in the interactive simulation program environment. DARSIM was installed on the APOLLO computer at the AECB and a version for an IBM-PC was also provided for the exclusive use of the AECB. Shutdown system software was updated to incorporate the latest revisions in the functional specifications. Additional developments have been provided to assist in the use and interpretation of the DARSIM results
Electron-cloud updated simulation results for the PSR, and recent results for the SNS
International Nuclear Information System (INIS)
Pivi, M.; Furman, M.A.
2002-01-01
Recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos are presented in this paper. A refined model for the secondary emission process including the so called true secondary, rediffused and backscattered electrons has recently been included in the electron-cloud code
Directory of Open Access Journals (Sweden)
Y. P. Li
2013-07-01
Full Text Available The formation of Secondary organic aerosol (SOA was simulated with the Secondary ORGanic Aerosol Model (SORGAM by a classical gas-particle partitioning concept, using the two-product model approach, which is widely used in chemical transport models. In this study, we extensively updated SORGAM including three major modifications: firstly, we derived temperature dependence functions of the SOA yields for aromatics and biogenic VOCs (volatile organic compounds, based on recent chamber studies within a sophisticated mathematic optimization framework; secondly, we implemented the SOA formation pathways from photo oxidation (OH initiated of isoprene; thirdly, we implemented the SOA formation channel from NO3-initiated oxidation of reactive biogenic hydrocarbons (isoprene and monoterpenes. The temperature dependence functions of the SOA yields were validated against available chamber experiments, and the updated SORGAM with temperature dependence functions was evaluated with the chamber data. Good performance was found with the normalized mean error of less than 30%. Moreover, the whole updated SORGAM module was validated against ambient SOA observations represented by the summed oxygenated organic aerosol (OOA concentrations abstracted from aerosol mass spectrometer (AMS measurements at a rural site near Rotterdam, the Netherlands, performed during the IMPACT campaign in May 2008. In this case, we embedded both the original and the updated SORGAM module into the EURopean Air pollution and Dispersion-Inverse Model (EURAD-IM, which showed general good agreements with the observed meteorological parameters and several secondary products such as O3, sulfate and nitrate. With the updated SORGAM module, the EURAD-IM model also captured the observed SOA concentrations reasonably well especially those during nighttime. In contrast, the EURAD-IM model before update underestimated the observations by a factor of up to 5. The large improvements of the modeled
Update on comparison of the particle production using Mars simulation code
Prior, G; Kirk, H G; Souchlas, N; Ding, X
2011-01-01
In the International Design Study for the Neutrino Factory (IDS-NF), a 5-15 GeV (kinetic energy) proton beam impinges a Hg jet target, in order to produce pions that will decay into muons. The muons are captured and transformed into a beam, then passed to the downstream acceleration system. The target sits in a solenoid eld tapering from 20 T down to below 2 T, over several meters, permitting an optimized capture of the pions that will produce useful muons for the machine. The target and pion capture systems have been simulated using MARS. This paper presents an updated comparison of the particles production using the MARS code versions m1507 and m1510 on different machines located at the European Organization for Nuclear Research (CERN) and Brookhaven National Laboratory (BNL).
A comparison of updating algorithms for large N reduced models
Energy Technology Data Exchange (ETDEWEB)
Pérez, Margarita García [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); González-Arroyo, Antonio [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); Departamento de Física Teórica, C-XI Universidad Autónoma de Madrid,E-28049 Madrid (Spain); Keegan, Liam [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland); Okawa, Masanori [Graduate School of Science, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Core of Research for the Energetic Universe, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Ramos, Alberto [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland)
2015-06-29
We investigate Monte Carlo updating algorithms for simulating SU(N) Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole SU(N) matrix at once, or iterating through SU(2) subgroups of the SU(N) matrix, we find the same critical exponent in both cases, and only a slight difference between the two.
A comparison of updating algorithms for large $N$ reduced models
Pérez, Margarita García; Keegan, Liam; Okawa, Masanori; Ramos, Alberto
2015-01-01
We investigate Monte Carlo updating algorithms for simulating $SU(N)$ Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole $SU(N)$ matrix at once, or iterating through $SU(2)$ subgroups of the $SU(N)$ matrix, we find the same critical exponent in both cases, and only a slight difference between the two.
Decentralized Consistent Updates in SDN
Nguyen, Thanh Dang
2017-04-10
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
Empirical testing of forecast update procedure forseasonal products
DEFF Research Database (Denmark)
Wong, Chee Yew; Johansen, John
2008-01-01
Updating of forecasts is essential for successful collaborative forecasting, especially for seasonal products. This paper discusses the results of a theoretical simulation and an empirical test of a proposed time-series forecast updating procedure. It involves a two-stage longitudinal case study...... of a toy supply chain. The theoretical simulation involves historical weekly consumer demand data for 122 toy products. The empirical test is then carried out in real-time with 291 toy products. The results show that the proposed forecast updating procedure: 1) reduced forecast errors of the annual...... provided less forecast accuracy improvement and it needed a longer time to achieve relatively acceptable forecast uncertainty....
Model parameter updating using Bayesian networks
International Nuclear Information System (INIS)
Treml, C.A.; Ross, Timothy J.
2004-01-01
This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.
Brooks, Lynette E.
2013-01-01
The U.S. Geological Survey (USGS), in cooperation with the Southern Utah Valley Municipal Water Association, updated an existing USGS model of southern Utah and Goshen Valleys for hydrologic and climatic conditions from 1991 to 2011 and used the model for projection and groundwater management simulations. All model files used in the transient model were updated to be compatible with MODFLOW-2005 and with the additional stress periods. The well and recharge files had the most extensive changes. Discharge to pumping wells in southern Utah and Goshen Valleys was estimated and simulated on an annual basis from 1991 to 2011. Recharge estimates for 1991 to 2011 were included in the updated model by using precipitation, streamflow, canal diversions, and irrigation groundwater withdrawals for each year. The model was evaluated to determine how well it simulates groundwater conditions during recent increased withdrawals and drought, and to determine if the model is adequate for use in future planning. In southern Utah Valley, the magnitude and direction of annual water-level fluctuation simulated by the updated model reasonably match measured water-level changes, but they do not simulate as much decline as was measured in some locations from 2000 to 2002. Both the rapid increase in groundwater withdrawals and the total groundwater withdrawals in southern Utah Valley during this period exceed the variations and magnitudes simulated during the 1949 to 1990 calibration period. It is possible that hydraulic properties may be locally incorrect or that changes, such as land use or irrigation diversions, occurred that are not simulated. In the northern part of Goshen Valley, simulated water-level changes reasonably match measured changes. Farther south, however, simulated declines are much less than measured declines. Land-use changes indicate that groundwater withdrawals in Goshen Valley are possibly greater than estimated and simulated. It is also possible that irrigation
Optimal updating magnitude in adaptive flat-distribution sampling.
Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery
2017-11-07
We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.
Impacts of updated green vegetation fraction data on WRF simulations of the 2006 European heat wave
Refslund, J.; Dellwik, E.; Hahmann, A. N.; Barlage, M. J.; Boegh, E.
2012-12-01
Climate change studies suggest an increase in heat wave occurrences over Europe in the coming decades. Extreme events with excessive heat and associated drought will impact vegetation growth and health and lead to alterations in the partitioning of the surface energy. In this study, the atmospheric conditions during the heat wave year 2006 over Europe were simulated using the Weather Research and Forecasting (WRF) model. To account for the drought effects on the vegetation, new high-resolution green vegetation fraction (GVF) data were developed for the domain using NDVI data from MODIS satellite observations. Many empirical relationships exist to convert NDVI to GVF and both a linear and a quadratic formulation were evaluated. The new GVF product has a spatial resolution of 1 km2 and a temporal resolution of 8 days. To minimize impacts from low-quality satellite retrievals in the NDVI series, as well as for comparison with the default GVF climatology in WRF, a new background climatology using 10 recent years of observations was also developed. The annual time series of the new GVF climatology was compared to the default WRF GVF climatology at 18 km2 grid resolution for the most common land use classes in the European domain. The new climatology generally has higher GVF levels throughout the year, in particular an extended autumnal growth season. Comparison of 2006 GVF with the climatology clearly indicates vegetation stresses related to heat and drought. The GVF product based on a quadratic NDVI relationship shows the best agreement with the magnitude and annual range of the default input data, in addition to including updated seasonality for various land use classes. The new GVF products were tested in WRF and found to work well for the spring of 2006 where the difference between the default and new GVF products was small. The WRF 2006 heat wave simulations were verified by comparison with daily gridded observations of mean, minimum and maximum temperature and
An Update on the Banana Effect
Schulte, Daniel
2002-01-01
Wakefield in the main linac of the future linear collider can strongly affect the beam-beam interaction at the collision point and potentially lead to a significant luminosity loss. This paper gives an update on the status of the simulations of this effect.
Decentralized Consistent Network Updates in SDN with ez-Segway
Nguyen, Thanh Dang
2017-03-06
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
Editorial for special issue on Perception and Navigation for Autonomous Vehicles
Laugier , Christian; Philippe , Martinet; Urbano , Nunes
2014-01-01
International audience; This Special Issue of the IEEE Robotics and Automation Magazine has been prepared in the scope of the activities of the Technical Committee on "Autonomous Ground Vehicle and Intelligent Transportation System" (AGV-ITS) (http://www.ieee-ras.org/autonomous-groundvehicles- and-intelligent-transportation-systems) of the IEEE Robotics and Automation Society (IEEE RAS).
Update schemes of multi-velocity floor field cellular automaton for pedestrian dynamics
Luo, Lin; Fu, Zhijian; Cheng, Han; Yang, Lizhong
2018-02-01
Modeling pedestrian movement is an interesting problem both in statistical physics and in computational physics. Update schemes of cellular automaton (CA) models for pedestrian dynamics govern the schedule of pedestrian movement. Usually, different update schemes make the models behave in different ways, which should be carefully recalibrated. Thus, in this paper, we investigated the influence of four different update schemes, namely parallel/synchronous scheme, random scheme, order-sequential scheme and shuffled scheme, on pedestrian dynamics. The multi-velocity floor field cellular automaton (FFCA) considering the changes of pedestrians' moving properties along walking paths and heterogeneity of pedestrians' walking abilities was used. As for parallel scheme only, the collisions detection and resolution should be considered, resulting in a great difference from any other update schemes. For pedestrian evacuation, the evacuation time is enlarged, and the difference in pedestrians' walking abilities is better reflected, under parallel scheme. In face of a bottleneck, for example a exit, using a parallel scheme leads to a longer congestion period and a more dispersive density distribution. The exit flow and the space-time distribution of density and velocity have significant discrepancies under four different update schemes when we simulate pedestrian flow with high desired velocity. Update schemes may have no influence on pedestrians in simulation to create tendency to follow others, but sequential and shuffled update scheme may enhance the effect of pedestrians' familiarity with environments.
Co-operation and Phase Behavior under the Mixed Updating Rules
International Nuclear Information System (INIS)
Zhang Wen; Li Yao-Sheng; Xu Chen
2015-01-01
We present a model by considering two updating rules when the agents play prisoner's dilemma on a square lattice. Agents can update their strategies by referencing one of his neighbors of higher payoffs under the imitation updating rule or directly replaced by one of his neighbors according to the death-birth updating rule. The frequency of co-operation is related to the probability q of occurrence of the imitation updating or the death-birth updating and the game parameter b. The death-birth updating rule favors the co-operation while the imitation updating rule favors the defection on the lattice, although both rules suppress the co-operation in the well-mixed population. Therefore a totally co-operative state may emerge when the death-birth updating is involved in the evolution when b is relatively small. We also obtain a phase diagram on the q-b plane. There are three phases on the plane with two pure phases of a totally co-operative state and a totally defective state and a mixing phase of mixed strategies. Based on the pair approximation, we theoretically analyze the phase behavior and obtain a quantitative agreement with the simulation results. (paper)
Regan, R. Steve; LaFontaine, Jacob H.
2017-10-05
This report documents seven enhancements to the U.S. Geological Survey (USGS) Precipitation-Runoff Modeling System (PRMS) hydrologic simulation code: two time-series input options, two new output options, and three updates of existing capabilities. The enhancements are (1) new dynamic parameter module, (2) new water-use module, (3) new Hydrologic Response Unit (HRU) summary output module, (4) new basin variables summary output module, (5) new stream and lake flow routing module, (6) update to surface-depression storage and flow simulation, and (7) update to the initial-conditions specification. This report relies heavily upon U.S. Geological Survey Techniques and Methods, book 6, chapter B7, which documents PRMS version 4 (PRMS-IV). A brief description of PRMS is included in this report.
A New Simulation Technique for Study of Collisionless Shocks: Self-Adaptive Simulations
International Nuclear Information System (INIS)
Karimabadi, H.; Omelchenko, Y.; Driscoll, J.; Krauss-Varban, D.; Fujimoto, R.; Perumalla, K.
2005-01-01
The traditional technique for simulating physical systems modeled by partial differential equations is by means of time-stepping methodology where the state of the system is updated at regular discrete time intervals. This method has inherent inefficiencies. In contrast to this methodology, we have developed a new asynchronous type of simulation based on a discrete-event-driven (as opposed to time-driven) approach, where the simulation state is updated on a 'need-to-be-done-only' basis. Here we report on this new technique, show an example of particle acceleration in a fast magnetosonic shockwave, and briefly discuss additional issues that we are addressing concerning algorithm development and parallel execution
MEMS Stirling Cooler Development Update
Moran, Matthew E.; Wesolek, Danielle
2003-01-01
This presentation provides an update on the effort to build and test a prototype unit of the patented MEMS Stirling cooler concept. A micro-scale regenerator has been fabricated by Polar Thermal Technologies and is currently being integrated into a Stirling cycle simulator at Johns Hopkins University Applied Physics Laboratory. A discussion of the analysis, design, assembly, and test plans for the prototype will be presented.
A Lookahead Behavior Model for Multi-Agent Hybrid Simulation
Directory of Open Access Journals (Sweden)
Mei Yang
2017-10-01
Full Text Available In the military field, multi-agent simulation (MAS plays an important role in studying wars statistically. For a military simulation system, which involves large-scale entities and generates a very large number of interactions during the runtime, the issue of how to improve the running efficiency is of great concern for researchers. Current solutions mainly use hybrid simulation to gain fewer updates and synchronizations, where some important continuous models are maintained implicitly to keep the system dynamics, and partial resynchronization (PR is chosen as the preferable state update mechanism. However, problems, such as resynchronization interval selection and cyclic dependency, remain unsolved in PR, which easily lead to low update efficiency and infinite looping of the state update process. To address these problems, this paper proposes a lookahead behavior model (LBM to implement a PR-based hybrid simulation. In LBM, a minimal safe time window is used to predict the interactions between implicit models, upon which the resynchronization interval can be efficiently determined. Moreover, the LBM gives an estimated state value in the lookahead process so as to break the state-dependent cycle. The simulation results show that, compared with traditional mechanisms, LBM requires fewer updates and synchronizations.
Full-scope training simulators
International Nuclear Information System (INIS)
Ugedo, E.
1986-01-01
The following topics to be covered in this report are: Reasons justifying the use of full-scope simulators for operator qualification. Full-scope simulator description: the control room, the physical models, the computer complex, the instructor's console. Main features of full-scope simulators. Merits of simulator training. The role of full-scope simulators in the training programs. The process of ordering and acquiring a full-scope simulator. Maintaining and updating simulator capabilities. (orig./GL)
ROMI 4.0: Updated Rough Mill Simulator
Timo Grueneberg; R. Edward Thomas; Urs Buehlmann
2012-01-01
In the secondary hardwood industry, rough mills convert hardwood lumber into dimension parts for furniture, cabinets, and other wood products. ROMI 4.0, the US Department of Agriculture Forest Service's ROugh-MIll simulator, is a software package designed to simulate the cut-up of hardwood lumber in rough mills in such a way that a maximum possible component yield...
Ontology Update in the Cognitive Model of Ontology Learning
Directory of Open Access Journals (Sweden)
Zhang De-Hai
2016-01-01
Full Text Available Ontology has been used in many hot-spot fields, but most ontology construction methods are semiautomatic, and the construction process of ontology is still a tedious and painstaking task. In this paper, a kind of cognitive models is presented for ontology learning which can simulate human being’s learning from world. In this model, the cognitive strategies are applied with the constrained axioms. Ontology update is a key step when the new knowledge adds into the existing ontology and conflict with old knowledge in the process of ontology learning. This proposal designs and validates the method of ontology update based on the axiomatic cognitive model, which include the ontology update postulates, axioms and operations of the learning model. It is proved that these operators subject to the established axiom system.
Sustained qualification process for full scope nuclear power plant simulators
International Nuclear Information System (INIS)
Pirson, J.; Stubbe, E.; Vanhoenacker, L.
1994-01-01
In the past decade, simulator training for all nuclear power plant operators has evolved into a vital requirement. To assure a correct training, the simulator qualification process is an important issue not only for the initial validation but also following major simulator updates, which are necessary during the lifetime of the simulator. In order to avoid degradation of the simulator validated software, the modifications have to be introduced according to a rigorous methodology and a practical requalification process has to be applied. Such methodology has to be enforced at every phase of the simulator construction or updating process from plant data package constitution, over simulator software development to simulator response qualification. The initial qualification and requalification process is based on the 3 levels identified by the ANSI/ANS 3-5 standard for steady-state, operational transients and accident conditions. For the initial certification of the full scope simulators in Belgium, a practical qualification methodology has been applied, which has been adapted into a set of non regression tests for the requalification after major simulator updates. (orig.) (4 refs., 3 figs.)
Rakovec, O.; Weerts, A.; Hazenberg, P.; Torfs, P.; Uijlenhoet, R.
2012-12-01
This paper presents a study on the optimal setup for discharge assimilation within a spatially distributed hydrological model (Rakovec et al., 2012a). The Ensemble Kalman filter (EnKF) is employed to update the grid-based distributed states of such an hourly spatially distributed version of the HBV-96 model. By using a physically based model for the routing, the time delay and attenuation are modelled more realistically. The discharge and states at a given time step are assumed to be dependent on the previous time step only (Markov property). Synthetic and real world experiments are carried out for the Upper Ourthe (1600 km2), a relatively quickly responding catchment in the Belgian Ardennes. The uncertain precipitation model forcings were obtained using a time-dependent multivariate spatial conditional simulation method (Rakovec et al., 2012b), which is further made conditional on preceding simulations. We assess the impact on the forecasted discharge of (1) various sets of the spatially distributed discharge gauges and (2) the filtering frequency. The results show that the hydrological forecast at the catchment outlet is improved by assimilating interior gauges. This augmentation of the observation vector improves the forecast more than increasing the updating frequency. In terms of the model states, the EnKF procedure is found to mainly change the pdfs of the two routing model storages, even when the uncertainty in the discharge simulations is smaller than the defined observation uncertainty. Rakovec, O., Weerts, A. H., Hazenberg, P., Torfs, P. J. J. F., and Uijlenhoet, R.: State updating of a distributed hydrological model with Ensemble Kalman Filtering: effects of updating frequency and observation network density on forecast accuracy, Hydrol. Earth Syst. Sci. Discuss., 9, 3961-3999, doi:10.5194/hessd-9-3961-2012, 2012a. Rakovec, O., Hazenberg, P., Torfs, P. J. J. F., Weerts, A. H., and Uijlenhoet, R.: Generating spatial precipitation ensembles: impact of
Marwala, Tshilidzi
2010-01-01
Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...
A Kriging Model Based Finite Element Model Updating Method for Damage Detection
Directory of Open Access Journals (Sweden)
Xiuming Yang
2017-10-01
Full Text Available Model updating is an effective means of damage identification and surrogate modeling has attracted considerable attention for saving computational cost in finite element (FE model updating, especially for large-scale structures. In this context, a surrogate model of frequency is normally constructed for damage identification, while the frequency response function (FRF is rarely used as it usually changes dramatically with updating parameters. This paper presents a new surrogate model based model updating method taking advantage of the measured FRFs. The Frequency Domain Assurance Criterion (FDAC is used to build the objective function, whose nonlinear response surface is constructed by the Kriging model. Then, the efficient global optimization (EGO algorithm is introduced to get the model updating results. The proposed method has good accuracy and robustness, which have been verified by a numerical simulation of a cantilever and experimental test data of a laboratory three-story structure.
Directory of Open Access Journals (Sweden)
Ulrike Stumvoll
2016-01-01
Full Text Available In an Enterprise Resource Planning (ERP system, production planning is influenced by a variety of parameters. Previous investigations show that setting parameter values is highly relevant to a company’s target system. Parameter settings should be checked and adjusted, e.g., after a change in environmental factors, by material planners. In practice, updating the parameters is difficult due to several reasons. This paper presents a simulation-based decision support system, which helps material planners in all stages of decision-making processes. It will present the system prototype’s user interface and the results of applying the system to a case study.
Demeter, R M; Kristensen, A R; Dijkstra, J; Oude Lansink, A G J M; Meuwissen, M P M; van Arendonk, J A M
2011-12-01
Herd optimization models that determine economically optimal insemination and replacement decisions are valuable research tools to study various aspects of farming systems. The aim of this study was to develop a herd optimization and simulation model for dairy cattle. The model determines economically optimal insemination and replacement decisions for individual cows and simulates whole-herd results that follow from optimal decisions. The optimization problem was formulated as a multi-level hierarchic Markov process, and a state space model with Bayesian updating was applied to model variation in milk yield. Methodological developments were incorporated in 2 main aspects. First, we introduced an additional level to the model hierarchy to obtain a more tractable and efficient structure. Second, we included a recently developed cattle feed intake model. In addition to methodological developments, new parameters were used in the state space model and other biological functions. Results were generated for Dutch farming conditions, and outcomes were in line with actual herd performance in the Netherlands. Optimal culling decisions were sensitive to variation in milk yield but insensitive to energy requirements for maintenance and feed intake capacity. We anticipate that the model will be applied in research and extension. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Aircraft engine sensor fault diagnostics using an on-line OBEM update method.
Directory of Open Access Journals (Sweden)
Xiaofeng Liu
Full Text Available This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI system, in which a Hybrid Kalman Filter (HKF was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault.
Novel approach to improve the attitude update rate of a star tracker.
Zhang, Shuo; Xing, Fei; Sun, Ting; You, Zheng; Wei, Minsong
2018-03-05
The star tracker is widely used in attitude control systems of spacecraft for attitude measurement. The attitude update rate of a star tracker is important to guarantee the attitude control performance. In this paper, we propose a novel approach to improve the attitude update rate of a star tracker. The electronic Rolling Shutter (RS) imaging mode of the complementary metal-oxide semiconductor (CMOS) image sensor in the star tracker is applied to acquire star images in which the star spots are exposed with row-to-row time offsets, thereby reflecting the rotation of star tracker at different times. The attitude estimation method with a single star spot is developed to realize the multiple attitude updates by a star image, so as to reach a high update rate. The simulation and experiment are performed to verify the proposed approaches. The test results demonstrate that the proposed approach is effective and the attitude update rate of a star tracker is increased significantly.
"Updates to Model Algorithms & Inputs for the Biogenic Emissions Inventory System (BEIS) Model"
We have developed new canopy emission algorithms and land use data for BEIS. Simulations with BEIS v3.4 and these updates in CMAQ v5.0.2 are compared these changes to the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and evaluated the simulations against observatio...
The Rubber Band Revisited: Wang-Landau Simulation
Ferreira, Lucas S.; Caparica, Alvaro A.; Neto, Minos A.; Galiceanu, Mircea D.
2012-01-01
In this work we apply Wang-Landau simulations to a simple model which has exact solutions both in the microcanonical and canonical formalisms. The simulations were carried out by using an updated version of the Wang-Landau sampling. We consider a homopolymer chain consisting of $N$ monomers units which may assume any configuration on the two-dimensional lattice. By imposing constraints to the moves of the polymers we obtain three different models. Our results show that updating the density of...
A review on model updating of joint structure for dynamic analysis purpose
Directory of Open Access Journals (Sweden)
Zahari S.N.
2016-01-01
Full Text Available Structural joints provide connection between structural element (beam, plate etc. in order to construct a whole assembled structure. There are many types of structural joints such as bolted joint, riveted joints and welded joints. The joints structures significantly contribute to structural stiffness and dynamic behaviour of structures hence the main objectives of this paper are to review on method of model updating on joints structure and to discuss the guidelines to perform model updating for dynamic analysis purpose. This review paper firstly will outline some of the existing finite element modelling works of joints structure. Experimental modal analysis is the next step to obtain modal parameters (natural frequency & mode shape to validate and improve the discrepancy between results obtained from experimental and the simulation counterparts. Hence model updating will be carried out to minimize the differences between the two results. There are two methods of model updating; direct method and iterative method. Sensitivity analysis employed using SOL200 in NASTRAN by selecting the suitable updating parameters to avoid ill-conditioning problem. It is best to consider both geometrical and material properties in the updating procedure rather than choosing only a number of geometrical properties alone. Iterative method was chosen as the best model updating procedure because the physical meaning of updated parameters are guaranteed although this method required computational effort compare to direct method.
Design and use of an engineering simulator for power plant and training simulator updates
International Nuclear Information System (INIS)
Sharawy, P.S.; Kennard, J.R.; Chou, Q.B.
1990-01-01
The advancement in real-time simulators has been facilitated by the availability of increasingly powerful computing devices at reduced costs for use in conjunction with high-fidelity simulation software. Ontario Hydro's commitment to the safe and reliable operation of its nuclear power plants was one of the factors which influenced its decision to build a plant-replica operator training simulator for each of its nuclear generating stations. This investment soon proved to have advantages beyond those originally envisaged. It become apparent that because the software developed for these simulators met rigorous acceptance criteria, it could be used on an engineering simulator to effectively investigate problems occurring at the stations. It could also serve as a design aid for station modifications. Encouraged by the success of early experimentation in the use of its training simulators for concept validation and verification, Ontario Hydro is developing a low-cost central facility - the Instrumentation and Control Engineering Simulator (ICES) - for use in its design work. This facility incorporates the software of its training simulators and includes a user-friendly generic interface which enables designers to configure and operate it. Inclusion of the engineering simulator in all phases of the design process, from the original concept to implementation and verification, will make it possible to shorten the design period significantly while achieving a high level of quality. It will also facilitate the rapid retrofit of simulators to reflect station modifications. This paper will recount Ontario Hydro's experience in the use of simulators for design work and will specifically discuss the design features and system performance of its engineering simulator
Least squares approach for initial data recovery in dynamic data-driven applications simulations
Douglas, C.
2010-12-01
In this paper, we consider the initial data recovery and the solution update based on the local measured data that are acquired during simulations. Each time new data is obtained, the initial condition, which is a representation of the solution at a previous time step, is updated. The update is performed using the least squares approach. The objective function is set up based on both a measurement error as well as a penalization term that depends on the prior knowledge about the solution at previous time steps (or initial data). Various numerical examples are considered, where the penalization term is varied during the simulations. Numerical examples demonstrate that the predictions are more accurate if the initial data are updated during the simulations. © Springer-Verlag 2011.
Information dissemination model for social media with constant updates
Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui
2018-07-01
With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.
Simulated and observed 2010 floodwater elevations in the Pawcatuck and Wood Rivers, Rhode Island
Zarriello, Phillip J.; Straub, David E.; Smith, Thor E.
2014-01-01
Heavy, persistent rains from late February through March 2010 caused severe flooding that set, or nearly set, peaks of record for streamflows and water levels at many long-term U.S. Geological Survey streamgages in Rhode Island. In response to this flood, hydraulic models of Pawcatuck River (26.9 miles) and Wood River (11.6 miles) were updated from the most recent approved U.S. Department of Homeland Security-Federal Emergency Management Agency flood insurance study (FIS) to simulate water-surface elevations (WSEs) for specified flows and boundary conditions. The hydraulic models were updated to Hydrologic Engineering Center-River Analysis System (HEC-RAS) using steady-state simulations and incorporate new field-survey data at structures, high resolution land-surface elevation data, and updated flood flows from a related study. The models were used to simulate the 0.2-percent annual exceedance probability (AEP) flood, which is the AEP determined for the 2010 flood in the Pawcatuck and Wood Rivers. The simulated WSEs were compared to high-water mark (HWM) elevation data obtained in a related study following the March–April 2010 flood, which included 39 HWMs along the Pawcatuck River and 11 HWMs along the Wood River. The 2010 peak flow generally was larger than the 0.2-percent AEP flow, which, in part, resulted in the FIS and updated model WSEs to be lower than the 2010 HWMs. The 2010 HWMs for the Pawcatuck River averaged about 1.6 feet (ft) higher than the 0.2-percent AEP WSEs simulated in the updated model and 2.5 ft higher than the WSEs in the FIS. The 2010 HWMs for the Wood River averaged about 1.3 ft higher than the WSEs simulated in the updated model and 2.5 ft higher than the WSEs in the FIS. The improved agreement of the updated simulated water elevations to observed 2010 HWMs provides a measure of the hydraulic model performance, which indicates the updated models better represent flooding at other AEPs than the existing FIS models.
Computerized Adaptive Testing with R: Recent Updates of the Package catR
Directory of Open Access Journals (Sweden)
David Magis
2017-01-01
Full Text Available The purpose of this paper is to list the recent updates of the R package catR. This package allows for generating response patterns under a computerized adaptive testing (CAT framework with underlying item response theory (IRT models. Among the most important updates, well-known polytomous IRT models are now supported by catR; several item selection rules have been added; and it is now possible to perform post-hoc simulations. Some functions were also rewritten or withdrawn to improve the usefulness and performances of the package.
Remote collaboration system based on large scale simulation
International Nuclear Information System (INIS)
Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.
2008-01-01
Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another
The rubber band revisited: Wang–Landau simulation
International Nuclear Information System (INIS)
Ferreira, Lucas S; Caparica, Álvaro A; Neto, Minos A; Galiceanu, Mircea D
2012-01-01
In this work we apply Wang–Landau simulations to a simple model which has exact solutions both in the microcanonical and canonical formalisms. The simulations were carried out by using an updated version of the Wang–Landau sampling. We consider a homopolymer chain consisting of N monomers units which may assume any configuration on the two-dimensional lattice. By imposing constraints to the moves of the polymers we obtain three different models. Our results show that updating the density of states only after every N monomer moves leads to a better precision. We obtain the specific heat and the end-to-end distance per monomer and test the precision of our simulations by comparing the location of the maximum of the specific heat with the exact results and conventional Wang–Landau simulations for the three types of walk. (paper)
Timing Interactions in Social Simulations: The Voter Model
Fernández-Gracia, Juan; Eguíluz, Víctor M.; Miguel, Maxi San
The recent availability of huge high resolution datasets on human activities has revealed the heavy-tailed nature of the interevent time distributions. In social simulations of interacting agents the standard approach has been to use Poisson processes to update the state of the agents, which gives rise to very homogeneous activity patterns with a well defined characteristic interevent time. As a paradigmatic opinion model we investigate the voter model and review the standard update rules and propose two new update rules which are able to account for heterogeneous activity patterns. For the new update rules each node gets updated with a probability that depends on the time since the last event of the node, where an event can be an update attempt (exogenous update) or a change of state (endogenous update). We find that both update rules can give rise to power law interevent time distributions, although the endogenous one more robustly. Apart from that for the exogenous update rule and the standard update rules the voter model does not reach consensus in the infinite size limit, while for the endogenous update there exist a coarsening process that drives the system toward consensus configurations.
Full LCD detector simulation with GISMO
International Nuclear Information System (INIS)
Cassell, Ronald
2001-01-01
We present a status update of a full simulation package using GISMO. This package is a functioning tool producing simulation data for the two standard LCD detector designs, in a framework allowing easy changes to the detector designs. The simulation engine, GISMO, is separated from the application code, GISMOAPPS, to allow for a future upgrade to GEANT4 within the same framework
Lippert, Ross A.; Predescu, Cristian; Ierardi, Douglas J.; Mackenzie, Kenneth M.; Eastwood, Michael P.; Dror, Ron O.; Shaw, David E.
2013-10-01
In molecular dynamics simulations, control over temperature and pressure is typically achieved by augmenting the original system with additional dynamical variables to create a thermostat and a barostat, respectively. These variables generally evolve on timescales much longer than those of particle motion, but typical integrator implementations update the additional variables along with the particle positions and momenta at each time step. We present a framework that replaces the traditional integration procedure with separate barostat, thermostat, and Newtonian particle motion updates, allowing thermostat and barostat updates to be applied infrequently. Such infrequent updates provide a particularly substantial performance advantage for simulations parallelized across many computer processors, because thermostat and barostat updates typically require communication among all processors. Infrequent updates can also improve accuracy by alleviating certain sources of error associated with limited-precision arithmetic. In addition, separating the barostat, thermostat, and particle motion update steps reduces certain truncation errors, bringing the time-average pressure closer to its target value. Finally, this framework, which we have implemented on both general-purpose and special-purpose hardware, reduces software complexity and improves software modularity.
Dimension reduction of Karhunen-Loeve expansion for simulation of stochastic processes
Liu, Zhangjun; Liu, Zixin; Peng, Yongbo
2017-11-01
Conventional Karhunen-Loeve expansions for simulation of stochastic processes often encounter the challenge of dealing with hundreds of random variables. For breaking through the barrier, a random function embedded Karhunen-Loeve expansion method is proposed in this paper. The updated scheme has a similar form to the conventional Karhunen-Loeve expansion, both involving a summation of a series of deterministic orthonormal basis and uncorrelated random variables. While the difference from the updated scheme lies in the dimension reduction of Karhunen-Loeve expansion through introducing random functions as a conditional constraint upon uncorrelated random variables. The random function is expressed as a single-elementary-random-variable orthogonal function in polynomial format (non-Gaussian variables) or trigonometric format (non-Gaussian and Gaussian variables). For illustrative purposes, the simulation of seismic ground motion is carried out using the updated scheme. Numerical investigations reveal that the Karhunen-Loeve expansion with random functions could gain desirable simulation results in case of a moderate sample number, except the Hermite polynomials and the Laguerre polynomials. It has the sound applicability and efficiency in simulation of stochastic processes. Besides, the updated scheme has the benefit of integrating with probability density evolution method, readily for the stochastic analysis of nonlinear structures.
Optimization Model for Web Based Multimodal Interactive Simulations.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2015-07-15
This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.
a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating
Tian, W.; Zhu, X.; Liu, Y.
2012-08-01
Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.
Kessler, Yoav; Oberauer, Klaus
2014-01-01
Updating and maintenance of information are 2 conflicting demands on working memory (WM). We examined the time required to update WM (updating latency) as a function of the sequence of updated and not-updated items within a list. Participants held a list of items in WM and updated a variable subset of them in each trial. Four experiments that vary…
ECLOUD in PS2, PS+, SPS+: AN UPDATE
International Nuclear Information System (INIS)
Furman, M.A.
2007-01-01
We present an update of our results for the electron-cloud build-up for several upgrades proposed for the LHC injectors. Specifically, we have re-examined our published results for the ecloud heat load [1] from the perspective of numerical convergence of the simulations vis-a-vis the integration time step Δt. We repeated most of the simulations with ever smaller values of Δt until we reached stable results, indicating numerical convergence; this was achieved at 200-500 slices per bunch, depending on the particular case. In all cases examined, the simulated heat load decreases monotonically, until the limit is reached, as Δt decreases in the range explored, hence the stable results are more favorable vis-a-vis the heat load than the previous ones. This is particularly true for a bunch spacing t b = 25 ns
International Nuclear Information System (INIS)
Mueller, P.
1995-01-01
This talks describes updates in the following updates in FRMAC publications concerning radiation emergencies: Monitoring and Analysis Manual; Evaluation and Assessment Manual; Handshake Series (Biannual) including exercises participated in; environmental Data and Instrument Transmission System (EDITS); Plume in a Box with all radiological data stored onto a hand-held computer; and courses given
Lu, Xiaoman; Zheng, Guang; Miller, Colton; Alvarado, Ernesto
2017-09-08
Monitoring and understanding the spatio-temporal variations of forest aboveground biomass (AGB) is a key basis to quantitatively assess the carbon sequestration capacity of a forest ecosystem. To map and update forest AGB in the Greater Khingan Mountains (GKM) of China, this work proposes a physical-based approach. Based on the baseline forest AGB from Landsat Enhanced Thematic Mapper Plus (ETM+) images in 2008, we dynamically updated the annual forest AGB from 2009 to 2012 by adding the annual AGB increment (ABI) obtained from the simulated daily and annual net primary productivity (NPP) using the Boreal Ecosystem Productivity Simulator (BEPS) model. The 2012 result was validated by both field- and aerial laser scanning (ALS)-based AGBs. The predicted forest AGB for 2012 estimated from the process-based model can explain 31% ( n = 35, p forest AGBs, respectively. However, due to the saturation of optical remote sensing-based spectral signals and contribution of understory vegetation, the BEPS-based AGB tended to underestimate/overestimate the AGB for dense/sparse forests. Generally, our results showed that the remotely sensed forest AGB estimates could serve as the initial carbon pool to parameterize the process-based model for NPP simulation, and the combination of the baseline forest AGB and BEPS model could effectively update the spatiotemporal distribution of forest AGB.
Finite Difference Time Domain (FDTD) Simulations Using Graphics Processors
National Research Council Canada - National Science Library
Adams, Samuel; Payne, Jason; Boppana, Rajendra
2007-01-01
.... This paper shows how GPUs can be used to greatly speedup FDTD simulations. The main objective is to leverage GPU processing power for FDTD update calculations and complete computationally expensive simulations in reasonable time...
Photovoltaic Shading Testbed for Module-Level Power Electronics: 2016 Performance Data Update
Energy Technology Data Exchange (ETDEWEB)
Deline, Chris [National Renewable Energy Lab. (NREL), Golden, CO (United States); Meydbray, Jenya [PV Evolution Labs (PVEL), Davis, CA (United States); Donovan, Matt [PV Evolution Labs (PVEL), Davis, CA (United States)
2016-09-01
The 2012 NREL report 'Photovoltaic Shading Testbed for Module-Level Power Electronics' provides a standard methodology for estimating the performance benefit of distributed power electronics under partial shading conditions. Since the release of the report, experiments have been conducted for a number of products and for different system configurations. Drawing from these experiences, updates to the test and analysis methods are recommended. Proposed changes in data processing have the benefit of reducing the sensitivity to measurement errors and weather variability, as well as bringing the updated performance score in line with measured and simulated values of the shade recovery benefit of distributed PV power electronics. Also, due to the emergence of new technologies including sub-module embedded power electronics, the shading method has been extended to include power electronics that operate at a finer granularity than the module level. An update to the method is proposed to account for these emerging technologies that respond to shading differently than module-level devices. The partial shading test remains a repeatable test procedure that attempts to simulate shading situations as would be experienced by typical residential or commercial rooftop photovoltaic (PV) systems. Performance data for multiple products tested using this method are discussed, based on equipment from Enphase, Solar Edge, Maxim Integrated and SMA. In general, the annual recovery of shading losses from the module-level electronics evaluated is 25-35%, with the major difference between different trials being related to the number of parallel strings in the test installation rather than differences between the equipment tested. Appendix D data has been added in this update.
Real-time numerical shake prediction and updating for earthquake early warning
Wang, Tianyun; Jin, Xing; Wei, Yongxiang; Huang, Yandan
2017-12-01
Ground motion prediction is important for earthquake early warning systems, because the region's peak ground motion indicates the potential disaster. In order to predict the peak ground motion quickly and precisely with limited station wave records, we propose a real-time numerical shake prediction and updating method. Our method first predicts the ground motion based on the ground motion prediction equation after P waves detection of several stations, denoted as the initial prediction. In order to correct the prediction error of the initial prediction, an updating scheme based on real-time simulation of wave propagation is designed. Data assimilation technique is incorporated to predict the distribution of seismic wave energy precisely. Radiative transfer theory and Monte Carlo simulation are used for modeling wave propagation in 2-D space, and the peak ground motion is calculated as quickly as possible. Our method has potential to predict shakemap, making the potential disaster be predicted before the real disaster happens. 2008 M S8.0 Wenchuan earthquake is studied as an example to show the validity of the proposed method.
A last updating evolution model for online social networks
Bu, Zhan; Xia, Zhengyou; Wang, Jiandong; Zhang, Chengcui
2013-05-01
As information technology has advanced, people are turning to electronic media more frequently for communication, and social relationships are increasingly found on online channels. However, there is very limited knowledge about the actual evolution of the online social networks. In this paper, we propose and study a novel evolution network model with the new concept of “last updating time”, which exists in many real-life online social networks. The last updating evolution network model can maintain the robustness of scale-free networks and can improve the network reliance against intentional attacks. What is more, we also found that it has the “small-world effect”, which is the inherent property of most social networks. Simulation experiment based on this model show that the results and the real-life data are consistent, which means that our model is valid.
Memory updating and mental arithmetic
Directory of Open Access Journals (Sweden)
Cheng-Ching eHan
2016-02-01
Full Text Available Is domain-general memory updating ability predictive of calculation skills or are such skills better predicted by the capacity for updating specifically numerical information? Here, we used multidigit mental multiplication (MMM as a measure for calculating skill as this operation requires the accurate maintenance and updating of information in addition to skills needed for arithmetic more generally. In Experiment 1, we found that only individual differences with regard to a task updating numerical information following addition (MUcalc could predict the performance of MMM, perhaps owing to common elements between the task and MMM. In Experiment 2, new updating tasks were designed to clarify this: a spatial updating task with no numbers, a numerical task with no calculation, and a word task. The results showed that both MUcalc and the spatial task were able to predict the performance of MMM but only with the more difficult problems, while other updating tasks did not predict performance. It is concluded that relevant processes involved in updating the contents of working memory support mental arithmetic in adults.
Nuclear power plant diagnostics study at the Midland Training Simulator
International Nuclear Information System (INIS)
Reifman, J.; Rank, P.; Lee, J.C.; Wehe, D.K.
1991-01-01
This paper discusses the implementation of two advanced diagnostic concepts for nuclear power plant diagnostics, the systematic generation and updating of a rule-based system and the simulation filter, at the Midland Nuclear Power Plant Unit 2 Training Simulator. The authors use an entropy minimax pattern recognition algorithm for the systematic construction of the diagnostic rule base. By extracting information from a transient database constructed with the Midland Simulator, the algorithm searches for trends in plant parameters, forming patterns or rules that describe the behavior of the transients. The rules are updated in an incremental manner within the context of the entropy minimax algorithm. The simulation filter is a nonlinear parameter estimation algorithm based on the extended Kalman filter. The authors use the simulation filter to improve the results of crude simulation models by optimally estimating system states given a set of measurements and results from a nonlinear simulation program. The Midland Simulator results of the Three Mile Island accident are significantly improved with the use of the simulation filter
Energy Technology Data Exchange (ETDEWEB)
Cort, Katherine A.; Belzer, David B.; Winiarski, David W.; Richman, Eric E.
2004-04-30
The state of North Dakota is considering updating its commercial building energy code. This report evaluates the potential costs and benefits to North Dakota residents from updating and requiring compliance with ASHRAE Standard 90.1-2001. Both qualitative and quantitative benefits and costs are assessed in the analysis. Energy and economic impacts are estimated using the Building Loads Analysis and System Thermodynamics (BLAST simulation combined with a Life-cycle Cost (LCC) approach to assess correspodning economic costs and benefits.
Self-shielding models of MICROX-2 code: Review and updates
International Nuclear Information System (INIS)
Hou, J.; Choi, H.; Ivanov, K.N.
2014-01-01
Highlights: • The MICROX-2 code has been improved to expand its application to advanced reactors. • New fine-group cross section libraries based on ENDF/B-VII have been generated. • Resonance self-shielding and spatial self-shielding models have been improved. • The improvements were assessed by a series of benchmark calculations against MCNPX. - Abstract: The MICROX-2 is a transport theory code that solves for the neutron slowing-down and thermalization equations of a two-region lattice cell. The MICROX-2 code has been updated to expand its application to advanced reactor concepts and fuel cycle simulations, including generation of new fine-group cross section libraries based on ENDF/B-VII. In continuation of previous work, the MICROX-2 methods are reviewed and updated in this study, focusing on its resonance self-shielding and spatial self-shielding models for neutron spectrum calculations. The improvement of self-shielding method was assessed by a series of benchmark calculations against the Monte Carlo code, using homogeneous and heterogeneous pin cell models. The results have shown that the implementation of the updated self-shielding models is correct and the accuracy of physics calculation is improved. Compared to the existing models, the updates reduced the prediction error of the infinite multiplication factor by ∼0.1% and ∼0.2% for the homogeneous and heterogeneous pin cell models, respectively, considered in this study
A Model for Capturing Team Adaptation in Simulated Emergencies
DEFF Research Database (Denmark)
Paltved, Charlotte; Musaeus, Peter
2013-01-01
and conceptualizes team processes through recursive cycles of updates. In the 29 simulation scenarios, 94 updates were recorded. There were between 0 and 8 updates per scenario (mean 3,2). Level five was achieved in 13 scenarios, level four in 8 scenarios and finally, level two and three were achieved in four...... is required to meaningfully account for communication exchanges in context. As such, this theoretical framework might provide a vocabulary for operationalizing the differences between "effective and ineffective" communication. Moving beyond counting communication events or the frequency of certain...
Gantt, B.; Kelly, J. T.; Bash, J. O.
2015-11-01
Sea spray aerosols (SSAs) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. Model evaluations of SSA emissions have mainly focused on the global scale, but regional-scale evaluations are also important due to the localized impact of SSAs on atmospheric chemistry near the coast. In this study, SSA emissions in the Community Multiscale Air Quality (CMAQ) model were updated to enhance the fine-mode size distribution, include sea surface temperature (SST) dependency, and reduce surf-enhanced emissions. Predictions from the updated CMAQ model and those of the previous release version, CMAQv5.0.2, were evaluated using several coastal and national observational data sets in the continental US. The updated emissions generally reduced model underestimates of sodium, chloride, and nitrate surface concentrations for coastal sites in the Bay Regional Atmospheric Chemistry Experiment (BRACE) near Tampa, Florida. Including SST dependency to the SSA emission parameterization led to increased sodium concentrations in the southeastern US and decreased concentrations along parts of the Pacific coast and northeastern US. The influence of sodium on the gas-particle partitioning of nitrate resulted in higher nitrate particle concentrations in many coastal urban areas due to increased condensation of nitric acid in the updated simulations, potentially affecting the predicted nitrogen deposition in sensitive ecosystems. Application of the updated SSA emissions to the California Research at the Nexus of Air Quality and Climate Change (CalNex) study period resulted in a modest improvement in the predicted surface concentration of sodium and nitrate at several central and southern California coastal sites. This update of SSA emissions enabled a more realistic simulation of the atmospheric chemistry in coastal environments where marine air mixes with urban pollution.
Update of CERN exchange network
2003-01-01
An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas April 11 Update of switch in LHC 4 LHC 4 Point April 14 Update of switch in LHC 5 LHC 5 Point April 15 Update of switches in LHC 3 and LHC 2 Points LHC 3 and LHC 2 April 22 Update of switch N4 Meyrin Ouest April 23 Update of switch N6 Prévessin Site Ap...
DEFF Research Database (Denmark)
Hansen, Lisbet Sneftrup; Borup, Morten; Moller, Arne
2014-01-01
drainage models and reduce a number of unavoidable discrepancies between the model and reality. The latter can be achieved partly by inserting measured water levels from the sewer system into the model. This article describes how deterministic updating of model states in this manner affects a simulation...
International Nuclear Information System (INIS)
Liu Yang; Xu Dejian; Li Yan; Duan Zhongdong
2011-01-01
As a novel updating technique, cross-model cross-mode (CMCM) method possesses a high efficiency and capability of flexible selecting updating parameters. However, the success of this method depends on the accuracy of measured modal shapes. Usually, the measured modal shapes are inaccurate since many kinds of measured noises are inevitable. Furthermore, the complete testing modal shapes are required by CMCM method so that the calculating errors may be introduced into the measured modal shapes by conducting the modal expansion or model reduction technique. Therefore, this algorithm is faced with the challenge of updating the finite element (FE) model of practical complex structures. In this study, the fuzzy CMCM method is proposed in order to weaken the effect of errors of the measured modal shapes on the updated results. Then two simulated examples are applied to compare the performance of the fuzzy CMCM method with the CMCM method. The test results show that proposed method is more promising to update the FE model of practical structures than CMCM method.
Gating based on internal/external signals with dynamic correlation updates
International Nuclear Information System (INIS)
Wu Huanmei; Zhao Qingya; Berbeco, Ross I; Nishioka, Seiko; Shirato, Hiroki; Jiang, Steve B
2008-01-01
Precise localization of mobile tumor positions in real time is critical to the success of gated radiotherapy. Tumor positions are usually derived from either internal or external surrogates. Fluoroscopic gating based on internal surrogates, such as implanted fiducial markers, is accurate however requiring a large amount of imaging dose. Gating based on external surrogates, such as patient abdominal surface motion, is non-invasive however less accurate due to the uncertainty in the correlation between tumor location and external surrogates. To address these complications, we propose to investigate an approach based on hybrid gating with dynamic internal/external correlation updates. In this approach, the external signal is acquired at high frequency (such as 30 Hz) while the internal signal is sparsely acquired (such as 0.5 Hz or less). The internal signal is used to validate and update the internal/external correlation during treatment. Tumor positions are derived from the external signal based on the newly updated correlation. Two dynamic correlation updating algorithms are introduced. One is based on the motion amplitude and the other is based on the motion phase. Nine patients with synchronized internal/external motion signals are simulated retrospectively to evaluate the effectiveness of hybrid gating. The influences of different clinical conditions on hybrid gating, such as the size of gating windows, the optimal timing for internal signal acquisition and the acquisition frequency are investigated. The results demonstrate that dynamically updating the internal/external correlation in or around the gating window will reduce false positive with relatively diminished treatment efficiency. This improvement will benefit patients with mobile tumors, especially greater for early stage lung cancers, for which the tumors are less attached or freely floating in the lung.
Gating based on internal/external signals with dynamic correlation updates
Energy Technology Data Exchange (ETDEWEB)
Wu Huanmei [Purdue School of Engineering and Technology, Indiana University School of Informatics, IUPUI, Indianapolis, IN (United States); Zhao Qingya [School of Health Sciences, Purdue University, West Lafayette, IN (United States); Berbeco, Ross I [Department of Radiation Oncology, Dana-Farber/Brigham and Womens Cancer Center and Harvard Medical School, Boston, MA (United States); Nishioka, Seiko [NTT East-Japan Sapporo Hospital, Sapporo (Japan); Shirato, Hiroki [Hokkaido University Graduate School of Medicine, Sapporo (Japan); Jiang, Steve B [Department of Radiation Oncology, School of Medicine, University of California, San Diego, CA (United States)], E-mail: hw9@iupui.edu, E-mail: sbjiang@ucsd.edu
2008-12-21
Precise localization of mobile tumor positions in real time is critical to the success of gated radiotherapy. Tumor positions are usually derived from either internal or external surrogates. Fluoroscopic gating based on internal surrogates, such as implanted fiducial markers, is accurate however requiring a large amount of imaging dose. Gating based on external surrogates, such as patient abdominal surface motion, is non-invasive however less accurate due to the uncertainty in the correlation between tumor location and external surrogates. To address these complications, we propose to investigate an approach based on hybrid gating with dynamic internal/external correlation updates. In this approach, the external signal is acquired at high frequency (such as 30 Hz) while the internal signal is sparsely acquired (such as 0.5 Hz or less). The internal signal is used to validate and update the internal/external correlation during treatment. Tumor positions are derived from the external signal based on the newly updated correlation. Two dynamic correlation updating algorithms are introduced. One is based on the motion amplitude and the other is based on the motion phase. Nine patients with synchronized internal/external motion signals are simulated retrospectively to evaluate the effectiveness of hybrid gating. The influences of different clinical conditions on hybrid gating, such as the size of gating windows, the optimal timing for internal signal acquisition and the acquisition frequency are investigated. The results demonstrate that dynamically updating the internal/external correlation in or around the gating window will reduce false positive with relatively diminished treatment efficiency. This improvement will benefit patients with mobile tumors, especially greater for early stage lung cancers, for which the tumors are less attached or freely floating in the lung.
Updating Recursive XML Views of Relations
DEFF Research Database (Denmark)
Choi, Byron; Cong, Gao; Fan, Wenfei
2009-01-01
This paper investigates the view update problem for XML views published from relational data. We consider XML views defined in terms of mappings directed by possibly recursive DTDs compressed into DAGs and stored in relations. We provide new techniques to efficiently support XML view updates...... specified in terms of XPath expressions with recursion and complex filters. The interaction between XPath recursion and DAG compression of XML views makes the analysis of the XML view update problem rather intriguing. Furthermore, many issues are still open even for relational view updates, and need...... to be explored. In response to these, on the XML side, we revise the notion of side effects and update semantics based on the semantics of XML views, and present effecient algorithms to translate XML updates to relational view updates. On the relational side, we propose a mild condition on SPJ views, and show...
Dissociating Working Memory Updating and Automatic Updating: The Reference-Back Paradigm
Rac-Lubashevsky, Rachel; Kessler, Yoav
2016-01-01
Working memory (WM) updating is a controlled process through which relevant information in the environment is selected to enter the gate to WM and substitute its contents. We suggest that there is also an automatic form of updating, which influences performance in many tasks and is primarily manifested in reaction time sequential effects. The goal…
ALICES: an advanced object-oriented software workshop for simulators
International Nuclear Information System (INIS)
Sayet, R.L.; Rouault, G.; Pieroux, D.; Houte, U. Van
1999-01-01
Reducing simulator development costs while improving model quality, user-friendliness and teaching capabilities, is a major target for many years in the simulation industry. It has led to the development of specific software tools which have been improved progressively following the new features and capabilities offered by the software industry. Unlike most of these software tools, ALICES (which is a French acronym for 'Interactive Software Workshop for the Design of Simulators') is not an upgrade of a previous generation of tools, like putting a graphical front-end to a classical code generator, but a really new development. Its design specification is based on previous experience with different tools as well as on new capabilities of software technology, mainly in Object Oriented Design. This allowed us to make a real technological 'jump' in the simulation industry, beyond the constraints of some traditional approaches. The main objectives behind the development of ALICES were the following: (1) Minimizing the simulator development time and costs: a simulator development consists mainly in developing software. One way to reduce costs is to facilitate reuse of existing software by developing standard components, and by defining interface standards, (2) Insuring that the produced simulator can be maintained and updated at a minimal cost: a simulator must evolve along with the simulated process, and it is then necessary to update periodically the simulator. The cost of an adequate maintenance is highly dependent of the quality of the software workshop, (3) Covering the whole simulator development process: from the data package to the acceptance tests and for maintenance and upgrade activities; with the whole development team, even if it is dispatched at different working sites; respecting the Quality Assurance rules and procedures (CORYS T.E.S.S. and TRACTEBEL are ISO-9001 certified). The development of ALICES was also done to comply with the following two main
Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo
McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.
2017-11-01
Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.
A Class of Manifold Regularized Multiplicative Update Algorithms for Image Clustering.
Yang, Shangming; Yi, Zhang; He, Xiaofei; Li, Xuelong
2015-12-01
Multiplicative update algorithms are important tools for information retrieval, image processing, and pattern recognition. However, when the graph regularization is added to the cost function, different classes of sample data may be mapped to the same subspace, which leads to the increase of data clustering error rate. In this paper, an improved nonnegative matrix factorization (NMF) cost function is introduced. Based on the cost function, a class of novel graph regularized NMF algorithms is developed, which results in a class of extended multiplicative update algorithms with manifold structure regularization. Analysis shows that in the learning, the proposed algorithms can efficiently minimize the rank of the data representation matrix. Theoretical results presented in this paper are confirmed by simulations. For different initializations and data sets, variation curves of cost functions and decomposition data are presented to show the convergence features of the proposed update rules. Basis images, reconstructed images, and clustering results are utilized to present the efficiency of the new algorithms. Last, the clustering accuracies of different algorithms are also investigated, which shows that the proposed algorithms can achieve state-of-the-art performance in applications of image clustering.
Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping
2018-05-01
Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.
Updating systematic reviews: an international survey.
Directory of Open Access Journals (Sweden)
Chantelle Garritty
Full Text Available BACKGROUND: Systematic reviews (SRs should be up to date to maintain their importance in informing healthcare policy and practice. However, little guidance is available about when and how to update SRs. Moreover, the updating policies and practices of organizations that commission or produce SRs are unclear. METHODOLOGY/PRINCIPAL FINDINGS: The objective was to describe the updating practices and policies of agencies that sponsor or conduct SRs. An Internet-based survey was administered to a purposive non-random sample of 195 healthcare organizations within the international SR community. Survey results were analyzed using descriptive statistics. The completed response rate was 58% (n = 114 from across 26 countries with 70% (75/107 of participants identified as producers of SRs. Among responders, 79% (84/107 characterized the importance of updating as high or very-high and 57% (60/106 of organizations reported to have a formal policy for updating. However, only 29% (35/106 of organizations made reference to a written policy document. Several groups (62/105; 59% reported updating practices as irregular, and over half (53/103 of organizational respondents estimated that more than 50% of their respective SRs were likely out of date. Authors of the original SR (42/106; 40% were most often deemed responsible for ensuring SRs were current. Barriers to updating included resource constraints, reviewer motivation, lack of academic credit, and limited publishing formats. Most respondents (70/100; 70% indicated that they supported centralization of updating efforts across institutions or agencies. Furthermore, 84% (83/99 of respondents indicated they favoured the development of a central registry of SRs, analogous to efforts within the clinical trials community. CONCLUSIONS/SIGNIFICANCE: Most organizations that sponsor and/or carry out SRs consider updating important. Despite this recognition, updating practices are not regular, and many organizations lack
Zarriello, Phillip J.; Olson, Scott A.; Flynn, Robert H.; Strauch, Kellan R.; Murphy, Elizabeth A.
2014-01-01
Heavy, persistent rains from late February through March 2010 caused severe flooding that set, or nearly set, peaks of record for streamflows and water levels at many long-term streamgages in Rhode Island. In response to this event, hydraulic models were updated for selected reaches covering about 56 river miles in the Pawtuxet River Basin to simulate water-surface elevations (WSEs) at specified flows and boundary conditions. Reaches modeled included the main stem of the Pawtuxet River, the North and South Branches of the Pawtuxet River, Pocasset River, Simmons Brook, Dry Brook, Meshanticut Brook, Furnace Hill Brook, Flat River, Quidneck Brook, and two unnamed tributaries referred to as South Branch Pawtuxet River Tributary A1 and Tributary A2. All the hydraulic models were updated to Hydrologic Engineering Center-River Analysis System (HEC-RAS) version 4.1.0 using steady-state simulations. Updates to the models included incorporation of new field-survey data at structures, high resolution land-surface elevation data, and updated flood flows from a related study. The models were assessed using high-water marks (HWMs) obtained in a related study following the March– April 2010 flood and the simulated water levels at the 0.2-percent annual exceedance probability (AEP), which is the estimated AEP of the 2010 flood in the basin. HWMs were obtained at 110 sites along the main stem of the Pawtuxet River, the North and South Branches of the Pawtuxet River, Pocasset River, Simmons Brook, Furnace Hill Brook, Flat River, and Quidneck Brook. Differences between the 2010 HWM elevations and the simulated 0.2-percent AEP WSEs from flood insurance studies (FISs) and the updated models developed in this study varied with most differences attributed to the magnitude of the 0.2-percent AEP flows. WSEs from the updated models generally are in closer agreement with the observed 2010 HWMs than with the FIS WSEs. The improved agreement of the updated simulated water elevations to
Grey Forecast Rainfall with Flow Updating Algorithm for Real-Time Flood Forecasting
Directory of Open Access Journals (Sweden)
Jui-Yi Ho
2015-04-01
Full Text Available The dynamic relationship between watershed characteristics and rainfall-runoff has been widely studied in recent decades. Since watershed rainfall-runoff is a non-stationary process, most deterministic flood forecasting approaches are ineffective without the assistance of adaptive algorithms. The purpose of this paper is to propose an effective flow forecasting system that integrates a rainfall forecasting model, watershed runoff model, and real-time updating algorithm. This study adopted a grey rainfall forecasting technique, based on existing hourly rainfall data. A geomorphology-based runoff model can be used for simulating impacts of the changing geo-climatic conditions on the hydrologic response of unsteady and non-linear watershed system, and flow updating algorithm were combined to estimate watershed runoff according to measured flow data. The proposed flood forecasting system was applied to three watersheds; one in the United States and two in Northern Taiwan. Four sets of rainfall-runoff simulations were performed to test the accuracy of the proposed flow forecasting technique. The results indicated that the forecast and observed hydrographs are in good agreement for all three watersheds. The proposed flow forecasting system could assist authorities in minimizing loss of life and property during flood events.
Quantifying Update Effects in Citizen-Oriented Software
Directory of Open Access Journals (Sweden)
Ion Ivan
2009-02-01
Full Text Available Defining citizen-oriented software. Detailing technical issues regarding update process in this kind of software. Presenting different effects triggered by types of update. Building model for update costs estimation, including producer-side and consumer-side effects. Analyzing model applicability on INVMAT – large scale matrix inversion software. Proposing a model for update effects estimation. Specifying ways for softening effects of inaccurate updates.
Are Forecast Updates Progressive?
C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)
2010-01-01
textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,
Online updating procedures for a real-time hydrological forecasting system
International Nuclear Information System (INIS)
Kahl, B; Nachtnebel, H P
2008-01-01
Rainfall-runoff-models can explain major parts of the natural runoff pattern but never simulate the observed hydrograph exactly. Reasons for errors are various sources of uncertainties embedded in the model forecasting system. Errors are due to measurement errors, the selected time period for calibration and validation, the parametric uncertainty and the model imprecision. In on-line forecasting systems forecasted input data is used which additionally generates a major uncertainty for the hydrological forecasting system. Techniques for partially compensating these uncertainties are investigated in the recent study in a medium sized catchment in the Austrian part of the Danube basin. The catchment area is about 1000 km2. The forecasting system consists of a semi-distributed continuous rainfall-runoff model that uses quantitative precipitation and temperature forecasts. To provide adequate system states at the beginning of the forecasting period continuous simulation is required, especially in winter. In this study two online updating methods are used and combined for enhancing the runoff forecasts. The first method is used for updating the system states at the beginning of the forecasting period by changing the precipitation input. The second method is an autoregressive error model, which is used to eliminate systematic errors in the model output. In combination those two methods work together well as each method is more effective in different runoff situations.
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2013)
Energy Technology Data Exchange (ETDEWEB)
Miller, David C. [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Syamlal, Madhava [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Cottrell, Roger [URS Corporation. (URS), San Francisco, CA (United States); National Energy Technology Lab. (NETL), Morgantown, WV (United States); Kress, Joel D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sundaresan, S. [Princeton Univ., NJ (United States); Sun, Xin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Storlie, C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhattacharyya, D. [West Virginia Univ., Morgantown, WV (United States); National Energy Technology Lab. (NETL), Morgantown, WV (United States); Tong, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zitney, Stephen E [National Energy Technology Lab. (NETL), Pittsburgh, PA, (United States); Dale, Crystal [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Engel, Dave [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Agarwal, Deb [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Shinn, John [SynPatEco, Pleasant Hill, CA (United States)
2013-09-30
Virginia University, Boston University and the University of Texas at Austin) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 13, CCSI announced the initial release of its first set of computational tools and models during the October 2012 meeting of its Industry Advisory Board. This initial release led to five companies licensing the CCSI Toolset under a Test and Evaluation Agreement this year. By the end of FY13, the CCSI Technical Team had completed development of an updated suite of computational tools and models. The list below summarizes the new and enhanced toolset components that were released following comprehensive testing during October 2013. 1. FOQUS. Framework for Optimization and Quantification of Uncertainty and Sensitivity. Package includes: FOQUS Graphic User Interface (GUI), simulation-based optimization engine, Turbine Client, and heat integration capabilities. There is also an updated simulation interface and new configuration GUI for connecting Aspen Plus or Aspen Custom Modeler (ACM) simulations to FOQUS and the Turbine Science Gateway. 2. A new MFIX-based Computational Fluid Dynamics (CFD) model to predict particle attrition. 3. A new dynamic reduced model (RM) builder, which generates computationally efficient RMs of the behavior of a dynamic system. 4. A completely re-written version of the algebraic surrogate model builder for optimization (ALAMO). The new version is several orders of magnitude faster than the initial release and eliminates the MATLAB dependency. 5. A new suite of high resolution filtered models for the hydrodynamics associated with horizontal cylindrical objects in a flow path. 6. The new Turbine Science Gateway (Cluster), which supports FOQUS for running multiple simulations for optimization or UQ using a local computer or cluster. 7. A new statistical tool (BSS
Pankatz, K.; Kerkweg, A.
2014-12-01
The work presented is part of the joint project "DecReg" ("Regional decadal predictability") which is in turn part of the project "MiKlip" ("Decadal predictions"), an effort funded by the german Federal Ministry of Education and Research to improve decadal predictions on a global and regional scale. In regional climate modeling it is common to update the lateral boundary conditions (LBC) of the regional model every six hours. This is mainly due to the fact, that reference data sets like ERA are only available every six hours. Additionally, for offline coupling procedures it would be too costly to store LBC data in higher temporal resolution for climate simulations. However, theoretically, the coupling frequency could be as high as the time step of the driving model. Meanwhile, it is unclear if a more frequent update of the LBC has a significant effect on the climate in the domain of the regional model (RCM). This study uses the RCM COSMO-CLM/MESSy (Kerkweg and Jöckel, 2012) to couple COSMO-CLM offline to the GCM ECHAM5. One study examines a 30 year time slice experiment for three update frequencies of the LBC, namely six hours, one hour and six minutes. The evaluation of means, standard deviations and statistics of the climate in regional domain shows only small deviations, some stastically significant though, of 2m temperature, sea level pressure and precipitaion.The second scope of the study assesses parameters linked to cyclone activity, which is affected by the LBC update frequency. Differences in track density and strength are found when comparing the simulations.The second study examines the quality of decadal hind-casts of the decade 2001-2010 when the horizontal resolution of the driving model, namely T42, T63, T85, T106, from which the LBC are calculated, is altered. Two sets of simulations are evaluated. For the first set of simulations, the GCM simulations are performed at different resolutions using the same boundary conditions for GHGs and SSTs, thus
International Nuclear Information System (INIS)
Xia Chengyi; Wang Lei; Wang Jinsong; Wang Juan
2012-01-01
We combine the Fermi and Moran update rules in the spatial prisoner's dilemma and snowdrift games to investigate the behavior of collective cooperation among agents on the regular lattice. Large-scale simulations indicate that, compared to the model with only one update rule, the cooperation behavior exhibits the richer phenomena, and the role of update dynamics should be paid more attention in the evolutionary game theory. Meanwhile, we also observe that the introduction of Moran rule, which needs to consider all neighbor's information, can markedly promote the aggregate cooperation level, that is, randomly selecting the neighbor proportional to its payoff to imitate will facilitate the cooperation among agents. Current results will contribute to further understand the cooperation dynamics and evolutionary behaviors within many biological, economic and social systems.
Xia, Cheng-Yi; Wang, Lei; Wang, Juan; Wang, Jin-Song
2012-09-01
We combine the Fermi and Moran update rules in the spatial prisoner's dilemma and snowdrift games to investigate the behavior of collective cooperation among agents on the regular lattice. Large-scale simulations indicate that, compared to the model with only one update rule, the cooperation behavior exhibits the richer phenomena, and the role of update dynamics should be paid more attention in the evolutionary game theory. Meanwhile, we also observe that the introduction of Moran rule, which needs to consider all neighbor's information, can markedly promote the aggregate cooperation level, that is, randomly selecting the neighbor proportional to its payoff to imitate will facilitate the cooperation among agents. Current results will contribute to further understand the cooperation dynamics and evolutionary behaviors within many biological, economic and social systems.
Updating of working memory: lingering bindings.
Oberauer, Klaus; Vockenberg, Kerstin
2009-05-01
Three experiments investigated proactive interference and proactive facilitation in a memory-updating paradigm. Participants remembered several letters or spatial patterns, distinguished by their spatial positions, and updated them by new stimuli up to 20 times per trial. Self-paced updating times were shorter when an item previously remembered and then replaced reappeared in the same location than when it reappeared in a different location. This effect demonstrates residual memory for no-longer-relevant bindings of items to locations. The effect increased with the number of items to be remembered. With one exception, updating times did not increase, and recall of final values did not decrease, over successive updating steps, thus providing little evidence for proactive interference building up cumulatively.
How do we update faces? Effects of gaze direction and facial expressions on working memory updating
Directory of Open Access Journals (Sweden)
Caterina eArtuso
2012-09-01
Full Text Available The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM. We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g. joy, while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g. fear. Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g. joy-direct gaze were compared to low binding conditions (e.g. joy-averted gaze. Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.
How do we update faces? Effects of gaze direction and facial expressions on working memory updating.
Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola
2012-01-01
The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.
49 CFR 1002.3 - Updating user fees.
2010-10-01
... updating fees. Each fee shall be updated by updating the cost components comprising the fee. Cost... direct labor costs are direct labor costs determined by the cost study set forth in Revision of Fees For... by total office costs for the Offices directly associated with user fee activity. Actual updating of...
Topological zero modes in Monte Carlo simulations
International Nuclear Information System (INIS)
Dilger, H.
1994-08-01
We present an improvement of global Metropolis updating steps, the instanton hits, used in a hybrid Monte Carlo simulation of the two-flavor Schwinger model with staggered fermions. These hits are designed to change the topological sector of the gauge field. In order to match these hits to an unquenched simulation with pseudofermions, the approximate zero mode structure of the lattice Dirac operator has to be considered explicitly. (orig.)
Status and update of the National Ignition Facility radiation effects testing program
International Nuclear Information System (INIS)
Davis, J F; Serduke, F J; Wuest, C R.
1998-01-01
We are progressing in our efforts to make the National Ignition Facility (NIF) available to the nation as a radiation effects simulator to support the Services needs for nuclear hardness and survivability testing and validation. Details of our program were summarized in a paper presented at the 1998 HEART Conference [1]. This paper describes recent activities and updates plans for NIF radiation effects testing. research. Radiation Effects Testing
Concepts of incremental updating and versioning
CSIR Research Space (South Africa)
Cooper, Antony K
2004-07-01
Full Text Available of the work undertaken recently by the Working Group (WG). The WG was voted for a Commission by the General Assembly held at the 21st ICC in Durban, South Africa. The basic problem being addressed by the Commission is that a user compiles their data base... or election). Historically, updates have been provided in bulk, with the new data set replacing the old one. User could: ignore update (if it is not significant enough), manually (and selectively) update their data base, or accept the whole update...
Updating Geospatial Data from Large Scale Data Sources
Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.
2011-08-01
In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical
Numerical simulation of plasmas
International Nuclear Information System (INIS)
Dnestrovskii, Y.N.; Kostomarov, D.P.
1986-01-01
This book contains a modern consistent and systematic presentation of numerical computer simulation of plasmas in controlled thermonuclear fusion. The authors focus on the Soviet research in mathematical modelling of Tokamak plasmas, and present kinetic hydrodynamic and transport models with special emphasis on the more recent hybrid models. Compared with the first edition (in Russian) this book has been greatly revised and updated. (orig./WL)
Improved Lunar and Martian Regolith Simulant Production, Phase II
National Aeronautics and Space Administration — The technical objective of the Phase II project is to provide a more complete investigation of the long-term needs of the simulant community based on the updated...
A hybrid parallel framework for the cellular Potts model simulations
Energy Technology Data Exchange (ETDEWEB)
Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV
2009-01-01
The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).
Massively parallel multicanonical simulations
Gross, Jonathan; Zierenberg, Johannes; Weigel, Martin; Janke, Wolfhard
2018-03-01
Generalized-ensemble Monte Carlo simulations such as the multicanonical method and similar techniques are among the most efficient approaches for simulations of systems undergoing discontinuous phase transitions or with rugged free-energy landscapes. As Markov chain methods, they are inherently serial computationally. It was demonstrated recently, however, that a combination of independent simulations that communicate weight updates at variable intervals allows for the efficient utilization of parallel computational resources for multicanonical simulations. Implementing this approach for the many-thread architecture provided by current generations of graphics processing units (GPUs), we show how it can be efficiently employed with of the order of 104 parallel walkers and beyond, thus constituting a versatile tool for Monte Carlo simulations in the era of massively parallel computing. We provide the fully documented source code for the approach applied to the paradigmatic example of the two-dimensional Ising model as starting point and reference for practitioners in the field.
Polarized positrons for the ILC. Update on simulations
Energy Technology Data Exchange (ETDEWEB)
Ushakov, A.; Adeyemi, O.S.; Moortgat-Pick, G. [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Staufenbiel, F.; Riemann, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)
2012-02-15
To achieve the extremely high luminosity for colliding electron-positron beams at the future International Linear Collider [1] (ILC) an undulator-based source with about 230 meters helical undulator and a thin titanium-alloy target rim rotated with tangential velocity of about 100 meters per second are foreseen. The very high density of heat deposited in the target has to be analyzed carefully. The energy deposited by the photon beam in the target has been calculated in FLUKA. The resulting stress in the target material after one bunch train has been simulated in ANSYS. (orig.)
High-Performance Beam Simulator for the LANSCE Linac
International Nuclear Information System (INIS)
Pang, Xiaoying; Rybarcyk, Lawrence J.; Baily, Scott A.
2012-01-01
A high performance multiparticle tracking simulator is currently under development at Los Alamos. The heart of the simulator is based upon the beam dynamics simulation algorithms of the PARMILA code, but implemented in C++ on Graphics Processing Unit (GPU) hardware using NVIDIA's CUDA platform. Linac operating set points are provided to the simulator via the EPICS control system so that changes of the real time linac parameters are tracked and the simulation results updated automatically. This simulator will provide valuable insight into the beam dynamics along a linac in pseudo real-time, especially where direct measurements of the beam properties do not exist. Details regarding the approach, benefits and performance are presented.
RELAP5 based engineering simulator
International Nuclear Information System (INIS)
Charlton, T.R.; Laats, E.T.; Burtt, J.D.
1990-01-01
The INEL Engineering Simulation Center was established in 1988 to provide a modern, flexible, state-of-the-art simulation facility. This facility and two of the major projects which are part of the simulation center, the Advance Test Reactor (ATR) engineering simulator project and the Experimental Breeder Reactor (EBR-II) advanced reactor control system, have been the subject of several papers in the past few years. Two components of the ATR engineering simulator project, RELAP5 and the Nuclear Plant Analyzer (NPA), have recently been improved significantly. This paper presents an overview of the INEL Engineering Simulation Center, and discusses the RELAP5/MOD3 and NPA/MOD1 codes, specifically how they are being used at the INEL Engineering Simulation Center. It provides an update on the modifications to these two codes and their application to the ATR engineering simulator project, as well as, a discussion on the reactor system representation, control system modeling, two phase flow and heat transfer modeling. It will also discuss how these two codes are providing desktop, stand-alone reactor simulation
Update on Simulating Ice-Cliff Failure
Parizek, B. R.; Christianson, K. A.; Alley, R. B.; Voytenko, D.; Vankova, I.; Dixon, T. H.; Walker, R. T.; Holland, D.
2017-12-01
Using a 2D full-Stokes diagnostic ice-flow model and engineering and glaciological failure criteria, we simulate the limiting physical conditions for rapid structural failure of subaerial ice cliffs. Previously, using a higher-order flowline model, we reported that the threshold height, in crevassed ice and/or under favorable conditions for hydrofracture or crack lubrication, may be only slightly above the 100-m maximum observed today and that under well-drained or low-melt conditions, mechanically-competent ice supports cliff heights up to 220 m (with a likely range of 180-275 m) before ultimately succumbing to tensional and compressive failure along a listric surface. However, proximal to calving fronts, bridging effects lead to variations in vertical normal stress from the background glaciostatic stress state that give rise to the along-flow gradients in vertical shear stress that are included within a full-Stokes momentum balance. When including all flowline stresses within the physics core, diagnostic solutions continue to support our earlier findings that slumping failure ultimately limits the upper bound for cliff heights. Shear failure still requires low cohesive strength, tensile failure leads to deeper dry-crevasse propagation (albeit, less than halfway through the cliff), and compressive failure drops the threshold height for triggering rapid ice-front retreat via slumping to 200 m (145-280 m).
Status update of the BWR cask simulator
Energy Technology Data Exchange (ETDEWEB)
Lindgren, Eric R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Durbin, Samuel G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-09-01
The performance of commercial nuclear spent fuel dry storage casks are typically evaluated through detailed numerical analysis of the system's thermal performance. These modeling efforts are performed by the vendor to demonstrate the performance and regulatory compliance and are independently verified by the Nuclear Regulatory Commission (NRC). Carefully measured data sets generated from testing of full sized casks or smaller cask analogs are widely recognized as vital for validating these models. Numerous studies have been previously conducted. Recent advances in dry storage cask designs have moved the storage location from above ground to below ground and significantly increased the maximum thermal load allowed in a cask in part by increasing the canister helium pressure. Previous cask performance validation testing did not capture these parameters. The purpose of the investigation described in this report is to produce a data set that can be used to test the validity of the assumptions associated with the calculations presently used to determine steady-state cladding temperatures in modern dry casks. These modern cask designs utilize elevated helium pressure in the sealed canister or are intended for subsurface storage. The BWR cask simulator (BCS) has been designed in detail for both the above ground and below ground venting configurations. The pressure vessel representing the canister has been designed, fabricated, and pressure tested for a maximum allowable pressure (MAWP) rating of 24 bar at 400 C. An existing electrically heated but otherwise prototypic BWR Incoloy-clad test assembly is being deployed inside of a representative storage basket and cylindrical pressure vessel that represents the canister. The symmetric single assembly geometry with well-controlled boundary conditions simplifies interpretation of results. Various configurations of outer concentric ducting will be used to mimic conditions for above and below ground storage configurations
Updated clinical guidelines experience major reporting limitations
Directory of Open Access Journals (Sweden)
Robin W.M. Vernooij
2017-10-01
Full Text Available Abstract Background The Checklist for the Reporting of Updated Guidelines (CheckUp was recently developed. However, so far, no systematic assessment of the reporting of updated clinical guidelines (CGs exists. We aimed to examine (1 the completeness of reporting the updating process in CGs and (2 the inter-observer reliability of CheckUp. Methods We conducted a systematic assessment of the reporting of the updating process in a sample of updated CGs using CheckUp. We performed a systematic search to identify updated CGs published in 2015, developed by a professional society, reporting a systematic review of the evidence, and containing at least one recommendation. Three reviewers independently assessed the CGs with CheckUp (16 items. We calculated the median score per item, per domain, and overall, converting scores to a 10-point scale. Multiple linear regression analyses were used to identify differences according to country, type of organisation, scope, and health topic of updated CGs. We calculated the intraclass coefficient (ICC and 95% confidence interval (95% CI for domains and overall score. Results We included in total 60 updated CGs. The median domain score on a 10-point scale for presentation was 5.8 (range 1.7 to 10, for editorial independence 8.3 (range 3.3 to 10, and for methodology 5.7 (range 0 to 10. The median overall score on a 10-point scale was 6.3 (range 3.1 to 10. Presentation and justification items at recommendation level (respectively reported by 27 and 38% of the CGs and the methods used for the external review and implementing changes in practice were particularly poorly reported (both reported by 38% of the CGs. CGs developed by a European or international institution obtained a statistically significant higher overall score compared to North American or Asian institutions (p = 0.014. Finally, the agreement among the reviewers on the overall score was excellent (ICC 0.88, 95% CI 0.75 to 0.95. Conclusions The
National Oceanic and Atmospheric Administration, Department of Commerce — Circular Updates are periodic sequentially numbered instructions to debriefing staff and observers informing them of changes or additions to scientific and specimen...
Important update of CERN Mail Services
IT Department
2009-01-01
The CERN Mail Services are evolving. In the course of June and July 2009, all CERN mailboxes will be updated with a new infrastructure for hosting mailboxes, running Exchange 2007. This update is taking place in order to provide the capacity upgrade for the constantly growing volume of CERN mailboxes. It is also the opportunity to provide a number of improvements to CERN mailboxes: new and improved Outlook Web Access (the web interface used to access your mailbox from a web browser, also known as "webmail"), new features in the Out-of-Office auto-reply assistant, easier spam management... The update will preserve the mailbox configuration and no specific action is required by users. During the next weeks, each mailbox will be individually notified of the upcoming update the day before it takes place. We invite all users to carefully read this notification as it will contain the latest information for this update. The mailbox will be unavailable for a short time during the ni...
The Updating of Geospatial Base Data
Alrajhi, Muhamad N.; Konecny, Gottfried
2018-04-01
Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.
Electron-cloud simulation results for the PSR and SNS
International Nuclear Information System (INIS)
Pivi, M.; Furman, M.A.
2002-01-01
We present recent simulation results for the main features of the electron cloud in the storage ring of the Spallation Neutron Source (SNS) at Oak Ridge, and updated results for the Proton Storage Ring (PSR) at Los Alamos. In particular, a complete refined model for the secondary emission process including the so called true secondary, rediffused and backscattered electrons has been included in the simulation code
Updating optical pseudoinverse associative memories.
Telfer, B; Casasent, D
1989-07-01
Selected algorithms for adding to and deleting from optical pseudoinverse associative memories are presented and compared. New realizations of pseudoinverse updating methods using vector inner product matrix bordering and reduced-dimensionality Karhunen-Loeve approximations (which have been used for updating optical filters) are described in the context of associative memories. Greville's theorem is reviewed and compared with the Widrow-Hoff algorithm. Kohonen's gradient projection method is expressed in a different form suitable for optical implementation. The data matrix memory is also discussed for comparison purposes. Memory size, speed and ease of updating, and key vector requirements are the comparison criteria used.
Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system
Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.
2017-05-01
We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.
Using radar altimetry to update a routing model of the Zambezi River Basin
DEFF Research Database (Denmark)
Michailovsky, Claire Irene B.; Bauer-Gottwein, Peter
2012-01-01
Satellite radar altimetry allows for the global monitoring of lakes and river levels. However, the widespread use of altimetry for hydrological studies is limited by the coarse temporal and spatial resolution provided by current altimetric missions and the fact that discharge rather than level...... is needed for hydrological applications. To overcome these limitations, altimetry river levels can be combined with hydrological modeling in a dataassimilation framework. This study focuses on the updating of a river routing model of the Zambezi using river levels from radar altimetry. A hydrological model...... of the basin was built to simulate the land phase of the water cycle and produce inflows to a Muskingum routing model. River altimetry from the ENVISAT mission was then used to update the storages in the reaches of the Muskingum model using the Extended Kalman Filter. The method showed improvements in modeled...
Updating Sea Spray Aerosol Emissions in the Community Multiscale Air Quality Model
Gantt, B.; Bash, J. O.; Kelly, J.
2014-12-01
Sea spray aerosols (SSA) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. In this study, the Community Multiscale Air Quality (CMAQ) model is updated to enhance fine mode SSA emissions, include sea surface temperature (SST) dependency, and revise surf zone emissions. Based on evaluation with several regional and national observational datasets in the continental U.S., the updated emissions generally improve surface concentrations predictions of primary aerosols composed of sea-salt and secondary aerosols affected by sea-salt chemistry in coastal and near-coastal sites. Specifically, the updated emissions lead to better predictions of the magnitude and coastal-to-inland gradient of sodium, chloride, and nitrate concentrations at Bay Regional Atmospheric Chemistry Experiment (BRACE) sites near Tampa, FL. Including SST-dependency to the SSA emission parameterization leads to increased sodium concentrations in the southeast U.S. and decreased concentrations along the Pacific coast and northeastern U.S., bringing predictions into closer agreement with observations at most Interagency Monitoring of Protected Visual Environments (IMPROVE) and Chemical Speciation Network (CSN) sites. Model comparison with California Research at the Nexus of Air Quality and Climate Change (CalNex) observations will also be discussed, with particular focus on the South Coast Air Basin where clean marine air mixes with anthropogenic pollution in a complex environment. These SSA emission updates enable more realistic simulation of chemical processes in coastal environments, both in clean marine air masses and mixtures of clean marine and polluted conditions.
A Fatigue Crack Size Evaluation Method Based on Lamb Wave Simulation and Limited Experimental Data
Directory of Open Access Journals (Sweden)
Jingjing He
2017-09-01
Full Text Available This paper presents a systematic and general method for Lamb wave-based crack size quantification using finite element simulations and Bayesian updating. The method consists of construction of a baseline quantification model using finite element simulation data and Bayesian updating with limited Lamb wave data from target structure. The baseline model correlates two proposed damage sensitive features, namely the normalized amplitude and phase change, with the crack length through a response surface model. The two damage sensitive features are extracted from the first received S0 mode wave package. The model parameters of the baseline model are estimated using finite element simulation data. To account for uncertainties from numerical modeling, geometry, material and manufacturing between the baseline model and the target model, Bayesian method is employed to update the baseline model with a few measurements acquired from the actual target structure. A rigorous validation is made using in-situ fatigue testing and Lamb wave data from coupon specimens and realistic lap-joint components. The effectiveness and accuracy of the proposed method is demonstrated under different loading and damage conditions.
Real-time simulation of contact and cutting of heterogeneous soft-tissues.
Courtecuisse, Hadrien; Allard, Jérémie; Kerfriden, Pierre; Bordas, Stéphane P A; Cotin, Stéphane; Duriez, Christian
2014-02-01
This paper presents a numerical method for interactive (real-time) simulations, which considerably improves the accuracy of the response of heterogeneous soft-tissue models undergoing contact, cutting and other topological changes. We provide an integrated methodology able to deal both with the ill-conditioning issues associated with material heterogeneities, contact boundary conditions which are one of the main sources of inaccuracies, and cutting which is one of the most challenging issues in interactive simulations. Our approach is based on an implicit time integration of a non-linear finite element model. To enable real-time computations, we propose a new preconditioning technique, based on an asynchronous update at low frequency. The preconditioner is not only used to improve the computation of the deformation of the tissues, but also to simulate the contact response of homogeneous and heterogeneous bodies with the same accuracy. We also address the problem of cutting the heterogeneous structures and propose a method to update the preconditioner according to the topological modifications. Finally, we apply our approach to three challenging demonstrators: (i) a simulation of cataract surgery (ii) a simulation of laparoscopic hepatectomy (iii) a brain tumor surgery. Copyright © 2013 Elsevier B.V. All rights reserved.
Voogd, J.M.; Roza, M.
2015-01-01
The Dutch Ministry of Defense (NL-MoD) has recently acquired an update of its medium range anti tank (MRAT) missile system, called the GILL. The update to the SPIKE Long Range (LR) weapon system is accompanied with the acquisition of new simulation training devices (STDs). These devices are bought
Langevin simulations of QCD, including fermions
International Nuclear Information System (INIS)
Kronfeld, A.S.
1986-02-01
We encounter critical slow down in updating when xi/a -> infinite and in matrix inversion (needed to include fermions) when msub(q)a -> 0. A simulation that purports to solve QCD numerically will encounter these limits, so to face the challenge in the title of this workshop, we must cure the disease of critical slow down. Physically, this critical slow down is due to the reluctance of changes at short distances to propagate to large distances. Numerically, the stability of an algorithm at short wavelengths requires a (moderately) small step size; critical slow down occurs when the effective long wavelength step size becomes tiny. The remedy for this disease is an algorithm that propagates signals quickly throughout the system; i.e. one whose effective step size is not reduced for the long wavelength conponents of the fields. (Here the effective ''step size'' is essentially an inverse decorrelation time.) To do so one must resolve various wavelengths of the system and modify the dynamics (in CPU time) of the simulation so that all modes evolve at roughly the same rate. This can be achieved by introducing Fourier transforms. I show how to implement Fourier acceleration for Langevin updating and for conjugate gradient matrix inversion. The crucial feature of these algorithms that lends them to Fourier acceleration is that they update the lattice globally; hence the Fourier transforms are computed once per sweep rather than once per hit. (orig./HSI)
On the rejection-based algorithm for simulation and analysis of large-scale reaction networks
Energy Technology Data Exchange (ETDEWEB)
Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research-University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research-University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy)
2015-06-28
Stochastic simulation for in silico studies of large biochemical networks requires a great amount of computational time. We recently proposed a new exact simulation algorithm, called the rejection-based stochastic simulation algorithm (RSSA) [Thanh et al., J. Chem. Phys. 141(13), 134116 (2014)], to improve simulation performance by postponing and collapsing as much as possible the propensity updates. In this paper, we analyze the performance of this algorithm in detail, and improve it for simulating large-scale biochemical reaction networks. We also present a new algorithm, called simultaneous RSSA (SRSSA), which generates many independent trajectories simultaneously for the analysis of the biochemical behavior. SRSSA improves simulation performance by utilizing a single data structure across simulations to select reaction firings and forming trajectories. The memory requirement for building and storing the data structure is thus independent of the number of trajectories. The updating of the data structure when needed is performed collectively in a single operation across the simulations. The trajectories generated by SRSSA are exact and independent of each other by exploiting the rejection-based mechanism. We test our new improvement on real biological systems with a wide range of reaction networks to demonstrate its applicability and efficiency.
7. Mentor update and support: what do mentors need from an update?
Phillips, Mari; Marshall, Joyce
2015-04-01
Mentorship is the 14th series of 'Midwifery basics' targeted at practising midwives. The aim of these articles is to provide information to raise awareness of the impact of the work of midwives on women's experience, and encourage midwives to seek further information through a series of activities relating to the topic. In this seventh article Mari Phillips and Joyce Marshall consider some of the key issues related to mentor update and support and consider what mentors need from their annual update.
... of this page: https://medlineplus.gov/listserv.html Email Updates To use the sharing features on this ... view your email history or unsubscribe. Prevent MedlinePlus emails from being marked as "spam" or "junk" To ...
Valence-Dependent Belief Updating: Computational Validation
Directory of Open Access Journals (Sweden)
Bojana Kuzmanovic
2017-06-01
Full Text Available People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates with trials with bad news (worse-than-expected base rates. After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on
49 CFR 360.5 - Updating user fees.
2010-10-01
... updating the cost components comprising the fee. Cost components shall be updated as follows: (1) Direct... determined by the cost study in Regulations Governing Fees For Service, 1 I.C.C. 2d 60 (1984), or subsequent... by total office costs for the office directly associated with user fee activity. Actual updating of...
Second-generation speed limit map updating applications
DEFF Research Database (Denmark)
Tradisauskas, Nerius; Agerholm, Niels; Juhl, Jens
2011-01-01
Intelligent Speed Adaptation is an Intelligent Transport System developed to significantly improve road safety in helping car drivers maintain appropriate driving behaviour. The system works in connection with the speed limits on the road network. It is thus essential to keep the speed limit map...... used in the Intelligent Speed Adaptation scheme updated. The traditional method of updating speed limit maps on the basis of long time interval observations needed to be replaced by a more efficient speed limit updating tool. In a Danish Intelligent Speed Adaptation trial a web-based tool was therefore...... for map updating should preferably be made on the basis of a commercial map provider, 2 such as Google Maps and that the real challenge is to oblige road authorities to carry out updates....
The 2018 and 2020 Updates of the U.S. National Seismic Hazard Models
Petersen, M. D.
2017-12-01
During 2018 the USGS will update the 2014 National Seismic Hazard Models by incorporating new seismicity models, ground motion models, site factors, fault inputs, and by improving weights to ground motion models using empirical and other data. We will update the earthquake catalog for the U.S. and introduce new rate models. Additional fault data will be used to improve rate estimates on active faults. New ground motion models (GMMs) and site factors for Vs30 have been released by the Pacific Earthquake Engineering Research Center (PEER) and we will consider these in assessing ground motions in craton and extended margin regions of the central and eastern U.S. The USGS will also include basin-depth terms for selected urban areas of the western United States to improve long-period shaking assessments using published depth estimates to 1.0 and 2.5 km/s shear wave velocities. We will produce hazard maps for input into the building codes that span a broad range of periods (0.1 to 5 s) and site classes (shear wave velocity from 2000 m/s to 200 m/s in the upper 30 m of the crust, Vs30). In the 2020 update we plan on including: a new national crustal model that defines basin depths required in the latest GMMs, new 3-D ground motion simulations for several urban areas, new magnitude-area equations, and new fault geodetic and geologic strain rate models. The USGS will also consider including new 3-D ground motion simulations for inclusion in these long-period maps. These new models are being evaluated and will be discussed at one or more regional and topical workshops held at the beginning of 2018.
Impact of Neutrino Opacities on Core-collapse Supernova Simulations
Kotake, Kei; Takiwaki, Tomoya; Fischer, Tobias; Nakamura, Ko; Martínez-Pinedo, Gabriel
2018-02-01
The accurate description of neutrino opacities is central to both the core-collapse supernova (CCSN) phenomenon and the validity of the explosion mechanism itself. In this work, we study in a systematic fashion the role of a variety of well-selected neutrino opacities in CCSN simulations where the multi-energy, three-flavor neutrino transport is solved using the isotropic diffusion source approximation (IDSA) scheme. To verify our code, we first present results from one-dimensional (1D) simulations following the core collapse, bounce, and ∼250 ms postbounce of a 15 {M}ȯ star using a standard set of neutrino opacities by Bruenn. A detailed comparison with published results supports the reliability of our three-flavor IDSA scheme using the standard opacity set. We then investigate in 1D simulations how individual opacity updates lead to differences with the baseline run with the standard opacity set. Through detailed comparisons with previous work, we check the validity of our implementation of each update in a step-by-step manner. Individual neutrino opacities with the largest impact on the overall evolution in 1D simulations are selected for systematic comparisons in our two-dimensional (2D) simulations. Special attention is given to the criterion of explodability in the 2D models. We discuss the implications of these results as well as its limitations and the requirements for future, more elaborate CCSN modeling.
RELAP5 based engineering simulator
International Nuclear Information System (INIS)
Charlton, T.R.; Laats, E.T.; Burtt, J.D.
1990-01-01
The INEL Engineering Simulation Center was established in 1988 to provide a modern, flexible, state-of-the-art simulation facility. This facility and two of the major projects which are part of the simulation center, the Advance Test Reactor (ATR) engineering simulator project and the Experimental Breeder Reactor II (EBR-II) advanced reactor control system, have been the subject of several papers in the past few years. Two components of the ATR engineering simulator project, RELAP5 and the Nuclear Plant Analyzer (NPA), have recently been improved significantly. This paper will present an overview of the INEL Engineering Simulation Center, and discuss the RELAP5/MOD3 and NPA/MOD1 codes, specifically how they are being used at the INEL Engineering Simulation Center. It will provide an update on the modifications to these two codes and their application to the ATR engineering simulator project, as well as, a discussion on the reactor system representation, control system modeling, two phase flow and heat transfer modeling. It will also discuss how these two codes are providing desktop, stand-alone reactor simulation. 12 refs., 2 figs
Update of CERN exchange network
2003-01-01
An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas March 26Update of the voice messaging systemAll CERN sites April 4Updat...
Energy Technology Data Exchange (ETDEWEB)
Dornsife, William P.; Kirk, J. Scott; Shaw, Chris G. [Waste Control Specialists LLC, Andrews, Texas (United States)
2012-07-01
This Performance Assessment (PA) submittal is an update to the original PA that was developed to support the licensing of the Waste Control Specialists LLC Low-Level Radioactive Waste (LLRW) disposal facility. This update includes both the Compact Waste Facility (CWF) and the Federal Waste Facility (FWF), in accordance with Radioactive Material License (RML) No. R04100, License Condition (LC) 87. While many of the baseline assumptions supporting the initial license application PA were incorporated in this update, a new transport code, GoldSim, and new deterministic groundwater flow codes, including HYDRUS and MODFLOWSURFACT{sup TM}, were employed to demonstrate compliance with the performance objectives codified in the regulations and RML No. R04100, LC 87. A revised source term, provided by the Texas Commission on Environmental Quality staff, was used to match the initial 15 year license term. This updated PA clearly confirms and demonstrates the robustness of the characteristics of the site's geology and the advanced engineering design of the disposal units. Based on the simulations from fate and transport models, the radiation doses to members of the general public and site workers predicted in the initial and updated PA were a small fraction of the criterion doses of 0.25 mSv and 50 mSv, respectively. In a comparison between the results of the updated PA against the one developed in support of the initial license, both clearly demonstrated the robustness of the characteristics of the site's geology and engineering design of the disposal units. Based on the simulations from fate and transport models, the radiation doses to members of the general public predicted in the initial and updated PA were a fraction of the allowable 25 mrem/yr (0.25 m sievert/yr) dose standard for tens-of-thousands of years into the future. Draft Texas guidance on performance assessment (TCEQ, 2004) recommends a period of analysis equal to 1,000 years or until peak doses from
Breast Cancer and Estrogen-Alone Update
... Current Issue Past Issues Research News From NIH Breast Cancer and Estrogen-Alone Update Past Issues / Summer 2006 ... hormone therapy does not increase the risk of breast cancer in postmenopausal women, according to an updated analysis ...
Working Memory Updating as a Predictor of Academic Attainment
Lechuga, M. Teresa; Pelegrina, Santiago; Pelaez, Jose L.; Martin-Puga, M. Eva; Justicia, M. Jose
2016-01-01
There is growing evidence supporting the importance of executive functions, and specifically working memory updating (WMU), for children's academic achievement. This study aimed to assess the specific contribution of updating to the prediction of academic performance. Two updating tasks, which included different updating components, were…
Simulation of Optimal Decision-Making Under the Impacts of Climate Change
DEFF Research Database (Denmark)
Møller, Lea Ravnkilde; Drews, Martin; Larsen, Morten Andreas Dahl
2017-01-01
Climate change causes transformations to the conditions of existing agricultural practices appointing farmers to continuously evaluate their agricultural strategies, e.g., towards optimising revenue. In this light, this paper presents a framework for applying Bayesian updating to simulate decision...... crops, irrigated crops and livestock) by a continuous updating of beliefs relative to realised trajectories of climate (change), represented by projections of temperature and precipitation. The climate data is based on combinations of output from three global/regional climate model combinations and two...
FEFTRA {sup TM} verification. Update 2013
Energy Technology Data Exchange (ETDEWEB)
Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)
2013-12-15
FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems
Directory of Open Access Journals (Sweden)
O. Rakovec
2012-09-01
Full Text Available This paper presents a study on the optimal setup for discharge assimilation within a spatially distributed hydrological model. The Ensemble Kalman filter (EnKF is employed to update the grid-based distributed states of such an hourly spatially distributed version of the HBV-96 model. By using a physically based model for the routing, the time delay and attenuation are modelled more realistically. The discharge and states at a given time step are assumed to be dependent on the previous time step only (Markov property.
Synthetic and real world experiments are carried out for the Upper Ourthe (1600 km2, a relatively quickly responding catchment in the Belgian Ardennes. We assess the impact on the forecasted discharge of (1 various sets of the spatially distributed discharge gauges and (2 the filtering frequency. The results show that the hydrological forecast at the catchment outlet is improved by assimilating interior gauges. This augmentation of the observation vector improves the forecast more than increasing the updating frequency. In terms of the model states, the EnKF procedure is found to mainly change the pdfs of the two routing model storages, even when the uncertainty in the discharge simulations is smaller than the defined observation uncertainty.
ORIGEN2: a revised and updated version of the Oak Ridge isotope generation and depletion code
International Nuclear Information System (INIS)
Croff, A.G.
1980-07-01
ORIGEN2 is a versatile point depletion and decay computer code for use in simulating nuclear fuel cycles and calculating the nuclide compositions of materials contained therein. This code represents a revision and update of the original ORIGEN computer code which has been distributed world-wide beginning in the early 1970s. The purpose of this report is to give a summary description of a revised and updated version of the original ORIGEN computer code, which has been designated ORIGEN2. A detailed description of the computer code ORIGEN2 is presented. The methods used by ORIGEN2 to solve the nuclear depletion and decay equations are included. Input information necessary to use ORIGEN2 that has not been documented in supporting reports is documented
Placement by thermodynamic simulated annealing
International Nuclear Information System (INIS)
Vicente, Juan de; Lanchares, Juan; Hermida, Roman
2003-01-01
Combinatorial optimization problems arise in different fields of science and engineering. There exist some general techniques coping with these problems such as simulated annealing (SA). In spite of SA success, it usually requires costly experimental studies in fine tuning the most suitable annealing schedule. In this Letter, the classical integrated circuit placement problem is faced by Thermodynamic Simulated Annealing (TSA). TSA provides a new annealing schedule derived from thermodynamic laws. Unlike SA, temperature in TSA is free to evolve and its value is continuously updated from the variation of state functions as the internal energy and entropy. Thereby, TSA achieves the high quality results of SA while providing interesting adaptive features
High-efficiency wavefunction updates for large scale Quantum Monte Carlo
Kent, Paul; McDaniel, Tyler; Li, Ying Wai; D'Azevedo, Ed
Within ab intio Quantum Monte Carlo (QMC) simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunctions. The evaluation of each Monte Carlo move requires finding the determinant of a dense matrix, which is traditionally iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. For calculations with thousands of electrons, this operation dominates the execution profile. We propose a novel rank- k delayed update scheme. This strategy enables probability evaluation for multiple successive Monte Carlo moves, with application of accepted moves to the matrices delayed until after a predetermined number of moves, k. Accepted events grouped in this manner are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency. This procedure does not change the underlying Monte Carlo sampling or the sampling efficiency. For large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude speedups can be obtained on both multi-core CPU and on GPUs, making this algorithm highly advantageous for current petascale and future exascale computations.
International Nuclear Information System (INIS)
Huang, Ying; Fang, Xia; Xiao, Hai; Bevans, Wesley James; Chen, Genda; Zhou, Zhi
2013-01-01
Steel buildings are subjected to fire hazards during or immediately after a major earthquake. Under combined gravity and thermal loads, they have non-uniformly distributed stiffness and strength, and thus collapse progressively with large deformation. In this study, large-strain optical fiber sensors for high temperature applications and a temperature-dependent finite element model updating method are proposed for accurate prediction of structural behavior in real time. The optical fiber sensors can measure strains up to 10% at approximately 700 °C. Their measurements are in good agreement with those from strain gauges up to 0.5%. In comparison with the experimental results, the proposed model updating method can reduce the predicted strain errors from over 75% to below 20% at 800 °C. The minimum number of sensors in a fire zone that can properly characterize the vertical temperature distribution of heated air due to the gravity effect should be included in the proposed model updating scheme to achieve a predetermined simulation accuracy. (paper)
Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability
Energy Technology Data Exchange (ETDEWEB)
Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)
2016-06-14
Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.
Neurosurgery simulation using non-linear finite element modeling and haptic interaction
Lee, Huai-Ping; Audette, Michel; Joldes, Grand R.; Enquobahrie, Andinet
2012-02-01
Real-time surgical simulation is becoming an important component of surgical training. To meet the realtime requirement, however, the accuracy of the biomechancial modeling of soft tissue is often compromised due to computing resource constraints. Furthermore, haptic integration presents an additional challenge with its requirement for a high update rate. As a result, most real-time surgical simulation systems employ a linear elasticity model, simplified numerical methods such as the boundary element method or spring-particle systems, and coarse volumetric meshes. However, these systems are not clinically realistic. We present here an ongoing work aimed at developing an efficient and physically realistic neurosurgery simulator using a non-linear finite element method (FEM) with haptic interaction. Real-time finite element analysis is achieved by utilizing the total Lagrangian explicit dynamic (TLED) formulation and GPU acceleration of per-node and per-element operations. We employ a virtual coupling method for separating deformable body simulation and collision detection from haptic rendering, which needs to be updated at a much higher rate than the visual simulation. The system provides accurate biomechancial modeling of soft tissue while retaining a real-time performance with haptic interaction. However, our experiments showed that the stability of the simulator depends heavily on the material property of the tissue and the speed of colliding objects. Hence, additional efforts including dynamic relaxation are required to improve the stability of the system.
Update of CERN exchange network
2003-01-01
An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. Date Change type Affected areas April 8 Update of switch in LHC 7 LHC 7 Point April 9 Update of...
Electron-cloud simulation results for the SPS and recent results for the LHC
International Nuclear Information System (INIS)
Furman, M.A.; Pivi, M.T.F.
2002-01-01
We present an update of computer simulation results for some features of the electron cloud at the Large Hadron Collider (LHC) and recent simulation results for the Super Proton Synchrotron (SPS). We focus on the sensitivity of the power deposition on the LHC beam screen to the emitted electron spectrum, which we study by means of a refined secondary electron (SE) emission model recently included in our simulation code
An update on the BQCD Hybrid Monte Carlo program
Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk
2018-03-01
We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.
The electron-cloud instability in PEP-II: An update
International Nuclear Information System (INIS)
Furman, M.A.; Lambertson, G.R.
1997-05-01
The authors present an update on the estimate of the growth time of the multi-bunch transverse instability in the PEP-II collider arising from the interaction of the positron beam with the accumulated electron cloud. They estimate the contributions to the growth rate arising from the dipole magnets and from the pumping straight sections. They emphasize those quantities upon which the instability is most sensitive. The simulation includes measured data on the secondary emission yield for TiN-coated samples of the actual vacuum chamber. Although the analysis is still in progress, they conclude that the instability risetime is of order 1 ms, which is well within the range controllable by the feedback system
Turnbull, Heather; Omenzetter, Piotr
2018-03-01
vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.
Improved local lattice Monte Carlo simulation for charged systems
Jiang, Jian; Wang, Zhen-Gang
2018-03-01
Maggs and Rossetto [Phys. Rev. Lett. 88, 196402 (2002)] proposed a local lattice Monte Carlo algorithm for simulating charged systems based on Gauss's law, which scales with the particle number N as O(N). This method includes two degrees of freedom: the configuration of the mobile charged particles and the electric field. In this work, we consider two important issues in the implementation of the method, the acceptance rate of configurational change (particle move) and the ergodicity in the phase space sampled by the electric field. We propose a simple method to improve the acceptance rate of particle moves based on the superposition principle for electric field. Furthermore, we introduce an additional updating step for the field, named "open-circuit update," to ensure that the system is fully ergodic under periodic boundary conditions. We apply this improved local Monte Carlo simulation to an electrolyte solution confined between two low dielectric plates. The results show excellent agreement with previous theoretical work.
Fracture network modeling and GoldSim simulation support
International Nuclear Information System (INIS)
Sugita, Kenichirou; Dershowitz, W.
2005-01-01
During Heisei-16, Golder Associates provided support for JNC Tokai through discrete fracture network data analysis and simulation of the Mizunami Underground Research Laboratory (MIU), participation in Task 6 of the AEspoe Task Force on Modeling of Groundwater Flow and Transport, and development of methodologies for analysis of repository site characterization strategies and safety assessment. MIU support during H-16 involved updating the H-15 FracMan discrete fracture network (DFN) models for the MIU shaft region, and developing improved simulation procedures. Updates to the conceptual model included incorporation of 'Step2' (2004) versions of the deterministic structures, and revision of background fractures to be consistent with conductive structure data from the DH-2 borehole. Golder developed improved simulation procedures for these models through the use of hybrid discrete fracture network (DFN), equivalent porous medium (EPM), and nested DFN/EPM approaches. For each of these models, procedures were documented for the entire modeling process including model implementation, MMP simulation, and shaft grouting simulation. Golder supported JNC participation in Task 6AB, 6D and 6E of the AEspoe Task Force on Modeling of Groundwater Flow and Transport during H-16. For Task 6AB, Golder developed a new technique to evaluate the role of grout in performance assessment time-scale transport. For Task 6D, Golder submitted a report of H-15 simulations to SKB. For Task 6E, Golder carried out safety assessment time-scale simulations at the block scale, using the Laplace Transform Galerkin method. During H-16, Golder supported JNC's Total System Performance Assessment (TSPA) strategy by developing technologies for the analysis of the use site characterization data in safety assessment. This approach will aid in the understanding of the use of site characterization to progressively reduce site characterization uncertainty. (author)
Identifying null meta-analyses that are ripe for updating
Directory of Open Access Journals (Sweden)
Fang Manchun
2003-07-01
Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.
Updating parameters of the chicken processing line model
DEFF Research Database (Denmark)
Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna
2010-01-01
A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....
Simulation of secondary emission calorimeter for future colliders
Yetkin, E. A.; Yetkin, T.; Ozok, F.; Iren, E.; Erduran, M. N.
2018-03-01
We present updated results from a simulation study of a conceptual sampling electromagnetic calorimeter based on secondary electron emission process. We implemented the secondary electron emission process in Geant4 as a user physics list and produced the energy spectrum and yield of secondary electrons. The energy resolution of the SEE calorimeter was σ/E = (41%) GeV1/2/√E and the response linearity to electromagnetic showers was to within 1.5%. The simulation results were also compared with a traditional scintillator calorimeter.
Nonparametric methods in actigraphy: An update
Directory of Open Access Journals (Sweden)
Bruno S.B. Gonçalves
2014-09-01
Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.
Better Plants Progress Update Fall 2013
Energy Technology Data Exchange (ETDEWEB)
none,
2013-09-23
This Progress Update summarizes the significant energy saving achievements and cumulative cost savings made by these industry leaders from 2010-2012. The update also shares the plans and priorities over the next year for the Better Plants Program to continue to advance energy efficiency in the industrial sector.
Non-Linear Approximation of Bayesian Update
Litvinenko, Alexander
2016-01-01
We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.
Non-Linear Approximation of Bayesian Update
Litvinenko, Alexander
2016-06-23
We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.
Indoor Spatial Updating with Reduced Visual Information
Legge, Gordon E.; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M.
2016-01-01
Purpose Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Methods Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (S...
International Nuclear Information System (INIS)
Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu
2015-01-01
Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy. (paper)
Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu
2015-07-21
Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy.
Directory of Open Access Journals (Sweden)
Jelena Grujić
Full Text Available The presence of costly cooperation between otherwise selfish actors is not trivial. A prominent mechanism that promotes cooperation is spatial population structure. However, recent experiments with human subjects report substantially lower level of cooperation then predicted by theoretical models. We analyze the data of such an experiment in which a total of 400 players play a Prisoner's Dilemma on a 4×4 square lattice in two treatments, either interacting via a fixed square lattice (15 independent groups or with a population structure changing after each interaction (10 independent groups. We analyze the statistics of individual decisions and infer in which way they can be matched with the typical models of evolutionary game theorists. We find no difference in the strategy updating between the two treatments. However, the strategy updates are distinct from the most popular models which lead to the promotion of cooperation as shown by computer simulations of the strategy updating. This suggests that the promotion of cooperation by population structure is not as straightforward in humans as often envisioned in theoretical models.
Digitalization and networking of analog simulators and portal images.
Pesznyák, Csilla; Zaránd, Pál; Mayer, Arpád
2007-03-01
Many departments have analog simulators and irradiation facilities (especially cobalt units) without electronic portal imaging. Import of the images into the R&V (Record & Verify) system is required. Simulator images are grabbed while portal films scanned by using a laser scanner and both converted into DICOM RT (Digital Imaging and Communications in Medicine Radiotherapy) images. Image intensifier output of a simulator and portal films are converted to DICOM RT images and used in clinical practice. The simulator software was developed in cooperation at the authors' hospital. The digitalization of analog simulators is a valuable updating in clinical use replacing screen-film technique. Film scanning and digitalization permit the electronic archiving of films. Conversion into DICOM RT images is a precondition of importing to the R&V system.
Atmospheric release model for the E-area low-level waste facility: Updates and modifications
International Nuclear Information System (INIS)
None, None
2017-01-01
The atmospheric release model (ARM) utilizes GoldSim® Monte Carlo simulation software (GTG, 2017) to evaluate the flux of gaseous radionuclides as they volatilize from E-Area disposal facility waste zones, diffuse into the air-filled soil pores surrounding the waste, and emanate at the land surface. This report documents the updates and modifications to the ARM for the next planned E-Area PA considering recommendations from the 2015 PA strategic planning team outlined by Butcher and Phifer.
Atmospheric release model for the E-area low-level waste facility: Updates and modifications
Energy Technology Data Exchange (ETDEWEB)
None, None
2017-11-16
The atmospheric release model (ARM) utilizes GoldSim® Monte Carlo simulation software (GTG, 2017) to evaluate the flux of gaseous radionuclides as they volatilize from E-Area disposal facility waste zones, diffuse into the air-filled soil pores surrounding the waste, and emanate at the land surface. This report documents the updates and modifications to the ARM for the next planned E-Area PA considering recommendations from the 2015 PA strategic planning team outlined by Butcher and Phifer.
Chen, C P; Wan, J Z
1999-01-01
A fast learning algorithm is proposed to find an optimal weights of the flat neural networks (especially, the functional-link network). Although the flat networks are used for nonlinear function approximation, they can be formulated as linear systems. Thus, the weights of the networks can be solved easily using a linear least-square method. This formulation makes it easier to update the weights instantly for both a new added pattern and a new added enhancement node. A dynamic stepwise updating algorithm is proposed to update the weights of the system on-the-fly. The model is tested on several time-series data including an infrared laser data set, a chaotic time-series, a monthly flour price data set, and a nonlinear system identification problem. The simulation results are compared to existing models in which more complex architectures and more costly training are needed. The results indicate that the proposed model is very attractive to real-time processes.
SAM Photovoltaic Model Technical Reference 2016 Update
Energy Technology Data Exchange (ETDEWEB)
Gilman, Paul [National Renewable Energy Laboratory (NREL), Golden, CO (United States); DiOrio, Nicholas A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Freeman, Janine M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Janzou, Steven [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dobos, Aron [No longer NREL employee; Ryberg, David [No longer NREL employee
2018-03-19
This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixed arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.
Beyond "The Total Organization": A Graduate-Level Simulation
Kane, Kathleen R.; Goldgehn, Leslie A.
2011-01-01
This simulation is designed to help students understand the complexity of organizational life and learn how to navigate a work world of chaos, conflict, and uncertainty. This adaptation and update of an exercise by Cohen, Fink, Gadon, and Willits has been a successful addition to MBA and EMBA courses. The participants must self-organize, choose…
Cooperation and charity in spatial public goods game under different strategy update rules
Li, Yixiao; Jin, Xiaogang; Su, Xianchuang; Kong, Fansheng; Peng, Chengbin
2010-03-01
Human cooperation can be influenced by other human behaviors and recent years have witnessed the flourishing of studying the coevolution of cooperation and punishment, yet the common behavior of charity is seldom considered in game-theoretical models. In this article, we investigate the coevolution of altruistic cooperation and egalitarian charity in spatial public goods game, by considering charity as the behavior of reducing inter-individual payoff differences. Our model is that, in each generation of the evolution, individuals play games first and accumulate payoff benefits, and then each egalitarian makes a charity donation by payoff transfer in its neighborhood. To study the individual-level evolutionary dynamics, we adopt different strategy update rules and investigate their effects on charity and cooperation. These rules can be classified into two global rules: random selection rule in which individuals randomly update strategies, and threshold selection rule where only those with payoffs below a threshold update strategies. Simulation results show that random selection enhances the cooperation level, while threshold selection lowers the threshold of the multiplication factor to maintain cooperation. When charity is considered, it is incapable in promoting cooperation under random selection, whereas it promotes cooperation under threshold selection. Interestingly, the evolution of charity strongly depends on the dispersion of payoff acquisitions of the population, which agrees with previous results. Our work may shed light on understanding human egalitarianism.
Nonsynchronous updating in the multiverse of cellular automata.
Reia, Sandro M; Kinouchi, Osame
2015-04-01
In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.
Nonsynchronous updating in the multiverse of cellular automata
Reia, Sandro M.; Kinouchi, Osame
2015-04-01
In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.
Energy Technology Data Exchange (ETDEWEB)
Almond, K.P.
1995-07-01
An update to an extensive bibliography on alternate uses of sulfur was presented. Alberta Sulphur Research Ltd., previously compiled a bibliography in volume 24 of this quarterly bulletin. This update provides an additional 44 new publications. The information regarding current research focusses on topics regarding the use of sulfur in oil and gas applications, mining and metallurgy, concretes and other structural materials, waste management, rubber and textile products, asphalts and other paving and highway applications.
News and Features Updates from USA.gov
General Services Administration — Stay on top of important government news and information with the USA.gov Updates: News and Features RSS feed. We'll update this feed when we add news and featured...
Updating representation of land surface-atmosphere feedbacks in airborne campaign modeling analysis
Huang, M.; Carmichael, G. R.; Crawford, J. H.; Chan, S.; Xu, X.; Fisher, J. A.
2017-12-01
An updated modeling system to support airborne field campaigns is being built at NASA Ames Pleiades, with focus on adjusting the representation of land surface-atmosphere feedbacks. The main updates, referring to previous experiences with ARCTAS-CARB and CalNex in the western US to study air pollution inflows, include: 1) migrating the WRF (Weather Research and Forecasting) coupled land surface model from Noah to improved/more complex models especially Noah-MP and Rapid Update Cycle; 2) enabling the WRF land initialization with suitably spun-up land model output; 3) incorporating satellite land cover, vegetation dynamics, and soil moisture data (i.e., assimilating Soil Moisture Active Passive data using the ensemble Kalman filter approach) into WRF. Examples are given of comparing the model fields with available aircraft observations during spring-summer 2016 field campaigns taken place at the eastern side of continents (KORUS-AQ in South Korea and ACT-America in the eastern US), the air pollution export regions. Under fair weather and stormy conditions, air pollution vertical distributions and column amounts, as well as the impact from land surface, are compared. These help identify challenges and opportunities for LEO/GEO satellite remote sensing and modeling of air quality in the northern hemisphere. Finally, we briefly show applications of this system on simulating Australian conditions, which would explore the needs for further development of the observing system in the southern hemisphere and inform the Clean Air and Urban Landscapes (https://www.nespurban.edu.au) modelers.
Rebuild America partner update, November--December 1998
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-11-01
This issue of the update includes articles on retrofitting Duke University facilities, energy efficiency updates to buildings in Portland, Oregon, Salisbury, North Carolina, Hawaii, Roanoke-Chowan, Virginia, and energy savings centered designs for lighting systems.
Scalable Simulation of Electromagnetic Hybrid Codes
International Nuclear Information System (INIS)
Perumalla, Kalyan S.; Fujimoto, Richard; Karimabadi, Dr. Homa
2006-01-01
New discrete-event formulations of physics simulation models are emerging that can outperform models based on traditional time-stepped techniques. Detailed simulation of the Earth's magnetosphere, for example, requires execution of sub-models that are at widely differing timescales. In contrast to time-stepped simulation which requires tightly coupled updates to entire system state at regular time intervals, the new discrete event simulation (DES) approaches help evolve the states of sub-models on relatively independent timescales. However, parallel execution of DES-based models raises challenges with respect to their scalability and performance. One of the key challenges is to improve the computation granularity to offset synchronization and communication overheads within and across processors. Our previous work was limited in scalability and runtime performance due to the parallelization challenges. Here we report on optimizations we performed on DES-based plasma simulation models to improve parallel performance. The net result is the capability to simulate hybrid particle-in-cell (PIC) models with over 2 billion ion particles using 512 processors on supercomputing platforms
Federal Education Update, December 2004. Commission Update 04-17.
California Postsecondary Education Commission, 2004
2004-01-01
This update presents some of the major issues affecting education occurring at the national level. These include: Higher Education Act Extended for One Year; New Law Increases Loan Forgiveness for Teachers; Domestic Appropriations Measures Completed; Change in Federal Student Aid Rules; Bush Advisor Nominated To Be Education Secretary In Second…
Agent Communication for Dynamic Belief Update
Kobayashi, Mikito; Tojo, Satoshi
Thus far, various formalizations of rational / logical agent model have been proposed. In this paper, we include the notion of communication channel and belief modality into update logic, and introduce Belief Update Logic (BUL). First, we discuss that how we can reformalize the inform action of FIPA-ACL into communication channel, which represents a connection between agents. Thus, our agents can send a message only when they believe, and also there actually is, a channel between him / her and a receiver. Then, we present a static belief logic (BL) and show its soundness and completeness. Next, we develop the logic to BUL, which can update Kripke model by the inform action; in which we show that in the updated model the belief operator also satisfies K45. Thereafter, we show that every sentence in BUL can be translated into BL; thus, we can contend that BUL is also sound and complete. Furthermore, we discuss the features of CUL, including the case of inconsistent information, as well as channel transmission. Finally, we summarize our contribution and discuss some future issues.
Migrating to a real-time distributed parallel simulator architecture- An update
CSIR Research Space (South Africa)
Duvenhage, B
2007-09-01
Full Text Available A legacy non-distributed logical time simulator was previously migrated to a distributed architecture to parallelise execution. The existing Discrete Time System Specification (DTSS) modelling formalism was retained to simplify the reuse of existing...
Communication technology update and fundamentals
Grant, August E
2010-01-01
New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update and Fundamentals is the single best source for the latest developments, trends, and issues in communication technology. Featuring the fundamental framework along with the history and background of communication technologies, Communication Technology Update and Fundamentals, 12th edition helps you stay ahead of these ever-changing and emerging technologies.As always, every chapter ha
Communication technology update and fundamentals
Grant, August E
2008-01-01
New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update is the single best source for the latest developments, trends, and issues in communication technology. Now in its 11th edition, Communication Technology Update has become an indispensable information resource for business, government, and academia. As always, every chapter has been completely rewritten to reflect the latest developments and market statistics, and now covers mobile computing, dig
Updated safety analysis of ITER
Energy Technology Data Exchange (ETDEWEB)
Taylor, Neill, E-mail: neill.taylor@iter.org [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France); Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France)
2011-10-15
An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.
Updated safety analysis of ITER
International Nuclear Information System (INIS)
Taylor, Neill; Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid
2011-01-01
An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.
Mining Sequential Update Summarization with Hierarchical Text Analysis
Directory of Open Access Journals (Sweden)
Chunyun Zhang
2016-01-01
Full Text Available The outbreak of unexpected news events such as large human accident or natural disaster brings about a new information access problem where traditional approaches fail. Mostly, news of these events shows characteristics that are early sparse and later redundant. Hence, it is very important to get updates and provide individuals with timely and important information of these incidents during their development, especially when being applied in wireless and mobile Internet of Things (IoT. In this paper, we define the problem of sequential update summarization extraction and present a new hierarchical update mining system which can broadcast with useful, new, and timely sentence-length updates about a developing event. The new system proposes a novel method, which incorporates techniques from topic-level and sentence-level summarization. To evaluate the performance of the proposed system, we apply it to the task of sequential update summarization of temporal summarization (TS track at Text Retrieval Conference (TREC 2013 to compute four measurements of the update mining system: the expected gain, expected latency gain, comprehensiveness, and latency comprehensiveness. Experimental results show that our proposed method has good performance.
Run-time Phenomena in Dynamic Software Updating: Causes and Effects
DEFF Research Database (Denmark)
Gregersen, Allan Raundahl; Jørgensen, Bo Nørregaard
2011-01-01
The development of a dynamic software updating system for statically-typed object-oriented programming languages has turned out to be a challenging task. Despite the fact that the present state of the art in dynamic updating systems, like JRebel, Dynamic Code Evolution VM, JVolve and Javeleon, all...... written in statically-typed object-oriented programming languages. In this paper, we present our experience from developing dynamically updatable applications using a state-of-the-art dynamic updating system for Java. We believe that the findings presented in this paper provide an important step towards...... provide very transparent and flexible technical solutions to dynamic updating, case studies have shown that designing dynamically updatable applications still remains a challenging task. This challenge has its roots in a number of run-time phenomena that are inherent to dynamic updating of applications...
34 CFR 668.55 - Updating information.
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false Updating information. 668.55 Section 668.55 Education... Information § 668.55 Updating information. (a)(1) Unless the provisions of paragraph (a)(2) or (a)(3) of this... applicant to verify the information contained in his or her application for assistance in an award year if...
Choosing the speed of dynamic mental simulations.
Makin, Alexis D J
2017-01-01
The brain continuously maintains a current representation of its immediate surroundings. Perceptual representations are often updated when the world changes, e.g., when we notice an object move. However, we can also update representations internally, without incoming signals from the senses. In other words, we can run internal simulations of dynamic events. This ability is evident during mental object rotation. These uncontroversial observations lead to an obvious question that nevertheless remains to be answered: How does the brain control the speed of dynamic mental simulations? Is there a central rate controller or pacemaker module in the brain that can be temporarily coupled to sensory maps? We can refer to this as the common rate control theory. Alternatively, the primitive intelligence within each map could tune into the speed of recent changes and use this information to keep going after stimuli disappear. We can call this the separate rate control theory. Preliminary evidence from prediction motion experiments supports common rate control, although local predictive mechanisms may cover short gaps of cognitive timing literature. Indirect neuroimaging evidence suggests rate control is a function of the core timing system in the dorsal striatum. © 2017 Elsevier B.V. All rights reserved.
An update on the BQCD Hybrid Monte Carlo program
Directory of Open Access Journals (Sweden)
Haar Taylor Ryan
2018-01-01
Full Text Available We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n for K, cSW and chemical potential reweighting, a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.
Towards Dynamic Updates in Service Composition
Directory of Open Access Journals (Sweden)
Mario Bravetti
2015-12-01
Full Text Available We survey our results about verification of adaptable processes. We present adaptable processes as a way of overcoming the limitations that process calculi have for describing patterns of dynamic process evolution. Such patterns rely on direct ways of controlling the behavior and location of running processes, and so they are at the heart of the adaptation capabilities present in many modern concurrent systems. Adaptable processes have named scopes and are sensible to actions of dynamic update at runtime; this allows to express dynamic and static topologies of adaptable processes as well as different evolvability patterns for concurrent processes. We introduce a core calculus of adaptable processes and consider verification problems for them: first based on specific properties related to error occurrence, that we call bounded and eventual adaptation, and then by considering a simple yet expressive temporal logic over adaptable processes. We provide (undecidability results of such verification problems over adaptable processes considering the spectrum of topologies/evolvability patterns introduced. We then consider distributed adaptability, where a process can update part of a protocol by performing dynamic distributed updates over a set of protocol participants. Dynamic updates in this context are presented as an extension of our work on choreographies and behavioural contracts in multiparty interactions. We show how update mechanisms considered for adaptable processes can be used to extend the theory of choreography and orchestration/contracts, allowing them to be modified at run-time by internal (self-adaptation or external intervention.
Kessler, Yoav; Baruchin, Liad J; Bouhsira-Sabag, Anat
2017-01-01
Theoretical models suggest that maintenance and updating are two functional states of working memory (WM), which are controlled by a gate between perceptual information and WM representations. Opening the gate enables updating WM with input, while closing it enables keeping the maintained information shielded from interference. However, it is still unclear when gate opening takes place, and what is the external signal that triggers it. A version of the AX-CPT paradigm was used to examine a recent proposal in the literature, suggesting that updating is triggered whenever the maintenance of the context is necessary for task performance (context-dependent tasks). In four experiments using this paradigm, we show that (1) a task-switching cost takes place in both context-dependent and context-independent trials; (2) task-switching is additive to the dependency effect, and (3) unlike switching cost, the dependency effect is not affected by preparation and, therefore, does not reflect context-updating. We suggest that WM updating is likely to be triggered by a simple mechanism that occurs in each trial of the task regardless of whether maintaining the context is needed or not. The implications for WM updating and its relationship to task-switching are discussed.
Numerical model updating technique for structures using firefly algorithm
Sai Kubair, K.; Mohan, S. C.
2018-03-01
Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.
DEFF Research Database (Denmark)
Rendtorff, Jacob Dahl
2015-01-01
This paper presents an update of the research on European bioethics undertaken by the author together with Professor Peter Kemp since the 1990s, on Basic ethical principles in European bioethics and biolaw. In this European approach to basic ethical principles in bioethics and biolaw......, the principles of autonomy, dignity, integrity and vulnerability are proposed as the most important ethical principles for respect for the human person in biomedical and biotechnological development. This approach to bioethics and biolaw is presented here in a short updated version that integrates the earlier...... research in a presentation of the present understanding of the basic ethical principles in bioethics and biolaw....
Robot Visual Tracking via Incremental Self-Updating of Appearance Model
Directory of Open Access Journals (Sweden)
Danpei Zhao
2013-09-01
Full Text Available This paper proposes a target tracking method called Incremental Self-Updating Visual Tracking for robot platforms. Our tracker treats the tracking problem as a binary classification: the target and the background. The greyscale, HOG and LBP features are used in this work to represent the target and are integrated into a particle filter framework. To track the target over long time sequences, the tracker has to update its model to follow the most recent target. In order to deal with the problems of calculation waste and lack of model-updating strategy with the traditional methods, an intelligent and effective online self-updating strategy is devised to choose the optimal update opportunity. The strategy of updating the appearance model can be achieved based on the change in the discriminative capability between the current frame and the previous updated frame. By adjusting the update step adaptively, severe waste of calculation time for needless updates can be avoided while keeping the stability of the model. Moreover, the appearance model can be kept away from serious drift problems when the target undergoes temporary occlusion. The experimental results show that the proposed tracker can achieve robust and efficient performance in several benchmark-challenging video sequences with various complex environment changes in posture, scale, illumination and occlusion.
International Nuclear Information System (INIS)
Larson, H.A.; Dean, E.M.; Koenig, J.F.; Gale, J.G.; Lehto, W.K.
1984-01-01
The DSNP Simulation Language facilitates whole reactor plant simulation and design. Verification includes DSNP dynamic modeling of Experimental Breeder Reactor No. 2 (EBR-II) plant experiments as well as comparisons with verified simulation programs. Great flexibility is allowed in expanding the DSNP language and accommodate other computer languages. The component modules of DSNP, contained in libraries, are continually updated with new, improved, and verified modules. The modules are used to simulate the dynamic response of LMFBR reactor systems to upset and transient conditions, with special emphasis on investigations of inherent shutdown mechanisms
Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L
2017-02-01
To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.
An Update on Improvements to NiCE Support for RELAP-7
Energy Technology Data Exchange (ETDEWEB)
McCaskey, Alex [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wojtowicz, Anna [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Deyton, Jordan H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Patterson, Taylor C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Billings, Jay Jay [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2015-01-01
The Multiphysics Object-Oriented Simulation Environment (MOOSE) is a framework that facilitates the development of applications that rely on finite-element analysis to solve a coupled, nonlinear system of partial differential equations. RELAP-7 represents an update to the venerable RELAP-5 simulator that is built upon this framework and attempts to model the balance-of-plant concerns in a full nuclear plant. This report details the continued support and integration of RELAP-7 and the NEAMS Integrated Computational Environment (NiCE). RELAP-7 is fully supported by the NiCE due to on-going work to tightly integrate NiCE with the MOOSE framework, and subsequently the applications built upon it. NiCE development throughout the first quarter of FY15 has focused on improvements, bug fixes, and feature additions to existing MOOSE-based application support. Specifically, this report will focus on improvements to the NiCE MOOSE Model Builder, the MOOSE application job launcher, and the 3D Nuclear Plant Viewer. This report also includes a comprehensive tutorial that guides RELAP-7 users through the basic NiCE workflow: from input generation and 3D Plant modeling, to massively parallel job launch and post-simulation data visualization.
Minnesota's forest statistics, 1987: an inventory update.
Jerold T. Hahn; W. Brad Smith
1987-01-01
The Minnesota 1987 inventory update, derived by using tree growth models, reports 13.5 million acres of timberland, a decline of less than 1% since 1977. This bulletin presents findings from the inventory update in tables detailing timer land area, volume, and biomass.
Wisconsin's forest statistics, 1987: an inventory update.
W. Brad Smith; Jerold T. Hahn
1989-01-01
The Wisconsin 1987 inventory update, derived by using tree growth models, reports 14.7 million acres of timberland, a decline of less than 1% since 1983. This bulletin presents findings from the inventory update in tables detailing timberland area, volume, and biomass.
Research on Topographic Map Updating
Directory of Open Access Journals (Sweden)
Ivana Javorović
2013-04-01
Full Text Available The investigation of interpretability of panchromatic satellite image IRS-1C integrated with multispectral Landsat TM image with the purpose of updating the topographic map sheet at the scale of 1:25 000 has been described. The geocoding of source map was based on trigonometric points of the map sheet. Satellite images were geocoded using control points selected from the map. The contents of map have been vectorized and topographic database designed. The digital image processing improved the interpretability of images. Then, the vectorization of new contents was made. The change detection of the forest and water area was defined by using unsupervised classification of spatial and spectral merged images. Verification of the results was made using corresponding aerial photographs. Although this methodology could not insure the complete updating of topographic map at the scale of 1:25 000, the database has been updated with huge amount of data. Erdas Imagine 8.3. software was used.
Jalali, Mohammad; Ramazi, Hamidreza
2018-04-01
This article is devoted to application of a simulation algorithm based on geostatistical methods to compile and update seismotectonic provinces in which Iran has been chosen as a case study. Traditionally, tectonic maps together with seismological data and information (e.g., earthquake catalogues, earthquake mechanism, and microseismic data) have been used to update seismotectonic provinces. In many cases, incomplete earthquake catalogues are one of the important challenges in this procedure. To overcome this problem, a geostatistical simulation algorithm, turning band simulation, TBSIM, was applied to make a synthetic data to improve incomplete earthquake catalogues. Then, the synthetic data was added to the traditional information to study the seismicity homogeneity and classify the areas according to tectonic and seismic properties to update seismotectonic provinces. In this paper, (i) different magnitude types in the studied catalogues have been homogenized to moment magnitude (Mw), and earthquake declustering was then carried out to remove aftershocks and foreshocks; (ii) time normalization method was introduced to decrease the uncertainty in a temporal domain prior to start the simulation procedure; (iii) variography has been carried out in each subregion to study spatial regressions (e.g., west-southwestern area showed a spatial regression from 0.4 to 1.4 decimal degrees; the maximum range identified in the azimuth of 135 ± 10); (iv) TBSIM algorithm was then applied to make simulated events which gave rise to make 68,800 synthetic events according to the spatial regression found in several directions; (v) simulated events (i.e., magnitudes) were classified based on their intensity in ArcGIS packages and homogenous seismic zones have been determined. Finally, according to the synthetic data, tectonic features, and actual earthquake catalogues, 17 seismotectonic provinces were introduced in four major classes introduced as very high, high, moderate, and low
International Nuclear Information System (INIS)
Marchetti, Luca; Priami, Corrado; Thanh, Vo Hong
2016-01-01
This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance and accuracy of HRSSA against other state of the art algorithms.
Energy Technology Data Exchange (ETDEWEB)
Marchetti, Luca, E-mail: marchetti@cosbi.eu [The Microsoft Research – University of Trento Centre for Computational and Systems Biology (COSBI), Piazza Manifattura, 1, 38068 Rovereto (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research – University of Trento Centre for Computational and Systems Biology (COSBI), Piazza Manifattura, 1, 38068 Rovereto (Italy); University of Trento, Department of Mathematics (Italy); Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research – University of Trento Centre for Computational and Systems Biology (COSBI), Piazza Manifattura, 1, 38068 Rovereto (Italy)
2016-07-15
This paper introduces HRSSA (Hybrid Rejection-based Stochastic Simulation Algorithm), a new efficient hybrid stochastic simulation algorithm for spatially homogeneous biochemical reaction networks. HRSSA is built on top of RSSA, an exact stochastic simulation algorithm which relies on propensity bounds to select next reaction firings and to reduce the average number of reaction propensity updates needed during the simulation. HRSSA exploits the computational advantage of propensity bounds to manage time-varying transition propensities and to apply dynamic partitioning of reactions, which constitute the two most significant bottlenecks of hybrid simulation. A comprehensive set of simulation benchmarks is provided for evaluating performance and accuracy of HRSSA against other state of the art algorithms.
ROBOSIM, a simulator for robotic systems
Hinman, Elaine M.; Fernandez, Ken; Cook, George E.
1991-01-01
ROBOSIM, a simulator for robotic systems, was developed by NASA to aid in the rapid prototyping of automation. ROBOSIM has allowed the development of improved robotic systems concepts for both earth-based and proposed on-orbit applications while significantly reducing development costs. In a cooperative effort with an area university, ROBOSIM was further developed for use in the classroom as a safe and cost-effective way of allowing students to study robotic systems. Students have used ROBOSIM to study existing robotic systems and systems which they have designed in the classroom. Since an advanced simulator/trainer of this type is beneficial not only to NASA projects and programs but industry and academia as well, NASA is in the process of developing this technology for wider public use. An update on the simulators's new application areas, the improvements made to the simulator's design, and current efforts to ensure the timely transfer of this technology are presented.
Software Updating in Wireless Sensor Networks: A Survey and Lacunae
Directory of Open Access Journals (Sweden)
Cormac J. Sreenan
2013-11-01
Full Text Available Wireless Sensor Networks are moving out of the laboratory and into the field. For a number of reasons there is often a need to update sensor node software, or node configuration, after deployment. The need for over-the-air updates is driven both by the scale of deployments, and by the remoteness and inaccessibility of sensor nodes. This need has been recognized since the early days of sensor networks, and research results from the related areas of mobile networking and distributed systems have been applied to this area. In order to avoid any manual intervention, the update process needs to be autonomous. This paper presents a comprehensive survey of software updating in Wireless Sensor Networks, and analyses the features required to make these updates autonomous. A new taxonomy of software update features and a new model for fault detection and recovery are presented. The paper concludes by identifying the lacunae relating to autonomous software updates, providing direction for future research.
Two-way coupling of magnetohydrodynamic simulations with embedded particle-in-cell simulations
Makwana, K. D.; Keppens, R.; Lapenta, G.
2017-12-01
We describe a method for coupling an embedded domain in a magnetohydrodynamic (MHD) simulation with a particle-in-cell (PIC) method. In this two-way coupling we follow the work of Daldorff et al. (2014) [19] in which the PIC domain receives its initial and boundary conditions from MHD variables (MHD to PIC coupling) while the MHD simulation is updated based on the PIC variables (PIC to MHD coupling). This method can be useful for simulating large plasma systems, where kinetic effects captured by particle-in-cell simulations are localized but affect global dynamics. We describe the numerical implementation of this coupling, its time-stepping algorithm, and its parallelization strategy, emphasizing the novel aspects of it. We test the stability and energy/momentum conservation of this method by simulating a steady-state plasma. We test the dynamics of this coupling by propagating plasma waves through the embedded PIC domain. Coupling with MHD shows satisfactory results for the fast magnetosonic wave, but significant distortion for the circularly polarized Alfvén wave. Coupling with Hall-MHD shows excellent coupling for the whistler wave. We also apply this methodology to simulate a Geospace Environmental Modeling (GEM) challenge type of reconnection with the diffusion region simulated by PIC coupled to larger scales with MHD and Hall-MHD. In both these cases we see the expected signatures of kinetic reconnection in the PIC domain, implying that this method can be used for reconnection studies.
77 FR 41258 - FOIA Fee Schedule Update
2012-07-13
... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Establishment of FOIA Fee Schedule. SUMMARY: The Defense Nuclear Facilities Safety Board is publishing its Freedom of Information Act (FOIA) Fee Schedule Update pursuant to...
76 FR 43819 - FOIA Fee Schedule Update
2011-07-22
... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Establishment of FOIA Fee Schedule. SUMMARY: The Defense Nuclear Facilities Safety Board is publishing its Freedom of Information Act (FOIA) Fee Schedule Update pursuant to...
Lang, S. E.; Tao, W. K.; Wu, D.
2016-12-01
The Goddard Convective-Stratiform Heating (or CSH) algorithm is used to retrieve estimates of cloud heating over the global Tropics using TRMM rainfall data and a set of look-up-tables (LUTs) derived from a series of multi-week cloud-resolving model (CRM) simulations using the Goddard Cumulus Ensemble model (or GCE). These simulations link satellite observables (i.e., surface rainfall and stratiform fraction) with cloud heating profiles, which are not directly observable. The strength of the algorithm relies in part on the representativeness of the simulations; more realistic simulations provide a stronger link between the observables and simulated heating profiles. The current "TRMM" version of the CSH algorithm relies on 2D GCE simulations using an improved version of the Goddard 3-class ice scheme (3ICE), a moderate-sized domain, and 1-km horizontal resolution. Updating the LUTs, which are suitable for tropical and continental summertime environments requires new, more realistic GCE simulations. New simulations are performed using a new, improved 4-class ice scheme, which has been shown to outperform the 3ICE scheme, especially for intense convection. Additional grid configurations are also tested and evaluated to find the best overall setup to for re-deriving and updating the CSH tropical/summertime LUTs.
Exploring cluster Monte Carlo updates with Boltzmann machines.
Wang, Lei
2017-11-01
Boltzmann machines are physics informed generative models with broad applications in machine learning. They model the probability distribution of an input data set with latent variables and generate new samples accordingly. Applying the Boltzmann machines back to physics, they are ideal recommender systems to accelerate the Monte Carlo simulation of physical systems due to their flexibility and effectiveness. More intriguingly, we show that the generative sampling of the Boltzmann machines can even give different cluster Monte Carlo algorithms. The latent representation of the Boltzmann machines can be designed to mediate complex interactions and identify clusters of the physical system. We demonstrate these findings with concrete examples of the classical Ising model with and without four-spin plaquette interactions. In the future, automatic searches in the algorithm space parametrized by Boltzmann machines may discover more innovative Monte Carlo updates.
Exploring cluster Monte Carlo updates with Boltzmann machines
Wang, Lei
2017-11-01
Boltzmann machines are physics informed generative models with broad applications in machine learning. They model the probability distribution of an input data set with latent variables and generate new samples accordingly. Applying the Boltzmann machines back to physics, they are ideal recommender systems to accelerate the Monte Carlo simulation of physical systems due to their flexibility and effectiveness. More intriguingly, we show that the generative sampling of the Boltzmann machines can even give different cluster Monte Carlo algorithms. The latent representation of the Boltzmann machines can be designed to mediate complex interactions and identify clusters of the physical system. We demonstrate these findings with concrete examples of the classical Ising model with and without four-spin plaquette interactions. In the future, automatic searches in the algorithm space parametrized by Boltzmann machines may discover more innovative Monte Carlo updates.
Simulated Annealing-Based Krill Herd Algorithm for Global Optimization
Directory of Open Access Journals (Sweden)
Gai-Ge Wang
2013-01-01
Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.
Environmental Regulatory Update Table, December 1989
International Nuclear Information System (INIS)
Houlbert, L.M.; Langston, M.E.; Nikbakht, A.; Salk, M.S.
1990-01-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action
Environmental regulatory update table, March 1989
International Nuclear Information System (INIS)
Houlberg, L.; Langston, M.E.; Nikbakht, A.; Salk, M.S.
1989-04-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action
Environmental Regulatory Update Table, April 1989
International Nuclear Information System (INIS)
Houlberg, L.; Langston, M.E.; Nikbakht, A.; Salk, M.S.
1989-05-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action
Environmental Regulatory Update Table, December 1991
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1992-01-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, August 1990
International Nuclear Information System (INIS)
Houlberg, L.M.; Nikbakht, A.; Salk, M.S.
1990-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action
Environmental Regulatory Update Table, October 1991
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1991-11-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, November 1991
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1991-12-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, September 1991
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1991-10-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
How Do We Update Faces? Effects of Gaze Direction and Facial Expressions on Working Memory Updating
Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola
2012-01-01
The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enh...
Key Update Assistant for Resource-Constrained Networks
DEFF Research Database (Denmark)
Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming
2012-01-01
developed a push-button solution - powered by stochastic model checking - that network designers can easily benefit from, and it paves the way for consumers to set up key update related security parameters. Key Update Assistant, as we named it, runs necessary model checking operations and determines...
Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold
1991-01-01
A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...
Systems Thinking and Simulation Modeling to Inform Childhood Obesity Policy and Practice.
Powell, Kenneth E; Kibbe, Debra L; Ferencik, Rachel; Soderquist, Chris; Phillips, Mary Ann; Vall, Emily Anne; Minyard, Karen J
In 2007, 31.7% of Georgia adolescents in grades 9-12 were overweight or obese. Understanding the impact of policies and interventions on obesity prevalence among young people can help determine statewide public health and policy strategies. This article describes a systems model, originally launched in 2008 and updated in 2014, that simulates the impact of policy interventions on the prevalence of childhood obesity in Georgia through 2034. In 2008, using information from peer-reviewed reports and quantitative estimates by experts in childhood obesity, physical activity, nutrition, and health economics and policy, a group of legislators, legislative staff members, and experts trained in systems thinking and system dynamics modeling constructed a model simulating the impact of policy interventions on the prevalence of childhood obesity in Georgia through 2034. Use of the 2008 model contributed to passage of a bill requiring annual fitness testing of schoolchildren and stricter enforcement of physical education requirements. We updated the model in 2014. With no policy change, the updated model projects that the prevalence of obesity among children and adolescents aged ≤18 in Georgia would hold at 18% from 2014 through 2034. Mandating daily school physical education (which would reduce prevalence to 12%) and integrating moderate to vigorous physical activity into elementary classrooms (which would reduce prevalence to 10%) would have the largest projected impact. Enacting all policies simultaneously would lower the prevalence of childhood obesity from 18% to 3%. Systems thinking, especially with simulation models, facilitates understanding of complex health policy problems. Using a simulation model to educate legislators, educators, and health experts about the policies that have the greatest short- and long-term impact should encourage strategic investment in low-cost, high-return policies.
Competency-Based Training and Simulation: Making a "Valid" Argument.
Noureldin, Yasser A; Lee, Jason Y; McDougall, Elspeth M; Sweet, Robert M
2018-02-01
The use of simulation as an assessment tool is much more controversial than is its utility as an educational tool. However, without valid simulation-based assessment tools, the ability to objectively assess technical skill competencies in a competency-based medical education framework will remain challenging. The current literature in urologic simulation-based training and assessment uses a definition and framework of validity that is now outdated. This is probably due to the absence of awareness rather than an absence of comprehension. The following review article provides the urologic community an updated taxonomy on validity theory as it relates to simulation-based training and assessments and translates our simulation literature to date into this framework. While the old taxonomy considered validity as distinct subcategories and focused on the simulator itself, the modern taxonomy, for which we translate the literature evidence, considers validity as a unitary construct with a focus on interpretation of simulator data/scores.
Automated Boundary Conditions for Wind Tunnel Simulations
Carlson, Jan-Renee
2018-01-01
Computational fluid dynamic (CFD) simulations of models tested in wind tunnels require a high level of fidelity and accuracy particularly for the purposes of CFD validation efforts. Considerable effort is required to ensure the proper characterization of both the physical geometry of the wind tunnel and recreating the correct flow conditions inside the wind tunnel. The typical trial-and-error effort used for determining the boundary condition values for a particular tunnel configuration are time and computer resource intensive. This paper describes a method for calculating and updating the back pressure boundary condition in wind tunnel simulations by using a proportional-integral-derivative controller. The controller methodology and equations are discussed, and simulations using the controller to set a tunnel Mach number in the NASA Langley 14- by 22-Foot Subsonic Tunnel are demonstrated.
Environmental Regulatory Update Table, August 1991
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M., Hawkins, G.T.; Salk, M.S.
1991-09-01
This Environmental Regulatory Update Table (August 1991) provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, July 1991
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1991-08-01
This Environmental Regulatory Update Table (July 1991) provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Digitalization and networking of analog simulators and portal images
Energy Technology Data Exchange (ETDEWEB)
Pesznyak, C.; Zarand, P.; Mayer, A. [Uzsoki Hospital, Budapest (Hungary). Inst. of Oncoradiology
2007-03-15
Background: Many departments have analog simulators and irradiation facilities (especially cobalt units) without electronic portal imaging. Import of the images into the R and V (Record and Verify) system is required. Material and Methods: Simulator images are grabbed while portal films scanned by using a laser scanner and both converted into DICOM RT (Digital Imaging and Communications in Medicine Radiotherapy) images. Results: Image intensifier output of a simulator and portal films are converted to DICOM RT images and used in clinical practice. The simulator software was developed in cooperation at the authors' hospital. Conclusion: The digitalization of analog simulators is a valuable updating in clinical use replacing screen-film technique. Film scanning and digitalization permit the electronic archiving of films. Conversion into DICOM RT images is a precondition of importing to the R and V system. (orig.)
Directory of Open Access Journals (Sweden)
Ezri Tiberiu
2016-01-01
Full Text Available The purpose of this update is to provide recent knowledge and debates regarding the use of sugammadex in the fields of anesthesia and critical care. The review is not intended to provide a comprehensive description of sugammadex and its clinical use.
"Updates to Model Algorithms & Inputs for the Biogenic ...
We have developed new canopy emission algorithms and land use data for BEIS. Simulations with BEIS v3.4 and these updates in CMAQ v5.0.2 are compared these changes to the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and evaluated the simulations against observations. This has resulted in improvements in model evaluations of modeled isoprene, NOx, and O3. The National Exposure Research Laboratory (NERL) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA mission to protect human health and the environment. AMAD research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollution problem, but also in developing emission control policies and regulations for air quality improvements.
Utilization of negative beat-frequencies for maximizing the update-rate of OFDR
Gabai, Haniel; Botsev, Yakov; Hahami, Meir; Eyal, Avishay
2015-07-01
In traditional OFDR systems, the backscattered profile of a sensing fiber is inefficiently duplicated to the negative band of spectrum. In this work, we present a new OFDR design and algorithm that remove this redundancy and make use of negative beat frequencies. In contrary to conventional OFDR designs, it facilitates efficient use of the available system bandwidth and enables distributed sensing with the maximum allowable interrogation update-rate for a given fiber length. To enable the reconstruction of negative beat frequencies an I/Q type receiver is used. In this receiver, both the in-phase (I) and quadrature (Q) components of the backscatter field are detected. Following detection, both components are digitally combined to produce a complex backscatter signal. Accordingly, due to its asymmetric nature, the produced spectrum will not be corrupted by the appearance of negative beat-frequencies. Here, via a comprehensive computer simulation, we show that in contrast to conventional OFDR systems, I/Q OFDR can be operated at maximum interrogation update-rate for a given fiber length. In addition, we experimentally demonstrate, for the first time, the ability of I/Q OFDR to utilize negative beat-frequencies for long-range distributed sensing.
Non-linear Bayesian update of PCE coefficients
Litvinenko, Alexander
2014-01-06
Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).
Non-linear Bayesian update of PCE coefficients
Litvinenko, Alexander; Matthies, Hermann G.; Pojonk, Oliver; Rosic, Bojana V.; Zander, Elmar
2014-01-01
Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).
Modernizing the ATLAS simulation infrastructure
Di Simone, A.; CollaborationAlbert-Ludwigs-Universitt Freiburg, ATLAS; Institut, Physikalisches; Br., 79104 Freiburg i.; Germany
2017-10-01
The ATLAS Simulation infrastructure has been used to produce upwards of 50 billion proton-proton collision events for analyses ranging from detailed Standard Model measurements to searches for exotic new phenomena. In the last several years, the infrastructure has been heavily revised to allow intuitive multithreading and significantly improved maintainability. Such a massive update of a legacy code base requires careful choices about what pieces of code to completely rewrite and what to wrap or revise. The initialization of the complex geometry was generalized to allow new tools and geometry description languages, popular in some detector groups. The addition of multithreading requires Geant4-MT and GaudiHive, two frameworks with fundamentally different approaches to multithreading, to work together. It also required enforcing thread safety throughout a large code base, which required the redesign of several aspects of the simulation, including truth, the record of particle interactions with the detector during the simulation. These advances were possible thanks to close interactions with the Geant4 developers.
Real-time Global Illumination by Simulating Photon Mapping
DEFF Research Database (Denmark)
Larsen, Bent Dalgaard
2004-01-01
This thesis introduces a new method for simulating photon mapping in realtime. The method uses a variety of both CPU and GPU based algorithms for speeding up the different elements in global illumination. The idea behind the method is to calculate each illumination element individually in a progr......This thesis introduces a new method for simulating photon mapping in realtime. The method uses a variety of both CPU and GPU based algorithms for speeding up the different elements in global illumination. The idea behind the method is to calculate each illumination element individually...... in a progressive and efficient manner. This has been done by analyzing the photon mapping method and by selecting efficient methods, either CPU based or GPU based, which replaces the original photon mapping algorithms. We have chosen to focus on the indirect illumination and the caustics. In our method we first...... divide the photon map into several photon maps in order to make local updates possible. Then indirect illumination is added using light maps that are selectively updated by using selective photon tracing on the CPU. The final gathering step is calculated by using fragment programs and GPU based...
Update of CERN exchange network
2003-01-01
An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. Date Change type Affected areas March 26 Update of the voice messaging system All CERN sites April...
Control of Interference during Working Memory Updating
Szmalec, Arnaud; Verbruggen, Frederick; Vandierendonck, Andre; Kemps, Eva
2011-01-01
The current study examined the nature of the processes underlying working memory updating. In 4 experiments using the n-back paradigm, the authors demonstrate that continuous updating of items in working memory prevents strong binding of those items to their contexts in working memory, and hence leads to an increased susceptibility to proactive…
42 CFR 414.30 - Conversion factor update.
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Conversion factor update. 414.30 Section 414.30... Practitioners § 414.30 Conversion factor update. Unless Congress acts in accordance with section 1848(d)(3) of... preceding FY over the third preceding FY exceeds the performance standard rate of increase established for...
Application of Real Time Models Updating in ABO Central Field
International Nuclear Information System (INIS)
Heikal, S.; Adewale, D.; Doghmi, A.; Augustine, U.
2003-01-01
ABO central field is the first deep offshore oil production in Nigeria located in OML 125 (ex-OPL316). The field was developed in a water depth of between 500 and 800 meters. Deep-water development requires much faster data handling and model updates in order to make the best possible technical decision. This required an easy way to incorporate the latest information and dynamic update of the reservoir model enabling real time reservoir management. The paper aims at discussing the benefits of real time static and dynamic model update and illustrates with a horizontal well example how this update was beneficial prior and during the drilling operation minimizing the project CAPEX Prior to drilling, a 3D geological model was built based on seismic and offset wells' data. The geological model was updated twice, once after the pilot hole drilling and then after reaching the landing point and prior drilling the horizontal section .Forward modeling ws made was well using the along the planned trajectory. During the drilling process both geo- steering and LWD data were loaded in real time to the 3D modeling software. The data was analyzed and compared with the predicted model. The location of markers was changed as drilling progressed and the entire 3D Geological model was rapidly updated. The target zones were revaluated in the light of the new model updates. Recommendations were communicated to the field, and the well trajectory was modified to take into account the new information. The combination of speed, flexibility and update-ability of the 3D modeling software enabled continues geological model update on which the asset team based their trajectory modification decisions throughout the drilling phase. The well was geo-steered through 7 meters thickness of sand. After the drilling, the testing showed excellent results with a productivity and fluid properties data were used to update the dynamic model reviewing the well production plateau providing optimum reservoir
Effect of asynchronous updating on the stability of cellular automata
International Nuclear Information System (INIS)
Baetens, J.M.; Van der Weeën, P.; De Baets, B.
2012-01-01
Highlights: ► An upper bound on the Lyapunov exponent of asynchronously updated CA is established. ► The employed update method has repercussions on the stability of CAs. ► A decision on the employed update method should be taken with care. ► Substantial discrepancies arise between synchronously and asynchronously updated CA. ► Discrepancies between different asynchronous update schemes are less pronounced. - Abstract: Although cellular automata (CAs) were conceptualized as utter discrete mathematical models in which the states of all their spatial entities are updated simultaneously at every consecutive time step, i.e. synchronously, various CA-based models that rely on so-called asynchronous update methods have been constructed in order to overcome the limitations that are tied up with the classical way of evolving CAs. So far, only a few researchers have addressed the consequences of this way of updating on the evolved spatio-temporal patterns, and the reachable stationary states. In this paper, we exploit Lyapunov exponents to determine to what extent the stability of the rules within a family of totalistic CAs is affected by the underlying update method. For that purpose, we derive an upper bound on the maximum Lyapunov exponent of asynchronously iterated CAs, and show its validity, after which we present a comparative study between the Lyapunov exponents obtained for five different update methods, namely one synchronous method and four well-established asynchronous methods. It is found that the stability of CAs is seriously affected if one of the latter methods is employed, whereas the discrepancies arising between the different asynchronous methods are far less pronounced and, finally, we discuss the repercussions of our findings on the development of CA-based models.
Further experience in simulation of rod drop experiments in the Loviisa and Mochovce reactors
International Nuclear Information System (INIS)
Siltanen, P.; Kaloinen, E.; Tanskanen, A.; Mattila, R.
2001-01-01
Simulations of reactor scram experiments using the 3-dimensional kinetics code HEXTRAN have been updated for the initial cores of Loviisa-1 and 2 Mochovce-1 and have been extended to burned cores of Loviisa-1. In these simulations, the entire experiment is simulated dynamically, including the behaviour of the core, the signal of the ionization chamber, and the inverse point kinetics of the reactivity meter. The predicted output of the reactivity meter is compared with the output observed during the experiment (Authors)
A Novel Simulator of Nonstationary Random MIMO Channels in Rayleigh Fading Scenarios
Directory of Open Access Journals (Sweden)
Qiuming Zhu
2016-01-01
Full Text Available For simulations of nonstationary multiple-input multiple-output (MIMO Rayleigh fading channels in time-variant scattering environments, a novel channel simulator is proposed based on the superposition of chirp signals. This new method has the advantages of low complexity and implementation simplicity as the sum of sinusoids (SOS method. In order to reproduce realistic time varying statistics for dynamic channels, an efficient parameter computation method is also proposed for updating the frequency parameters of employed chirp signals. Simulation results indicate that the proposed simulator is effective in generating nonstationary MIMO channels with close approximation of the time-variant statistical characteristics in accordance with the expected theoretical counterparts.
Energy Economic Data Base (EEDB) Program: Phase VI update (1983) report
International Nuclear Information System (INIS)
1984-09-01
This update of the Energy Economic Data Base is the latest in a series of technical and cost studies prepared by United Engineers and Constructors Inc., during the last 18 years. The data base was developed during 1978 and has been updated annually since then. The purpose of the updates has been to reflect the impact of changing regulations and technology on the costs of electric power generating stations. This Phase VI (Sixth) Update report documents the results of the 1983 EEDB Program update effort. The latest effort was a comprehensive update of the technical and capital cost information for the pressurized water reactor, boiling water reactor, and liquid metal fast breeder reactor nuclear power plant data models and for the 800 MWe and 500 MWe high sulfur coal-fired power plant data models. The update provided representative costs for these nuclear and coal-fired power plants for the 1980's. In addition, the updated nuclear power plant data models for the 1980's were modified to provide anticipated costs for nuclear power plants for the 1990's. Consequently, the Phase VI Update has continued to provide important benchmark information through which technical and capital cost trends may be identified that have occurred since January 1, 1978
Indoor Spatial Updating with Reduced Visual Information.
Legge, Gordon E; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M
2016-01-01
Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (Snellen 20/135) and Severe Blur (Snellen 20/900) conditions, and a Narrow Field (8°) condition. The subjects estimated the dimensions of seven rectangular rooms with and without these visual restrictions. They were also guided along three-segment paths in the rooms. At the end of each path, they were asked to estimate the distance and direction to the starting location. In Experiment 1, the subjects walked along the path. In Experiment 2, they were pushed in a wheelchair to determine if reduced proprioceptive input would result in poorer spatial updating. With unrestricted vision, mean Weber fractions for room-size estimates were near 20%. Severe Blur but not Mild Blur yielded larger errors in room-size judgments. The Narrow Field was associated with increased error, but less than with Severe Blur. There was no effect of visual restriction on estimates of distance back to the starting location, and only Severe Blur yielded larger errors in the direction estimates. Contrary to expectation, the wheelchair subjects did not exhibit poorer updating performance than the walking subjects, nor did they show greater dependence on visual condition. If our results generalize to people with low vision, severe deficits in acuity or field will adversely affect the ability to judge the size of indoor spaces, but updating of position and orientation may be less affected by visual impairment.
Indoor Spatial Updating with Reduced Visual Information.
Directory of Open Access Journals (Sweden)
Gordon E Legge
Full Text Available Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms.Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (Snellen 20/135 and Severe Blur (Snellen 20/900 conditions, and a Narrow Field (8° condition. The subjects estimated the dimensions of seven rectangular rooms with and without these visual restrictions. They were also guided along three-segment paths in the rooms. At the end of each path, they were asked to estimate the distance and direction to the starting location. In Experiment 1, the subjects walked along the path. In Experiment 2, they were pushed in a wheelchair to determine if reduced proprioceptive input would result in poorer spatial updating.With unrestricted vision, mean Weber fractions for room-size estimates were near 20%. Severe Blur but not Mild Blur yielded larger errors in room-size judgments. The Narrow Field was associated with increased error, but less than with Severe Blur. There was no effect of visual restriction on estimates of distance back to the starting location, and only Severe Blur yielded larger errors in the direction estimates. Contrary to expectation, the wheelchair subjects did not exhibit poorer updating performance than the walking subjects, nor did they show greater dependence on visual condition.If our results generalize to people with low vision, severe deficits in acuity or field will adversely affect the ability to judge the size of indoor spaces, but updating of position and orientation may be less affected by visual impairment.
Supervising simulations with the Prodiguer Messaging Platform
Greenslade, Mark; Carenton, Nicolas; Denvil, Sebastien
2015-04-01
At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of High Performance Computing (HPC) environments spread throughout France. The IPSL's simulation execution runtime is called libIGCM (library for IPSL Global Climate Modeling group). libIGCM has recently been enhanced so as to support realtime operational use cases. Such use cases include simulation monitoring, data publication, environment metrics collection, automated simulation control … etc. At the core of this enhancement is the Prodiguer messaging platform. libIGCM now emits information, in the form of messages, for remote processing at IPSL servers in Paris. The remote message processing takes several forms, for example: 1. Persisting message content to database(s); 2. Notifying an operator of changes in a simulation's execution status; 3. Launching rollback jobs upon simulation failure; 4. Dynamically updating controlled vocabularies; 5. Notifying downstream applications such as the Prodiguer web portal; We will describe how the messaging platform has been implemented from a technical perspective and demonstrate the Prodiguer web portal receiving realtime notifications.
Advanced Wear Simulation for Bulk Metal Forming Processes
Directory of Open Access Journals (Sweden)
Behrens Bernd-Arno
2016-01-01
Full Text Available In the recent decades the finite element method has become an essential tool for the cost-efficient virtual process design in the metal forming sector in order to counter the constantly increasing quality standards, particularly from the automotive industry as well as intensified international competition in the forging industry. An optimized process design taking precise tool wear prediction into account is a way to increase the cost-efficiency of the bulk metal forming processes. The main objective of the work presented in this paper is a modelling algorithm, which allows predicting die wear with respect to a geometry update during the forming simulation. Changes in the contact area caused by geometry update lead to the different die wear distribution. It primarily concerns the die areas, which undergo high thermal and mechanical loads.
Indurkhya, Sagar; Beal, Jacob
2010-01-06
ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.
Pankatz, Klaus; Kerkweg, Astrid
2015-04-01
The work presented is part of the joint project "DecReg" ("Regional decadal predictability") which is in turn part of the project "MiKlip" ("Decadal predictions"), an effort funded by the German Federal Ministry of Education and Research to improve decadal predictions on a global and regional scale. In MiKlip, one big question is if regional climate modeling shows "added value", i.e. to evaluate, if regional climate models (RCM) produce better results than the driving models. However, the scope of this study is to look more closely at the setup specific details of regional climate modeling. As regional models only simulate a small domain, they have to inherit information about the state of the atmosphere at their lateral boundaries from external data sets. There are many unresolved questions concerning the setup of lateral boundary conditions (LBC). External data sets come from global models or from global reanalysis data-sets. A temporal resolution of six hours is common for this kind of data. This is mainly due to the fact, that storage space is a limiting factor, especially for climate simulations. However, theoretically, the coupling frequency could be as high as the time step of the driving model. Meanwhile, it is unclear if a more frequent update of the LBCs has a significant effect on the climate in the domain of the RCM. The first study examines how the RCM reacts to a higher update frequency. The study is based on a 30 year time slice experiment for three update frequencies of the LBC, namely six hours, one hour and six minutes. The evaluation of means, standard deviations and statistics of the climate in the regional domain shows only small deviations, some statistically significant though, of 2m temperature, sea level pressure and precipitation. The second part of the first study assesses parameters linked to cyclone activity, which is affected by the LBC update frequency. Differences in track density and strength are found when comparing the simulations
Modeling and Simulation of Cyber Battlefield
Directory of Open Access Journals (Sweden)
AliJabar Rashidi
2017-12-01
Full Text Available In order to protect cyberspace against cyber-attacks we need cyber situation awareness framework for the implementation of our cyber maneuvers. This article allows execution cyber maneuvers with dynamic cyber battlefield simulator. Cyber battlefield contains essential information for the detection of cyber events, therefore, it can be considered most important and complicated factor in the high-level fusion. Cyber battlefield by gather detail data of cyberspace elements, including knowledge repository of vulnerability, tangible and intangible elements of cyberspace and the relationships between them, can provide and execute cyber maneuvers, penetration testing, cyber-attacks injection, attack tracking, visualization, cyber-attacks impact assessment and risk assessment. The dynamic maker Engine in simulator is designed to update the knowledge base of vulnerabilities, change the topology elements, and change the access list, services, hosts and users. Evaluation of simulator do with qualitative method of research and with create a focus group.
Impact of the updating scheme on stationary states of networks
International Nuclear Information System (INIS)
Radicchi, F; Ahn, Y Y; Meyer-Ortmanns, H
2008-01-01
From Boolean networks it is well known that the number of attractors as a function of the system size depends on the updating scheme which is chosen either synchronously or asynchronously. In this contribution, we report on a systematic interpolation between synchronous and asynchronous updating in a one-dimensional chain of Ising spins. The stationary state for fully synchronous updating is antiferromagnetic. The interpolation allows us to locate a phase transition between phases with an absorbing and a fluctuating stationary state. The associated universality class is that of parity conservation. We also report on a more recent study of asynchronous updates applied to the yeast cell-cycle network. Compared to the synchronous update, the basin of attraction of the largest attractor considerably shrinks and the convergence to the biological pathway slows down and is less dominant. Both examples illustrate how sensitively the stationary states and the properties of attractors can depend on the updating mode of the algorithm
Map updates in a dynamic Voronoi data structure
DEFF Research Database (Denmark)
Mioc, Darka; Antón Castro, Francesc/François; Gold, C. M.
2006-01-01
In this paper we are using local and sequential map updates in the Voronoi data structure, which allows us to automatically record each event and performed map updates within the system. These map updates are executed through map construction commands that are composed of atomic actions (geometric...... algorithms for addition, deletion, and motion of spatial objects) on the dynamic Voronoi data structure. The formalization of map commands led to the development of a spatial language comprising a set of atomic operations or constructs on spatial primitives (points and lines), powerful enough to define...
A Weighted Two-Level Bregman Method with Dictionary Updating for Nonconvex MR Image Reconstruction
Directory of Open Access Journals (Sweden)
Qiegen Liu
2014-01-01
Full Text Available Nonconvex optimization has shown that it needs substantially fewer measurements than l1 minimization for exact recovery under fixed transform/overcomplete dictionary. In this work, two efficient numerical algorithms which are unified by the method named weighted two-level Bregman method with dictionary updating (WTBMDU are proposed for solving lp optimization under the dictionary learning model and subjecting the fidelity to the partial measurements. By incorporating the iteratively reweighted norm into the two-level Bregman iteration method with dictionary updating scheme (TBMDU, the modified alternating direction method (ADM solves the model of pursuing the approximated lp-norm penalty efficiently. Specifically, the algorithms converge after a relatively small number of iterations, under the formulation of iteratively reweighted l1 and l2 minimization. Experimental results on MR image simulations and real MR data, under a variety of sampling trajectories and acceleration factors, consistently demonstrate that the proposed method can efficiently reconstruct MR images from highly undersampled k-space data and presents advantages over the current state-of-the-art reconstruction approaches, in terms of higher PSNR and lower HFEN values.
Key Techniques for Dynamic Updating of National Fundamental Geographic Information Database
Directory of Open Access Journals (Sweden)
WANG Donghua
2015-07-01
Full Text Available One of the most important missions of fundamental surveying and mapping work is to keep the fundamental geographic information fresh. In this respect, National Administration of Surveying, Mapping and Geoinformation has launched the project of dynamic updating of national fundamental geographic information database since 2012, which aims to update 1:50 000, 1:250 000 and 1:1 000 000 national fundamental geographic information database continuously and quickly, by updating and publishing once a year. This paper introduces the general technical thinking of dynamic updating, states main technical methods, such as dynamic updating of fundamental database, linkage updating of derived databases, and multi-tense database management and service and so on, and finally introduces main technical characteristics and engineering applications.
Zarriello, Phillip J.; Straub, David E.; Westenbroek, Stephen M.
2014-01-01
Heavy persistent rains from late February through March 2010 caused severe flooding and set, or nearly set, peaks of record for streamflows and water levels at many long-term U.S. Geological Survey streamgages in Rhode Island. In response to this flood, hydraulic models were updated for selected reaches covering about 33 river miles in Moshassuck and Woonasquatucket River Basins from the most recent approved Federal Emergency Management Agency flood insurance study (FIS) to simulate water-surface elevations (WSEs) from specified flows and boundary conditions. Reaches modeled include the main stem of the Moshassuck River and its main tributary, the West River, and three tributaries to the West River—Upper Canada Brook, Lincoln Downs Brook, and East Branch West River; and the main stem of the Woonasquatucket River. All the hydraulic models were updated to Hydrologic Engineering Center-River Analysis System (HEC-RAS) version 4.1.0 and incorporate new field-survey data at structures, high-resolution land-surface elevation data, and flood flows from a related study. The models were used to simulate steady-state WSEs at the 1- and 2-percent annual exceedance probability (AEP) flows, which is the estimated AEP of the 2010 flood in the Moshassuck River Basin and the Woonasquatucket River, respectively. The simulated WSEs were compared to the high-water mark (HWM) elevation data obtained in these basins in a related study following the March–April 2010 flood, which included 18 HWMs along the Moshassuck River and 45 HWMs along the Woonasquatucket River. Differences between the 2010 HWMs and the simulated 2- and 1-percent AEP WSEs from the FISs and the updated models developed in this study varied along the reach. Most differences could be attributed to the magnitude of the 2- and 1-percent AEP flows used in the FIS and updated model flows. Overall, the updated model and the FIS WSEs were not appreciably different when compared to the observed 2010 HWMs along the
Least squares approach for initial data recovery in dynamic data-driven applications simulations
Douglas, C.; Efendiev, Y.; Ewing, R.; Ginting, V.; Lazarov, R.; Cole, M.; Jones, G.
2010-01-01
In this paper, we consider the initial data recovery and the solution update based on the local measured data that are acquired during simulations. Each time new data is obtained, the initial condition, which is a representation of the solution at a
Statistical and perceptual updating: correlated impairments in right brain injury.
Stöttinger, Elisabeth; Filipowicz, Alex; Marandi, Elahe; Quehl, Nadine; Danckert, James; Anderson, Britt
2014-06-01
It has been hypothesized that many of the cognitive impairments commonly seen after right brain damage (RBD) can be characterized as a failure to build or update mental models. We (Danckert et al. in Neglect as a disorder of representational updating. NOVA Open Access, New York, 2012a; Cereb Cortex 22:2745-2760, 2012b) were the first to directly assess the association between RBD and updating and found that RBD patients were unable to exploit a strongly biased play strategy in their opponent in the children's game rock, paper, scissors. Given that this game required many other cognitive capacities (i.e., working memory, sustained attention, reward processing), RBD patients could have failed this task for various reasons other than a failure to update. To assess the generality of updating deficits after RBD, we had RBD, left brain-damaged (LBD) patients and healthy controls (HCs) describe line drawings that evolved gradually from one figure (e.g., rabbit) to another (e.g., duck) in addition to the RPS updating task. RBD patients took significantly longer to alter their perceptual report from the initial object to the final object than did LBD patients and HCs. Although both patient groups performed poorly on the RPS task, only the RBD patients showed a significant correlation between the two, very different, updating tasks. We suggest these data indicate a general deficiency in the ability to update mental representations following RBD.
Supporting Frequent Updates in R-Trees: A Bottom-Up Approach
DEFF Research Database (Denmark)
Lee, Mong Li; Hsu, Wynne; Jensen, Christian Søndergaard
2003-01-01
Advances in hardware-related technologies promise to enable new data management applications that monitor continuous processes. In these applications, enormous amounts of state samples are obtained via sensors and are streamed to a database. Further, updates are very frequent and may exhibit...... and aims to improve update performance. It has different levels of reorganizationranging from global to localduring updates, avoiding expensive top-down updates. A compact main-memory summary structure that allows direct access to the R-tree index nodes is used together with efficient bottom...
The distance effect in numerical memory-updating tasks.
Lendínez, Cristina; Pelegrina, Santiago; Lechuga, Teresa
2011-05-01
Two experiments examined the role of numerical distance in updating numerical information in working memory. In the first experiment, participants had to memorize a new number only when it was smaller than a previously memorized number. In the second experiment, updating was based on an external signal, which removed the need to perform any numerical comparison. In both experiments, distance between the memorized number and the new one was manipulated. The results showed that smaller distances between the new and the old information led to shorter updating times. This graded facilitation suggests that the process by which information is substituted in the focus of attention involves maintaining the shared features between the new and the old number activated and selecting other new features to be activated. Thus, the updating cost may be related to amount of new features to be activated in the focus of attention.
Full-f gyrokinetic simulation of edge pedestal in Textor
Energy Technology Data Exchange (ETDEWEB)
Kiviniemi, Timo [Aalto Univ. (Finland)
2016-11-01
In ongoing simulations we have noticed that change phase angle between electric field and density oscillation may be important for changes in particle transport for different isotopes which could explain part of the so-called isotope-effect. Even the present database from the PRACE simulation (about 20 cases and some 4 TB of data) can still be further explored for this as the 3D data for both electric field and density exists. After finishing the PRACE project the code has been updated to include scrape-off-layer (SOL) which has opened several possibilities for future research.
Environmental Regulatory Update Table, January/February 1995
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Mayer, S.J.; Salk, M.S.
1995-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives impacting environmental, health, and safety management responsibilities. the table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, January/February 1995
International Nuclear Information System (INIS)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Mayer, S.J.; Salk, M.S.
1995-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives impacting environmental, health, and safety management responsibilities. the table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action
UCI2001: The updated catalogue of Italy
International Nuclear Information System (INIS)
Peresan, A.; Panza, G.F.
2002-05-01
A new updated earthquake catalogue for the Italian territory, named UCI2001, is described here; it consists of an updated and revised version of the CCI1996 catalogue (Peresan et al., 1997). The revision essentially corresponds to the incorporation of data from the NEIC (National Earthquake Information Centre) and ALPOR (Catalogo delle Alpi Orientali) catalogues, while the updating is performed using the NEIC Preliminary Determinations of Epicenters since 1986. A brief overview of the catalogues used for the monitoring of seismicity in the Italian area is provided, together with the essential information about the structure of the UCI2001 catalogue and a description of its format. A complete list of the events, as on May 1 2002, is given in the Appendix. (author)
AEGIS geologic simulation model
International Nuclear Information System (INIS)
Foley, M.G.
1982-01-01
The Geologic Simulation Model (GSM) is used by the AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) program at the Pacific Northwest Laboratory to simulate the dynamic geology and hydrology of a geologic nuclear waste repository site over a million-year period following repository closure. The GSM helps to organize geologic/hydrologic data; to focus attention on active natural processes by requiring their simulation; and, through interactive simulation and calibration, to reduce subjective evaluations of the geologic system. During each computer run, the GSM produces a million-year geologic history that is possible for the region and the repository site. In addition, the GSM records in permanent history files everything that occurred during that time span. Statistical analyses of data in the history files of several hundred simulations are used to classify typical evolutionary paths, to establish the probabilities associated with deviations from the typical paths, and to determine which types of perturbations of the geologic/hydrologic system, if any, are most likely to occur. These simulations will be evaluated by geologists familiar with the repository region to determine validity of the results. Perturbed systems that are determined to be the most realistic, within whatever probability limits are established, will be used for the analyses that involve radionuclide transport and dose models. The GSM is designed to be continuously refined and updated. Simulation models are site specific, and, although the submodels may have limited general applicability, the input data equirements necessitate detailed characterization of each site before application
Updated radio Σ−D relation for galactic supernova remnants
Directory of Open Access Journals (Sweden)
Pavlović M.Z.
2014-01-01
Full Text Available We present the updated empirical radio surface-brightness-to-diameter (Σ − D relation for supernova remnants (SNRs in our Galaxy. Our original calibration sample of Galactic SNRs with independently determined distances (Pavlović et al. 2013, hereafter Paper I is reconsidered and updated with data which became available in the past two years. The orthogonal fitting procedure and probability-density-function-based (PDF method are applied to the calibration sample in the logΣ − logD plane. Non-standard orthogonal regression keeps the Σ−D and D−Σ relations invariant within estimated uncertainties. Our previous Monte Carlo simulations verified that the slopes of the empirical Σ−D relation should be determined by using the orthogonal regression, because of its good performances for data sets with severe scatter. The updated calibration sample contains 65 shell SNRs. 6 new Galactic SNRs are added to the sample from Paper I, one is omitted and distances are changed for 10 SNRs. The slope derived is here slightly steeper (β ≈ 5.2 than the Σ−D slope in Paper I (β ≈ 4.8. The PDF method relies on data points density maps which can provide more reliable calibrations that preserve more information contained in the calibration sample. We estimate distances to five new faint Galactic SNRs discovered for the first time by Canadian Galactic Plane Survey, and obtained distances of 2.3, 4.0, 1.3, 2.9 and 4.7 kiloparsecs for G108.5+11.0, G128.5+2.6, G149.5+3.2, G150.8+3.8 and G160.1−1.1, respectively. The updated empirical relation is used to estimate distances of 160 shell Galactic SNRs and new results change their distance scales up to 15 per cent, compared to the results from Paper I. The PDF calculation can provide even few times higher or lower values in comparison with the orthogonal fit, as it uses a totally different approach. However, on average, this difference is 32, 24 and 18 per cent for mode, median and mean distances
Monte Carlo Simulation Tool Installation and Operation Guide
Energy Technology Data Exchange (ETDEWEB)
Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.; Kouzes, Richard T.; Orrell, John L.; Troy, Meredith D.; Wiseman, Clinton G.
2013-09-02
This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection by an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.
2014 Update of the United States National Seismic Hazard Maps
Petersen, M.D.; Mueller, C.S.; Haller, K.M.; Moschetti, M.; Harmsen, S.C.; Field, E.H.; Rukstales, K.S.; Zeng, Y.; Perkins, D.M.; Powers, P.; Rezaeian, S.; Luco, N.; Olsen, A.; Williams, R.
2012-01-01
The U.S. National Seismic Hazard Maps are revised every six years, corresponding with the update cycle of the International Building Code. These maps cover the conterminous U.S. and will be updated in 2014 using the best-available science that is obtained from colleagues at regional and topical workshops, which are convened in 2012-2013. Maps for Alaska and Hawaii will be updated shortly following this update. Alternative seismic hazard models discussed at the workshops will be implemented in a logic tree framework and will be used to develop the seismic hazard maps and associated products. In this paper we describe the plan to update the hazard maps, the issues raised in workshops up to March 2012, and topics that will be discussed at future workshops. An advisory panel will guide the development of the hazard maps and ensure that the maps are acceptable to a broad segment of the science and engineering communities. These updated maps will then be considered by end-users for inclusion in building codes, risk models, and public policy documents.
78 FR 26244 - Updating of Employer Identification Numbers
2013-05-06
... Number, or EIN. Employers are required to know the identity of their responsible party. The amount of...-BK02 Updating of Employer Identification Numbers AGENCY: Internal Revenue Service (IRS), Treasury... assigned an employer identification number (EIN) to provide updated information to the IRS in the manner...
76 FR 28194 - Proposed FOIA Fee Schedule Update
2011-05-16
... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 Proposed FOIA Fee Schedule Update AGENCY... publishing its proposed Freedom of Information Act (FOIA) Fee Schedule Update and solicits comments from... on the proposed fee schedule should be mailed or delivered to the Office of the General Counsel...
75 FR 27228 - Proposed FOIA Fee Schedule Update
2010-05-14
... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 Proposed FOIA Fee Schedule Update AGENCY... publishing its proposed Freedom of Information Act (FOIA) Fee Schedule Update and solicits comments from... on the proposed fee schedule should be mailed or delivered to the Office of the General Counsel...
Energy Technology Data Exchange (ETDEWEB)
Ravnik, M; Trkov, A [Inst. Jozef Stefan, Ljubljana (Slovenia); Holubar, A [Ustav Jaderneho Vyzkumu CSKAE, Rez (Serbia and Montenegro)
1992-07-01
At the end of 1990 the WIMS Library Update Project (WLUP) has been initiated at the International Atomic Energy Agency. The project was organized as an international research project, coordinated at the J. Stefan Institute. Up to now, 22 laboratories from 19 countries joined the project. Phase 1 of the project, which included WIMS input optimization for five experimental benchmark lattices, has been completed. The work presented in this paper describes also the results of Phase 2 of the Project, in which the cross sections based on ENDF/B-IV evaluated nuclear data library have been processed. (author) [Slovenian] Konec 1990 se je na Mednarodni agenciji za atomsko energijo zacel projekt obnove knjiznice presekov programa WIMS (WIMS Library Updating Project, WLUP). V projektu sodeluje 22 laboratorijev iz 19 drzav, koordiniramo pa ga na Institutu Jozef Stefan. Doslej je koncana faza 1 tega projekta, ki obsega optimizacijo vhodnega modela programa WIMS za pet eksperimentalnih testnih problemov. Podani so tudi rezultati faze 2, v kateri so se procesirali preseki na osnovi ENDF/B-IV datoteke. (author)
Environmental Regulatory Update Table, January--February 1993
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1993-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, November--December 1993
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1994-01-01
The Environmental Regulatory Update Table provides information on regulatory of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table November--December 1994
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Mayer, S.J.; Salk, M.S.
1995-01-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, May--June 1994
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Salk, M.S.
1994-07-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, May/June 1993
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1993-07-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, March--April 1994
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E. [Oak Ridge National Lab., TN (United States). Health Sciences Research Div.; Salk, M.S. [Oak Ridge National Lab., TN (United States). Environmental Sciences Div.
1994-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table July/August 1993
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1993-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, March/April 1993
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1993-05-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, November--December 1992
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.; Salk, M.S.
1993-01-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly wit information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, July--August 1992
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.; Salk, M.S.
1992-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, September/October 1993
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1993-11-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operation and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, January--February 1994
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1994-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations ad contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, September--October 1992
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Lewis, E.B.; Salk, M.S.
1992-11-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, July/August 1994
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Salk, M.S.
1994-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental Regulatory Update Table, March/April 1992
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1992-05-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Environmental regulatory update table, July/August 1994
International Nuclear Information System (INIS)
Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Salk, M.S.
1994-09-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action
Updating radiation protection regulations in Egypt
International Nuclear Information System (INIS)
Gomaa, M.A.; El-Naggar, A.M.
1996-01-01
The aim of this treatise is to present -the rational steps taken in the process of updating the Radiation Protection Regulations in Egypt. The contents of the review will include a historical synopsis, and the current state of art regarding competent authorities. Furthermore, the various committees formed with responsibilities for specific issues are indicated, including the role of the Ministry of Health (MOH), and that of the Atomic Energy Authority (AEA). Finally, the efforts made towards updating the radiation Protection Regulations in Egypt are highlighted. (author)
National Pediatric Program Update
International Nuclear Information System (INIS)
2008-01-01
The book of the National Pediatric Program Update, issued by the Argentina Society of Pediatrics, describes important issues, including: effective treatment of addictions (drugs); defects of the neural tube; and the use of radiation imaging in diagnosis. [es
77 FR 33980 - Proposed FOIA Fee Schedule Update
2012-06-08
... 1703 Proposed FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Notice... the Board's proposed FOIA Fee Schedule Update published in the Federal Register of June 1, 2012. The...: The FOIA requires each Federal agency covered by the Act to specify a schedule of fees applicable to...
A Provenance Tracking Model for Data Updates
Directory of Open Access Journals (Sweden)
Gabriel Ciobanu
2012-08-01
Full Text Available For data-centric systems, provenance tracking is particularly important when the system is open and decentralised, such as the Web of Linked Data. In this paper, a concise but expressive calculus which models data updates is presented. The calculus is used to provide an operational semantics for a system where data and updates interact concurrently. The operational semantics of the calculus also tracks the provenance of data with respect to updates. This provides a new formal semantics extending provenance diagrams which takes into account the execution of processes in a concurrent setting. Moreover, a sound and complete model for the calculus based on ideals of series-parallel DAGs is provided. The notion of provenance introduced can be used as a subjective indicator of the quality of data in concurrent interacting systems.
Clean Coal Technology Programs: Program Update 2009
Energy Technology Data Exchange (ETDEWEB)
None
2009-10-01
The purpose of the Clean Coal Technology Programs: Program Update 2009 is to provide an updated status of the U.S. Department of Energy (DOE) commercial-scale demonstrations of clean coal technologies (CCT). These demonstrations have been performed under the Clean Coal Technology Demonstration Program (CCTDP), the Power Plant Improvement Initiative (PPII), and the Clean Coal Power Initiative (CCPI). Program Update 2009 provides: (1) a discussion of the role of clean coal technology demonstrations in improving the nation’s energy security and reliability, while protecting the environment using the nation’s most abundant energy resource—coal; (2) a summary of the funding and costs of the demonstrations; and (3) an overview of the technologies being demonstrated, along with fact sheets for projects that are active, recently completed, or recently discontinued.
A Time Domain Update Method for Reservoir History Matching of Electromagnetic Data
Katterbauer, Klemens
2014-03-25
The oil & gas industry has been the backbone of the world\\'s economy in the last century and will continue to be in the decades to come. With increasing demand and conventional reservoirs depleting, new oil industry projects have become more complex and expensive, operating in areas that were previously considered impossible and uneconomical. Therefore, good reservoir management is key for the economical success of complex projects requiring the incorporation of reliable uncertainty estimates for reliable production forecasts and optimizing reservoir exploitation. Reservoir history matching has played here a key role incorporating production, seismic, electromagnetic and logging data for forecasting the development of reservoirs and its depletion. With the advances in the last decade, electromagnetic techniques, such as crosswell electromagnetic tomography, have enabled engineers to more precisely map the reservoirs and understand their evolution. Incorporating the large amount of data efficiently and reducing uncertainty in the forecasts has been one of the key challenges for reservoir management. Computing the conductivity distribution for the field for adjusting parameters in the forecasting process via solving the inverse problem has been a challenge, due to the strong ill-posedness of the inversion problem and the extensive manual calibration required, making it impossible to be included into an efficient reservoir history matching forecasting algorithm. In the presented research, we have developed a novel Finite Difference Time Domain (FDTD) based method for incorporating electromagnetic data directly into the reservoir simulator. Based on an extended Archie relationship, EM simulations are performed for both forecasted and Porosity-Saturation retrieved conductivity parameters being incorporated directly into an update step for the reservoir parameters. This novel direct update method has significant advantages such as that it overcomes the expensive and ill
Taylor, Kelley R.
2009-01-01
"Chief Justice Flubs Oath." "Justice Ginsburg Has Cancer Surgery." At the start of this year, those were the news headlines about the U.S. Supreme Court. But January 2009 also brought news about key education cases--one resolved and two others on the docket--of which school administrators should take particular note. The Supreme Court updates on…
Directory of Open Access Journals (Sweden)
Katharina Kranzer
Full Text Available Survival analysis using time-updated CD4+ counts during antiretroviral therapy is frequently employed to determine risk of clinical events. The time-point when the CD4+ count is assumed to change potentially biases effect estimates but methods used to estimate this are infrequently reported.This study examined the effect of three different estimation methods: assuming i a constant CD4+ count from date of measurement until the date of next measurement, ii a constant CD4+ count from the midpoint of the preceding interval until the midpoint of the subsequent interval and iii a linear interpolation between consecutive CD4+ measurements to provide additional midpoint measurements. Person-time, tuberculosis rates and hazard ratios by CD4+ stratum were compared using all available CD4+ counts (measurement frequency 1-3 months and 6 monthly measurements from a clinical cohort. Simulated data were used to compare the extent of bias introduced by these methods.The midpoint method gave the closest fit to person-time spent with low CD4+ counts and for hazard ratios for outcomes both in the clinical dataset and the simulated data.The midpoint method presents a simple option to reduce bias in time-updated CD4+ analysis, particularly at low CD4 cell counts and rapidly increasing counts after ART initiation.
The value of information updating in new product development
Artmann, Christian
2009-01-01
This work shows how managing uncertainty in new product development can be improved by conducting an information update during the development process. The book details the comprehensive model needed to perform that information update.
Risk assessment, management, communication: a guide to selected sources. Update. Information guide
International Nuclear Information System (INIS)
1987-05-01
This is the first update to the March 1987 publication entitled Risk Assessment, Management, Communication: A Guide to Selected Sources. The risk update series is divided into three major sections: Assessment, Management, and Communication. This update also includes subsections on hazardous waste, radiation, and a number of specific chemicals. Due to the expanding literature on risk, other subsections may be added to updates in the future. Each Table of Contents contains a complete list of the subsections. Updates are produced on a quarterly basis
Update History of This Database - DMPD | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us DMPD Update History of This Database Date Update contents 2010/03/29 DMPD English archive si....jp/macrophage/ ) is released. About This Database Database Description Download License Update History of Thi...s Database Site Policy | Contact Us Update History of This Database - DMPD | LSDB Archive ...
Ganschow, Pamela S; Jacobs, Elizabeth A; Mackinnon, Jennifer; Charney, Pamela
2009-06-01
The aim of this clinical update is to summarize articles and guidelines published in the last year with the potential to change current clinical practice as it relates to women's health. We used two independent search strategies to identify articles relevant to women's health published between March 1, 2007 and February 29, 2008. First, we reviewed the Cochrane Database of Systematic Reviews and journal indices from the ACP Journal Club, Annals of Internal Medicine, Archives of Internal Medicine, British Medical Journal, Circulation, Diabetes, JAMA, JGIM, Journal of Women's Health, Lancet, NEJM, Obstetrics and Gynecology, and Women's Health Journal Watch. Second, we performed a MEDLINE search using the medical subject heading term "sex factors." The authors, who all have clinical and/or research experience in the area of women's health, reviewed all article titles, abstracts, and, when indicated, full publications. We excluded articles related to obstetrical aspects of women's health focusing on those relevant to general internists. We had two acceptance criteria, scientific rigor and potential to impact women's health. We also identified new and/or updated women's health guidelines released during the same time period. We identified over 250 publications with potential relevance to women's health. Forty-six articles were selected for presentation as part of the Clinical Update, and nine were selected for a more detailed discussion in this paper. Evidence-based women's health guidelines are listed in Table 1. Table 1 Important Women's Health Guidelines in 2007-2008: New or Updated Topic Issuing organization Updated recommendations and comments Mammography screening in women 40-4917 ACP Individualized risk assessment and informed decision making should be used to guide decisions about mammography screening in this age group. To aid in the risk assessment, a discussion of the risk factors, which if present in a woman in her 40s increases her risk to above that of an
Simulation of Optimal Decision-Making Under the Impacts of Climate Change.
Møller, Lea Ravnkilde; Drews, Martin; Larsen, Morten Andreas Dahl
2017-07-01
Climate change causes transformations to the conditions of existing agricultural practices appointing farmers to continuously evaluate their agricultural strategies, e.g., towards optimising revenue. In this light, this paper presents a framework for applying Bayesian updating to simulate decision-making, reaction patterns and updating of beliefs among farmers in a developing country, when faced with the complexity of adapting agricultural systems to climate change. We apply the approach to a case study from Ghana, where farmers seek to decide on the most profitable of three agricultural systems (dryland crops, irrigated crops and livestock) by a continuous updating of beliefs relative to realised trajectories of climate (change), represented by projections of temperature and precipitation. The climate data is based on combinations of output from three global/regional climate model combinations and two future scenarios (RCP4.5 and RCP8.5) representing moderate and unsubstantial greenhouse gas reduction policies, respectively. The results indicate that the climate scenario (input) holds a significant influence on the development of beliefs, net revenues and thereby optimal farming practices. Further, despite uncertainties in the underlying net revenue functions, the study shows that when the beliefs of the farmer (decision-maker) opposes the development of the realised climate, the Bayesian methodology allows for simulating an adjustment of such beliefs, when improved information becomes available. The framework can, therefore, help facilitating the optimal choice between agricultural systems considering the influence of climate change.
State energy-price system: 1981 update
Energy Technology Data Exchange (ETDEWEB)
Fang, J.M.; Imhoff, K.L.; Hood, L.J.
1983-08-01
This report updates the State Energy Price Data System (STEPS) to include state-level energy prices by fuel and by end-use sectors for 1981. Both physical unit prices and Btu prices are presented. Basic documentation of the data base remains generally the same as in the original report: State Energy Price System; Volume 1: Overview and Technical Documentation (DOE/NBB-0029 Volume 1 of 2, November 1982). The present report documents only the changes in procedures necessitated by the update to 1981 and the corrections to the basic documentation.
International Nuclear Information System (INIS)
2007-05-01
WIMS-D (Winfrith Improved Multigroup Scheme-D) is the name of a family of software packages for reactor lattice calculations and is one of the few reactor lattice codes in the public domain and available on noncommercial terms. WIMSD-5B has recently been released from the OECD Nuclear Energy Agency Data Bank, and features major improvements in machine portability, as well as incorporating a few minor corrections. This version supersedes WIMS-D/4, which was released by the Winfrith Technology Centre in the United Kingdom for IBM machines and has been adapted for various other computer platforms in different laboratories. The main weakness of the WIMS-D package is the multigroup constants library, which is based on very old data. The relatively good performance of WIMS-D is attributed to a series of empirical adjustments to the multigroup data. However, the adjustments are not always justified on the basis of more accurate and recent experimental measurements. Following the release of new and revised evaluated nuclear data files, it was felt that the performance of WIMS-D could be improved by updating the associated library. The WIMS-D Library Update Project (WLUP) was initiated in the early 1990s with the support of the IAEA. This project consisted of voluntary contributions from a large number of participants. Several benchmarks for testing the library were identified and analysed, the WIMSR module of the NJOY code system was upgraded and the author of NJOY accepted the proposed updates for the official code system distribution. A detailed parametric study was performed to investigate the effects of various data processing input options on the integral results. In addition, the data processing methods for the main reactor materials were optimized. Several partially updated libraries were produced for testing purposes. The final stage of the WLUP was organized as a coordinated research project (CRP) in order to speed up completion of the fully updated library
Who's Who? Memory updating and character reference in children's narratives.
Whitely, Cristy; Colozzo, Paola
2013-10-01
The capacity to update and monitor the contents of working memory is an executive function presumed to play a critical role in language processing. The current study used an individual differences approach to consider the relationship between memory updating and accurate reference to story characters in the narratives of typically developing children. English-speaking children from kindergarten to grade 2 ( N = 63; M age = 7.0 years) completed updating tasks, short-term memory tasks, and narrative productions. The authors used multiple regression to test whether updating accounted for independent variability in referential adequacy. The capacity to update working memory was related to adequate character reference beyond the effects of age and of short-term memory capacity, with the strongest relationship emerging for maintaining reference over multiple utterances. This individual differences study is the first to show a link between updating and performance in a discourse production task for young school-age children. The findings contribute to the growing body of research investigating the role of working memory in shaping language production. This study invites extension to children of different ages and language abilities as well as to other language production tasks.
An Algorithm of Auto-Update Threshold for Singularity Analysis of Pipeline Pressure
Directory of Open Access Journals (Sweden)
Jinhai Liu
2013-01-01
Full Text Available A precise auto-update threshold algorithm (AUTA which imitates the short-term memory of human brain is proposed to search singularities in pipeline pressure signal. According to the characteristics of the pressure signal, the pressure can be divided into two states known as nonsteady state and steady state. The AUTA can distinguish these two states and then choose corresponding method to calculate the dynamic thresholds of pressure variation in real time. Then, the parameters of AUTA are analyzed to determine their values or ranges. Finally, in the simulations to the actual pressure signal from oil pipelines, we verified the effectiveness of AUTA in estimating the dynamic threshold value of pressure.
Update History of This Database - DGBY | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us DGBY Update History of This Database Date Update contents 2014/10/20 The URL of the portal s...aro.affrc.go.jp/yakudachi/yeast/index.html ) is opened. About This Database Database Description Download License Update Hi...story of This Database Site Policy | Contact Us Update History of This Database - DGBY | LSDB Archive ... ... Expression of attribution in License is updated. 2012/03/08 DGBY English archive site is opened. 2006/10/02
Why, when and how to update a meta-ethnography qualitative synthesis.
France, Emma F; Wells, Mary; Lang, Heidi; Williams, Brian
2016-03-15
Meta-ethnography is a unique, systematic, qualitative synthesis approach widely used to provide robust evidence on patient and clinician beliefs and experiences and understandings of complex social phenomena. It can make important theoretical and conceptual contributions to health care policy and practice. Since beliefs, experiences, health care contexts and social phenomena change over time, the continued relevance of the findings from meta-ethnographies cannot be assumed. However, there is little guidance on whether, when and how meta-ethnographies should be updated; Cochrane guidance on updating reviews of intervention effectiveness is unlikely to be fully appropriate. This is the first in-depth discussion on updating a meta-ethnography; it explores why, when and how to update a meta-ethnography. Three main methods of updating the analysis and synthesis are examined. Advantages and disadvantages of each method are outlined, relating to the context, purpose, process and output of the update and the nature of the new data available. Recommendations are made for the appropriate use of each method, and a worked example of updating a meta-ethnography is provided. This article makes a unique contribution to this evolving area of meta-ethnography methodology.
RISC-type microprocessors may revolutionize aerospace simulation
Jackson, Albert S.
The author explores the application of RISC (reduced instruction set computer) processors in massively parallel computer (MPC) designs for aerospace simulation. The MPC approach is shown to be well adapted to the needs of aerospace simulation. It is shown that any of the three common types of interconnection schemes used with MPCs are effective for general-purpose simulation, although the bus-or switch-oriented machines are somewhat easier to use. For partial differential equation models, the hypercube approach at first glance appears more efficient because the nearest-neighbor connections required for three-dimensional models are hardwired in a hypercube machine. However, the data broadcast ability of a bus system, combined with the fact that data can be transmitted over a bus as soon as it has been updated, makes the bus approach very competitive with the hypercube approach even for these types of models.
Kempka, T.; Norden, B.; Tillner, E.; Nakaten, B.; Kühn, M.
2012-04-01
Geological modelling and dynamic flow simulations were conducted at the Ketzin pilot site showing a good agreement of history matched geological models with CO2 arrival times in both observation wells and timely development of reservoir pressure determined in the injection well. Recently, a re-evaluation of the seismic 3D data enabled a refinement of the structural site model and the implementation of the fault system present at the top of the Ketzin anticline. The updated geological model (model size: 5 km x 5 km) shows a horizontal discretization of 5 x 5 m and consists of three vertical zones, with the finest discretization at the top (0.5 m). According to the revised seismic analysis, the facies modelling to simulate the channel and floodplain facies distribution at Ketzin was updated. Using a sequential Gaussian simulator for the distribution of total and effective porosities and an empiric porosity-permeability relationship based on site and literature data available, the structural model was parameterized. Based on this revised reservoir model of the Stuttgart formation, numerical simulations using the TOUGH2-MP/ECO2N and Schlumberger Information Services (SIS) ECLIPSE 100 black-oil simulators were undertaken in order to evaluate the long-term (up to 10,000 years) migration of the injected CO2 (about 57,000 t at the end of 2011) and the development of reservoir pressure over time. The simulation results enabled us to quantitatively compare both reservoir simulators based on current operational data considering the long-term effects of CO2 storage including CO2 dissolution in the formation fluid. While the integration of the static geological model developed in the SIS Petrel modelling package into the ECLIPSE simulator is relatively flawless, a work-flow allowing for the export of Petrel models into the TOUGH2-MP input file format had to be implemented within the scope of this study. The challenge in this task was mainly determined by the presence of a
Relating Standardized Visual Perception Measures to Simulator Visual System Performance
Kaiser, Mary K.; Sweet, Barbara T.
2013-01-01
Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).
32 CFR 635.24 - Updating the COPS MPRS.
2010-07-01
... 32 National Defense 4 2010-07-01 2010-07-01 true Updating the COPS MPRS. 635.24 Section 635.24 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.24 Updating the COPS MPRS. Installation Provost Marshals/Directors of...
Realtime mine ventilation simulation
International Nuclear Information System (INIS)
McDaniel, K.H.
1997-01-01
This paper describes the development of a Windows based, interactive mine ventilation simulation software program at the Waste Isolation Pilot Plant (WIPP). To enhance the operation of the underground ventilation system, Westinghouse Electric Corporation developed the program called WIPPVENT. While WIPPVENT includes most of the functions of the commercially available simulation program VNETPC and uses the same subroutine to calculate airflow distributions, the user interface has been completely rewritten as a Windows application with screen graphics. WIPPVENT is designed to interact with WIPP ventilation monitoring systems through the sitewise Central monitoring System. Data can be continuously collected from the Underground Ventilation Remote Monitoring and Control System (e.g., air quantity and differential pressure) and the Mine Weather Stations (psychrometric data). Furthermore, WIPPVENT incorporates regulator characteristic curves specific to the site. The program utilizes this data to create and continuously update a REAL-TIME ventilation model. This paper discusses the design, key features, and interactive capabilities of WIPPVENT
Biomass Gasifier for Computer Simulation; Biomassa foergasare foer Computer Simulation
Energy Technology Data Exchange (ETDEWEB)
Hansson, Jens; Leveau, Andreas; Hulteberg, Christian [Nordlight AB, Limhamn (Sweden)
2011-08-15
This report is an effort to summarize the existing data on biomass gasifiers as the authors have taken part in various projects aiming at computer simulations of systems that include biomass gasification. Reliable input data is paramount for any computer simulation, but so far there is no easy-accessible biomass gasifier database available for this purpose. This study aims at benchmarking current and past gasifier systems in order to create a comprehensive database for computer simulation purposes. The result of the investigation is presented in a Microsoft Excel sheet, so that the user easily can implement the data in their specific model. In addition to provide simulation data, the technology is described briefly for every studied gasifier system. The primary pieces of information that are sought for are temperatures, pressures, stream compositions and energy consumption. At present the resulting database contains 17 gasifiers, with one or more gasifier within the different gasification technology types normally discussed in this context: 1. Fixed bed 2. Fluidised bed 3. Entrained flow. It also contains gasifiers in the range from 100 kW to 120 MW, with several gasifiers in between these two values. Finally, there are gasifiers representing both direct and indirect heating. This allows for a more qualified and better available choice of starting data sets for simulations. In addition to this, with multiple data sets available for several of the operating modes, sensitivity analysis of various inputs will improve simulations performed. However, there have been fewer answers to the survey than expected/hoped for, which could have improved the database further. However, the use of online sources and other public information has to some extent counterbalanced the low response frequency of the survey. In addition to that, the database is preferred to be a living document, continuously updated with new gasifiers and improved information on existing gasifiers.
Efficient Multiplicative Updates for Support Vector Machines
DEFF Research Database (Denmark)
Potluru, Vamsi K.; Plis, Sergie N; Mørup, Morten
2009-01-01
(NMF) problem. This allows us to derive a novel multiplicative algorithm for solving hard and soft margin SVM. The algorithm follows as a natural extension of the updates for NMF and semi-NMF. No additional parameter setting, such as choosing learning rate, is required. Exploiting the connection......The dual formulation of the support vector machine (SVM) objective function is an instance of a nonnegative quadratic programming problem. We reformulate the SVM objective function as a matrix factorization problem which establishes a connection with the regularized nonnegative matrix factorization...... between SVM and NMF formulation, we show how NMF algorithms can be applied to the SVM problem. Multiplicative updates that we derive for SVM problem also represent novel updates for semi-NMF. Further this unified view yields algorithmic insights in both directions: we demonstrate that the Kernel Adatron...
Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris
2008-01-01
The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.
A Coordinated Initialization Process for the Distributed Space Exploration Simulation
Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David
2007-01-01
A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions
Rosa, Sarah N.; Hay, Lauren E.
2017-12-01
In 2014, the U.S. Geological Survey, in cooperation with the U.S. Department of Defense’s Strategic Environmental Research and Development Program, initiated a project to evaluate the potential impacts of projected climate-change on Department of Defense installations that rely on Guam’s water resources. A major task of that project was to develop a watershed model of southern Guam and a water-balance model for the Fena Valley Reservoir. The southern Guam watershed model provides a physically based tool to estimate surface-water availability in southern Guam. The U.S. Geological Survey’s Precipitation Runoff Modeling System, PRMS-IV, was used to construct the watershed model. The PRMS-IV code simulates different parts of the hydrologic cycle based on a set of user-defined modules. The southern Guam watershed model was constructed by updating a watershed model for the Fena Valley watersheds, and expanding the modeled area to include all of southern Guam. The Fena Valley watershed model was combined with a previously developed, but recently updated and recalibrated Fena Valley Reservoir water-balance model.Two important surface-water resources for the U.S. Navy and the citizens of Guam were modeled in this study; the extended model now includes the Ugum River watershed and improves upon the previous model of the Fena Valley watersheds. Surface water from the Ugum River watershed is diverted and treated for drinking water, and the Fena Valley watersheds feed the largest surface-water reservoir on Guam. The southern Guam watershed model performed “very good,” according to the criteria of Moriasi and others (2007), in the Ugum River watershed above Talofofo Falls with monthly Nash-Sutcliffe efficiency statistic values of 0.97 for the calibration period and 0.93 for the verification period (a value of 1.0 represents perfect model fit). In the Fena Valley watershed, monthly simulated streamflow volumes from the watershed model compared reasonably well with the
Update History of This Database - SAHG | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us SAHG Update History of This Database Date Update contents 2016/05/09 SAHG English archive si...te is opened. 2009/10 SAHG ( http://bird.cbrc.jp/sahg ) is opened. About This Database Database Description ...Download License Update History of This Database Site Policy | Contact Us Update History of This Database - SAHG | LSDB Archive ...
Update History of This Database - RMOS | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us RMOS Update History of This Database Date Update contents 2015/10/27 RMOS English archive si...12 RMOS (http://cdna01.dna.affrc.go.jp/RMOS/) is opened. About This Database Database Description Download License Update Hi...story of This Database Site Policy | Contact Us Update History of This Database - RMOS | LSDB Archive ...
Updating the Psoriatic Arthritis (PsA) Core Domain Set
DEFF Research Database (Denmark)
Orbai, Ana-Maria; de Wit, Maarten; Mease, Philip J
2017-01-01
OBJECTIVE: To include the patient perspective in accordance with the Outcome Measures in Rheumatology (OMERACT) Filter 2.0 in the updated Psoriatic Arthritis (PsA) Core Domain Set for randomized controlled trials (RCT) and longitudinal observational studies (LOS). METHODS: At OMERACT 2016, research...... conducted to update the PsA Core Domain Set was presented and discussed in breakout groups. The updated PsA Core Domain Set was voted on and endorsed by OMERACT participants. RESULTS: We conducted a systematic literature review of domains measured in PsA RCT and LOS, and identified 24 domains. We conducted...... and breakout groups at OMERACT 2016 in which findings were presented and discussed. The updated PsA Core Domain Set endorsed with 90% agreement by OMERACT 2016 participants included musculoskeletal disease activity, skin disease activity, fatigue, pain, patient's global assessment, physical function, health...
Environmental Regulatory Update Table, January/February 1992
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.
1992-03-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action. This table is for January/February 1992.
Böddeker, B.; Teichler, H.
The MD simulation program TABB is motivated by the need of long time simulations for the investigation of slow processes near the glass transition of glass forming alloys. TABB is written in C++ with a high degree of flexibility: TABB allows the use of any short ranged pair potentials or EAM potentials, by generating and using a spline representation of all functions and their derivatives. TABB supports several numerical integration algorithms like the Runge-Kotta or the modified Gear-predictor-corrector algorithm of order five. The boundary conditions can be chosen to resemble the geometry of bulk materials or films. The simulation box length or the pressure can be fixed for each dimension separately. TABB may be used in isokinetic, isoenergeric or canonic (with random forces) mode. TABB contains a simple instruction interpreter to easily control the parameters and options during the simulation. The same source code can be compiled either for workstations or for parallel computers. The main optimization goal of TABB is to allow long time simulations of medium or small sized systems. To make this possible, much attention is spent on the optimized communication between the nodes. TABB uses a domain decomposition procedure. To use many nodes with a small system, the domain size has to be small compared to the range of particle interactions. In the limit of many nodes for only few atoms, the bottle neck of communication is the latency time. TABB minimizes the number of pairs of domains containing atoms that interact between these domains. This procedure minimizes the need of communication calls between pairs of nodes. TABB decides automatically, to how many, and to which directions the decomposition shall be applied. E.g., in the case of one dimensional domain decomposition, the simulation box is only split into "slabs" along a selected direction. The three dimensional domain decomposition is best with respect to the number of interacting domains only for simulations
Logistic simulation of the Cryomagnet Activities at Point 18
Lundgren, M
2002-01-01
The purpose of this work is to build a model in a simulation software (AutoMod) that can be used to improve the logistic planning and scheduling of the assembly and installation procedures concerning the Cryomagnets for the LHC project at point 18. The model should provide necessary information that simplifies the scheduling of activities based on time and resource availability. This report contains the results gathered from these simulations. This work been performed in the framework of a Master thesis. It is based on data available in November 2001. It has not been updated following latter decisions but the methodology and main results are still valid.
Future prospects of nuclear power plant simulators in Czechoslovakia
International Nuclear Information System (INIS)
Panek, J.
1983-01-01
The article is devoted to the innovation of the WWER-440 simulator which is being updated because of the obsolete design of the RPP 16 S system whose production has been discontinued. The innovation will replace the greater part of the computer system. Technical requirements and software of the new computer are described. The innovation will proceed in three stages: the acquisition of hardware, modification and complementation of mathematical models and the development of software. No innovation will be made in interface unit DASIO-600 which is conceptually new and has been practically proven. This is also true of the reactor instrumentation simulation which is based on the modular microprocessor system SM 50-40 and meets all requirements. The development of the WWER-1000 simulator has been approved. The development and construction of the simulator proceeds in the following stages: functional specification of the simulator, mathematical model, project of unit control room, assembly and debugging of the computer system, processing of user programs, production and activation of the unit control room and assembly, activation and adjustment of the whole simulator. (B.S.)
Developed hydraulic simulation model for water pipeline networks
Directory of Open Access Journals (Sweden)
A. Ayad
2013-03-01
Full Text Available A numerical method that uses linear graph theory is presented for both steady state, and extended period simulation in a pipe network including its hydraulic components (pumps, valves, junctions, etc.. The developed model is based on the Extended Linear Graph Theory (ELGT technique. This technique is modified to include new network components such as flow control valves and tanks. The technique also expanded for extended period simulation (EPS. A newly modified method for the calculation of updated flows improving the convergence rate is being introduced. Both benchmarks, ad Actual networks are analyzed to check the reliability of the proposed method. The results reveal the finer performance of the proposed method.
An Updated Geophysical Model for AMSR-E and SSMIS Brightness Temperature Simulations over Oceans
Directory of Open Access Journals (Sweden)
Elizaveta Zabolotskikh
2014-03-01
Full Text Available In this study, we considered the geophysical model for microwave brightness temperature (BT simulation for the Atmosphere-Ocean System under non-precipitating conditions. The model is presented as a combination of atmospheric absorption and ocean emission models. We validated this model for two satellite instruments—for Advanced Microwave Sounding Radiometer-Earth Observing System (AMSR-E onboard Aqua satellite and for Special Sensor Microwave Imager/Sounder (SSMIS onboard F16 satellite of Defense Meteorological Satellite Program (DMSP series. We compared simulated BT values with satellite BT measurements for different combinations of various water vapor and oxygen absorption models and wind induced ocean emission models. A dataset of clear sky atmospheric and oceanic parameters, collocated in time and space with satellite measurements, was used for the comparison. We found the best model combination, providing the least root mean square error between calculations and measurements. A single combination of models ensured the best results for all considered radiometric channels. We also obtained the adjustments to simulated BT values, as averaged differences between the model simulations and satellite measurements. These adjustments can be used in any research based on modeling data for removing model/calibration inconsistencies. We demonstrated the application of the model by means of the development of the new algorithm for sea surface wind speed retrieval from AMSR-E data.
Directory of Open Access Journals (Sweden)
Sagar Indurkhya
Full Text Available ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1 a small number of reactions tend to occur a disproportionately large percentage of the time, and (2 a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.
Indurkhya, Sagar; Beal, Jacob
2010-01-01
ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models. PMID:20066048
Evidence-based guideline update
DEFF Research Database (Denmark)
Tfelt-Hansen, Peer Carsten
2013-01-01
Peer Carsten Tfelt-Hansen, Glostrup, Denmark: According to the recent American Academy of Neurology (AAN) guideline update, a drug can be recommended as possibly effective for migraine prevention if it had demonstrated efficacy in one Class II study.(1) Eight drugs are recommended as possibly...
Update History of This Database - SSBD | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us SSBD Update History of This Database Date Update contents 2016/07/25 SSBD English archive si...tion Download License Update History of This Database Site Policy | Contact Us Update History of This Database - SSBD | LSDB Archive ... ...te is opened. 2013/09/03 SSBD ( http://ssbd.qbic.riken.jp/ ) is opened. About This Database Database Descrip
Q2/Q3 2017 Solar Industry Update
Energy Technology Data Exchange (ETDEWEB)
Feldman, David J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hoskins, Jack [Dept. of Energy (DOE), Washington DC (United States); Margolis, Robert M. [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2017-10-24
This technical presentation provides an update on the major trends that occurred in the solar industry in Q2 and Q3 of 2017. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.
Q2/Q3 2016 Solar Industry Update
Energy Technology Data Exchange (ETDEWEB)
Feldman, David; Boff, Daniel; Margolis, Robert
2016-10-11
This technical presentation provides an update on the major trends that occurred in the solar industry in the Q2 and Q3 of 2016. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.
Q3/Q4 2016 Solar Industry Update
Energy Technology Data Exchange (ETDEWEB)
Feldman, David; Boff, Daniel; Margolis, Robert
2016-12-21
This technical presentation provides an update on the major trends that occurred in the solar industry in the Q3 and Q4 of 2016. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.
Q3/Q4 2017 Solar Industry Update
Energy Technology Data Exchange (ETDEWEB)
Feldman, David J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hoskins, Jack [Dept. of Energy (DOE), Washington DC (United States); Margolis, Robert M. [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2018-02-15
This technical presentation provides an update on the major trends that occurred in the solar industry in the Q3 and Q4 of 2017. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.
Utilization, maintenance and upgrading of the ANGRA-2 NPP simulator
International Nuclear Information System (INIS)
Rodrigues, J.C.
1997-01-01
In order to provide a highly qualified training for the future operation teams of the ANGRA-2 and 3 Nuclear Power Plants, ELECTRONUCLEAR has built a training center, at the Plant site, including a full-scope Simulator based on the Siemens/KWU design, PWR, 1300 MW ANGRA-2 Nuclear Power Plant. The Simulator is a full-scope replica of that of the ANGRA-2 Plant. The CTAS has implemented, since the beginning of its operation in Brazil, an extensive training program for operators, technical managers and commissioning and licensing personnel from foreign nuclear power plants, as well as specialists for national and foreign organizations involved with several activities related to nuclear installations and also with the operation of conventional electric power plants. After the installation of the Simulator in Brazil, the Simulator software had been continuously modified and updated. (author)
A multilevel-skin neighbor list algorithm for molecular dynamics simulation
Zhang, Chenglong; Zhao, Mingcan; Hou, Chaofeng; Ge, Wei
2018-01-01
Searching of the interaction pairs and organization of the interaction processes are important steps in molecular dynamics (MD) algorithms and are critical to the overall efficiency of the simulation. Neighbor lists are widely used for these steps, where thicker skin can reduce the frequency of list updating but is discounted by more computation in distance check for the particle pairs. In this paper, we propose a new neighbor-list-based algorithm with a precisely designed multilevel skin which can reduce unnecessary computation on inter-particle distances. The performance advantages over traditional methods are then analyzed against the main simulation parameters on Intel CPUs and MICs (many integrated cores), and are clearly demonstrated. The algorithm can be generalized for various discrete simulations using neighbor lists.
Updating of visual orientation in a gravity-based reference frame.
Niehof, Nynke; Tramper, Julian J; Doeller, Christian F; Medendorp, W Pieter
2017-10-01
The brain can use multiple reference frames to code line orientation, including head-, object-, and gravity-centered references. If these frames change orientation, their representations must be updated to keep register with actual line orientation. We tested this internal updating during head rotation in roll, exploiting the rod-and-frame effect: The illusory tilt of a vertical line surrounded by a tilted visual frame. If line orientation is stored relative to gravity, these distortions should also affect the updating process. Alternatively, if coding is head- or frame-centered, updating errors should be related to the changes in their orientation. Ten subjects were instructed to memorize the orientation of a briefly flashed line, surrounded by a tilted visual frame, then rotate their head, and subsequently judge the orientation of a second line relative to the memorized first while the frame was upright. Results showed that updating errors were mostly related to the amount of subjective distortion of gravity at both the initial and final head orientation, rather than to the amount of intervening head rotation. In some subjects, a smaller part of the updating error was also related to the change of visual frame orientation. We conclude that the brain relies primarily on a gravity-based reference to remember line orientation during head roll.
[Preoperative fasting guidelines: an update].
López Muñoz, A C; Busto Aguirreurreta, N; Tomás Braulio, J
2015-03-01
Anesthesiology societies have issued various guidelines on preoperative fasting since 1990, not only to decrease the incidence of lung aspiration and anesthetic morbidity, but also to increase patient comfort prior to anesthesia. Some of these societies have been updating their guidelines, as such that, since 2010, we now have 2 evidence-based preoperative fasting guidelines available. In this article, an attempt is made to review these updated guidelines, as well as the current instructions for more controversial patients such as infants, the obese, and a particular type of ophthalmic surgery. Copyright © 2014 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.
Dynamic bounds coupled with Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Rajabalinejad, M., E-mail: M.Rajabalinejad@tudelft.n [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands); Meester, L.E. [Delft Institute of Applied Mathematics, Delft University of Technology, Delft (Netherlands); Gelder, P.H.A.J.M. van; Vrijling, J.K. [Faculty of Civil Engineering, Delft University of Technology, Delft (Netherlands)
2011-02-15
For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper describes a method to reduce the simulation cost even further, while retaining the accuracy of Monte Carlo, by taking into account widely present monotonicity. For models exhibiting monotonic (decreasing or increasing) behavior, dynamic bounds (DB) are defined, which in a coupled Monte Carlo simulation are updated dynamically, resulting in a failure probability estimate, as well as a strict (non-probabilistic) upper and lower bounds. Accurate results are obtained at a much lower cost than an equivalent ordinary Monte Carlo simulation. In a two-dimensional and a four-dimensional numerical example, the cost reduction factors are 130 and 9, respectively, where the relative error is smaller than 5%. At higher accuracy levels, this factor increases, though this effect is expected to be smaller with increasing dimension. To show the application of DB method to real world problems, it is applied to a complex finite element model of a flood wall in New Orleans.
Stochastic Simulation of Process Calculi for Biology
Directory of Open Access Journals (Sweden)
Andrew Phillips
2010-10-01
Full Text Available Biological systems typically involve large numbers of components with complex, highly parallel interactions and intrinsic stochasticity. To model this complexity, numerous programming languages based on process calculi have been developed, many of which are expressive enough to generate unbounded numbers of molecular species and reactions. As a result of this expressiveness, such calculi cannot rely on standard reaction-based simulation methods, which require fixed numbers of species and reactions. Rather than implementing custom stochastic simulation algorithms for each process calculus, we propose to use a generic abstract machine that can be instantiated to a range of process calculi and a range of reaction-based simulation algorithms. The abstract machine functions as a just-in-time compiler, which dynamically updates the set of possible reactions and chooses the next reaction in an iterative cycle. In this short paper we give a brief summary of the generic abstract machine, and show how it can be instantiated with the stochastic simulation algorithm known as Gillespie's Direct Method. We also discuss the wider implications of such an abstract machine, and outline how it can be used to simulate multiple calculi simultaneously within a common framework.
Wear simulation of apex seal in rotary engine under mixed lubrication
Jiang, Hanying; Zuo, Zhengxing; Liu, Jinxiang
2018-05-01
In this work, the wear of apex seal's running face under mixed lubrication is studied. Numerical simulation is carried out by employing the couple model of Reynolds equation, Greenwood and Tripp model and Archard's wear law. The simulation is performed both for one circle and multi circle. In the multi circle simulation, the change of contact position due to wear is considered. A method that is able to find the new contact position based on the updated apex seal's contour profile is proposed, validated and used. The result of multi circle simulation indicates that contact position changes obviously around the maximum swing angles both on leading and trailing sides with the increase number of circles. The wear depth distribution becomes more uniform with the increase of operation circle number.
Energy Technology Data Exchange (ETDEWEB)
WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.
2000-11-01
Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.
Update History of This Database - RED | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us RED Update History of This Database Date Update contents 2015/12/21 Rice Expression Database English archi...s Database Database Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - RED | LSDB Archive ... ...ve site is opened. 2000/10/1 Rice Expression Database ( http://red.dna.affrc.go.jp/RED/ ) is opened. About Thi
Modernizing the ATLAS simulation infrastructure
AUTHOR|(INSPIRE)INSPIRE-00213431; The ATLAS collaboration
2017-01-01
The ATLAS Simulation infrastructure has been used to produce upwards of 50 billion proton-proton collision events for analyses ranging from detailed Standard Model measurements to searches for exotic new phenomena. In the last several years, the infrastructure has been heavily revised to allow intuitive multithreading and significantly improved maintainability. Such a massive update of a legacy code base requires careful choices about what pieces of code to completely rewrite and what to wrap or revise. The initialization of the complex geometry was generalized to allow new tools and geometry description languages, popular in some detector groups. The addition of multithreading requires Geant4-MT and GaudiHive, two frameworks with fundamentally different approaches to multithreading, to work together. It also required enforcing thread safety throughout a large code base, which required the redesign of several aspects of the simulation, including truth, the record of particle interactions with the detector dur...
Townsend, Molly T; Sarigul-Klijn, Nesrin
2016-01-01
Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.
Selective updating of working memory content modulates meso-cortico-striatal activity.
Murty, Vishnu P; Sambataro, Fabio; Radulescu, Eugenia; Altamura, Mario; Iudicello, Jennifer; Zoltick, Bradley; Weinberger, Daniel R; Goldberg, Terry E; Mattay, Venkata S
2011-08-01
Accumulating evidence from non-human primates and computational modeling suggests that dopaminergic signals arising from the midbrain (substantia nigra/ventral tegmental area) mediate striatal gating of the prefrontal cortex during the selective updating of working memory. Using event-related functional magnetic resonance imaging, we explored the neural mechanisms underlying the selective updating of information stored in working memory. Participants were scanned during a novel working memory task that parses the neurophysiology underlying working memory maintenance, overwriting, and selective updating. Analyses revealed a functionally coupled network consisting of a midbrain region encompassing the substantia nigra/ventral tegmental area, caudate, and dorsolateral prefrontal cortex that was selectively engaged during working memory updating compared to the overwriting and maintenance of working memory content. Further analysis revealed differential midbrain-dorsolateral prefrontal interactions during selective updating between low-performing and high-performing individuals. These findings highlight the role of this meso-cortico-striatal circuitry during the selective updating of working memory in humans, which complements previous research in behavioral neuroscience and computational modeling. Published by Elsevier Inc.
Second update The Gordon Bell Competition entry gb110s2
International Nuclear Information System (INIS)
Vranas, P; Soltz, R
2006-01-01
Since the update to our entry of October 20th we have just made a significant improvement. We understand that this is past the deadline for updates and very close to the conference date. However, Lawrence Livermore National Laboratory has just updated the BG/L system software on their full 64 BG/L supercomputer to IBM-BGL Release 3. As we discussed in our update of October 20 this release includes our custom L1 and SRAM access functions that allow us to achieve higher sustained performance. Just a few hours ago we got access to the full system and obtained the fastest sustained performance point. In the full 131,072 CPU-cores system QCD sustains 70.9 Teraflops for the Dirac operator and 67.9 teraflops for the full Conjugate Gradient inverter. This is about 20% faster than our last update. We attach the corresponding speedup figure. As you can tell the speedup is perfect. This figure is the same as Figure 1 of our October 20th update except that it now includes the 131,072 CPU-cores point
Evolution of ERP Systems in the Cloud: A Study on System Updates
Directory of Open Access Journals (Sweden)
Elise Bjelland
2018-06-01
Full Text Available Cloud-based enterprise resource planning (ERP systems emerged around the new millennium, and since then there has been a lack of research regarding the evolution and update processes of these systems. From the users’ perspective, updates in a traditional on-premise ERP system are carried at their own request; while cloud-based ERPs are compulsory updated. Through an established ERP lifecycle framework, this study investigates how the process of updates is conducted in a cloud ERP context, from both the users’ and vendors’ perspectives. A multiple case study was conducted in Norway at 10 client organizations, as well as a cloud ERP vendor. Our main findings suggest that the vendor and the users view the process of updates differently. The main challenges with the process of updates from the users’ perspective are the size and date of the updates, lack of information and communication during the process, and extinction of certain functionalities. Yet, the main advantages are that all system users will always have the same version of the system, users do not need to spend time on updating the system and paying attention to the ERP market, which leads to more focus on their core competences instead.
Update History of This Database - RPD | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us RPD Update History of This Database Date Update contents 2016/02/02 Rice Proteome Database English archi...s Database Database Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - RPD | LSDB Archive ... ...ve site is opened. 2003/01/07 Rice Proteome Database ( http://gene64.dna.affrc.go.jp/RPD/ ) is opened. About Thi
Update History of This Database - PLACE | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us PLACE Update History of This Database Date Update contents 2016/08/22 The contact address is...s Database Database Description Download License Update History of Thi...s Database Site Policy | Contact Us Update History of This Database - PLACE | LSDB Archive ... ... changed. 2014/10/20 The URLs of the database maintenance site and the portal site are changed. 2014/07/17 PLACE English archi
77 FR 19077 - Adoption of Updated EDGAR Filer Manual
2012-03-30
... practice, publication for notice and comment is not required under the Administrative Procedure Act (APA...-30008] Adoption of Updated EDGAR Filer Manual AGENCY: Securities and Exchange Commission. ACTION: Final... Electronic Data Gathering, Analysis, and Retrieval System (EDGAR) Filer Manual to reflect updates to the...
76 FR 73506 - Adoption of Updated EDGAR Filer Manual
2011-11-29
... practice, publication for notice and comment is not required under the Administrative Procedure Act (APA...-29868] Adoption of Updated EDGAR Filer Manual AGENCY: Securities and Exchange Commission. ACTION: Final... Electronic Data Gathering, Analysis, and Retrieval System (EDGAR) Filer Manual to reflect updates to the...
77 FR 71089 - Pilot Loading of Aeronautical Database Updates
2012-11-29
...) card, rather than in resident memory. The database update was accomplished by removing the SD card with... frequency distance measuring equipment (DME), and any updates that affect system operating software--that... developed with attention to data integrity. Current technology uses databases which are developed in...
The effect of Fukushima in the training simulators
International Nuclear Information System (INIS)
Garces, M.; Garcia, S.
2012-01-01
The main areas where improvements in comparison to the current situation will be required are: response to prolonged loss of power supply, availability of additional cooling of the core sources, new measures to ensure the integrity of the containment, monitoring and control of the cooling of the spent fuel pool, revision of guidelines, management of severe accidents and existence of centers or organizations external emergency control. Associated training and coaching needs will require the updating of simulators.
LBTool: A stochastic toolkit for leave-based key updates
DEFF Research Database (Denmark)
Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming
2012-01-01
Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless...... sensor networks. The analysis aims to find out the probability of a network key being compromised at a specific time point, which result in fluctuations over time for a specific key update method called Leave-based key update. For such a problem, the use of current tools is limited in many ways...
Nevada Nuclear Waste Storage Investigations, January-June 1987: An update
International Nuclear Information System (INIS)
Tamura, A.T.; Lorenz, J.J.
1988-03-01
This update contains information on the Nevada Nuclear Waste Storage Investigations (NNWSI) that was added to the DOE Energy Data Base during the first six months of 1987. The update is categorized by principal NNWSI Project participating organization, and items are arranged in chronological order. Participant-sponsored subcontractor reports, papers, and articles are included in the sponsoring organization's list. The publication following this update will be a supplement to the first bibliography (DOE/TIC-3406) and will include all information retrieved from January 1, 1986, to December 31, 1987. It will be a cumulation of all updates for this two-year interval and will include indexing for: Corporate Author, Personal Author, Subject, Contract Number, Report Number, Order Number Correlation, and Key Word in Context
On low-rank updates to the singular value and Tucker decompositions
Energy Technology Data Exchange (ETDEWEB)
O' Hara, M J
2009-10-06
The singular value decomposition is widely used in signal processing and data mining. Since the data often arrives in a stream, the problem of updating matrix decompositions under low-rank modification has been widely studied. Brand developed a technique in 2006 that has many advantages. However, the technique does not directly approximate the updated matrix, but rather its previous low-rank approximation added to the new update, which needs justification. Further, the technique is still too slow for large information processing problems. We show that the technique minimizes the change in error per update, so if the error is small initially it remains small. We show that an updating algorithm for large sparse matrices should be sub-linear in the matrix dimension in order to be practical for large problems, and demonstrate a simple modification to the original technique that meets the requirements.
Nuclear data sheets update for A = 197
Energy Technology Data Exchange (ETDEWEB)
Chunmei, Zhou [Chinese Nuclear Data Center, Beijing, BJ (China)
1996-06-01
The Nuclear Data Sheet for A = 197 has been carried out on the basis of the nuclear reaction and decay experiments leading to all the nuclei with mass number A = 197 since cutoff date of the last evaluation, December 1989. Most evaluation data have been updated or revised. The nuclei of updated data mainly are {sup 197}Hg, {sup 197}Pb, {sup 197}Bi and {sup 197}Po. The adopted levels and adopted gamma radiations for all nuclei are shown in the tables.
Update of Part 61 impacts analysis methodology
International Nuclear Information System (INIS)
Oztunali, O.I.; Roles, G.W.
1986-01-01
The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project
Update of Part 61 impacts analysis methodology
International Nuclear Information System (INIS)
Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)
1985-01-01
The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project
Fracture network modeling and GoldSim simulation support
International Nuclear Information System (INIS)
Sugita, Kenichiro; Dershowitz, William
2003-01-01
During Heisei-14, Golder Associates provided support for JNC Tokai through data analysis and simulation of the MIU Underground Rock Laboratory, participation in Task 6 of the Aespoe Task Force on Modelling of Groundwater Flow and Transport, and analysis of repository safety assessment technologies including cell networks for evaluation of the disturbed rock zone (DRZ) and total systems performance assessment (TSPA). MIU Underground Rock Laboratory support during H-14 involved discrete fracture network (DFN) modelling in support of the Multiple Modelling Project (MMP) and the Long Term Pumping Test (LPT). Golder developed updated DFN models for the MIU site, reflecting updated analyses of fracture data. Golder also developed scripts to support JNC simulations of flow and transport pathways within the MMP. Golder supported JNC participation in Task 6 of the Aespoe Task Force on Modelling of Groundwater Flow and Transport during H-14. Task 6A and 6B compared safety assessment (PA) and experimental time scale simulations along a pipe transport pathway. Task 6B2 extended Task 6B simulations from 1-D to 2-D. For Task 6B2, Golder carried out single fracture transport simulations on a wide variety of generic heterogeneous 2D fractures using both experimental and safety assessment boundary conditions. The heterogeneous 2D fractures were implemented according to a variety of in plane heterogeneity patterns. Multiple immobile zones were considered including stagnant zones, infillings, altered wall rock, and intact rock. During H-14, JNC carried out extensive studies of the distributed rock zone (DRZ) surrounding repository tunnels and drifts. Golder supported this activity be evaluating the calculation time necessary for simulating a reference heterogeneous DRZ cell network for a range of computational strategies. To support the development of JNC's total system performance assessment (TSPA) strategy, Golder carried out a review of the US DOE Yucca Mountain Project TSPA. This
Development of the updated system of city underground pipelines based on Visual Studio
Zhang, Jianxiong; Zhu, Yun; Li, Xiangdong
2009-10-01
Our city has owned the integrated pipeline network management system with ArcGIS Engine 9.1 as the bottom development platform and with Oracle9i as basic database for storaging data. In this system, ArcGIS SDE9.1 is applied as the spatial data engine, and the system was a synthetic management software developed with Visual Studio visualization procedures development tools. As the pipeline update function of the system has the phenomenon of slower update and even sometimes the data lost, to ensure the underground pipeline data can real-time be updated conveniently and frequently, and the actuality and integrity of the underground pipeline data, we have increased a new update module in the system developed and researched by ourselves. The module has the powerful data update function, and can realize the function of inputting and outputting and rapid update volume of data. The new developed module adopts Visual Studio visualization procedures development tools, and uses access as the basic database to storage data. We can edit the graphics in AutoCAD software, and realize the database update using link between the graphics and the system. Practice shows that the update module has good compatibility with the original system, reliable and high update efficient of the database.
Institutional Ethics Committee Regulations and Current Updates in India.
Mahuli, Amit V; Mahuli, Simpy A; Patil, Shankargouda; Bhandi, Shilpa
2017-08-01
The aim of the review is to provide current updates on regulations for ethics committees and researchers in India. Ethical dilemmas in research since time immemorial have been a major concern for researchers worldwide. The question "what makes clinical research ethical" is significant and difficult to answer as multiple factors are involved. The research involving human participants in clinical trials should follow the required rules, regulations, and guidelines in one's own country. It is a dynamic process, and updates have to be learned by researcher and committee members. The review highlights the ethical regulation from the Drug Controller General of India, Clinical Trial Registry of India, and Indian Council of Medical Research guidelines. In this article, the updates on Indian scenario of the Ethical Committee and guidelines are compiled. The review comes handy for clinical researchers and ethics committee members in academic institutions to check on the current updates and keep abreast with the knowledge on regulations of ethics in India.
Updates to the Demographic and Spatial Allocation Models to ...
EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change modeling by providing nationwide housing development scenarios up to 2100. ICLUS V2 includes updated population and land use data sets and addressing limitations identified in ICLUS v1 in both the migration and spatial allocation models. The companion user guide describes the development of ICLUS v2 and the updates that were made to the original data sets and the demographic and spatial allocation models. [2017 UPDATE] Get the latest version of ICLUS and stay up-to-date by signing up to the ICLUS mailing list. The GIS tool enables users to run SERGoM with the population projections developed for the ICLUS project and allows users to modify the spatial allocation housing density across the landscape.
Internet Journal of Medical Update
African Journals Online (AJOL)
admin
Internet Journal of Medical Update 2010 July;5(2):8-14. Internet Journal ... hospitalizations. This study of Nigerian patients with diabetes examined the adequacy of ..... Physicians need .... relationship between patient education and glycaemic ...
Updated science systems on USCGC Healy
Chayes, D. N.; Roberts, S. D.; Arko, R. A.; Hiller, S. M.
2008-12-01
The USCG cutter Healy is the U.S. Arctic research icebreaker. Prior to the 2008 season, a number of upgrades and improvements were made to the science systems. These included the addition of two Bell BGM-3 marine gravity meters. The vessel's existing meterological sensors were enhanced with two RM Young model 85004 heated ultrasonic anemometers; a Paroscientific, Inc. model "MET-3A" air temperature, humidity and barometric pressure subsystem; and an RM Young model 50202 heated rain gauge. The flow through sea water system was updated with new flow meters, a SeaBird SBE45 thermosalinograph, long and a short wave radiation sensors, a Seapoint fluorometer. A Milltech Marine Smart Radio model SR161 Automatic Identification System (AIS) receiver and an updated interface to real-time winch and wire performance have been added. Our onboard real-time GIS has been updated to include real-time plotting of other ship tracks from our AIS receiver and the ability for users to save and share planned tracks. For the HLY0806 leg, we implemented a SWAP ship-to ship wireless connection for our two-ship operations with the Canadian icebreaker Louis S. St. Laurent similar to the one we implemented for our two-ship program with the Swedish icebreaker Oden in 2005. We updated our routine delivery of underway data to investigators, as well as a copy for archiving to the NSF-supported Marine Geoscience Data System (MGDS), using portable "boomerang" drives. An end-user workstation was added to accommodate increasing demand for onboard processing. Technical support for science on the Healy is supported by the U.S. National Science Foundation.
Quantitative evaluation of ozone and selected climate parameters in a set of EMAC simulations
Directory of Open Access Journals (Sweden)
M. Righi
2015-03-01
Full Text Available Four simulations with the ECHAM/MESSy Atmospheric Chemistry (EMAC model have been evaluated with the Earth System Model Validation Tool (ESMValTool to identify differences in simulated ozone and selected climate parameters that resulted from (i different setups of the EMAC model (nudged vs. free-running and (ii different boundary conditions (emissions, sea surface temperatures (SSTs and sea ice concentrations (SICs. To assess the relative performance of the simulations, quantitative performance metrics are calculated consistently for the climate parameters and ozone. This is important for the interpretation of the evaluation results since biases in climate can impact on biases in chemistry and vice versa. The observational data sets used for the evaluation include ozonesonde and aircraft data, meteorological reanalyses and satellite measurements. The results from a previous EMAC evaluation of a model simulation with nudging towards realistic meteorology in the troposphere have been compared to new simulations with different model setups and updated emission data sets in free-running time slice and nudged quasi chemistry-transport model (QCTM mode. The latter two configurations are particularly important for chemistry-climate projections and for the quantification of individual sources (e.g., the transport sector that lead to small chemical perturbations of the climate system, respectively. With the exception of some specific features which are detailed in this study, no large differences that could be related to the different setups (nudged vs. free-running of the EMAC simulations were found, which offers the possibility to evaluate and improve the overall model with the help of shorter nudged simulations. The main differences between the two setups is a better representation of the tropospheric and stratospheric temperature in the nudged simulations, which also better reproduce stratospheric water vapor concentrations, due to the improved
Updating radon daughter bronchial dosimetry
International Nuclear Information System (INIS)
Harley, N.H.; Cohen, B.S.
1990-01-01
It is of value to update radon daughter bronchial dosimetry as new information becomes available. Measurements have now been performed using hollow casts of the human bronchial tree with a larynx to determine convective or turbulent deposition in the upper airways. These measurements allow a more realistic calculation of bronchial deposition by diffusion. Particle diameters of 0.15 and 0.2 μm were used which correspond to the activity median diameters for radon daughters in both environmental and mining atmospheres. The total model incorporates Yeh/Schum bronchial morphometry, deposition of unattached and attached radon daughters, build up and decay of the daughters and mucociliary clearance. The alpha dose to target cells in the bronchial epithelium is calculated for the updated model and compared with previous calculations of bronchial dose
Evaluation of two updating methods for dissipative models on a real structure
International Nuclear Information System (INIS)
Moine, P.; Billet, L.
1996-01-01
Finite Element Models are widely used to predict the dynamic behaviour from structures. Frequently, the model does not represent the structure with all be expected accuracy i.e. the measurements realised on the structure differ from the data predicted by the model. It is therefore necessary to update the model. Although many modeling errors come from inadequate representation of the damping phenomena, most of the model updating techniques are up to now restricted to conservative models only. In this paper, we present two updating methods for dissipative models using Eigen mode shapes and Eigen values as behavioural information from the structure. The first method - the modal output error method - compares directly the experimental Eigen vectors and Eigen values to the model Eigen vectors and Eigen values whereas the second method - the error in constitutive relation method - uses an energy error derived from the equilibrium relation. The error function, in both cases, is minimized by a conjugate gradient algorithm and the gradient is calculated analytically. These two methods behave differently which can be evidenced by updating a real structure constituted from a piece of pipe mounted on two viscous elastic suspensions. The updating of the model validates an updating strategy consisting in realizing a preliminary updating with the error in constitutive relation method (a fast to converge but difficult to control method) and then to pursue the updating with the modal output error method (a slow to converge but reliable and easy to control method). Moreover the problems encountered during the updating process and their corresponding solutions are given. (authors)
A general concurrent algorithm for plasma particle-in-cell simulation codes
International Nuclear Information System (INIS)
Liewer, P.C.; Decyk, V.K.
1989-01-01
We have developed a new algorithm for implementing plasma particle-in-cell (PIC) simulation codes on concurrent processors with distributed memory. This algorithm, named the general concurrent PIC algorithm (GCPIC), has been used to implement an electrostatic PIC code on the 33-node JPL Mark III Hypercube parallel computer. To decompose at PIC code using the GCPIC algorithm, the physical domain of the particle simulation is divided into sub-domains, equal in number to the number of processors, such that all sub-domains have roughly equal numbers of particles. For problems with non-uniform particle densities, these sub-domains will be of unequal physical size. Each processor is assigned a sub-domain and is responsible for updating the particles in its sub-domain. This algorithm has led to a a very efficient parallel implementation of a well-benchmarked 1-dimensional PIC code. The dominant portion of the code, updating the particle positions and velocities, is nearly 100% efficient when the number of particles is increased linearly with the number of hypercube processors used so that the number of particles per processor is constant. For example, the increase in time spent updating particles in going from a problem with 11,264 particles run on 1 processor to 360,448 particles on 32 processors was only 3% (parallel efficiency of 97%). Although implemented on a hypercube concurrent computer, this algorithm should also be efficient for PIC codes on other parallel architectures and for large PIC codes on sequential computers where part of the data must reside on external disks. copyright 1989 Academic Press, Inc
Action simulation: time course and representational mechanisms
Springer, Anne; Parkinson, Jim; Prinz, Wolfgang
2013-01-01
The notion of action simulation refers to the ability to re-enact foreign actions (i.e., actions observed in other individuals). Simulating others' actions implies a mirroring of their activities, based on one's own sensorimotor competencies. Here, we discuss theoretical and experimental approaches to action simulation and the study of its representational underpinnings. One focus of our discussion is on the timing of internal simulation and its relation to the timing of external action, and a paradigm that requires participants to predict the future course of actions that are temporarily occluded from view. We address transitions between perceptual mechanisms (referring to action representation before and after occlusion) and simulation mechanisms (referring to action representation during occlusion). Findings suggest that action simulation runs in real-time; acting on newly created action representations rather than relying on continuous visual extrapolations. A further focus of our discussion pertains to the functional characteristics of the mechanisms involved in predicting other people's actions. We propose that two processes are engaged, dynamic updating and static matching, which may draw on both semantic and motor information. In a concluding section, we discuss these findings in the context of broader theoretical issues related to action and event representation, arguing that a detailed functional analysis of action simulation in cognitive, neural, and computational terms may help to further advance our understanding of action cognition and motor control. PMID:23847563
Update History of This Database - RMG | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us RMG Update History of This Database Date Update contents 2016/08/22 The contact address is c...dna.affrc.go.jp/ ) is opened. About This Database Database Description Download License Update Hi...story of This Database Site Policy | Contact Us Update History of This Database - RMG | LSDB Archive ... ... URL of the portal site is changed. 2013/08/07 RMG archive site is opened. 2002/09/25 RMG ( http://rmg.rice.
2006-01-01
As part of the upgrade of telephone services, the CERN switching centre will be updated on Wednesday 14 June between 8.00 p.m. and midnight. Telephone services may be disrupted and possibly even interrupted during this operation. We apologise in advance for any inconvenience this may cause. CERN TELECOM Service
OSATE Overview & Community Updates
2015-02-15
update 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Delange /Julien 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...main language capabilities Modeling patterns & model samples for beginners Error-Model examples EMV2 model constructs Demonstration of tools Case
78 FR 4766 - Adoption of Updated EDGAR Filer Manual
2013-01-23
..., publication for notice and comment is not required under the Administrative Procedure Act (APA).\\7\\ It follows...-68644; 39-2488; IC-30348] Adoption of Updated EDGAR Filer Manual AGENCY: Securities and Exchange...) Filer Manual and related rules to reflect updates to the EDGAR system. The revisions are being made...
Design and development for updating national 1:50,000 topographic databases in China
Directory of Open Access Journals (Sweden)
CHEN Jun
2010-02-01
Full Text Available 1.1 Objective Map databases are irreplaceable national treasure of immense importance. Their currency referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million’s kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. 1.2 Methodology The updating of map databases is different than its original creation, and a number of new problems should be solved, such as change detection using latest multi-source data, incremental object revision and relation amendment. The methodology of this paper consists of the following three parts: 1 Examine the four key aspects of map database updating and develop basic updating models/methods, such as currentness-oriented integration of multi-resource data, completeness-based incremental change detection in the context of existing datasets, consistency-aware processing of updated data sets, and user-friendly propagation and services of updates. 2 Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. 3 Design a national 1:50,000 databases updating strategy and its production workflow, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process
A new frequency matching technique for FRF-based model updating
Yang, Xiuming; Guo, Xinglin; Ouyang, Huajiang; Li, Dongsheng
2017-05-01
Frequency Response Function (FRF) residues have been widely used to update Finite Element models. They are a kind of original measurement information and have the advantages of rich data and no extraction errors, etc. However, like other sensitivity-based methods, an FRF-based identification method also needs to face the ill-conditioning problem which is even more serious since the sensitivity of the FRF in the vicinity of a resonance is much greater than elsewhere. Furthermore, for a given frequency measurement, directly using a theoretical FRF at a frequency may lead to a huge difference between the theoretical FRF and the corresponding experimental FRF which finally results in larger effects of measurement errors and damping. Hence in the solution process, correct selection of the appropriate frequency to get the theoretical FRF in every iteration in the sensitivity-based approach is an effective way to improve the robustness of an FRF-based algorithm. A primary tool for right frequency selection based on the correlation of FRFs is the Frequency Domain Assurance Criterion. This paper presents a new frequency selection method which directly finds the frequency that minimizes the difference of the order of magnitude between the theoretical and experimental FRFs. A simulated truss structure is used to compare the performance of different frequency selection methods. For the sake of reality, it is assumed that not all the degrees of freedom (DoFs) are available for measurement. The minimum number of DoFs required in each approach to correctly update the analytical model is regarded as the right identification standard.
SAVY-4000 Field Surveillance Plan Update for 2017
Energy Technology Data Exchange (ETDEWEB)
Kelly, Elizabeth J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Stone, Timothy Amos [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Smith, Paul Herrick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Reeves, Kirk Patrick [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Veirs, Douglas Kirk [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Prochnow, David Adrian [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-03-20
The Packaging Surveillance Program section of the Department of Energy (DOE) Manual 441.1-1, Nuclear Material Packaging Manual (DOE 2008), requires DOE contractors to “ensure that a surveillance program is established and implemented to ensure the nuclear material storage package continues to meet its design criteria.”This 2017 update reflects changes to the surveillance plan resulting from surveillance findings as documented in Reeves et al. 2016. These findings include observations of corrosion in SAVY and Hagan containers and the indication (in one SAVY container) of possible filter membrane thermal degradation. This surveillance plan update documents the rationale for selecting surveillance containers, specifies the containers for 2017 surveillance, and identifies a minimum set of containers for 2018 surveillance. This update contains important changes to the previous surveillance plans.
An ORIGEN-2 update for PCs and mainframes
International Nuclear Information System (INIS)
Ludwig, S.B.
1992-01-01
This paper reports that an updated version of the ORIGEN2 code package has been prepared by Oak Ridge National Laboratory. ORIGEN2 is used extensively by the DOE office of Civilian Radioactive Waste Management (OCRWM) and its contractors to determine the characteristics of spent fuel and high-level radioactive waste due to irradiation, decay, and processing. Included in this update are revised ORIGEN2 cross-section libraries for standard- and extended-burnup PWRs and BWRs. This update also includes improvements to the ORIGEN2 computer code (designated as ORIGEN2, Version 2.1 8-1-91 release). This version of ORIGEN2 provides a single source code that may be executed on both mainframes and 80386 or 80486 PCs, effectively smashing the 640 KB barrier that limited previous PC implementations
Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab: an update on PR12-16-001
Energy Technology Data Exchange (ETDEWEB)
Battaglieri, M. [Istituto Nazionale di Fisica Nucleare (INFN), Genova (Italy); et. al.
2017-12-07
This document is an update to the proposal PR12-16-001 Dark matter search in a Beam-Dump eXperiment (BDX) at Jefferson Lab submitted to JLab-PAC44 in 2016 reporting progress in addressing questions raised regarding the beam-on backgrounds. The concerns are addressed by adopting a new simulation tool, FLUKA, and planning measurements of muon fluxes from the dump with its existing shielding around the dump. First, we have implemented the detailed BDX experimental geometry into a FLUKA simulation, in consultation with experts from the JLab Radiation Control Group. The FLUKA simulation has been compared directly to our GEANT4 simulations and shown to agree in regions of validity. The FLUKA interaction package, with a tuned set of biasing weights, is naturally able to generate reliable particle distributions with very small probabilities and therefore predict rates at the detector location beyond the planned shielding around the beam dump. Second, we have developed a plan to conduct measurements of the muon ux from the Hall-A dump in its current configuration to validate our simulations.
Update to the Fissile Materials Disposition program SST/SGT transportation estimation
International Nuclear Information System (INIS)
John Didlake
1999-01-01
This report is an update to ''Fissile Materials Disposition Program SST/SGT Transportation Estimation,'' SAND98-8244, June 1998. The Department of Energy Office of Fissile Materials Disposition requested this update as a basis for providing the public with an updated estimation of the number of transportation loads, load miles, and costs associated with the preferred alternative in the Surplus Plutonium Disposition Final Environmental Impact Statement (EIS)
2010-10-01
... 42 Public Health 5 2010-10-01 2010-10-01 false Approval of the State Medicaid HIT plan, the HIT PAPD and update, the HIT IAPD and update, and the annual HIT IAPD. 495.344 Section 495.344 Public... Requirements Specific to the Medicaid Program § 495.344 Approval of the State Medicaid HIT plan, the HIT PAPD...
The updating of clinical practice guidelines: insights from an international survey
Directory of Open Access Journals (Sweden)
Solà Ivan
2011-09-01
Full Text Available Abstract Background Clinical practice guidelines (CPGs have become increasingly popular, and the methodology to develop guidelines has evolved enormously. However, little attention has been given to the updating process, in contrast to the appraisal of the available literature. We conducted an international survey to identify current practices in CPG updating and explored the need to standardize and improve the methods. Methods We developed a questionnaire (28 items based on a review of the existing literature about guideline updating and expert comments. We carried out the survey between March and July 2009, and it was sent by email to 106 institutions: 69 members of the Guidelines International Network who declared that they developed CPGs; 30 institutions included in the U.S. National Guideline Clearinghouse database that published more than 20 CPGs; and 7 institutions selected by an expert committee. Results Forty-four institutions answered the questionnaire (42% response rate. In the final analysis, 39 completed questionnaires were included. Thirty-six institutions (92% reported that they update their guidelines. Thirty-one institutions (86% have a formal procedure for updating their guidelines, and 19 (53% have a formal procedure for deciding when a guideline becomes out of date. Institutions describe the process as moderately rigorous (36% or acknowledge that it could certainly be more rigorous (36%. Twenty-two institutions (61% alert guideline users on their website when a guideline is older than three to five years or when there is a risk of being outdated. Twenty-five institutions (64% support the concept of "living guidelines," which are continuously monitored and updated. Eighteen institutions (46% have plans to design a protocol to improve their guideline-updating process, and 21 (54% are willing to share resources with other organizations. Conclusions Our study is the first to describe the process of updating CPGs among prominent
MARMOT update for oxide fuel modeling
Energy Technology Data Exchange (ETDEWEB)
Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, Pritam [Idaho National Lab. (INL), Idaho Falls, ID (United States); Jiang, Chao [Idaho National Lab. (INL), Idaho Falls, ID (United States); Aagesen, Larry [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ahmed, Karim [Idaho National Lab. (INL), Idaho Falls, ID (United States); Jiang, Wen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Biner, Bulent [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, Xianming [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Tonks, Michael [Pennsylvania State Univ., University Park, PA (United States); Millett, Paul [Univ. of Arkansas, Fayetteville, AR (United States)
2016-09-01
This report summarizes the lower-length-scale research and development progresses in FY16 at Idaho National Laboratory in developing mechanistic materials models for oxide fuels, in parallel to the development of the MARMOT code which will be summarized in a separate report. This effort is a critical component of the microstructure based fuel performance modeling approach, supported by the Fuels Product Line in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. The progresses can be classified into three categories: 1) development of materials models to be used in engineering scale fuel performance modeling regarding the effect of lattice defects on thermal conductivity, 2) development of modeling capabilities for mesoscale fuel behaviors including stage-3 gas release, grain growth, high burn-up structure, fracture and creep, and 3) improved understanding in material science by calculating the anisotropic grain boundary energies in UO$_2$ and obtaining thermodynamic data for solid fission products. Many of these topics are still under active development. They are updated in the report with proper amount of details. For some topics, separate reports are generated in parallel and so stated in the text. The accomplishments have led to better understanding of fuel behaviors and enhance capability of the MOOSE-BISON-MARMOT toolkit.
Argonne Wakefield Accelerator Update '92
International Nuclear Information System (INIS)
Rosing, M.; Balka, L.; Chojnacki, E.; Gai, W.; Ho, C.; Konecny, R.; Power, J.; Schoessow, P.; Simpson, J.
1992-01-01
The Argonne Wakefield Accelerator (AWA) is an experiment designed to test various ideas related to wakefield technology. Construction is now underway for a 100 nC electron beam in December of 1992. This report updates this progress
Update History of This Database - KOME | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available List Contact us KOME Update History of This Database Date Update contents 2014/10/22 The URL of the whole da...site is opened. 2003/07/18 KOME ( http://cdna01.dna.affrc.go.jp/cDNA/ ) is opened. About This Database Dat...abase Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - KOME | LSDB Archive ...
Update History of This Database - PSCDB | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available List Contact us PSCDB Update History of This Database Date Update contents 2016/11/30 PSCDB English archive ...site is opened. 2011/11/13 PSCDB ( http://idp1.force.cs.is.nagoya-u.ac.jp/pscdb/ ) is opened. About This Database Database... Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - PSCDB | LSDB Archive ...
Update History of This Database - fRNAdb | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us fRNAdb Update History of This Database Date Update contents 2016/03/29 fRNAdb English archiv...on Download License Update History of This Database Site Policy | Contact Us Update History of This Database - fRNAdb | LSDB Archive ... ...e site is opened. 2006/12 fRNAdb ( http://www.ncrna.org/ ) is opened. About This Database Database Descripti
Modernizing the ATLAS Simulation Infrastructure
Di Simone, Andrea; The ATLAS collaboration
2016-01-01
The ATLAS Simulation infrastructure has been used to produce upwards of 50 billion proton-proton collision events for analyses ranging from detailed Standard Model measurements to searches for exotic new phenomena. In the last several years, the infrastructure has been heavily revised to allow intuitive multithreading and significantly improved maintainability. Such a massive update of a legacy code base requires careful choices about what pieces of code to completely rewrite and what to wrap or revise. The initialization of the complex geometry was generalized to allow new tools and geometry description languages, popular in some detector groups. The addition of multithreading requires Geant4 MT and GaudiHive, two frameworks with fundamentally different approaches to multithreading, to work together. It also required enforcing thread safety throughout a large code base, which required the redesign of several aspects of the simulation, including “truth,” the record of particle interactions with the detect...
Landsat-D thematic mapper simulator
Flanagan, G. F.; Tilton, E. L., III
The design and testing program for the airborne Landsat-D thematic-mapper simulator (TMS) is summarized. The TMS is intended to provide data similar enough to those expected from Landsat-D to facilitate the development of data-processing software. The design process comprised mainly modifications on the existing MSS-simulator fiber optics, dichroics, and detectors to provide 7-channel coverage of the 0.45-12.3-micron range at 60-deg angle of view, corresponding to a 418-element, 13.8-km-wide ground swath. The TMS is carried on a Lear 23 aircraft operating at 750 km/h and 12-m altitude and equipped with a 15.2-cm aerial mapping camera and a ground-updated inertial navigational system. Agricultural, forestry, and geological trial applications are reviewed, and some sample results are given. The significant improvements predicted for the Landsat-D thematic mapper (relative to the Landsat MSS) are seen as confirmed, with the possible exception of the 120-m-resolution version of channel 7.
Disruption of the Right Temporoparietal Junction Impairs Probabilistic Belief Updating.
Mengotti, Paola; Dombert, Pascasie L; Fink, Gereon R; Vossel, Simone
2017-05-31
Generating and updating probabilistic models of the environment is a fundamental modus operandi of the human brain. Although crucial for various cognitive functions, the neural mechanisms of these inference processes remain to be elucidated. Here, we show the causal involvement of the right temporoparietal junction (rTPJ) in updating probabilistic beliefs and we provide new insights into the chronometry of the process by combining online transcranial magnetic stimulation (TMS) with computational modeling of behavioral responses. Female and male participants performed a modified location-cueing paradigm, where false information about the percentage of cue validity (%CV) was provided in half of the experimental blocks to prompt updating of prior expectations. Online double-pulse TMS over rTPJ 300 ms (but not 50 ms) after target appearance selectively decreased participants' updating of false prior beliefs concerning %CV, reflected in a decreased learning rate of a Rescorla-Wagner model. Online TMS over rTPJ also impacted on participants' explicit beliefs, causing them to overestimate %CV. These results confirm the involvement of rTPJ in updating of probabilistic beliefs, thereby advancing our understanding of this area's function during cognitive processing. SIGNIFICANCE STATEMENT Contemporary views propose that the brain maintains probabilistic models of the world to minimize surprise about sensory inputs. Here, we provide evidence that the right temporoparietal junction (rTPJ) is causally involved in this process. Because neuroimaging has suggested that rTPJ is implicated in divergent cognitive domains, the demonstration of an involvement in updating internal models provides a novel unifying explanation for these findings. We used computational modeling to characterize how participants change their beliefs after new observations. By interfering with rTPJ activity through online transcranial magnetic stimulation, we showed that participants were less able to update
Updating risk prediction tools: a case study in prostate cancer.
Ankerst, Donna P; Koniarski, Tim; Liang, Yuanyuan; Leach, Robin J; Feng, Ziding; Sanda, Martin G; Partin, Alan W; Chan, Daniel W; Kagan, Jacob; Sokoll, Lori; Wei, John T; Thompson, Ian M
2012-01-01
Online risk prediction tools for common cancers are now easily accessible and widely used by patients and doctors for informed decision-making concerning screening and diagnosis. A practical problem is as cancer research moves forward and new biomarkers and risk factors are discovered, there is a need to update the risk algorithms to include them. Typically, the new markers and risk factors cannot be retrospectively measured on the same study participants used to develop the original prediction tool, necessitating the merging of a separate study of different participants, which may be much smaller in sample size and of a different design. Validation of the updated tool on a third independent data set is warranted before the updated tool can go online. This article reports on the application of Bayes rule for updating risk prediction tools to include a set of biomarkers measured in an external study to the original study used to develop the risk prediction tool. The procedure is illustrated in the context of updating the online Prostate Cancer Prevention Trial Risk Calculator to incorporate the new markers %freePSA and [-2]proPSA measured on an external case-control study performed in Texas, U.S.. Recent state-of-the art methods in validation of risk prediction tools and evaluation of the improvement of updated to original tools are implemented using an external validation set provided by the U.S. Early Detection Research Network. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
LightForce: An Update on Orbital Collision Avoidance Using Photon Pressure
Stupl, Jan; Mason, James; De Vries, Willem; Smith, Craig; Levit, Creon; Marshall, William; Salas, Alberto Guillen; Pertica, Alexander; Olivier, Scot; Ting, Wang
2012-01-01
We present an update on our research on collision avoidance using photon-pressure induced by ground-based lasers. In the past, we have shown the general feasibility of employing small orbit perturbations, induced by photon pressure from ground-based laser illumination, for collision avoidance in space. Possible applications would be protecting space assets from impacts with debris and stabilizing the orbital debris environment. Focusing on collision avoidance rather than de-orbit, the scheme avoids some of the security and liability implications of active debris removal, and requires less sophisticated hardware than laser ablation. In earlier research we concluded that one ground based system consisting of a 10 kW class laser, directed by a 1.5 m telescope with adaptive optics, could avoid a significant fraction of debris-debris collisions in low Earth orbit. This paper describes our recent efforts, which include refining our original analysis, employing higher fidelity simulations and performing experimental tracking tests. We investigate the efficacy of one or more laser ground stations for debris-debris collision avoidance and satellite protection using simulations to investigate multiple case studies. The approach includes modeling of laser beam propagation through the atmosphere, the debris environment (including actual trajectories and physical parameters), laser facility operations, and simulations of the resulting photon pressure. We also present the results of experimental laser debris tracking tests. These tests track potential targets of a first technical demonstration and quantify the achievable tracking performance.
Corey, Stephen; Carnahan, Richard S., Jr.
1990-01-01
A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.
Inter-firm Networks, Organizational Learning and Knowledge Updating: An Empirical Study
Zhang, Su-rong; Wang, Wen-ping
In the era of knowledge-based economy which information technology develops rapidly, the rate of knowledge updating has become a critical factor for enterprises to gaining competitive advantage .We build an interactional theoretical model among inter-firm networks, organizational learning and knowledge updating thereby and demonstrate it with empirical study at last. The result shows that inter-firm networks and organizational learning is the source of knowledge updating.
Directory of Open Access Journals (Sweden)
VĂDUVA CECILIA ELENA
2018-02-01
Full Text Available The foundation for future firm development is investment. Agents have a risk aversion requiring higher returns as the risks associated with the project will be greater. The investment decision determines the market firm's affirmation, increasing the market share, dominating the market. Making an investment at a certain point will determine certain cash flows throughout the life of the project, and a residual value can be obtained when it is taken out of service. The flows and payments for the investment project can be more easily tracked if we are proposing a constant update rate. We will be able to analyze, based on various factors, three techniques for determining the discount rate for investment projects: the opportunity cost, the risk-free rate, and a series of risk premiums, the weighted average cost of capital. People without financial training make value judgments for investment projects based on other market opportunities, comparing the returns that any investment offers to other pay options. An investor has a sum of money he wants to make - if he does not invest in a project, he will invest in another, that will bring him a certain amount of money, choosing the most advantageous project by comparison. All projects are characterized by identical risks, and the agents are considered indifferent to the risks. The answer given by financial theory and practice to the disadvantage of rates in the opportunity cost category is the discount rate calculated as a sum of the risk-free rate and a risk premium, defining the risk as a factor whose action may cause a possible decrease in cash of the available flows. Higher objectivity is presented by the opportunity cost update rates of update because it refers to known variables but cannot be perfectly matched to the performance of the investment process.
2006-01-01
As part of the upgrade of telephone services, the CERN switching centre will be updated on Monday 3 July between 8.00 p.m. and 3.00 a.m. Telephone services may be disrupted and possibly even interrupted during this operation.We apologise in advance for any inconvenience this may cause. CERN TELECOM Service
2006-01-01
As part of the upgrade of telephone services, the CERN switching centre will be updated on Monday 3 July between 8.00 p.m. and 3.00 a.m. Telephone services may be disrupted and possibly even interrupted during this operation. We apologise in advance for any inconvenience this may cause. CERN TELECOM Service
Update History of This Database - GenLibi | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us GenLibi Update History of This Database Date Update contents 2014/03/25 GenLibi English archi...base Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - GenLibi | LSDB Archive ... ...ve site is opened. 2007/03/01 GenLibi ( http://gene.biosciencedbc.jp/ ) is opened. About This Database Data
Update History of This Database - TogoTV | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us TogoTV Update History of This Database Date Update contents 2017/05/12 TogoTV English archiv...ription Download License Update History of This Database Site Policy | Contact Us Update History of This Database - TogoTV | LSDB Archive ... ...e site is opened. 2007/07/20 TogoTV ( http://togotv.dbcls.jp/ ) is opened. About This Database Database Desc
Update History of This Database - ConfC | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us ConfC Update History of This Database Date Update contents 2016/09/20 ConfC English archive ...tion Download License Update History of This Database Site Policy | Contact Us Update History of This Database - ConfC | LSDB Archive ... ...site is opened. 2005/05/01 ConfC (http://mbs.cbrc.jp/ConfC/) is opened. About This Database Database Descrip
Update History of This Database - dbQSNP | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us dbQSNP Update History of This Database Date Update contents 2017/02/16 dbQSNP English archiv...e Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - dbQSNP | LSDB Archive ... ...e site is opened. 2002/10/23 dbQSNP (http://qsnp.gen.kyushu-u.ac.jp/) is opened. About This Database Databas
On preconditioner updates for sequences of saddle-point linear systems
Directory of Open Access Journals (Sweden)
Simone Valentina De
2018-02-01
Full Text Available Updating preconditioners for the solution of sequences of large and sparse saddle- point linear systems via Krylov methods has received increasing attention in the last few years, because it allows to reduce the cost of preconditioning while keeping the efficiency of the overall solution process. This paper provides a short survey of the two approaches proposed in the literature for this problem: updating the factors of a preconditioner available in a block LDLT form, and updating a preconditioner via a limited-memory technique inspired by quasi-Newton methods.
Thread-Level Parallel Indexing of Update Intensive Moving-Object Workloads
DEFF Research Database (Denmark)
Sidlauskas, Darius; Ross, Kenneth A.; Jensen, Christian S.
2011-01-01
as well as contain queries. The non-trivial challenge addressed is that of avoiding contention between long-running queries and frequent updates. Specifically, the paper proposes a grid-based indexing technique. A static grid indexes a near up-to-date snapshot of the data to support queries, while a live......Modern processors consist of multiple cores that each support parallel processing by multiple physical threads, and they offer ample main-memory storage. This paper studies the use of such processors for the processing of update-intensive moving-object workloads that contain very frequent updates...
Neural basis for dynamic updating of object representation in visual working memory.
Takahama, Sachiko; Miyauchi, Satoru; Saiki, Jun
2010-02-15
In real world, objects have multiple features and change dynamically. Thus, object representations must satisfy dynamic updating and feature binding. Previous studies have investigated the neural activity of dynamic updating or feature binding alone, but not both simultaneously. We investigated the neural basis of feature-bound object representation in a dynamically updating situation by conducting a multiple object permanence tracking task, which required observers to simultaneously process both the maintenance and dynamic updating of feature-bound objects. Using an event-related design, we separated activities during memory maintenance and change detection. In the search for regions showing selective activation in dynamic updating of feature-bound objects, we identified a network during memory maintenance that was comprised of the inferior precentral sulcus, superior parietal lobule, and middle frontal gyrus. In the change detection period, various prefrontal regions, including the anterior prefrontal cortex, were activated. In updating object representation of dynamically moving objects, the inferior precentral sulcus closely cooperates with a so-called "frontoparietal network", and subregions of the frontoparietal network can be decomposed into those sensitive to spatial updating and feature binding. The anterior prefrontal cortex identifies changes in object representation by comparing memory and perceptual representations rather than maintaining object representations per se, as previously suggested. Copyright 2009 Elsevier Inc. All rights reserved.
Updating visual memory across eye movements for ocular and arm motor control.
Thompson, Aidan A; Henriques, Denise Y P
2008-11-01
Remembered object locations are stored in an eye-fixed reference frame, so that every time the eyes move, spatial representations must be updated for the arm-motor system to reflect the target's new relative position. To date, studies have not investigated how the brain updates these spatial representations during other types of eye movements, such as smooth-pursuit. Further, it is unclear what information is used in spatial updating. To address these questions we investigated whether remembered locations of pointing targets are updated following smooth-pursuit eye movements, as they are following saccades, and also investigated the role of visual information in estimating eye-movement amplitude for updating spatial memory. Misestimates of eye-movement amplitude were induced when participants visually tracked stimuli presented with a background that moved in either the same or opposite direction of the eye before pointing or looking back to the remembered target location. We found that gaze-dependent pointing errors were similar following saccades and smooth-pursuit and that incongruent background motion did result in a misestimate of eye-movement amplitude. However, the background motion had no effect on spatial updating for pointing, but did when subjects made a return saccade, suggesting that the oculomotor and arm-motor systems may rely on different sources of information for spatial updating.
Upgrade of the Hunterston B AGR operator training simulator
International Nuclear Information System (INIS)
Morrison, J.; Nicol, D.; Hacking, D.
1997-01-01
Nuclear power plant simulators provide a vital tool in the training of operational staff in the statutory procedures and operational requirements of the nuclear industry. Scottish Nuclear, and its predecessor the South of Scotland Electricity Board, recognised the value such facilities offered to safety and efficiency and commissioned the construction of the Hunterston Operator Training Simulator as early as 1980. The simulator is a full scope, total plant, and real time system, with a complete 'as plant' replication of the operator interface, together with extensive instructor and tutorial facilities. Its uses have extended beyond the operator training role into plant engineering post incident analysis, evolving to be an essential feature of the station as a whole. Operation of the simulator for the foreseeable life of the station was the main driving force behind the current simulator update project, and whilst the need to move to a new computing platform, avoiding impending obsolescence problems, was the prime reason, the retention of 17 years of software development was seen as a valuable legacy to preserve. This paper discusses the main criteria considered during the simulator upgrade programme, highlighting the main technical issues and risks involved. (author)
Simulated Performance Evaluation of a Selective Tracker Through Random Scenario Generation
DEFF Research Database (Denmark)
Hussain, Dil Muhammad Akbar
2006-01-01
performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the proposed selective track splitting algorithm using Kalman filters is investigated through a number of performance parameters which gives the activity profile of the tracking scenario...... The paper presents a simulation study on the performance of a target tracker using selective track splitting filter algorithm through a random scenario implemented on a digital signal processor. In a typical track splitting filter all the observation which fall inside a likelihood ellipse...... are used for update, however, in our proposed selective track splitting filter less number of observations are used for track update. Much of the previous performance work [1] has been done on specific (deterministic) scenarios. One of the reasons for considering the specific scenarios, which were...
Cardiovascular Update: Risk, Guidelines, and Recommendations.
Pearson, Tamera
2015-09-01
This article provides an update of the current status of cardiovascular disease (CVD) in the United States, including a brief review of the underlying pathophysiology and epidemiology. This article presents a discussion of the latest American Heart Association guidelines that introduce the concept of promoting ideal cardiovascular health, defined by seven identified metrics. Specific CVD risk factors and utilization of the 10-year CVD event prediction calculator are discussed. In addition, current management recommendations of health-related conditions that increase risk for CVD, such as hypertension and hypercholesterolemia, are provided. Finally, a discussion of detailed evidence-based lifestyle recommendations to promote cardiovascular health and reduce CVD risks concludes the update. © 2015 The Author(s).
Update in clinical allergy and immunology.
von Gunten, S; Marsland, B J; von Garnier, C; Simon, D
2012-12-01
In the recent years, a tremendous body of studies has addressed a broad variety of distinct topics in clinical allergy and immunology. In this update, we discuss selected recent data that provide clinically and pathogenetically relevant insights or identify potential novel targets and strategies for therapy. The role of the microbiome in shaping allergic immune responses and molecular, as well as cellular mechanisms of disease, is discussed separately and in the context of atopic dermatitis, as an allergic model disease. Besides summarizing novel evidence, this update highlights current areas of uncertainties and debates that, as we hope, shall stimulate scientific discussions and research activities in the field. © 2012 John Wiley & Sons A/S.
Modeling and Simulation in the Army Intermediate Level Education Critical Thinking Curriculum
2015-06-12
and bring me back to reality when I become too technical. These characteristics also apply to the MMAS board that guided me in researching this paper...curriculum 1 ProCon.org, “Critical Thinking Quote: Adrienne Rich ,” last updated August 8, 2013...ray-odierno/. 5 embraced visual learning to teach the millennial generation, to include using simulations to enhance understanding of critical
Updating the Nomographical Diagrams for Dimensioning the Beams
Pop Maria T.
2015-01-01
In order to reduce the time period needed for structures design it is strongly recommended to use nomographical diagrams. The base for formation and updating the nomographical diagrams, stands on the charts presented by different technical publications. The updated charts use the same algorithm and calculation elements as the former diagrams in accordance to the latest prescriptions and European standards. The result consists in a chart, having the same properties, similar with the nomogragra...
Federal Geothermal Research Program Update - Fiscal Year 2001
Energy Technology Data Exchange (ETDEWEB)
Laney, P.T.
2002-08-31
This Federal Geothermal Program Research Update reviews the specific objectives, status, and accomplishments of DOE's Geothermal Program for Federal Fiscal Year (FY) 2001. The information contained in this Research Update illustrates how the mission and goals of the Office of Geothermal Technologies are reflected in each R&D activity. The Geothermal Program, from its guiding principles to the most detailed research activities, is focused on expanding the use of geothermal energy.
Update History of This Database - AcEST | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us AcEST Update History of This Database Date Update contents 2013/01/10 Errors found on AcEST ...s Database Database Description Download License Update History of This Data...base Site Policy | Contact Us Update History of This Database - AcEST | LSDB Archive ... ...Conting data have been correceted. For details, please refer to the following page. Data correction 2010/03/29 AcEST English archi
Update History of This Database - FANTOM5 | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available CAGE TSS aggregation 」 「 CAGE peaks 」 2015/12/07 FANTOM5 archive site is opened. (Archive V1) 2014/03/27 FANTOM5 ( http://fantom...switchLanguage; BLAST Search Image Search Home About Archive Update History Data List Contact us FANTOM...5 Update History of This Database Date Update contents 2017/03/14 FANTOM5 English arch...escription Download License Update History of This Database Site Policy | Contact Us Update History of This Database - FANTOM5 | LSDB Archive ...
Crucial role of strategy updating for coexistence of strategies in interaction networks
Zhang, Jianlei; Zhang, Chunyan; Cao, Ming; Weissing, Franz J.
2015-04-01
Network models are useful tools for studying the dynamics of social interactions in a structured population. After a round of interactions with the players in their local neighborhood, players update their strategy based on the comparison of their own payoff with the payoff of one of their neighbors. Here we show that the assumptions made on strategy updating are of crucial importance for the strategy dynamics. In the first step, we demonstrate that seemingly small deviations from the standard assumptions on updating have major implications for the evolutionary outcome of two cooperation games: cooperation can more easily persist in a Prisoner's Dilemma game, while it can go more easily extinct in a Snowdrift game. To explain these outcomes, we develop a general model for the updating of states in a network that allows us to derive conditions for the steady-state coexistence of states (or strategies). The analysis reveals that coexistence crucially depends on the number of agents consulted for updating. We conclude that updating rules are as important for evolution on a network as network structure and the nature of the interaction.
2006-01-01
As part of the upgrade of telephone services, the CERN switching centre will be updated on between Monday 23 October 8.00 p.m. and Tuesday 24 October 2.00 a.m. Telephone services may be disrupted and possibly even interrupted during this operation. We apologise in advance for any inconvenience this may cause. CERN TELECOM Service
Q4 2017/Q1 2018 Solar Industry Update
Energy Technology Data Exchange (ETDEWEB)
Feldman, David J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Margolis, Robert M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hoskins, Jack [U.S. Department of Energy
2018-05-16
This technical presentation provides an update on the major trends that occurred in the solar industry in Q4 2017 and Q1 2018. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.
Energy Technology Data Exchange (ETDEWEB)
Georgieva, Emiliya Lyudmilova
2016-06-06
The novel academic contributions are summarized as follows. A) A cross-section modelling methodology and a cycle-specific cross-section update procedure are developed to meet fidelity requirements applicable to a cycle-specific reactor core simulation, as well as particular customer needs and practices supporting VVER-1000 operation and safety. B) A real-time version of the Nodal Expansion Method code is developed and implemented into Kozloduy 6 full-scope replica control room simulator.
Update History of This Database - AT Atlas | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available List Contact us AT Atlas Update History of This Database Date Update contents 2013/12/16 The email address i... ( http://www.tanpaku.org/atatlas/ ) is opened. About This Database Database Description Download License Update History of This Data...base Site Policy | Contact Us Update History of This Database - AT Atlas | LSDB Archive ...
Directory of Open Access Journals (Sweden)
O. H. Otterå
2009-11-01
Full Text Available The Bergen Climate Model (BCM is a fully-coupled atmosphere-ocean-sea-ice model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate. Here, a pre-industrial multi-century simulation with an updated version of BCM is described and compared to observational data. The model is run without any form of flux adjustments and is stable for several centuries. The simulated climate reproduces the general large-scale circulation in the atmosphere reasonably well, except for a positive bias in the high latitude sea level pressure distribution. Also, by introducing an updated turbulence scheme in the atmosphere model a persistent cold bias has been eliminated. For the ocean part, the model drifts in sea surface temperatures and salinities are considerably reduced compared to earlier versions of BCM. Improved conservation properties in the ocean model have contributed to this. Furthermore, by choosing a reference pressure at 2000 m and including thermobaric effects in the ocean model, a more realistic meridional overturning circulation is simulated in the Atlantic Ocean. The simulated sea-ice extent in the Northern Hemisphere is in general agreement with observational data except for summer where the extent is somewhat underestimated. In the Southern Hemisphere, large negative biases are found in the simulated sea-ice extent. This is partly related to problems with the mixed layer parametrization, causing the mixed layer in the Southern Ocean to be too deep, which in turn makes it hard to maintain a realistic sea-ice cover here. However, despite some problematic issues, the pre-industrial control simulation presented here should still be appropriate for climate change studies requiring multi-century simulations.
A Mathematics Software Database Update.
Cunningham, R. S.; Smith, David A.
1987-01-01
Contains an update of an earlier listing of software for mathematics instruction at the college level. Topics are: advanced mathematics, algebra, calculus, differential equations, discrete mathematics, equation solving, general mathematics, geometry, linear and matrix algebra, logic, statistics and probability, and trigonometry. (PK)
Treatability study sample exemption: update
International Nuclear Information System (INIS)
1997-01-01
This document is a RCRA Information Brief intended to update the information in the 1991 Small-Scale Treatability Study Information Brief, and to address questions about the waste and treatability study sample exemptions that have arisen since References 3 and 5 were published
Earth Observing System Covariance Realism Updates
Ojeda Romero, Juan A.; Miguel, Fred
2017-01-01
This presentation will be given at the International Earth Science Constellation Mission Operations Working Group meetings June 13-15, 2017 to discuss the Earth Observing System Covariance Realism updates.
Computer simulation of radiation-induced nanostructure formation in amorphous materials
International Nuclear Information System (INIS)
Li, K.-D.; Perez-Bergquist, Alejandro; Wang, Lumin
2009-01-01
In this study, 3D simulations based on a theoretical model were developed to investigate radiation-induced nanostructure formation in amorphous materials. Model variables include vacancy production and recombination rates, ion sputtering effects, and redeposition of sputtered atoms. In addition, a phase field model was developed to predict vacancy diffusion as a function of free energies of mixing and interfacial energies. The distribution profile of the vacancy production rate along the depth of an irradiated matrix was considered as a near Gaussian approximation according to Monte-Carlo TRIM code calculations. Dynamic processes responsible for nanostructure evolution were simulated by updating the vacancy concentration profile over time. Simulated morphologies include cellular nanoholes, nanowalls, nanovoids, and nanofibers, with the resultant morphology dependant upon the incident ion species and ion fluence. These simulated morphologies are consistent with experimental observations achieved under comparable experimental conditions. Our model provides a distinct numerical approach to accurately predicting morphological results for ion-irradiation-induced nanostructures.
USAR managing and updating process
International Nuclear Information System (INIS)
Prah, M.; Spiler, J.
1996-01-01
In this paper basis and background of the FSAR (Final Safety Analysis Report) document and its conversion process to the USAR (Updated Safety Analysis Report) document are described. In addition, there are internal and external reviews as approval process presented. The following is included in our new approach to manage USAR changes: initiating the USAR change, technical reviewing, preparing a safety evaluation, KSC (Krsko Safety Committee) and KOC (Krsko Operating Committee) review, ESD Director approval, and the Regulatory Body review or approval. The intensive technological modification activities started in the year 1992 when the NEK Engineering Services Division was established. These activities are one of the most important reason for a very intensive USAR items change. The other reason for its conversation to an electronic format is a possibility for easier and faster searching, updating and changing process and introducing a new systematic USAR managing approach as mentioned above. (author)
Which Updates During an Equity Crowdfunding Campaign Increase Crowd Participation?
J.H. Block (Jörn); L. Hornuf (Lars); A. Moritz (Alexandra)
2016-01-01
textabstractStart-ups often post updates during equity crowdfunding campaigns. Yet, little is known about the effects of such updates on funding success. We investigate this question using hand-collected data from 71 funding campaigns on two German equity crowdfunding portals. Using a combination of
77 FR 4034 - Annual Update of the HHS Poverty Guidelines
2012-01-26
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Office of the Secretary Annual Update of the HHS Poverty... update of the Department of Health and Human Services (HHS) poverty guidelines to account for last... program. For information about poverty figures for immigration forms, the Hill-Burton Uncompensated...
76 FR 3637 - Annual Update of the HHS Poverty Guidelines
2011-01-20
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Office of the Secretary Annual Update of the HHS Poverty... update of the Department of Health and Human Services (HHS) poverty guidelines to account for last... program. For information about poverty figures for immigration forms, the Hill-Burton Uncompensated...
78 FR 5182 - Annual Update of the HHS Poverty Guidelines
2013-01-24
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Office of the Secretary Annual Update of the HHS Poverty... update of the Department of Health and Human Services (HHS) poverty guidelines to account for last... program. For information about poverty figures for immigration forms, the Hill-Burton Uncompensated...
Which updates during an equity crowdfunding campaign increase crowd participation?
Block, J. (Jörn); Hornuf, L. (Lars); Moritz, A. (Alexandra)
2017-01-01
textabstractStart-ups often post updates during equity crowdfunding campaigns. However, little is known about the effects of such updates on crowd participation. We investigate this question by using hand-collected data from 71 funding campaigns and 39,399 investment decisions on two German equity
Impaired Working Memory Updating for Emotional Stimuli in Depressed Patients.
Zhang, Dandan; Xie, Hui; He, Zhenhong; Wei, Zhaoguo; Gu, Ruolei
2018-01-01
Although two previous studies have demonstrated that depressed individuals showed deficits in working memory (WM) updating of both negative and positive contents, the effects were confounded by shifting dysfunctions and the detailed neural mechanism associated with the failure in N-back task is not clear. Using a 2-back task, the current study examined the WM updating of positive, negative and neutral contents in depressed patients. It is found that depressed patients performed poorer than healthy controls only when updating positive material. Using event-related potential (ERP) technique, the current study also investigated the neural correlates of updating deficits in depression. According to previous studies, the n-back task was divided into three sub-processes, i.e., encoding, matching and maintaining. Our ERP results showed that depressed patients had smaller occipital P1 for positive material compared to healthy controls, indicating their insensitivity to positive items on early encoding stage. Besides, depressed patients had larger frontal P2 and parietal late positive potential (LPP) than healthy controls irrespective of the valence of the words, reflecting that patients are inefficient during matching (P2) and maintaining (LPP) processes. These two mechanisms (insufficient attention to positive stimuli and low efficiency in matching and maintaining) together lead to the deficits of WM updating in depression.
Update History of This Database - KAIKOcDNA | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us KAIKOcDNA Update History of This Database Date Update contents 2014/10/20 The URL of the dat... database ( http://sgp.dna.affrc.go.jp/EST/ ) is opened. About This Database Database Description Download License Update Hi...story of This Database Site Policy | Contact Us Update History of This Database - KAIKOcDNA | LSDB Archive ... ...abase maintenance site is changed. 2014/10/08 KAIKOcDNA English archive site is opened. 2004/04/12 KAIKOcDNA
Update History of This Database - TP Atlas | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us TP Atlas Update History of This Database Date Update contents 2013/12/16 The email address i...s ( http://www.tanpaku.org/tpatlas/ ) is opened. About This Database Database Description Download License Update History of Thi...s Database Site Policy | Contact Us Update History of This Database - TP Atlas | LSDB Archive ... ...n the contact information is corrected. 2013/11/19 TP Atlas English archive site is opened. 2008/4/1 TP Atla
Lifescience Database Archive (English)
Full Text Available glish archive site is opened. 2010/10/01 KEGG MEDICUS ( http://www.kegg.jp/kegg/medicus/ ) is opened. About ...[ Credits ] English ]; } else if ( url.search(//en//) != -1 ) { url = url.replace(/...switchLanguage; BLAST Search Image Search Home About Archive Update History Data List Contact us KEGG MEDI...CUS Update History of This Database Date Update contents 2014/05/09 KEGG MEDICUS En...This Database Database Description Download License Update History of This Database Site Policy | Contact Us Update History of This Database - KEGG MEDICUS | LSDB Archive ...
Egocentric-updating during navigation facilitates episodic memory retrieval.
Gomez, Alice; Rousset, Stéphane; Baciu, Monica
2009-11-01
Influential models suggest that spatial processing is essential for episodic memory [O'Keefe, J., & Nadel, L. (1978). The hippocampus as a cognitive map. London: Oxford University Press]. However, although several types of spatial relations exist, such as allocentric (i.e. object-to-object relations), egocentric (i.e. static object-to-self relations) or egocentric updated on navigation information (i.e. self-to-environment relations in a dynamic way), usually only allocentric representations are described as potentially subserving episodic memory [Nadel, L., & Moscovitch, M. (1998). Hippocampal contributions to cortical plasticity. Neuropharmacology, 37(4-5), 431-439]. This study proposes to confront the allocentric representation hypothesis with an egocentric updated with self-motion representation hypothesis. In the present study, we explored retrieval performance in relation to these two types of spatial processing levels during learning. Episodic remembering has been assessed through Remember responses in a recall and in a recognition task, combined with a "Remember-Know-Guess" paradigm [Gardiner, J. M. (2001). Episodic memory and autonoetic consciousness: A first-person approach. Philosophical Transactions of the Royal Society B: Biological Sciences, 356(1413), 1351-1361] to assess the autonoetic level of responses. Our results show that retrieval performance was significantly higher when encoding was performed in the egocentric-updated condition. Although egocentric updated with self-motion and allocentric representations are not mutually exclusive, these results suggest that egocentric updating processing facilitates remember responses more than allocentric processing. The results are discussed according to Burgess and colleagues' model of episodic memory [Burgess, N., Becker, S., King, J. A., & O'Keefe, J. (2001). Memory for events and their spatial context: models and experiments. Philosophical Transactions of the Royal Society of London. Series B
Summary Analysis: Hanford Site Composite Analysis Update
Energy Technology Data Exchange (ETDEWEB)
Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)
2017-06-05
The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.
Göschl, Daniel
2018-03-01
We discuss simulation strategies for the massless lattice Schwinger model with a topological term and finite chemical potential. The simulation is done in a dual representation where the complex action problem is solved and the partition function is a sum over fermion loops, fermion dimers and plaquette-occupation numbers. We explore strategies to update the fermion loops coupled to the gauge degrees of freedom and check our results with conventional simulations (without topological term and at zero chemical potential), as well as with exact summation on small volumes. Some physical implications of the results are discussed.
Q4 2016/Q1 2017 Solar Industry Update
Energy Technology Data Exchange (ETDEWEB)
Margolis, Robert; Feldman, David; Boff, Daniel
2017-05-17
This technical presentation provides an update on the major trends that occurred in the solar industry in the fourth quarter of 2016 and the first quarter of 2017. Major topics of focus include global and U.S. supply and demand, module and system price, investment trends and business models, and updates on U.S. government programs supporting the solar industry.
Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model
Boone, Spencer
2017-01-01
This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.
Veterans and agent orange: update 2000
National Research Council Canada - National Science Library
Committee to Review the Health Effects in Vietnam Veterans of Exposure to Herbicides (Third Biennial Update), Division of Health Promotion and Disease Prevention
2001-01-01
Veterans and Agent Orange: Update 2000 examines the state of the scientific evidence regarding associations between diseases and exposure to dioxin and other chemical compounds in herbicides used in Vietnam...
Update of the FANTOM web resource
DEFF Research Database (Denmark)
Lizio, Marina; Harshbarger, Jayson; Abugessaisa, Imad
2017-01-01
Upon the first publication of the fifth iteration of the Functional Annotation of Mammalian Genomes collaborative project, FANTOM5, we gathered a series of primary data and database systems into the FANTOM web resource (http://fantom.gsc.riken.jp) to facilitate researchers to explore...... transcriptional regulation and cellular states. In the course of the collaboration, primary data and analysis results have been expanded, and functionalities of the database systems enhanced. We believe that our data and web systems are invaluable resources, and we think the scientific community will benefit...... for this recent update to deepen their understanding of mammalian cellular organization. We introduce the contents of FANTOM5 here, report recent updates in the web resource and provide future perspectives....
29 CFR 4281.43 - Notices of insolvency and annual updates.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Notices of insolvency and annual updates. 4281.43 Section 4281.43 Labor Regulations Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION INSOLVENCY... MASS WITHDRAWAL Benefit Suspensions § 4281.43 Notices of insolvency and annual updates. (a) Requirement...
Comprehensive Thematic T-matrix Reference Database: a 2013-2014 Update
Mishchenko, Michael I.; Zakharova, Nadezhda T.; Khlebtsov, Nikolai G.; Wriedt, Thomas; Videen, Gorden
2014-01-01
This paper is the sixth update to the comprehensive thematic database of peer-reviewedT-matrix publications initiated by us in 2004 and includes relevant publications that have appeared since 2013. It also lists several earlier publications not incorporated in the original database and previous updates.
CANDU 9 nuclear power plant simulator
International Nuclear Information System (INIS)
Kattan, M.; MacBeth, M.J.; Lam, K.
1995-01-01
Simulators are playing, an important role in the design and operations of CANDU reactors. They are used to analyze operating procedures under standard and upset conditions. The CANDU 9 nuclear power plant simulator is a low fidelity, near full scope capability simulator. It is designed to play an integral part in the design and verification of the control centre mock-up located in the AECL design office. It will also provide CANDU plant process dynamic data to the plant display system (PDS), distributed control system (DCS) and to the mock-up panel devices. The simulator model employs dynamic mathematical models of the various process and control components that make up a nuclear power plant. It provides the flexibility to add, remove or update user supplied component models. A block oriented process input is provided with the simulator. Individual blocks which represent independent algorithms of the model are linked together to generate the required overall plant model. As a design tool the simulator will be used for control strategy development, human factors studies (information access, readability, graphical display design, operability), analysis of overall plant control performance, tuning estimates for major control loops and commissioning strategy development. As a design evaluation tool, the simulator will be used to perform routine and non-routine procedures, practice 'what if' scenarios for operational strategy development, practice malfunction recovery procedures and verify human factors activities. This paper will describe the CANDU 9 plant simulator and demonstrate its implementation and proposed utility as a tool in the control system and control centre design of a CANDU 9 nuclear power plant. (author). 2 figs
2010-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Risk Management Plan § 68.190 Updates. (a) The owner or operator shall... later than the date on which a new regulated substance is first present in an already covered process...
Upgrade trigger: Biannual performance update
Aaij, Roel; Couturier, Ben; Esen, Sevda; De Cian, Michel; De Vries, Jacco Andreas; Dziurda, Agnieszka; Fitzpatrick, Conor; Fontana, Marianna; Grillo, Lucia; Hasse, Christoph; Jones, Christopher Rob; Le Gac, Renaud; Matev, Rosen; Neufeld, Niko; Nikodem, Thomas; Polci, Francesco; Del Buono, Luigi; Quagliani, Renato; Schwemmer, Rainer; Seyfert, Paul; Stahl, Sascha; Szumlak, Tomasz; Vesterinen, Mika Anton; Wanczyk, Joanna; Williams, Mark Richard James; Yin, Hang; Zacharjasz, Emilia Anna
2017-01-01
This document presents the performance of the LHCb Upgrade trigger reconstruction sequence, incorporating changes to the underlying reconstruction algorithms and detector description since the Trigger and Online Upgrade TDR. An updated extrapolation is presented using the most recent example of an Event Filter Farm node.
Gabus, Vincent; Tran, Van Nam; Regamey, Julien; Pascale, Patrizio; Monney, Pierre; Hullin, Roger; Vogt, Pierre
2017-01-11
In 2016 the European Society of Cardiology (ESC) published new guidelines. These documents update the knowledge in various fields such as atrial fibrillation, heart failure, cardiovascular prevention and dyslipidemia. Of course it is impossible to summarize these guidelines in detail. Nevertheless, we decided to highlight the major modifications, and to emphasize some key points that are especially useful for the primary care physician.
Optimal update with multiple out-of-sequence measurements
Zhang, Shuo; Bar-Shalom, Yaakov
2011-06-01
In multisensor target tracking systems receiving out-of-sequence measurements from local sensors is a common situation. In the last decade many algorithms have been proposed to update a target state with an OOSM optimally or suboptimally. However, what one faces in the real world is multiple OOSMs, which arrive at the fusion center in, generally, arbitrary orders, e.g., in succession or interleaved with in-sequence measurements. A straightforward approach to deal with this multi-OOSM problem is by sequentially applying a given OOSM algorithm; however, this simple solution does not guarantee optimal update under the multi-OOSM scenario. The present paper discusses the differences between the single-OOSM processing and the multi-OOSM processing, and presents the general solution to the multi-OOSM problem, called the complete in-sequence information (CISI) approach. Given an OOSM, in addition to updating the target state at the most recent time, the CISI approach also updates the states between the OOSM time and the most recent time, including the state at the OOSM time. Three novel CISI methods are developed in this paper: the information filter-equivalent measurement (IF-EqM) method, the CISI fixed-point smoothing (CISI-FPS) method and the CISI fixed-interval smoothing (CISI-FIS) method. Numerical examples are given to show the optimality of these CISI methods under various multi-OOSM scenarios.
Simulation studies of macroparticles falling into the LHC Proton Beam
Fuster Martinez, N; Zimmermann, F; Baer, T; Giovannozzi, M; Holzer, E B; Nebot Del Busto, E; Nordt, A; Sapinski, M; Yang, Z
2011-01-01
We report updated simulations on the interaction of macroparticles falling from the top of the vacuum chamber into the circulating LHC proton beam. The path and charge state of micron size micro-particles are computed together with the resulting beam losses, which — if high enough — can lead to the local quench of superconducting (SC) magnets. The simulated time evolution of the beam loss is compared with observations in order to constrain some macroparticle parameters. We also discuss the possibility of a “multiple crossing” by the same macroparticle, the effect of a strong dipole field, and the dependence of peak loss rate and loss duration on beam current and on beam size.
Prediction-error variance in Bayesian model updating: a comparative study
Asadollahi, Parisa; Li, Jian; Huang, Yong
2017-04-01
In Bayesian model updating, the likelihood function is commonly formulated by stochastic embedding in which the maximum information entropy probability model of prediction error variances plays an important role and it is Gaussian distribution subject to the first two moments as constraints. The selection of prediction error variances can be formulated as a model class selection problem, which automatically involves a trade-off between the average data-fit of the model class and the information it extracts from the data. Therefore, it is critical for the robustness in the updating of the structural model especially in the presence of modeling errors. To date, three ways of considering prediction error variances have been seem in the literature: 1) setting constant values empirically, 2) estimating them based on the goodness-of-fit of the measured data, and 3) updating them as uncertain parameters by applying Bayes' Theorem at the model class level. In this paper, the effect of different strategies to deal with the prediction error variances on the model updating performance is investigated explicitly. A six-story shear building model with six uncertain stiffness parameters is employed as an illustrative example. Transitional Markov Chain Monte Carlo is used to draw samples of the posterior probability density function of the structure model parameters as well as the uncertain prediction variances. The different levels of modeling uncertainty and complexity are modeled through three FE models, including a true model, a model with more complexity, and a model with modeling error. Bayesian updating is performed for the three FE models considering the three aforementioned treatments of the prediction error variances. The effect of number of measurements on the model updating performance is also examined in the study. The results are compared based on model class assessment and indicate that updating the prediction error variances as uncertain parameters at the model
Citation Discovery Tools for Conducting Adaptive Meta-analyses to Update Systematic Reviews.
Bae, Jong-Myon; Kim, Eun Hee
2016-03-01
The systematic review (SR) is a research methodology that aims to synthesize related evidence. Updating previously conducted SRs is necessary when new evidence has been produced, but no consensus has yet emerged on the appropriate update methodology. The authors have developed a new SR update method called 'adaptive meta-analysis' (AMA) using the 'cited by', 'similar articles', and 'related articles' citation discovery tools in the PubMed and Scopus databases. This study evaluates the usefulness of these citation discovery tools for updating SRs. Lists were constructed by applying the citation discovery tools in the two databases to the articles analyzed by a published SR. The degree of overlap between the lists and distribution of excluded results were evaluated. The articles ultimately selected for the SR update meta-analysis were found in the lists obtained from the 'cited by' and 'similar' tools in PubMed. Most of the selected articles appeared in both the 'cited by' lists in Scopus and PubMed. The Scopus 'related' tool did not identify the appropriate articles. The AMA, which involves using both citation discovery tools in PubMed, and optionally, the 'related' tool in Scopus, was found to be useful for updating an SR.
Updates to building-code maps for the 2015 NEHRP recommended seismic provisions
Luco, Nicolas; Bachman, Robert; Crouse, C.B; Harris, James R.; Hooper, John D.; Kircher, Charles A.; Caldwell, Phillp; Rukstales, Kenneth S.
2015-01-01
With the 2014 update of the U.S. Geological Survey (USGS) National Seismic Hazard Model (NSHM) as a basis, the Building Seismic Safety Council (BSSC) has updated the earthquake ground motion maps in the National Earthquake Hazards Reduction Program (NEHRP) Recommended Seismic Provisions for New Buildings and Other Structures, with partial funding from the Federal Emergency Management Agency. Anticipated adoption of the updated maps into the American Society of Civil Engineers Minimum Design Loads for Building and Other Structures and the International Building and Residential Codes is underway. Relative to the ground motions in the prior edition of each of these documents, most of the updated values are within a ±20% change. The larger changes are, in most cases, due to the USGS NSHM updates, reasons for which are given in companion publications. In some cases, the larger changes are partly due to a BSSC update of the slope of the fragility curve that is used to calculate the risk-targeted ground motions, and/or the introduction by BSSC of a quantitative definition of “active faults” used to calculate deterministic ground motions.
A New Multiscale Technique for Time-Accurate Geophysics Simulations
Omelchenko, Y. A.; Karimabadi, H.
2006-12-01
Large-scale geophysics systems are frequently described by multiscale reactive flow models (e.g., wildfire and climate models, multiphase flows in porous rocks, etc.). Accurate and robust simulations of such systems by traditional time-stepping techniques face a formidable computational challenge. Explicit time integration suffers from global (CFL and accuracy) timestep restrictions due to inhomogeneous convective and diffusion processes, as well as closely coupled physical and chemical reactions. Application of adaptive mesh refinement (AMR) to such systems may not be always sufficient since its success critically depends on a careful choice of domain refinement strategy. On the other hand, implicit and timestep-splitting integrations may result in a considerable loss of accuracy when fast transients in the solution become important. To address this issue, we developed an alternative explicit approach to time-accurate integration of such systems: Discrete-Event Simulation (DES). DES enables asynchronous computation by automatically adjusting the CPU resources in accordance with local timescales. This is done by encapsulating flux- conservative updates of numerical variables in the form of events, whose execution and synchronization is explicitly controlled by imposing accuracy and causality constraints. As a result, at each time step DES self- adaptively updates only a fraction of the global system state, which eliminates unnecessary computation of inactive elements. DES can be naturally combined with various mesh generation techniques. The event-driven paradigm results in robust and fast simulation codes, which can be efficiently parallelized via a new preemptive event processing (PEP) technique. We discuss applications of this novel technology to time-dependent diffusion-advection-reaction and CFD models representative of various geophysics applications.
National Solar Radiation Database 1991-2005 Update: User's Manual
Energy Technology Data Exchange (ETDEWEB)
Wilcox, S.
2007-04-01
This manual describes how to obtain and interpret the data products from the updated 1991-2005 National Solar Radiation Database (NSRDB). This is an update of the original 1961-1990 NSRDB released in 1992.
Simulation of Axial Combustion Instability Development and Suppression in Solid Rocket Motors
David R. Greatrix
2009-01-01
In the design of solid-propellant rocket motors, the ability to understand and predict the expected behaviour of a given motor under unsteady conditions is important. Research towards predicting, quantifying, and ultimately suppressing undesirable strong transient axial combustion instability symptoms necessitates a comprehensive numerical model for internal ballistic simulation under dynamic flow and combustion conditions. An updated numerical model incorporating recent developments in predi...
Update History of This Database - Q-TARO | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data ...List Contact us Q-TARO Update History of This Database Date Update contents 2014/10/20 The URL of the portal...ption Download License Update History of This Database Site Policy | Contact Us Update History of This Database - Q-TARO | LSDB Archive ... ... site is changed. 2013/12/17 The URL of the portal site is changed. 2013/12/13 Q-TARO English archive site i...s opened. 2009/11/15 Q-TARO ( http://qtaro.abr.affrc.go.jp/ ) is opened. About This Database Database Descri
UPDATING NATIONAL TOPOGRAPHIC DATA BASE USING CHANGE DETECTION METHODS
Directory of Open Access Journals (Sweden)
E. Keinan
2016-06-01
Full Text Available The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA, the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Updating National Topographic Data Base Using Change Detection Methods
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Environmental sciences division: Environmental regulatory update table July 1988
International Nuclear Information System (INIS)
Langston, M.E.; Nikbakht, A.; Salk, M.S.
1988-08-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action
Environmental Regulatory Update Table, March/April 1993. Revision 1
Energy Technology Data Exchange (ETDEWEB)
Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.; Danford, G.S.; Lewis, E.B.
1993-05-01
The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated bimonthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.
Belief update as social choice
van Benthem, J.; Girard, P.; Roy, O.; Marion, M.
2011-01-01
Dynamic epistemic-doxastic logics describe the new knowledge or new beliefs indexBelief of agents after some informational event has happened. Technically, this requires an update rule that turns a doxastic-epistemic modelM(recording the current information state of the agents) and a dynamic ‘event
Do Facebook Status Updates Reflect Subjective Well-Being?
Liu, Pan; Tov, William; Kosinski, Michal; Stillwell, David J; Qiu, Lin
2015-07-01
Nowadays, millions of people around the world use social networking sites to express everyday thoughts and feelings. Many researchers have tried to make use of social media to study users' online behaviors and psychological states. However, previous studies show mixed results about whether self-generated contents on Facebook reflect users' subjective well-being (SWB). This study analyzed Facebook status updates to determine the extent to which users' emotional expression predicted their SWB-specifically their self-reported satisfaction with life. It was found that positive emotional expressions on Facebook did not correlate with life satisfaction, whereas negative emotional expressions within the past 9-10 months (but not beyond) were significantly related to life satisfaction. These findings suggest that both the type of emotional expressions and the time frame of status updates determine whether emotional expressions in Facebook status updates can effectively reflect users' SWB. The findings shed light on the characteristics of online social media and improve the understanding of how user-generated contents reflect users' psychological states.
Update of GRASP/Ada reverse engineering tools for Ada
Cross, James H., II
1993-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical
A general framework for updating belief distributions.
Bissiri, P G; Holmes, C C; Walker, S G
2016-11-01
We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.
Eastern US seismic hazard characterization update
International Nuclear Information System (INIS)
Savy, J.B.; Boissonnade, A.C.; Mensing, R.W.; Short, C.M.
1993-06-01
In January 1989, LLNL published the results of a multi-year project, funded by NRC, on estimating seismic hazard at nuclear plant sites east of the Rockies. The goal of this study was twofold: to develop a good central estimate (median) of the seismic hazard and to characterize the uncertainty in the estimates of this hazard. In 1989, LLNL was asked by DOE to develop site specific estimates of the seismic hazard at the Savannah River Site (SRS) in South Carolina as part of the New Production Reactor (NPR) project. For the purpose of the NPR, a complete review of the methodology and of the data acquisition process was performed. Work done under the NPR project has shown that first order improvement in the estimates of the uncertainty (i.e., lower mean hazard values) could be easily achieved by updating the modeling of the seismicity and ground motion attenuation uncertainty. To this effect, NRC sponsored LLNL to perform a reelicitation to update the seismicity and ground motion experts' inputs and to revise methods to combine seismicity and ground motion inputs in the seismic hazard analysis for nuclear power plant sites east of the Rocky Mountains. The objective of the recent study was to include the first order improvements that reflect the latest knowledge in seismicity and ground motion modeling and produce an update of all the hazard results produced in the 1989 study. In particular, it had been demonstrated that eliciting seismicity information in terms of rates of earthquakes rather than a- and b-values, and changing the elicitation format to a one-on-one interview, improved our ability to express the uncertainty of earthquake rates of occurrence at large magnitudes. Thus, NRC sponsored this update study to refine the model of uncertainty, and to re-elicitate of the experts' interpretations of the zonation and seismicity, as well as to reelicitate the ground motion models, based on current state of knowledge
Parvaneh, Z.; Liao, F.; Arentze, T.A.; Timmermans, H.J.P.; Shakshuki, Elhadi; Yasar, Ansar
2014-01-01
This study introduces a model of individual belief updating of subjective travel times as a function of the provision of different types of travel information. Travel information includes real-time prescriptive or descriptive, and public or personal information. The model is embedded in a
Finite element model updating in structural dynamics using design sensitivity and optimisation
Calvi, Adriano
1998-01-01
Model updating is an important issue in engineering. In fact a well-correlated model provides for accurate evaluation of the structure loads and responses. The main objectives of the study were to exploit available optimisation programs to create an error localisation and updating procedure of nite element models that minimises the "error" between experimental and analytical modal data, addressing in particular the updating of large scale nite element models with se...
Simulation and the Monte Carlo method
Rubinstein, Reuven Y
2016-01-01
Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...
AAP Updates Recommendations on Car Seats
... Size Email Print Share AAP Updates Recommendations on Car Seats Page Content Article Body Children should ride ... of approved car safety seats. Healthy Children Radio: Car Seat Safety Dennis Durbin, MD, FAAP, lead author ...
Pension Fund
2011-01-01
All members and beneficiaries of the Pension Fund are invited to attend the Annual Pension Fund Update to be held in the CERN Council Chamber on Tuesday 20 September 2011 from 10-00 to 12-00 a.m. Copies of the 2010 Financial Statements are available from departmental secretariats. Coffee and croissants will be served prior to the meeting as of 9-30 a.m.
International Nuclear Information System (INIS)
Anon.
1992-01-01
Ontario Hydro's Demand/Supply Plan (DSP), the 25 year plan which was submitted in December 1989, is currently being reviewed by the Environmental Assessment Board (EAB). Since 1989 there have been several changes which have led Ontario Hydro to update the original Demand/Supply Plan. This information sheet gives a quick overview of what has changed and how Ontario Hydro is adapting to that change
FEM Updating of the Heritage Court Building Structure
DEFF Research Database (Denmark)
Ventura, C. E.; Brincker, Rune; Dascotte, E.
2001-01-01
. The starting model of the structure was developed from the information provided in the design documentation of the building. Different parameters of the model were then modified using an automated procedure to improve the correlation between measured and calculated modal parameters. Careful attention......This paper describes results of a model updating study conducted on a 15-storey reinforced concrete shear core building. The output-only modal identification results obtained from ambient vibration measurements of the building were used to update a finite element model of the structure...
Deductive Updating Is Not Bayesian
Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc
2015-01-01
One of the major debates concerning the nature of inferential reasoning is between counterexample-based theories such as mental model theory and probabilistic theories. This study looks at conclusion updating after the addition of statistical information to examine the hypothesis that deductive reasoning cannot be explained by probabilistic…
Eczema and ceramides: an update
DEFF Research Database (Denmark)
Jungersted, Jakob Mutanu; Agner, Tove
2013-01-01
types of treatment. We also consider the genetic influence on stratum corneum lipids. The review is an update on research indexed in PubMed following the discovery of the filaggrin mutations in atopic dermatitis in 2006, but when newer publications cannot stand alone, we include publications from before...
Pelegrina, Santiago; Capodieci, Agnese; Carretti, Barbara; Cornoldi, Cesare
2015-01-01
It has been argued that children with learning disabilities (LD) encounter severe problems in working memory (WM) tasks, especially when they need to update information stored in their WM. It is not clear, however, to what extent this is due to a generally poor updating ability or to a difficulty specific to the domain to be processed. To examine this issue, two groups of children with arithmetic or reading comprehension LD and a group of typically developing children (9 to 10 years old) were assessed using two updating tasks requiring to select the smallest numbers or objects presented. The results showed that children with an arithmetic disability failed in a number updating task, but not in the object updating task. The opposite was true for the group with poor reading comprehension, whose performance was worse in the object than in the number updating task. It may be concluded that the problem of WM updating in children with LD is also due to a poor representation of the material to be updated. In addition, our findings suggest that the mental representation of the size of objects relates to the semantic representation of the objects' properties and differs from the quantitative representation of numbers. © Hammill Institute on Disabilities 2014.
Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1
International Nuclear Information System (INIS)
Oztunali, O.I.; Roles, G.W.
1986-01-01
Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology
Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1
Energy Technology Data Exchange (ETDEWEB)
Oztunali, O.I.; Roles, G.W.
1986-01-01
Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.
75 FR 54592 - Pale Cyst Nematode; Update of Quarantined Areas
2010-09-08
...] Pale Cyst Nematode; Update of Quarantined Areas AGENCY: Animal and Plant Health Inspection Service... made changes to the area in the State of Idaho that is quarantined to prevent the spread of pale cyst nematode. The description of the quarantined area was updated on April 26, 2010. As a result of these...
CERN's web application updates for electron and laser beam technologies
Sigas, Christos
2017-01-01
This report describes the modifications at CERN's web application for electron and laser beam technologies. There are updates at both the front and the back end of the application. New electron and laser machines were added and also old machines were updated. There is also a new feature for printing needed information.
International Nuclear Information System (INIS)
Okasha, Nader M.; Frangopol, Dan M.; Orcesi, André D.
2012-01-01
The importance of improving the understanding of the performance of structures over their lifetime under uncertainty with information obtained from structural health monitoring (SHM) has been widely recognized. However, frameworks that efficiently integrate monitoring data into the life-cycle management of structures are yet to be developed. The objective of this paper is to propose and illustrate an approach for updating the lifetime reliability of aging bridges using monitored strain data obtained from crawl tests. It is proposed to use automated finite element model updating techniques as a tool for updating the resistance parameters of the structure. In this paper, the results from crawl tests are used to update the finite element model and, in turn, update the lifetime reliability. The original and updated lifetime reliabilities are computed using advanced computational tools. The approach is illustrated on an existing bridge.
Updated Higgs cross section at approximate N3LO
International Nuclear Information System (INIS)
Bonvini, Marco; Ball, Richard D; Forte, Stefano; Marzani, Simone; Ridolfi, Giovanni
2014-01-01
We update our estimate of the cross section for Higgs production in gluon fusion at next-to-next-to-next-to-leading order in α s in view of the recent full computation of the result in the soft limit for infinite top mass, which determines a previously unknown constant. We briefly discuss the phenomenological implications. Results are available through the updated version of the ggHiggs code. (paper)
Updated Higgs cross section at approximate N3LO
International Nuclear Information System (INIS)
Bonvini, Marco; Ball, Richard D.; Marzani, Simone
2014-04-01
We update our estimate of the cross section for Higgs production in gluon fusion at next-to-next-to-next-to-leading order (N 3 LO) in α s in view of the recent full computation of the result in the soft limit for infinite top mass, which determines a previously unknown constant. We briefly discuss the phenomenological implications. Results are available through the updated version of the ggHiggs code.
Updating design information questionnaire (DIQ) experiences
International Nuclear Information System (INIS)
Palafox-Garcia, P.
2001-01-01
Full text: 1. Introduction - Once the State signed with the International Atomic Energy Agency the Non-Proliferation Treaty (NPT), the State has to declare to the IAEA their facilities where they handle Nuclear Material. Each facility will have their own Safeguards Agreement and these are called Subsidiary Arrangements. In order to have a good control and accountability of this material, each facility is named Material Balance Area (MBA). Based on the Subsidiary Arrangements each MBA has to fill a proper IAEA format named DIQ in order to get the Facility Attachment. The DIQ format varies, relying on the kind of facility. 2. Facility - In the NNRI, we have two MBA's and the experiences that we have had to fill the DIQ formats had been, that it takes quite a time to get the proper Facility Attachment, because first you have to have the proper format, then you fill it properly with all their respective annexes and once it is reviewed and approved by the people involved, this is signed and sent to the IAEA, this first step took six months. Once the format is reviewed by the IAEA, they send it back to the facility, asking for proper comments in order to clarify it properly, this took three months. The facility update the comments and send it back, this took three months. With this format the IAEA prepares the Facility Attachment of the MBA and send it to the facility for its approval or comments, this took five months. The facility reviewed it and sent it back with some comments or doubts after tree months. The IAEA clarifies the comments and doubts and send to the facility the approved Facility Attachment, four months later. So in order to get the proper Facility Attachments for each of our MBA's, it has been taken 24 months (two years) at least. 3. Actual situation - At present, now that the nuclear activities have been diminished and consequently the nuclear material movements, because the Fuel Fabrication Pilot Plant (FFPP) we have, was stopped for financial reasons
Rapid Update Cycle (RUC) [20 km
National Oceanic and Atmospheric Administration, Department of Commerce — The Rapid Update Cycle (RUC) weather forecast model was developed by the National Centers for Environmental Prediction (NCEP). On May 1, 2012, the RUC was replaced...
Rapid Update Cycle (RUC) [13 km
National Oceanic and Atmospheric Administration, Department of Commerce — The Rapid Update Cycle (RUC) weather forecast model was developed by the National Centers for Environmental Prediction (NCEP). On May 1, 2012, the RUC was replaced...
1990 Kansas Land Cover Patterns Update
Kansas Data Access and Support Center — In 2008, an update of the 1990 Kansas Land Cover Patterns (KLCP) database was undertaken. The 1990 KLCP database depicts 10 general land cover classes for the State...
Updated preparedness and response framework for influenza pandemics.
Holloway, Rachel; Rasmussen, Sonja A; Zaza, Stephanie; Cox, Nancy J; Jernigan, Daniel B
2014-09-26
The complexities of planning for and responding to the emergence of novel influenza viruses emphasize the need for systematic frameworks to describe the progression of the event; weigh the risk of emergence and potential public health impact; evaluate transmissibility, antiviral resistance, and severity; and make decisions about interventions. On the basis of experience from recent influenza responses, CDC has updated its framework to describe influenza pandemic progression using six intervals (two prepandemic and four pandemic intervals) and eight domains. This updated framework can be used for influenza pandemic planning and serves as recommendations for risk assessment, decision-making, and action in the United States. The updated framework replaces the U.S. federal government stages from the 2006 implementation plan for the National Strategy for Pandemic Influenza (US Homeland Security Council. National strategy for pandemic influenza: implementation plan. Washington, DC: US Homeland Security Council; 2006. Available at http://www.flu.gov/planning-preparedness/federal/pandemic-influenza-implementation.pdf). The six intervals of the updated framework are as follows: 1) investigation of cases of novel influenza, 2) recognition of increased potential for ongoing transmission, 3) initiation of a pandemic wave, 4) acceleration of a pandemic wave, 5) deceleration of a pandemic wave, and 6) preparation for future pandemic waves. The following eight domains are used to organize response efforts within each interval: incident management, surveillance and epidemiology, laboratory, community mitigation, medical care and countermeasures, vaccine, risk communications, and state/local coordination. Compared with the previous U.S. government stages, this updated framework provides greater detail and clarity regarding the potential timing of key decisions and actions aimed at slowing the spread and mitigating the impact of an emerging pandemic. Use of this updated framework is
Supporting Documentation for the 2008 Update to the Insulation Fact Sheet
Energy Technology Data Exchange (ETDEWEB)
Stovall, Therese K [ORNL
2008-02-01
The Insulation Fact Sheet provides consumers for general guidance and recommended insulation levels for their home. This fact sheet has been on-line since 1995 and this update addresses new insulation materials, as well as updated costs for energy and materials.
Computational simulation of the creep-rupture process in filamentary composite materials
Slattery, Kerry T.; Hackett, Robert M.
1991-01-01
A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.
Update History of This Database - D-HaploDB | LSDB Archive [Life Science Database Archive metadata
Lifescience Database Archive (English)
Full Text Available List Contact us D-HaploDB Update History of This Database Date Update contents 2016/12/13 Description of the.../orca.gen.kyushu-u.ac.jp/) is released. About This Database Database Description Download License Update History of This Database... Site Policy | Contact Us Update History of This Database - D-HaploDB | LSDB Archive ...
Simulating a Direction-Finder Search for an ELT
Bream, Bruce
2005-01-01
A computer program simulates the operation of direction-finding equipment engaged in a search for an emergency locator transmitter (ELT) aboard an aircraft that has crashed. The simulated equipment is patterned after the equipment used by the Civil Air Patrol to search for missing aircraft. The program is designed to be used for training in radio direction-finding and/or searching for missing aircraft without incurring the expense and risk of using real aircraft and ground search resources. The program places a hidden ELT on a map and enables the user to search for the location of the ELT by moving a 14 NASA Tech Briefs, March 2005 small aircraft image around the map while observing signal-strength and direction readings on a simulated direction- finding locator instrument. As the simulated aircraft is turned and moved on the map, the program updates the readings on the direction-finding instrument to reflect the current position and heading of the aircraft relative to the location of the ELT. The software is distributed in a zip file that contains an installation program. The software runs on the Microsoft Windows 9x, NT, and XP operating systems.
Update: Applications of Research in Music Education Yearbook. Volume 24
Rowman & Littlefield Education, 2006
2006-01-01
Readers of the online journal "Update: Applications of Research in Music Education" who prefer a printed copy of articles most relevant to their work will find them in the new 2005-2006 "Update Yearbook." Now available to everyone interested in the latest music education trends, the Yearbook contains in print the entire online issues for…
Early Limits on the Verbal Updating of an Object's Location
Ganea, Patricia A.; Harris, Paul L.
2013-01-01
Recent research has shown that by 30 months of age, children can successfully update their representation of an absent object's location on the basis of new verbal information, whereas 23-month-olds often return to the object's prior location. The current results show that this updating failure persisted even when (a) toddlers received visual and…