WorldWideScience

Sample records for modeling project update

  1. Model Updating Nonlinear System Identification Toolbox Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology (ZONA) proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology that utilizes flight data with...

  2. Model Updating Nonlinear System Identification Toolbox Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology by adopting the flight data with state-of-the-art...

  3. The Lunar Mapping and Modeling Project Update

    Science.gov (United States)

    Noble, S.; French, R.; Nall, M.; Muery, K.

    2010-01-01

    The Lunar Mapping and Modeling Project (LMMP) is managing the development of a suite of lunar mapping and modeling tools and data products that support lunar exploration activities, including the planning, design, development, test, and operations associated with crewed and/or robotic operations on the lunar surface. In addition, LMMP should prove to be a convenient and useful tool for scientific analysis and for education and public outreach (E/PO) activities. LMMP will utilize data predominately from the Lunar Reconnaissance Orbiter, but also historical and international lunar mission data (e.g. Lunar Prospector, Clementine, Apollo, Lunar Orbiter, Kaguya, and Chandrayaan-1) as available and appropriate. LMMP will provide such products as image mosaics, DEMs, hazard assessment maps, temperature maps, lighting maps and models, gravity models, and resource maps. We are working closely with the LRO team to prevent duplication of efforts and ensure the highest quality data products. A beta version of the LMMP software was released for limited distribution in December 2009, with the public release of version 1 expected in the Fall of 2010.

  4. Foothills model forest grizzly bear study : project update

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-01-01

    This report updates a five year study launched in 1999 to ensure the continued healthy existence of grizzly bears in west-central Alberta by integrating their needs into land management decisions. The objective was to gather better information and to develop computer-based maps and models regarding grizzly bear migration, habitat use and response to human activities. The study area covers 9,700 square km in west-central Alberta where 66 to 147 grizzly bears exist. During the first 3 field seasons, researchers captured and radio collared 60 bears. Researchers at the University of Calgary used remote sensing tools and satellite images to develop grizzly bear habitat maps. Collaborators at the University of Washington used trained dogs to find bear scat which was analyzed for DNA, stress levels and reproductive hormones. Resource Selection Function models are being developed by researchers at the University of Alberta to identify bear locations and to see how habitat is influenced by vegetation cover and oil, gas, forestry and mining activities. The health of the bears is being studied by researchers at the University of Saskatchewan and the Canadian Cooperative Wildlife Health Centre. The study has already advanced the scientific knowledge of grizzly bear behaviour. Preliminary results indicate that grizzlies continue to find mates, reproduce and gain weight and establish dens. These are all good indicators of a healthy population. Most bear deaths have been related to poaching. The study will continue for another two years. 1 fig.

  5. Model Updating and Uncertainty Management for Aircraft Prognostic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses the integration of physics-based damage propagation models with diagnostic measures of current state of health in a mathematically rigorous...

  6. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2011

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg; Devin A. Steuhm

    2011-09-01

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance and, to some extent, experiment management are obsolete, inconsistent with the state of modern nuclear engineering practice, and are becoming increasingly difficult to properly verify and validate (V&V). Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In 2009 the Idaho National Laboratory (INL) initiated a focused effort to address this situation through the introduction of modern high-fidelity computational software and protocols, with appropriate V&V, within the next 3-4 years via the ATR Core Modeling and Simulation and V&V Update (or 'Core Modeling Update') Project. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the anticipated ATR Core Internals Changeout (CIC) in the 2014 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its first full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (SCALE, KENO-6, HELIOS, NEWT, and ATTILA) have been installed at the INL under various permanent sitewide license agreements and corresponding baseline models of the ATR and ATRC are now operational, demonstrating the basic feasibility of these code packages for their intended purpose. Furthermore

  7. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2010

    Energy Technology Data Exchange (ETDEWEB)

    Rahmat Aryaeinejad; Douglas S. Crawford; Mark D. DeHart; George W. Griffith; D. Scott Lucas; Joseph W. Nielsen; David W. Nigg; James R. Parry; Jorge Navarro

    2010-09-01

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance and, to some extent, experiment management are obsolete, inconsistent with the state of modern nuclear engineering practice, and are becoming increasingly difficult to properly verify and validate (V&V). Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In 2009 the Idaho National Laboratory (INL) initiated a focused effort to address this situation through the introduction of modern high-fidelity computational software and protocols, with appropriate V&V, within the next 3-4 years via the ATR Core Modeling and Simulation and V&V Update (or “Core Modeling Update”) Project. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF).

  8. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2013

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg

    2013-09-01

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance, and to some extent, experiment management, are inconsistent with the state of modern nuclear engineering practice, and are difficult, if not impossible, to verify and validate (V&V) according to modern standards. Furthermore, the legacy staff knowledge required for effective application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In late 2009, the Idaho National Laboratory (INL) initiated a focused effort, the ATR Core Modeling Update Project, to address this situation through the introduction of modern high-fidelity computational software and protocols. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF).

  9. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2012

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg, Principal Investigator; Kevin A. Steuhm, Project Manager

    2012-09-01

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance, and to some extent, experiment management, are inconsistent with the state of modern nuclear engineering practice, and are difficult, if not impossible, to properly verify and validate (V&V) according to modern standards. Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In late 2009, the Idaho National Laboratory (INL) initiated a focused effort, the ATR Core Modeling Update Project, to address this situation through the introduction of modern high-fidelity computational software and protocols. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the next anticipated ATR Core Internals Changeout (CIC) in the 2014-2015 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its third full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL under various licensing arrangements. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core

  10. NACRE Update and Extension Project

    Science.gov (United States)

    Aikawa, Masayuki; Arai, Koji; Arnould, Marcel; Takahashi, Kohji; Utsunomiya, Hiroaki

    2006-04-01

    NACRE, the `nuclear astrophysics compilation of reaction rates', has been widely utilized in stellar evolution and nucleosynthesis studies since its publication in 1999. We describe here the current status of a Konan-Université Libre de Bruxelles (ULB) joint project that aims at its update and extension.

  11. Eastern slopes grizzly bear project : project update

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-01-01

    This report updates a study to examine the cumulative effects of human activities on the grizzly bears in the central Canadian Rockies. The project was initiated in 1994 to acquire accurate scientific information on the habitat and populations of grizzly bears in the area of the Banff National Park and Kananaskis Country. This area is probably the most heavily used and developed area where the grizzly still survives. The information gathered throughout the course of the study is used to better protect and manage the bears and other sensitive carnivores in the region. Using telemetry, researchers monitored 25 grizzly bears which were radio-collared in a 22,000 square-kilometer area in the upper Bow Valley drainage of the eastern Alberta slopes. The researchers worked with representatives from Husky Oil and Rigel Energy on the development of the Moose Mountain oil and gas field without adversely affecting the grizzly bear population. Information collected over eight years indicates that the grizzly bears have few and infrequent offspring. Using the information gathered thus far, the location of the Moose Mountain to Jumping Pound pipeline was carefully selected, since the bears suffer from high mortality, and the food and cover had already been compromised by the high number of roads, trails and other human activities in the area. The research concluded in November 2001 provides sufficient information to accurately asses the status of the grizzly bear population and habitat. The data will be analyzed and integrated in 2002 into models that reflect the variables affecting grizzly bears and a final report will be published.

  12. Updated Core Libraries of the ALPS Project

    CERN Document Server

    Gaenko, A; Carcassi, G; Chen, T; Chen, X; Dong, Q; Gamper, L; Gukelberger, J; Igarashi, R; Iskakov, S; Könz, M; LeBlanc, J P F; Levy, R; Ma, P N; Paki, J E; Shinaoka, H; Todo, S; Troyer, M; Gull, E

    2016-01-01

    The open source ALPS (Algorithms and Libraries for Physics Simulations) project provides a collection of physics libraries and applications, with a focus on simulations of lattice models and strongly correlated systems. The libraries provide a convenient set of well-documented and reusable components for developing condensed matter physics simulation code, and the applications strive to make commonly used and proven computational algorithms available to a non-expert community. In this paper we present an updated and refactored version of the core ALPS libraries geared at the computational physics software development community, rewritten with focus on documentation, ease of installation, and software maintainability.

  13. Dynamic Model Updating Using Virtual Antiresonances

    Directory of Open Access Journals (Sweden)

    Walter D’Ambrogio

    2004-01-01

    Full Text Available This paper considers an extension of the model updating method that minimizes the antiresonance error, besides the natural frequency error. By defining virtual antiresonances, this extension allows the use of previously identified modal data. Virtual antiresonances can be evaluated from a truncated modal expansion, and do not correspond to any physical system. The method is applied to the Finite Element model updating of the GARTEUR benchmark, used within an European project on updating. Results are compared with those previously obtained by estimating actual antiresonances after computing low and high frequency residuals, and with results obtained by using the correlation (MAC between identified and analytical mode shapes.

  14. MU-SPIN Project Update

    Science.gov (United States)

    Harrington, James L., Jr.

    2000-01-01

    The Minority University Space Interdisciplinary (MUSPIN) Network project is a comprehensive outreach and education initiative that focuses on the transfer of advanced computer networking technologies and relevant science to Historically Black Colleges and Universities (HBCU's) and Other Minority Universities (OMU's) for supporting multi-disciplinary education research.

  15. Adjustment or updating of models

    Indian Academy of Sciences (India)

    D J Ewins

    2000-06-01

    In this paper, first a review of the terminology used in the model adjustment or updating is presented. This is followed by an outline of the major updating algorithms cuurently available, together with a discussion of the advantages and disadvantages of each, and the current state-of-the-art of this important application and part of optimum design technology.

  16. GMRT-EoR Project Update

    Science.gov (United States)

    Chang, Tzu-Ching; GMRT-EoR Team

    2013-01-01

    The GMRT-EoR project has continued the multi-year campaign to measure the neutral hydrogen 21cm fluctuations at z=8.5, probing the reionization epoch using the Giant Metrewave Radio Telescope (GMRT) in India. We have acquired new data and continued to improve our analysis strategy and process. To cope with the stringent foreground subtraction requirement, our team has developed a new method for component separation; I will present updates on the latest analysis and future prospects.

  17. Evaluation of the groundwater flow model for southern Utah and Goshen Valleys, Utah, updated to conditions through 2011, with new projections and groundwater management simulations

    Science.gov (United States)

    Brooks, Lynette E.

    2013-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Southern Utah Valley Municipal Water Association, updated an existing USGS model of southern Utah and Goshen Valleys for hydrologic and climatic conditions from 1991 to 2011 and used the model for projection and groundwater management simulations. All model files used in the transient model were updated to be compatible with MODFLOW-2005 and with the additional stress periods. The well and recharge files had the most extensive changes. Discharge to pumping wells in southern Utah and Goshen Valleys was estimated and simulated on an annual basis from 1991 to 2011. Recharge estimates for 1991 to 2011 were included in the updated model by using precipitation, streamflow, canal diversions, and irrigation groundwater withdrawals for each year. The model was evaluated to determine how well it simulates groundwater conditions during recent increased withdrawals and drought, and to determine if the model is adequate for use in future planning. In southern Utah Valley, the magnitude and direction of annual water-level fluctuation simulated by the updated model reasonably match measured water-level changes, but they do not simulate as much decline as was measured in some locations from 2000 to 2002. Both the rapid increase in groundwater withdrawals and the total groundwater withdrawals in southern Utah Valley during this period exceed the variations and magnitudes simulated during the 1949 to 1990 calibration period. It is possible that hydraulic properties may be locally incorrect or that changes, such as land use or irrigation diversions, occurred that are not simulated. In the northern part of Goshen Valley, simulated water-level changes reasonably match measured changes. Farther south, however, simulated declines are much less than measured declines. Land-use changes indicate that groundwater withdrawals in Goshen Valley are possibly greater than estimated and simulated. It is also possible that irrigation

  18. Model validation: Correlation for updating

    Indian Academy of Sciences (India)

    D J Ewins

    2000-06-01

    In this paper, a review is presented of the various methods which are available for the purpose of performing a systematic comparison and correlation between two sets of vibration data. In the present case, the application of interest is in conducting this correlation process as a prelude to model correlation or updating activity.

  19. High-dose neutron detector project update

    Energy Technology Data Exchange (ETDEWEB)

    Menlove, Howard Olsen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Henzlova, Daniela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-10

    These are the slides for a progress review meeting by the sponsor. This is an update on the high-dose neutron detector project. In summary, improvements in both boron coating and signal amplification have been achieved; improved boron coating materials and procedures have increase efficiency by ~ 30-40% without the corresponding increase in the detector plate area; low dead-time via thin cell design (~ 4 mm gas gaps) and fast amplifiers; prototype PDT 8” pod has been received and testing is in progress; significant improvements in efficiency and stability have been verified; use commercial PDT 10B design and fabrication to obtain a faster path from the research to practical high-dose neutron detector.

  20. High-dose neutron detector project update

    Energy Technology Data Exchange (ETDEWEB)

    Menlove, Howard Olsen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Henzlova, Daniela [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-10

    These are the slides for a progress review meeting by the sponsor. This is an update on the high-dose neutron detector project. In summary, improvements in both boron coating and signal amplification have been achieved; improved boron coating materials and procedures have increased efficiency by ~ 30-40% without the corresponding increase in the detector plate area; low dead-time via thin cell design (~ 4 mm gas gaps) and fast amplifiers; prototype PDT 8” pod has been received and testing is in progress; significant improvements in efficiency and stability have been verified; use commercial PDT 10B design and fabrication to obtain a faster path from the research to practical high-dose neutron detector.

  1. CTL Model Update for System Modifications

    CERN Document Server

    Ding, Yulin; Zhang, Yan; Zhang, Y; 10.1613/jair.2420

    2011-01-01

    Model checking is a promising technology, which has been applied for verification of many hardware and software systems. In this paper, we introduce the concept of model update towards the development of an automatic system modification tool that extends model checking functions. We define primitive update operations on the models of Computation Tree Logic (CTL) and formalize the principle of minimal change for CTL model update. These primitive update operations, together with the underlying minimal change principle, serve as the foundation for CTL model update. Essential semantic and computational characterizations are provided for our CTL model update approach. We then describe a formal algorithm that implements this approach. We also illustrate two case studies of CTL model updates for the well-known microwave oven example and the Andrew File System 1, from which we further propose a method to optimize the update results in complex system modifications.

  2. The Danish (Q)SAR Database Update Project

    DEFF Research Database (Denmark)

    Nikolov, Nikolai Georgiev; Dybdahl, Marianne; Abildgaard Rosenberg, Sine

    2013-01-01

    , carcinogenicity and others), each of them available for 185,000 organic substances. The database has been available online since 2005 (http://qsar.food.dtu.dk). A major update project for the Danish (Q)SAR database is under way, with a new online release planned in the beginning of 2015. The updated version...

  3. Experimental model updating using frequency response functions

    Science.gov (United States)

    Hong, Yu; Liu, Xi; Dong, Xinjun; Wang, Yang; Pu, Qianhui

    2016-04-01

    In order to obtain a finite element (FE) model that can more accurately describe structural behaviors, experimental data measured from the actual structure can be used to update the FE model. The process is known as FE model updating. In this paper, a frequency response function (FRF)-based model updating approach is presented. The approach attempts to minimize the difference between analytical and experimental FRFs, while the experimental FRFs are calculated using simultaneously measured dynamic excitation and corresponding structural responses. In this study, the FRF-based model updating method is validated through laboratory experiments on a four-story shear-frame structure. To obtain the experimental FRFs, shake table tests and impact hammer tests are performed. The FRF-based model updating method is shown to successfully update the stiffness, mass and damping parameters of the four-story structure, so that the analytical and experimental FRFs match well with each other.

  4. Sand Lake WMD vegetation mapping project update

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Final report on the vegetation mapping project at Sand Lake Wetland Management District. This project is being completed by the use of SPRING software and ground...

  5. Resource Tracking Model Updates and Trade Studies

    Science.gov (United States)

    Chambliss, Joe; Stambaugh, Imelda; Moore, Michael

    2016-01-01

    The Resource Tracking Model has been updated to capture system manager and project manager inputs. Both the Trick/General Use Nodal Network Solver Resource Tracking Model (RTM) simulator and the RTM mass balance spreadsheet have been revised to address inputs from system managers and to refine the way mass balance is illustrated. The revisions to the RTM included the addition of a Plasma Pyrolysis Assembly (PPA) to recover hydrogen from Sabatier Reactor methane, which was vented in the prior version of the RTM. The effect of the PPA on the overall balance of resources in an exploration vehicle is illustrated in the increased recycle of vehicle oxygen. Case studies have been run to show the relative effect of performance changes on vehicle resources.

  6. An updating of the SIRM model

    Science.gov (United States)

    Perna, L.; Pezzopane, M.; Pietrella, M.; Zolesi, B.; Cander, L. R.

    2017-09-01

    The SIRM model proposed by Zolesi et al. (1993, 1996) is an ionospheric regional model for predicting the vertical-sounding characteristics that has been frequently used in developing ionospheric web prediction services (Zolesi and Cander, 2014). Recently the model and its outputs were implemented in the framework of two European projects: DIAS (DIgital upper Atmosphere Server; http://www.iono.noa.gr/DIAS/ (Belehaki et al., 2005, 2015) and ESPAS (Near-Earth Space Data Infrastructure for e-Science; http://www.espas-fp7.eu/) (Belehaki et al., 2016). In this paper an updated version of the SIRM model, called SIRMPol, is described and corresponding outputs in terms of the F2-layer critical frequency (foF2) are compared with values recorded at the mid-latitude station of Rome (41.8°N, 12.5°E), for extremely high (year 1958) and low (years 2008 and 2009) solar activity. The main novelties introduced in the SIRMPol model are: (1) an extension of the Rome ionosonde input dataset that, besides data from 1957 to 1987, includes also data from 1988 to 2007; (2) the use of second order polynomial regressions, instead of linear ones, to fit the relation foF2 vs. solar activity index R12; (3) the use of polynomial relations, instead of linear ones, to fit the relations A0 vs. R12, An vs. R12 and Yn vs. R12, where A0, An and Yn are the coefficients of the Fourier analysis performed by the SIRM model to reproduce the values calculated by using relations in (2). The obtained results show that SIRMPol outputs are better than those of the SIRM model. As the SIRMPol model represents only a partial updating of the SIRM model based on inputs from only Rome ionosonde data, it can be considered a particular case of a single-station model. Nevertheless, the development of the SIRMPol model allowed getting some useful guidelines for a future complete and more accurate updating of the SIRM model, of which both DIAS and ESPAS could benefit.

  7. Online Sequential Projection Vector Machine with Adaptive Data Mean Update

    Directory of Open Access Journals (Sweden)

    Lin Chen

    2016-01-01

    Full Text Available We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1 the projection vectors for dimension reduction, (2 the input weights, biases, and output weights, and (3 the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD approach, adaptive multihyperplane machine (AMM, primal estimated subgradient solver (Pegasos, online sequential extreme learning machine (OSELM, and SVD + OSELM (feature selection based on SVD is performed before OSELM. The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM.

  8. Utah trumpeter swan project update #1

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — Summary of activities for the Utah swan project for the year of 1996. This summary discusses core sampling that took place at Bear River Migratory Bird Refuge and...

  9. Rubbertown NGEM Demonstration Project - Update to Industry

    Science.gov (United States)

    Follow-up communication to Rubbertown industry group as part of the planning process for the Rubbertown NGEM demonstration study. These slides are for discussion purposes and will not be presented publically beyond the project team and industry group.

  10. Numerical modelling of mine workings: annual update 1999/2000.

    CSIR Research Space (South Africa)

    Lightfoot, N

    1999-09-01

    Full Text Available The SIMRAC project GAP629 has two aspects. Firstly, the production of an updated edition of the guidebook Numerical Modelling of Mine Workings. The original document was launched to the South African mining industry in April 1999. Secondly...

  11. Stochastic model updating using distance discrimination analysis

    Institute of Scientific and Technical Information of China (English)

    Deng Zhongmin; Bi Sifeng; Sez Atamturktur

    2014-01-01

    This manuscript presents a stochastic model updating method, taking both uncertainties in models and variability in testing into account. The updated finite element (FE) models obtained through the proposed technique can aid in the analysis and design of structural systems. The authors developed a stochastic model updating method integrating distance discrimination analysis (DDA) and advanced Monte Carlo (MC) technique to (1) enable more efficient MC by using a response surface model, (2) calibrate parameters with an iterative test-analysis correlation based upon DDA, and (3) utilize and compare different distance functions as correlation metrics. Using DDA, the influence of distance functions on model updating results is analyzed. The proposed sto-chastic method makes it possible to obtain a precise model updating outcome with acceptable cal-culation cost. The stochastic method is demonstrated on a helicopter case study updated using both Euclidian and Mahalanobis distance metrics. It is observed that the selected distance function influ-ences the iterative calibration process and thus, the calibration outcome, indicating that an integra-tion of different metrics might yield improved results.

  12. Updated Budget Projections: 2016 to 2026

    Science.gov (United States)

    2016-03-01

    projections in that report accounted for the direct budgetary effects of legislation enacted through the end of December 2015, they were based on the economic... assets . Beyond the 10-year period, if current laws remained in place, the pressures that contributed to rising deficits during the baseline period...neutral benchmark against which potential changes in law can be measured . Future legislative action could lead to markedly different outcomes—but

  13. Updating Dosimetry for Emergency Response Dose Projections.

    Science.gov (United States)

    DeCair, Sara

    2016-02-01

    In 2013, the U.S. Environmental Protection Agency (EPA) proposed an update to the 1992 Protective Action Guides (PAG) Manual. The PAG Manual provides guidance to state and local officials planning for radiological emergencies. EPA requested public comment on the proposed revisions, while making them available for interim use by officials faced with an emergency situation. Developed with interagency partners, EPA's proposal incorporates newer dosimetric methods, identifies tools and guidelines developed since the current document was issued, and extends the scope of the PAGs to all significant radiological incidents, including radiological dispersal devices or improvised nuclear devices. In order to best serve the emergency management community, scientific policy direction had to be set on how to use International Commission on Radiological Protection Publication 60 age groups in dose assessment when implementing emergency guidelines. Certain guidelines that lend themselves to different PAGs for different subpopulations are the PAGs for potassium iodide (KI), food, and water. These guidelines provide age-specific recommendations because of the radiosensitivity of the thyroid and young children with respect to ingestion and inhalation doses in particular. Taking protective actions like using KI, avoiding certain foods or using alternative sources of drinking water can be relatively simple to implement by the parents of young children. Clear public messages can convey which age groups should take which action, unlike how an evacuation or relocation order should apply to entire households or neighborhoods. New in the PAG Manual is planning guidance for the late phase of an incident, after the situation is stabilized and efforts turn toward recovery. Because the late phase can take years to complete, decision makers are faced with managing public exposures in areas not fully remediated. The proposal includes quick-reference operational guidelines to inform re-entry to

  14. The International Project 1992 Update Including "Microfilming Projects Abroad."

    Science.gov (United States)

    Rutimann, Hans

    1993-01-01

    Describes microfilming projects in 30 countries collected from questionnaire responses. Additional topics discussed include cooperative programs for preservation and access; an overview of national programs; mass deacidification; new technologies, such as digital preservation; microfilming projects abroad; and future priorities. (Contains 10…

  15. Neuman systems model in holland: an update.

    Science.gov (United States)

    Merks, André; Verberk, Frans; de Kuiper, Marlou; Lowry, Lois W

    2012-10-01

    The authors of this column, leading members of the International Neuman Systems Model Association, provide an update on the use of Neuman systems model in Holland and document the various changes in The Netherlands that have influenced the use of the model in that country. The model's link to systems theory and stress theory are discussed, as well as a shift to greater emphasis on patient self-management. The model is also linked to healthcare quality improvement and interprofessional collaboration in Holland.

  16. Updating the U.S. Life Cycle GHG Petroleum Baseline to 2014 with Projections to 2040 Using Open-Source Engineering-Based Models.

    Science.gov (United States)

    Cooney, Gregory; Jamieson, Matthew; Marriott, Joe; Bergerson, Joule; Brandt, Adam; Skone, Timothy J

    2017-01-17

    The National Energy Technology Laboratory produced a well-to-wheels (WTW) life cycle greenhouse gas analysis of petroleum-based fuels consumed in the U.S. in 2005, known as the NETL 2005 Petroleum Baseline. This study uses a set of engineering-based, open-source models combined with publicly available data to calculate baseline results for 2014. An increase between the 2005 baseline and the 2014 results presented here (e.g., 92.4 vs 96.2 g CO2e/MJ gasoline, + 4.1%) are due to changes both in modeling platform and in the U.S. petroleum sector. An updated result for 2005 was calculated to minimize the effect of the change in modeling platform, and emissions for gasoline in 2014 were about 2% lower than in 2005 (98.1 vs 96.2 g CO2e/MJ gasoline). The same methods were utilized to forecast emissions from fuels out to 2040, indicating maximum changes from the 2014 gasoline result between +2.1% and -1.4%. The changing baseline values lead to potential compliance challenges with frameworks such as the Energy Independence and Security Act (EISA) Section 526, which states that Federal agencies should not purchase alternative fuels unless their life cycle GHG emissions are less than those of conventionally produced, petroleum-derived fuels.

  17. A Provenance Tracking Model for Data Updates

    Directory of Open Access Journals (Sweden)

    Gabriel Ciobanu

    2012-08-01

    Full Text Available For data-centric systems, provenance tracking is particularly important when the system is open and decentralised, such as the Web of Linked Data. In this paper, a concise but expressive calculus which models data updates is presented. The calculus is used to provide an operational semantics for a system where data and updates interact concurrently. The operational semantics of the calculus also tracks the provenance of data with respect to updates. This provides a new formal semantics extending provenance diagrams which takes into account the execution of processes in a concurrent setting. Moreover, a sound and complete model for the calculus based on ideals of series-parallel DAGs is provided. The notion of provenance introduced can be used as a subjective indicator of the quality of data in concurrent interacting systems.

  18. An Updated AP2 Beamline TURTLE Model

    Energy Technology Data Exchange (ETDEWEB)

    Gormley, M.; O' Day, S.

    1991-08-23

    This note describes a TURTLE model of the AP2 beamline. This model was created by D. Johnson and improved by J. Hangst. The authors of this note have made additional improvements which reflect recent element and magnet setting changes. The magnet characteristics measurements and survey data compiled to update the model will be presented. A printout of the actual TURTLE deck may be found in appendix A.

  19. OSPREY Model Development Status Update

    Energy Technology Data Exchange (ETDEWEB)

    Veronica J Rutledge

    2014-04-01

    During the processing of used nuclear fuel, volatile radionuclides will be discharged to the atmosphere if no recovery processes are in place to limit their release. The volatile radionuclides of concern are 3H, 14C, 85Kr, and 129I. Methods are being developed, via adsorption and absorption unit operations, to capture these radionuclides. It is necessary to model these unit operations to aid in the evaluation of technologies and in the future development of an advanced used nuclear fuel processing plant. A collaboration between Fuel Cycle Research and Development Offgas Sigma Team member INL and a NEUP grant including ORNL, Syracuse University, and Georgia Institute of Technology has been formed to develop off gas models and support off gas research. Georgia Institute of Technology is developing fundamental level model to describe the equilibrium and kinetics of the adsorption process, which are to be integrated with OSPREY. This report discusses the progress made on expanding OSPREY to be multiple component and the integration of macroscale and microscale level models. Also included in this report is a brief OSPREY user guide.

  20. 100% DD Energy Model Update

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2011-06-30

    The Miami Science Museum energy model has been used during DD to test the building's potential for energy savings as measured by ASHRAE 90.1-2007 Appendix G. This standard compares the designed building's yearly energy cost with that of a code-compliant building. The building is currently on track show 20% or better improvement over the ASHRAE 90.1-2007 Appendix G baseline; this performance would ensure minimum compliance with both LEED 2.2 and current Florida Energy Code, which both reference a less strict version of ASHRAE 90.1. In addition to being an exercise in energy code compliance, the energy model has been used as a design tool to show the relative performance benefit of individual energy conservation measures (ECMs). These ECMs are areas where the design team has improved upon code-minimum design paths to improve the energy performance of the building. By adding ECMs one a time to a code-compliant baseline building, the current analysis identifies which ECMs are most effective in helping the building meet its energy performance goals.

  1. Assessment of stochastically updated finite element models using reliability indicator

    Science.gov (United States)

    Hua, X. G.; Wen, Q.; Ni, Y. Q.; Chen, Z. Q.

    2017-01-01

    Finite element (FE) model updating techniques have been a viable approach to correcting an initial mathematical model based on test data. Validation of the updated FE models is usually conducted by comparing model predictions with independent test data that have not been used for model updating. This approach of model validation cannot be readily applied in the case of a stochastically updated FE model. In recognizing that structural reliability is a major decision factor throughout the lifecycle of a structure, this study investigates the use of structural reliability as a measure for assessing the quality of stochastically updated FE models. A recently developed perturbation method for stochastic FE model updating is first applied to attain the stochastically updated models by using the measured modal parameters with uncertainty. The reliability index and failure probability for predefined limit states are computed for the initial and the stochastically updated models, respectively, and are compared with those obtained from the 'true' model to assess the quality of the two models. Numerical simulation of a truss bridge is provided as an example. The simulated modal parameters involving different uncertainty magnitudes are used to update an initial model of the bridge. It is shown that the reliability index obtained from the updated model is much closer to true reliability index than that obtained from the initial model in the case of small uncertainty magnitude; in the case of large uncertainty magnitude, the reliability index computed from the initial model rather than from the updated model is closer to the true value. The present study confirms the usefulness of measurement-calibrated FE models and at the same time also highlights the importance of the uncertainty reduction in test data for reliable model updating and reliability evaluation.

  2. Update History of This Database - Budding yeast cDNA sequencing project | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us ...Budding yeast cDNA sequencing project Update History of This Database Date Update contents 2010/03/29 Buddin...tio About This Database Database Description Download License Update History of This Database Site Policy | Contact Us Update History

  3. Can model updating tell the truth?

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, F.M.

    1998-02-01

    This paper discusses to which extent updating methods may be able to correct finite element models in such a way that the test structure is better simulated. After having unified some of the most popular modal residues used as the basis for optimization algorithms, the relationship between modal residues and model correlation is investigated. This theoretical approach leads to an error estimator that may be implemented to provide an a priori upper bound of a model`s predictive quality relative to test data. These estimates however assume that a full measurement set is available. Finally, an application example is presented that illustrates the effectiveness of the estimator proposed when less measurement points than degrees of freedom are available.

  4. Update of the SPS Impedance Model

    CERN Document Server

    Salvant, B; Zannini, C; Arduini, G; Berrig, O; Caspers, F; Grudiev, A; Métral, E; Rumolo, G; Shaposhnikova, E; Zotter, B; Migliorati, M; Spataro, B

    2010-01-01

    The beam coupling impedance of the CERN SPS is expected to be one of the limitations to an intensity upgrade of the LHC complex. In order to be able to reduce the SPS impedance, its main contributors need to be identified. An impedance model for the SPS has been gathered from theoretical calculations, electromagnetic simulations and bench measurements of single SPS elements. The current model accounts for the longitudinal and transverse impedance of the kickers, the horizontal and vertical electrostatic beam position monitors, the RF cavities and the 6.7 km beam pipe. In order to assess the validity of this model, macroparticle simulations of a bunch interacting with this updated SPS impedance model are compared to measurements performed with the SPS beam.

  5. Frequency response function-based model updating using Kriging model

    Science.gov (United States)

    Wang, J. T.; Wang, C. J.; Zhao, J. P.

    2017-03-01

    An acceleration frequency response function (FRF) based model updating method is presented in this paper, which introduces Kriging model as metamodel into the optimization process instead of iterating the finite element analysis directly. The Kriging model is taken as a fast running model that can reduce solving time and facilitate the application of intelligent algorithms in model updating. The training samples for Kriging model are generated by the design of experiment (DOE), whose response corresponds to the difference between experimental acceleration FRFs and its counterpart of finite element model (FEM) at selected frequency points. The boundary condition is taken into account, and a two-step DOE method is proposed for reducing the number of training samples. The first step is to select the design variables from the boundary condition, and the selected variables will be passed to the second step for generating the training samples. The optimization results of the design variables are taken as the updated values of the design variables to calibrate the FEM, and then the analytical FRFs tend to coincide with the experimental FRFs. The proposed method is performed successfully on a composite structure of honeycomb sandwich beam, after model updating, the analytical acceleration FRFs have a significant improvement to match the experimental data especially when the damping ratios are adjusted.

  6. Enrollment Projection Model.

    Science.gov (United States)

    Gustafson, B. Kerry; Hample, Stephen R.

    General documentation for the Enrollment Projection Model used by the Maryland Council for Higher Education (MCHE) is provided. The manual is directed toward both the potential users of the model as well as others interested in enrollment projections. The first four chapters offer administrators or planners insight into the derivation of the…

  7. MARMOT update for oxide fuel modeling

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, Pritam [Idaho National Lab. (INL), Idaho Falls, ID (United States); Jiang, Chao [Idaho National Lab. (INL), Idaho Falls, ID (United States); Aagesen, Larry [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ahmed, Karim [Idaho National Lab. (INL), Idaho Falls, ID (United States); Jiang, Wen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Biner, Bulent [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, Xianming [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Tonks, Michael [Pennsylvania State Univ., University Park, PA (United States); Millett, Paul [Univ. of Arkansas, Fayetteville, AR (United States)

    2016-09-01

    This report summarizes the lower-length-scale research and development progresses in FY16 at Idaho National Laboratory in developing mechanistic materials models for oxide fuels, in parallel to the development of the MARMOT code which will be summarized in a separate report. This effort is a critical component of the microstructure based fuel performance modeling approach, supported by the Fuels Product Line in the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. The progresses can be classified into three categories: 1) development of materials models to be used in engineering scale fuel performance modeling regarding the effect of lattice defects on thermal conductivity, 2) development of modeling capabilities for mesoscale fuel behaviors including stage-3 gas release, grain growth, high burn-up structure, fracture and creep, and 3) improved understanding in material science by calculating the anisotropic grain boundary energies in UO$_2$ and obtaining thermodynamic data for solid fission products. Many of these topics are still under active development. They are updated in the report with proper amount of details. For some topics, separate reports are generated in parallel and so stated in the text. The accomplishments have led to better understanding of fuel behaviors and enhance capability of the MOOSE-BISON-MARMOT toolkit.

  8. Updating river basin models with radar altimetry

    DEFF Research Database (Denmark)

    Michailovsky, Claire Irene B.

    response of a catchment to meteorological forcing. While river discharge cannot be directly measured from space, radar altimetry (RA) can measure water level variations in rivers at the locations where the satellite ground track and river network intersect called virtual stations or VS. In this PhD study...... been between 10 and 35 days for altimetry missions until now. The location of the VS is also not necessarily the point at which measurements are needed. On the other hand, one of the main strengths of the dataset is its availability in near-real time. These characteristics make radar altimetry ideally...... suited for use in data assimilation frameworks which combine the information content from models and current observations to produce improved forecasts and reduce prediction uncertainty. The focus of the second and third papers of this thesis was therefore the use of radar altimetry as update data...

  9. Operational modal analysis by updating autoregressive model

    Science.gov (United States)

    Vu, V. H.; Thomas, M.; Lakis, A. A.; Marcouiller, L.

    2011-04-01

    This paper presents improvements of a multivariable autoregressive (AR) model for applications in operational modal analysis considering simultaneously the temporal response data of multi-channel measurements. The parameters are estimated by using the least squares method via the implementation of the QR factorization. A new noise rate-based factor called the Noise rate Order Factor (NOF) is introduced for use in the effective selection of model order and noise rate estimation. For the selection of structural modes, an orderwise criterion called the Order Modal Assurance Criterion (OMAC) is used, based on the correlation of mode shapes computed from two successive orders. Specifically, the algorithm is updated with respect to model order from a small value to produce a cost-effective computation. Furthermore, the confidence intervals of each natural frequency, damping ratio and mode shapes are also computed and evaluated with respect to model order and noise rate. This method is thus very effective for identifying the modal parameters in case of ambient vibrations dealing with modern output-only modal analysis. Simulations and discussions on a steel plate structure are presented, and the experimental results show good agreement with the finite element analysis.

  10. Declarative XML Update Language Based on a Higher Data Model

    Institute of Scientific and Technical Information of China (English)

    Guo-Ren Wang; Xiao-Lin Zhang

    2005-01-01

    With the extensive use of XML in applications over the Web, how to update XML data is becoming an important issue because the role of XML has expanded beyond traditional applications in which XML is used for information exchange and data representation over the Web. So far, several languages have been proposed for updating XML data, but they are all based on lower, so-called graph-based or tree-based data models. Update requests are thus expressed in a nonintuitive and unnatural way and update statements are too complicated to comprehend. This paper presents a novel declarative XML update language which is an extension of the XML-RL query language. Compared with other existing XML update languages, it has the following features. First, it is the only XML data manipulation language based on a higher data model. Second, this language can express complex update requests at multiple levels in a hierarchy in a simple and flat way. Third, this language directly supports the functionality of updating complex objects while all other update languages do not support these operations. Lastly, most of existing languages use rename to modify attribute and element names, which is a different way from updates on value. The proposed language modifies tag names, values, and objects in a unified way by the introduction of three kinds of logical binding variables: object variables, value variables, and name variables.

  11. General Separations Area (GSA) Groundwater Flow Model Update: Hydrostratigraphic Data

    Energy Technology Data Exchange (ETDEWEB)

    Bagwell, L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Bennett, P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-02-21

    This document describes the assembly, selection, and interpretation of hydrostratigraphic data for input to an updated groundwater flow model for the General Separations Area (GSA; Figure 1) at the Department of Energy’s (DOE) Savannah River Site (SRS). This report is one of several discrete but interrelated tasks that support development of an updated groundwater model (Bagwell and Flach, 2016).

  12. A Review of Recent Updates of Sea-Level Projections at Global and Regional Scales

    Science.gov (United States)

    Slangen, A. B. A.; Adloff, F.; Jevrejeva, S.; Leclercq, P. W.; Marzeion, B.; Wada, Y.; Winkelmann, R.

    2017-01-01

    Sea-level change (SLC) is a much-studied topic in the area of climate research, integrating a range of climate science disciplines, and is expected to impact coastal communities around the world. As a result, this field is rapidly moving, and the knowledge and understanding of processes contributing to SLC is increasing. Here, we discuss noteworthy recent developments in the projection of SLC contributions and in the global mean and regional sea-level projections. For the Greenland Ice Sheet contribution to SLC, earlier estimates have been confirmed in recent research, but part of the source of this contribution has shifted from dynamics to surface melting. New insights into dynamic discharge processes and the onset of marine ice sheet instability increase the projected range for the Antarctic contribution by the end of the century. The contribution from both ice sheets is projected to increase further in the coming centuries to millennia. Recent updates of the global glacier outline database and new global glacier models have led to slightly lower projections for the glacier contribution to SLC (7-17 cm by 2100), but still project the glaciers to be an important contribution. For global mean sea-level projections, the focus has shifted to better estimating the uncertainty distributions of the projection time series, which may not necessarily follow a normal distribution. Instead, recent studies use skewed distributions with longer tails to higher uncertainties. Regional projections have been used to study regional uncertainty distributions, and regional projections are increasingly being applied to specific regions, countries, and coastal areas.

  13. A Review of Recent Updates of Sea-Level Projections at Global and Regional Scales

    Science.gov (United States)

    Slangen, A. B. A.; Adloff, F.; Jevrejeva, S.; Leclercq, P. W.; Marzeion, B.; Wada, Y.; Winkelmann, R.

    2016-06-01

    Sea-level change (SLC) is a much-studied topic in the area of climate research, integrating a range of climate science disciplines, and is expected to impact coastal communities around the world. As a result, this field is rapidly moving, and the knowledge and understanding of processes contributing to SLC is increasing. Here, we discuss noteworthy recent developments in the projection of SLC contributions and in the global mean and regional sea-level projections. For the Greenland Ice Sheet contribution to SLC, earlier estimates have been confirmed in recent research, but part of the source of this contribution has shifted from dynamics to surface melting. New insights into dynamic discharge processes and the onset of marine ice sheet instability increase the projected range for the Antarctic contribution by the end of the century. The contribution from both ice sheets is projected to increase further in the coming centuries to millennia. Recent updates of the global glacier outline database and new global glacier models have led to slightly lower projections for the glacier contribution to SLC (7-17 cm by 2100), but still project the glaciers to be an important contribution. For global mean sea-level projections, the focus has shifted to better estimating the uncertainty distributions of the projection time series, which may not necessarily follow a normal distribution. Instead, recent studies use skewed distributions with longer tails to higher uncertainties. Regional projections have been used to study regional uncertainty distributions, and regional projections are increasingly being applied to specific regions, countries, and coastal areas.

  14. A Continuously Updated, Global Land Classification Map Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate a fully automatic capability for generating a global, high resolution (30 m) land classification map, with continuous updates from...

  15. Finite element model updating using bayesian framework and modal properties

    CSIR Research Space (South Africa)

    Marwala, T

    2005-01-01

    Full Text Available Finite element (FE) models are widely used to predict the dynamic characteristics of aerospace structures. These models often give results that differ from measured results and therefore need to be updated to match measured results. Some...

  16. Reservoir management under geological uncertainty using fast model update

    NARCIS (Netherlands)

    Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.

    2015-01-01

    Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU del

  17. Update Legal Documents Using Hierarchical Ranking Models and Word Clustering

    OpenAIRE

    Pham, Minh Quang Nhat; Nguyen, Minh Le; Shimazu, Akira

    2010-01-01

    Our research addresses the task of updating legal documents when newinformation emerges. In this paper, we employ a hierarchical ranking model tothe task of updating legal documents. Word clustering features are incorporatedto the ranking models to exploit semantic relations between words. Experimentalresults on legal data built from the United States Code show that the hierarchicalranking model with word clustering outperforms baseline methods using VectorSpace Model, and word cluster-based ...

  18. Structural Finite Element Model Updating Using Vibration Tests and Modal Analysis for NPL footbridge - SHM demonstrator

    Science.gov (United States)

    Barton, E.; Middleton, C.; Koo, K.; Crocker, L.; Brownjohn, J.

    2011-07-01

    This paper presents the results from collaboration between the National Physical Laboratory (NPL) and the University of Sheffield on an ongoing research project at NPL. A 50 year old reinforced concrete footbridge has been converted to a full scale structural health monitoring (SHM) demonstrator. The structure is monitored using a variety of techniques; however, interrelating results and converting data to knowledge are not possible without a reliable numerical model. During the first stage of the project, the work concentrated on static loading and an FE model of the undamaged bridge was created, and updated, under specified static loading and temperature conditions. This model was found to accurately represent the response under static loading and it was used to identify locations for sensor installation. The next stage involves the evaluation of repair/strengthening patches under both static and dynamic loading. Therefore, before deliberately introducing significant damage, the first set of dynamic tests was conducted and modal properties were estimated. The measured modal properties did not match the modal analysis from the statically updated FE model; it was clear that the existing model required updating. This paper introduces the results of the dynamic testing and model updating. It is shown that the structure exhibits large non-linear, amplitude dependant characteristics. This creates a difficult updating process, but we attempt to produce the best linear representation of the structure. A sensitivity analysis is performed to determine the most sensitive locations for planned damage/repair scenarios and is used to decide whether additional sensors will be necessary.

  19. Structural Finite Element Model Updating Using Vibration Tests and Modal Analysis for NPL footbridge - SHM demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Barton, E; Crocker, L [Structural health monitoring, National Physical Laboratory, Hampton Road, Teddington, Middlesex, TW11 0LW (United Kingdom); Middleton, C; Koo, K; Brownjohn, J, E-mail: elena.barton@npl.co.uk, E-mail: C.J.Middleton@sheffield.ac.uk, E-mail: k.koo@sheffield.ac.uk, E-mail: louise.crocker@npl.co.uk, E-mail: j.brownjohn@sheffield.ac.uk [University of Sheffield, Department of Civil and Structural Engineering, Vibration Engineering Research Section, Sir Frederick Mappin Building Mappin Street, Sheffield, S1 3JD (United Kingdom)

    2011-07-19

    This paper presents the results from collaboration between the National Physical Laboratory (NPL) and the University of Sheffield on an ongoing research project at NPL. A 50 year old reinforced concrete footbridge has been converted to a full scale structural health monitoring (SHM) demonstrator. The structure is monitored using a variety of techniques; however, interrelating results and converting data to knowledge are not possible without a reliable numerical model. During the first stage of the project, the work concentrated on static loading and an FE model of the undamaged bridge was created, and updated, under specified static loading and temperature conditions. This model was found to accurately represent the response under static loading and it was used to identify locations for sensor installation. The next stage involves the evaluation of repair/strengthening patches under both static and dynamic loading. Therefore, before deliberately introducing significant damage, the first set of dynamic tests was conducted and modal properties were estimated. The measured modal properties did not match the modal analysis from the statically updated FE model; it was clear that the existing model required updating. This paper introduces the results of the dynamic testing and model updating. It is shown that the structure exhibits large non-linear, amplitude dependant characteristics. This creates a difficult updating process, but we attempt to produce the best linear representation of the structure. A sensitivity analysis is performed to determine the most sensitive locations for planned damage/repair scenarios and is used to decide whether additional sensors will be necessary.

  20. EPA Project Updates: DSSTox and ToxCast Generating New ...

    Science.gov (United States)

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than traditionally employed in SAR modeling, and in ways that facilitate data-mining, and data read-across. The DSSTox Structure-Browser, launched in September 2007, provides structure searchability across all published DSSTox toxicity-related inventory, and is enabling linkages between previously isolated toxicity data resources. As of early March 2008, the public DSSTox inventory as been integrated into PubChem, allowing a user to take full advantage of PubChem structure-activity and bioassay clustering features. The most recent DSSTox version of Carcinogenic Potency Database file (CPDBAS) illustrates ways in which various summary definitions of carcinogenic activity can be employed in modeling and data mining. Phase I of the ToxCast project is generating high-throughput screening data from several hundred biochemical and cell-based assays for a set of 320 chemicals, mostly pesticide actives, with rich toxicology profiles. Incorporating and expanding traditional SAR Concepts into this new high-throughput and data-rich would pose conceptual and practical challenges, but also holds great promise for improving predictive capabilities. EPA's National Center for Computational Toxicology is bu

  1. Model updating of nonlinear structures from measured FRFs

    Science.gov (United States)

    Canbaloğlu, Güvenç; Özgüven, H. Nevzat

    2016-12-01

    There are always certain discrepancies between modal and response data of a structure obtained from its mathematical model and experimentally measured ones. Therefore it is a general practice to update the theoretical model by using experimental measurements in order to have a more accurate model. Most of the model updating methods used in structural dynamics are for linear systems. However, in real life applications most of the structures have nonlinearities, which restrict us applying model updating techniques available for linear structures, unless they work in linear range. Well-established frequency response function (FRF) based model updating methods would easily be extended to a nonlinear system if the FRFs of the underlying linear system (linear FRFs) could be experimentally measured. When frictional type of nonlinearity co-exists with other types of nonlinearities, it is not possible to obtain linear FRFs experimentally by using low level forcing. In this study a method (named as Pseudo Receptance Difference (PRD) method) is presented to obtain linear FRFs of a nonlinear structure having multiple nonlinearities including friction type of nonlinearity. PRD method, calculates linear FRFs of a nonlinear structure by using FRFs measured at various forcing levels, and simultaneously identifies all nonlinearities in the system. Then, any model updating method can be used to update the linear part of the mathematical model. In this present work, PRD method is used to predict the linear FRFs from measured nonlinear FRFs, and the inverse eigensensitivity method is employed to update the linear finite element (FE) model of the nonlinear structure. The proposed method is validated with different case studies using nonlinear lumped single-degree of freedom system, as well as a continuous system. Finally, a real nonlinear T-beam test structure is used to show the application and the accuracy of the proposed method. The accuracy of the updated nonlinear model of the

  2. Update of the Z Refurbishment project (ZR) at Sandia National Laboratories.

    Energy Technology Data Exchange (ETDEWEB)

    Moncayo, Carla; Bloomquist, Douglas D.; Weed, John Woodruff; Tabor, Debra Ann; Donovan, Guy Louis; McKee, G. Randall; Weinbrecht, Edward A.; Faturos, Thomas V.; McDaniel, Dillon Heirman

    2007-08-01

    Sandia's Z Refurbishment (ZR) Project formally began in February 2002 to increase the Z Accelerator's utilization by providing the capability to perform more shots, improve precision and pulse shape variability, and increase delivered current. A project update was provided at the 15th International Pulsed Power Conference in 2005. The Z facility was shut down in July 2006 for structural/infrastructure modifications and installation of new pulsed power systems. The refurbishment will conclude in 2007. This paper provides a status update of the project covering the past 2 years of activities.

  3. 76 FR 18214 - Idaho Power; Notice of Availability of Land Management Plan Update for the Shoshone Falls Project...

    Science.gov (United States)

    2011-04-01

    ... a Land Management Plan (LMP) update for the project. The LMP is a comprehensive plan to manage... Shoshone Falls Project and Soliciting Comments, Motions To Intervene, and Protests Take notice that the... inspection: a. Application Type: Land Management Plan Update. b. Project No.: 2055-087. c. Date Filed...

  4. 76 FR 18213 - Idaho Power; Notice of Availability of Land Management Plan Update for the Shoshone Falls Project...

    Science.gov (United States)

    2011-04-01

    ... a Land Management Plan (LMP) update for the project. The LMP is a comprehensive plan to manage... Shoshone Falls Project and Soliciting Comments, Motions To Intervene, and Protests Take notice that the... inspection: a. Application Type: Land Management Plan Update. b. Project No.: 2778-062. c. Date Filed...

  5. Structural Dynamics Model Updating with Positive Definiteness and No Spillover

    Directory of Open Access Journals (Sweden)

    Yongxin Yuan

    2014-01-01

    Full Text Available Model updating is a common method to improve the correlation between structural dynamics models and measured data. In conducting the updating, it is desirable to match only the measured spectral data without tampering with the other unmeasured and unknown eigeninformation in the original model (if so, the model is said to be updated with no spillover and to maintain the positive definiteness of the coefficient matrices. In this paper, an efficient numerical method for updating mass and stiffness matrices simultaneously is presented. The method first updates the modal frequencies. Then, a method is presented to construct a transformation matrix and this matrix is used to correct the analytical eigenvectors so that the updated model is compatible with the measurement of the eigenvectors. The method can preserve both no spillover and the symmetric positive definiteness of the mass and stiffness matrices. The method is computationally efficient as neither iteration nor numerical optimization is required. The numerical example shows that the presented method is quite accurate and efficient.

  6. Nonlinear structural finite element model updating and uncertainty quantification

    Science.gov (United States)

    Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.

    2015-04-01

    This paper presents a framework for nonlinear finite element (FE) model updating, in which state-of-the-art nonlinear structural FE modeling and analysis techniques are combined with the maximum likelihood estimation method (MLE) to estimate time-invariant parameters governing the nonlinear hysteretic material constitutive models used in the FE model of the structure. The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem. A proof-of-concept example, consisting of a cantilever steel column representing a bridge pier, is provided to verify the proposed nonlinear FE model updating framework.

  7. Coupled vibro-acoustic model updating using frequency response functions

    Science.gov (United States)

    Nehete, D. V.; Modak, S. V.; Gupta, K.

    2016-03-01

    Interior noise in cavities of motorized vehicles is of increasing significance due to the lightweight design of these structures. Accurate coupled vibro-acoustic FE models of such cavities are required so as to allow a reliable design and analysis. It is, however, experienced that the vibro-acoustic predictions using these models do not often correlate acceptably well with the experimental measurements and hence require model updating. Both the structural and the acoustic parameters addressing the stiffness as well as the damping modeling inaccuracies need to be considered simultaneously in the model updating framework in order to obtain an accurate estimate of these parameters. It is also noted that the acoustic absorption properties are generally frequency dependent. This makes use of modal data based methods for updating vibro-acoustic FE models difficult. In view of this, the present paper proposes a method based on vibro-acoustic frequency response functions that allow updating of a coupled FE model by considering simultaneously the parameters associated with both the structural as well as the acoustic model of the cavity. The effectiveness of the proposed method is demonstrated through numerical studies on a 3D rectangular box cavity with a flexible plate. Updating parameters related to the material property, stiffness of joints between the plate and the rectangular cavity and the properties of absorbing surfaces of the acoustic cavity are considered. The robustness of the method under presence of noise is also studied.

  8. An update on projection methods for transient incompressible viscous flow

    Energy Technology Data Exchange (ETDEWEB)

    Gresho, P.M.; Chan, S.T.

    1995-07-01

    Introduced in 1990 was the biharmonic equation (for the pressure) and the concomitant biharmonic miracle when transient incompressible viscous flow is solved approximately by a projection method. Herein is introduced the biharmonic catastrophe that sometimes occurs with these same projection methods.

  9. Baseline projections of transportation energy consumption by mode: 1981 update

    Energy Technology Data Exchange (ETDEWEB)

    Millar, M; Bunch, J; Vyas, A; Kaplan, M; Knorr, R; Mendiratta, V; Saricks, C

    1982-04-01

    A comprehensive set of activity and energy-demand projections for each of the major transportation modes and submodes is presented. Projections are developed for a business-as-usual scenario, which provides a benchmark for assessing the effects of potential conservation strategies. This baseline scenario assumes a continuation of present trends, including fuel-efficiency improvements likely to result from current efforts of vehicle manufacturers. Because of anticipated changes in fuel efficiency, fuel price, modal shifts, and a lower-than-historic rate of economic growth, projected growth rates in transportation activity and energy consumption depart from historic patterns. The text discusses the factors responsible for this departure, documents the assumptions and methodologies used to develop the modal projections, and compares the projections with other efforts.

  10. The International Reference Ionosphere: Model Update 2016

    Science.gov (United States)

    Bilitza, Dieter; Altadill, David; Reinisch, Bodo; Galkin, Ivan; Shubin, Valentin; Truhlik, Vladimir

    2016-04-01

    The International Reference Ionosphere (IRI) is recognized as the official standard for the ionosphere (COSPAR, URSI, ISO) and is widely used for a multitude of different applications as evidenced by the many papers in science and engineering journals that acknowledge the use of IRI (e.g., about 11% of all Radio Science papers each year). One of the shortcomings of the model has been the dependence of the F2 peak height modeling on the propagation factor M(3000)F2. With the 2016 version of IRI, two new models will be introduced for hmF2 that were developed directly based on hmF2 measurements by ionosondes [Altadill et al., 2013] and by COSMIC radio occultation [Shubin, 2015], respectively. In addition IRI-2016 will include an improved representation of the ionosphere during the very low solar activities that were reached during the last solar minimum in 2008/2009. This presentation will review these and other improvements that are being implemented with the 2016 version of the IRI model. We will also discuss recent IRI workshops and their findings and results. One of the most exciting new projects is the development of the Real-Time IRI [Galkin et al., 2012]. We will discuss the current status and plans for the future. Altadill, D., S. Magdaleno, J.M. Torta, E. Blanch (2013), Global empirical models of the density peak height and of the equivalent scale height for quiet conditions, Advances in Space Research 52, 1756-1769, doi:10.1016/j.asr.2012.11.018. Galkin, I.A., B.W. Reinisch, X. Huang, and D. Bilitza (2012), Assimilation of GIRO Data into a Real-Time IRI, Radio Science, 47, RS0L07, doi:10.1029/2011RS004952. Shubin V.N. (2015), Global median model of the F2-layer peak height based on ionospheric radio-occultation and ground-based Digisonde observations, Advances in Space Research 56, 916-928, doi:10.1016/j.asr.2015.05.029.

  11. Update to Core reporting practices in structural equation modeling.

    Science.gov (United States)

    Schreiber, James B

    2016-07-21

    This paper is a technical update to "Core Reporting Practices in Structural Equation Modeling."(1) As such, the content covered in this paper includes, sample size, missing data, specification and identification of models, estimation method choices, fit and residual concerns, nested, alternative, and equivalent models, and unique issues within the SEM family of techniques.

  12. Three Case Studies in Finite Element Model Updating

    Directory of Open Access Journals (Sweden)

    M. Imregun

    1995-01-01

    Full Text Available This article summarizes the basic formulation of two well-established finite element model (FEM updating techniques for improved dynamic analysis, namely the response function method (RFM and the inverse eigensensitivity method (IESM. Emphasis is placed on the similarities in their mathematical formulation, numerical treatment, and on the uniqueness of the resulting updated models. Three case studies that include welded L-plate specimens, a car exhaust system, and a highway bridge were examined in some detail and measured vibration data were used throughout the investigation. It was experimentally observed that significant dynamic behavior discrepancies existed between some of the nominally identical structures, a feature that makes the task of model updating even more difficult because no unequivocal reference data exist in this particular case. Although significant improvements were obtained in all cases where the updating of the FE model was possible, it was found that the success of the updated models depended very heavily on the parameters used, such as the selection and number of the frequency points for RFM, and the selection of modes and the balancing of the sensitivity matrix for IESM. Finally, the performance of the two methods was compared from general applicability, numerical stability, and computational effort standpoints.

  13. Project SFR 1 SAR-08. Update of priority of FEPs from Project SAFE

    Energy Technology Data Exchange (ETDEWEB)

    Gordon, Anna (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE)); Loefgren, Martin; Lindgren, Maria (Kemakta Konsult AB, Stockholm (SE)) (eds.)

    2008-03-15

    SFR 1 is a repository for final disposal of low and intermediate level radioactive waste produced at Swedish nuclear power plants, as well as at Swedish industrial, research, and medical treatment facilities. The repository obtained operational license in March 1988. The aim of Project SFR 1 SAR-08 is to perform an updated safety analysis, according to requirements in the regulations. A major difference between this and previous safety analyses is that repository safety should be demonstrated for 100,000 years after repository closure. This should be compared with the time frame of the safety assessment in Project SAFE that was 10,000 years. Due to the extended time frame, permafrost and glaciation have to be considered in the reference evolution of Project SFR 1 SAR-08. Other rationales for the update are recent input from the authorities concerning SAFE documents and the SFR 1 repository, as well as new data concerning the SFR 1 inventory. This report describes the outcome of revisiting the qualitative FEP (Features, Events and Processes) analysis carried out within Project SAFE for the SFR 1 repository. Each and every interaction definition, as defined in SAFE, has been examined with the aim at assuring that the SAFE interaction matrices are also applicable for SAR-08. It was found that this is generally the case, but seven new interactions were defined in order to make the interaction matrices more applicable for SAR-08. The priority of all interactions assigned priority 1 and many interactions assigned priority 2 in SAFE have been carefully examined. The examination has been made in the context of the general initial and boundary conditions that should also form the basis for the SAR-08 main scenario and less probable scenarios. In 48 cases, the priority of the interaction needed upgrading, compared to in SAFE. In a majority of these cases, the upgrade is due to the extended time frame of the safety assessment, from 10,000 years in SAFE to 100,000 years in SAR

  14. Updated Conceptual Model for the 300 Area Uranium Groundwater Plume

    Energy Technology Data Exchange (ETDEWEB)

    Zachara, John M.; Freshley, Mark D.; Last, George V.; Peterson, Robert E.; Bjornstad, Bruce N.

    2012-11-01

    The 300 Area uranium groundwater plume in the 300-FF-5 Operable Unit is residual from past discharge of nuclear fuel fabrication wastes to a number of liquid (and solid) disposal sites. The source zones in the disposal sites were remediated by excavation and backfilled to grade, but sorbed uranium remains in deeper, unexcavated vadose zone sediments. In spite of source term removal, the groundwater plume has shown remarkable persistence, with concentrations exceeding the drinking water standard over an area of approximately 1 km2. The plume resides within a coupled vadose zone, groundwater, river zone system of immense complexity and scale. Interactions between geologic structure, the hydrologic system driven by the Columbia River, groundwater-river exchange points, and the geochemistry of uranium contribute to persistence of the plume. The U.S. Department of Energy (DOE) recently completed a Remedial Investigation/Feasibility Study (RI/FS) to document characterization of the 300 Area uranium plume and plan for beginning to implement proposed remedial actions. As part of the RI/FS document, a conceptual model was developed that integrates knowledge of the hydrogeologic and geochemical properties of the 300 Area and controlling processes to yield an understanding of how the system behaves and the variables that control it. Recent results from the Hanford Integrated Field Research Challenge site and the Subsurface Biogeochemistry Scientific Focus Area Project funded by the DOE Office of Science were used to update the conceptual model and provide an assessment of key factors controlling plume persistence.

  15. Model updating in flexible-link multibody systems

    Science.gov (United States)

    Belotti, R.; Caneva, G.; Palomba, I.; Richiedei, D.; Trevisani, A.

    2016-09-01

    The dynamic response of flexible-link multibody systems (FLMSs) can be predicted through nonlinear models based on finite elements, to describe the coupling between rigid- body and elastic behaviour. Their accuracy should be as high as possible to synthesize controllers and observers. Model updating based on experimental measurements is hence necessary. By taking advantage of the experimental modal analysis, this work proposes a model updating procedure for FLMSs and applies it experimentally to a planar robot. Indeed, several peculiarities of the model of FLMS should be carefully tackled. On the one hand, nonlinear models of a FLMS should be linearized about static equilibrium configurations. On the other, the experimental mode shapes should be corrected to be consistent with the elastic displacements represented in the model, which are defined with respect to a fictitious moving reference (the equivalent rigid link system). Then, since rotational degrees of freedom are also represented in the model, interpolation of the experimental data should be performed to match the model displacement vector. Model updating has been finally cast as an optimization problem in the presence of bounds on the feasible values, by also adopting methods to improve the numerical conditioning and to compute meaningful updated inertial and elastic parameters.

  16. Model update mechanism for mean-shift tracking

    Institute of Scientific and Technical Information of China (English)

    Peng Ningsong; Yang Jie; Liu Erqi

    2005-01-01

    In order to solve the model update problem in mean-shift based tracker, a novel mechanism is proposed.Kalman filter is employed to update object model by filtering object kernel-histogram using previous model and current candidate. A self-tuning method is used for adaptively adjust all the parameters of the filters under the analysis of the filtering residuals. In addition, hypothesis testing servers as the criterion for determining whether to accept filtering result. Therefore, the tracker has the ability to handle occlusion so as to avoid over-update. The experimental results show that our method can not only keep up with the object appearance and scale changes but also be robust to occlusion.

  17. New York State Energy Research and Development Authority. Research projects` update project status as of March 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-01

    This report provides an update of the New York State Energy Research and Development Authority (NYSERDA) program. The NYSERDA research and development program has five major areas: industry, buildings, energy resources, transportation, and environment. NYSERDA organizes projects within these five major areas based on energy use and supply, and end-use sectors. Therefore, issues such as waste management, energy products and renewable energy technologies are addressed in several areas of the program. The project descriptions presented are organized within the five program areas. Descriptions of projects completed between the period April 1, 1996, and March 31, 1997, including technology-transfer activities, are at the end of each subprogram section.

  18. Update of the Solar Concentrator Advanced Development Project

    Science.gov (United States)

    Corrigan, Robert D.; Peterson, Todd T.; Ehresman, Derik T.

    1989-01-01

    The Solar Concentrator Advanced Development Project, which has achieved the successful design, fabrication, and testing of a full-scale prototypical solar dynamic concentrator, is discussed. The design and fabrication process are summarized, and the test results for the reflective facet optical performance and the concentrator structural repeatability are reported. Initial testing of structural repeatability of a seven panel portion of the concentrator was followed by assembly and testing of the full nineteen-panel structure. The testing, which consisted of theodolite and optical measurements over an assembly-disassembly-reassembly cycle, demonstrated that the concentrator maintained the as-built contour and optical characteristics. The facet development effort, which entailed developing a vapor-deposited reflective facet, produced a viable design with demonstrated optical characteristics that are within the project goals.

  19. NCBI Reference Sequence project: update and current status.

    Science.gov (United States)

    Pruitt, Kim D; Tatusova, Tatiana; Maglott, Donna R

    2003-01-01

    The goal of the NCBI Reference Sequence (RefSeq) project is to provide the single best non-redundant and comprehensive collection of naturally occurring biological molecules, representing the central dogma. Nucleotide and protein sequences are explicitly linked on a residue-by-residue basis in this collection. Ideally all molecule types will be available for each well-studied organism, but the initial database collection pragmatically includes only those molecules and organisms that are most readily identified. Thus different amounts of information are available for different organisms at any given time. Furthermore, for some organisms additional intermediate records are provided when the genome sequence is not yet finished. The collection is supplied by NCBI through three distinct pipelines in addition to collaborations with community groups. The collection is curated on an ongoing basis. Additional information about the NCBI RefSeq project is available at http://www.ncbi.nih.gov/RefSeq/.

  20. Finite Element Model Updating Using Response Surface Method

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes the response surface method for finite element model updating. The response surface method is implemented by approximating the finite element model surface response equation by a multi-layer perceptron. The updated parameters of the finite element model were calculated using genetic algorithm by optimizing the surface response equation. The proposed method was compared to the existing methods that use simulated annealing or genetic algorithm together with a full finite element model for finite element model updating. The proposed method was tested on an unsymmetri-cal H-shaped structure. It was observed that the proposed method gave the updated natural frequen-cies and mode shapes that were of the same order of accuracy as those given by simulated annealing and genetic algorithm. Furthermore, it was observed that the response surface method achieved these results at a computational speed that was more than 2.5 times as fast as the genetic algorithm and a full finite element model and 24 ti...

  1. Application of firefly algorithm to the dynamic model updating problem

    Science.gov (United States)

    Shabbir, Faisal; Omenzetter, Piotr

    2015-04-01

    Model updating can be considered as a branch of optimization problems in which calibration of the finite element (FE) model is undertaken by comparing the modal properties of the actual structure with these of the FE predictions. The attainment of a global solution in a multi dimensional search space is a challenging problem. The nature-inspired algorithms have gained increasing attention in the previous decade for solving such complex optimization problems. This study applies the novel Firefly Algorithm (FA), a global optimization search technique, to a dynamic model updating problem. This is to the authors' best knowledge the first time FA is applied to model updating. The working of FA is inspired by the flashing characteristics of fireflies. Each firefly represents a randomly generated solution which is assigned brightness according to the value of the objective function. The physical structure under consideration is a full scale cable stayed pedestrian bridge with composite bridge deck. Data from dynamic testing of the bridge was used to correlate and update the initial model by using FA. The algorithm aimed at minimizing the difference between the natural frequencies and mode shapes of the structure. The performance of the algorithm is analyzed in finding the optimal solution in a multi dimensional search space. The paper concludes with an investigation of the efficacy of the algorithm in obtaining a reference finite element model which correctly represents the as-built original structure.

  2. LVRF fuel bundle manufacture for Bruce - project update

    Energy Technology Data Exchange (ETDEWEB)

    Pant, A. [Zircatec Precision Industries, Port Hope, Ontario (Canada)

    2005-07-01

    In response to the Power Uprate program at Bruce Power, Zircatec has committed to introduce, by Spring 2006 a new manufacturing line for the production of 43 element Bruce LVRF bundles containing Slightly Enriched Uranium (SEU) with a centre pin of blended dysprosia/urania (BDU). This is a new fuel design and is the first change in fuel design since the introduction of the current 37 element fuel over 20 years ago. Introduction of this new line has involved the introduction of significant changes to an environment that is not used to rapid changes with significant impact. At ZPI we have been able to build on our innovative capabilities in new fuel manufacturing, the strength and experience of our core team, and on our prevailing management philosophy of 'support the doer'. The presentation will discuss some of the novel aspects of this fuel introduction and the mix of innovative and classical project management methods that are being used to ensure that project deliverables are being met. Supporting presentations will highlight some of the issues in more detail. (author)

  3. Model updating of a full-scale FE model with nonlinear constraint equations and sensitivity-based cluster analysis for updating parameters

    Science.gov (United States)

    Jang, Jinwoo; Smyth, Andrew W.

    2017-01-01

    The objective of structural model updating is to reduce inherent modeling errors in Finite Element (FE) models due to simplifications, idealized connections, and uncertainties of material properties. Updated FE models, which have less discrepancies with real structures, give more precise predictions of dynamic behaviors for future analyses. However, model updating becomes more difficult when applied to civil structures with a large number of structural components and complicated connections. In this paper, a full-scale FE model of a major long-span bridge has been updated for improved consistency with real measured data. Two methods are applied to improve the model updating process. The first method focuses on improving the agreement of the updated mode shapes with the measured data. A nonlinear inequality constraint equation is used to an optimization procedure, providing the capability to regulate updated mode shapes to remain within reasonable agreements with those observed. An interior point algorithm deals with nonlinearity in the objective function and constraints. The second method finds very efficient updating parameters in a more systematic way. The selection of updating parameters in FE models is essential to have a successful updating result because the parameters are directly related to the modal properties of dynamic systems. An in-depth sensitivity analysis is carried out in an effort to precisely understand the effects of physical parameters in the FE model on natural frequencies. Based on the sensitivity analysis, cluster analysis is conducted to find a very efficient set of updating parameters.

  4. Knowledge Model: Project Knowledge Management

    DEFF Research Database (Denmark)

    Durao, Frederico; Dolog, Peter; Grolin, Daniel

    2009-01-01

    The Knowledge model for project management serves several goals:Introducing relevant concepts of project management area for software development (Section 1). Reviewing and understanding the real case requirements from the industrial perspective. (Section 2). Giving some preliminary suggestions f...

  5. Knowledge Model: Project Knowledge Management

    DEFF Research Database (Denmark)

    Durao, Frederico; Dolog, Peter; Grolin, Daniel

    2009-01-01

    The Knowledge model for project management serves several goals:Introducing relevant concepts of project management area for software development (Section 1). Reviewing and understanding the real case requirements from the industrial perspective. (Section 2). Giving some preliminary suggestions...

  6. Crushed-salt constitutive model update

    Energy Technology Data Exchange (ETDEWEB)

    Callahan, G.D.; Loken, M.C.; Mellegard, K.D. [RE/SPEC Inc., Rapid City, SD (United States); Hansen, F.D. [Sandia National Labs., Albuquerque, NM (United States)

    1998-01-01

    Modifications to the constitutive model used to describe the deformation of crushed salt are presented in this report. Two mechanisms--dislocation creep and grain boundary diffusional pressure solutioning--defined previously but used separately are combined to form the basis for the constitutive model governing the deformation of crushed salt. The constitutive model is generalized to represent three-dimensional states of stress. New creep consolidation tests are combined with an existing database that includes hydrostatic consolidation and shear consolidation tests conducted on Waste Isolation Pilot Plant and southeastern New Mexico salt to determine material parameters for the constitutive model. Nonlinear least-squares model fitting to data from the shear consolidation tests and a combination of the shear and hydrostatic consolidation tests produced two sets of material parameter values for the model. The change in material parameter values from test group to test group indicates the empirical nature of the model but demonstrates improvement over earlier work with the previous models. Key improvements are the ability to capture lateral strain reversal and better resolve parameter values. To demonstrate the predictive capability of the model, each parameter value set was used to predict each of the tests in the database. Based on the fitting statistics and the ability of the model to predict the test data, the model appears to capture the creep consolidation behavior of crushed salt quite well.

  7. A Successive Selection Method for finite element model updating

    Science.gov (United States)

    Gou, Baiyong; Zhang, Weijie; Lu, Qiuhai; Wang, Bo

    2016-03-01

    Finite Element (FE) model can be updated effectively and efficiently by using the Response Surface Method (RSM). However, it often involves performance trade-offs such as high computational cost for better accuracy or loss of efficiency for lots of design parameter updates. This paper proposes a Successive Selection Method (SSM), which is based on the linear Response Surface (RS) function and orthogonal design. SSM rewrites the linear RS function into a number of linear equations to adjust the Design of Experiment (DOE) after every FE calculation. SSM aims to interpret the implicit information provided by the FE analysis, to locate the Design of Experiment (DOE) points more quickly and accurately, and thereby to alleviate the computational burden. This paper introduces the SSM and its application, describes the solution steps of point selection for DOE in detail, and analyzes SSM's high efficiency and accuracy in the FE model updating. A numerical example of a simply supported beam and a practical example of a vehicle brake disc show that the SSM can provide higher speed and precision in FE model updating for engineering problems than traditional RSM.

  8. Updating Known Distribution Models for Forecasting Climate Change Impact on Endangered Species

    Science.gov (United States)

    Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo

    2013-01-01

    To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only. PMID:23840330

  9. Updating known distribution models for forecasting climate change impact on endangered species.

    Science.gov (United States)

    Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo

    2013-01-01

    To plan endangered species conservation and to design adequate management programmes, it is necessary to predict their distributional response to climate change, especially under the current situation of rapid change. However, these predictions are customarily done by relating de novo the distribution of the species with climatic conditions with no regard of previously available knowledge about the factors affecting the species distribution. We propose to take advantage of known species distribution models, but proceeding to update them with the variables yielded by climatic models before projecting them to the future. To exemplify our proposal, the availability of suitable habitat across Spain for the endangered Bonelli's Eagle (Aquila fasciata) was modelled by updating a pre-existing model based on current climate and topography to a combination of different general circulation models and Special Report on Emissions Scenarios. Our results suggested that the main threat for this endangered species would not be climate change, since all forecasting models show that its distribution will be maintained and increased in mainland Spain for all the XXI century. We remark on the importance of linking conservation biology with distribution modelling by updating existing models, frequently available for endangered species, considering all the known factors conditioning the species' distribution, instead of building new models that are based on climate change variables only.

  10. Neurodevelopmental model of schizophrenia: update 2012

    National Research Council Canada - National Science Library

    Rapoport, J L; Giedd, J N; Gogtay, N

    2012-01-01

    The neurodevelopmental model of schizophrenia, which posits that the illness is the end state of abnormal neurodevelopmental processes that started years before the illness onset, is widely accepted...

  11. A last updating evolution model for online social networks

    Science.gov (United States)

    Bu, Zhan; Xia, Zhengyou; Wang, Jiandong; Zhang, Chengcui

    2013-05-01

    As information technology has advanced, people are turning to electronic media more frequently for communication, and social relationships are increasingly found on online channels. However, there is very limited knowledge about the actual evolution of the online social networks. In this paper, we propose and study a novel evolution network model with the new concept of “last updating time”, which exists in many real-life online social networks. The last updating evolution network model can maintain the robustness of scale-free networks and can improve the network reliance against intentional attacks. What is more, we also found that it has the “small-world effect”, which is the inherent property of most social networks. Simulation experiment based on this model show that the results and the real-life data are consistent, which means that our model is valid.

  12. Engineered materials characterization report, volume 3 - corrosion data and modeling update for viability assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCright, R D

    1998-06-30

    This Engineered Materials Characterization Report (EMCR), Volume 3, discusses in considerable detail the work of the past 18 months on testing the candidate materials proposed for the waste-package (WP) container and on modeling the performance of those materials in the Yucca Mountain (YM) repository setting This report was prepared as an update of information and serves as one of the supporting documents to the Viability Assessment (VA) of the Yucca Mountain Project. Previous versions of the EMCR have provided a history and background of container-materials selection and evaluation (Volume I), a compilation of physical and mechanical properties for the WP design effort (Volume 2), and corrosion-test data and performance-modeling activities (Volume 3). Because the information in Volumes 1 and 2 is still largely current, those volumes are not being revised. As new information becomes available in the testing and modeling efforts, Volume 3 is periodically updated to include that information.

  13. School District Impact Projections: Operations and Maintenance Costs. Everett Navy Homeport Project 1992 Update

    Science.gov (United States)

    1992-08-01

    project-related school children inmigrants , regardless of whether a school district has a projected unhoused student need in the year(s) in which the...Projections of Homeport project-related school children inmigrants under all three versions are taken directly from the separate Homeport School Impact...this latter report document for a description of methods and assumptions used to project Homeport project-related inmigrants and unhoused student need

  14. Updating the debate on model complexity

    Science.gov (United States)

    Simmons, Craig T.; Hunt, Randall J.

    2012-01-01

    As scientists who are trying to understand a complex natural world that cannot be fully characterized in the field, how can we best inform the society in which we live? This founding context was addressed in a special session, “Complexity in Modeling: How Much is Too Much?” convened at the 2011 Geological Society of America Annual Meeting. The session had a variety of thought-provoking presentations—ranging from philosophy to cost-benefit analyses—and provided some areas of broad agreement that were not evident in discussions of the topic in 1998 (Hunt and Zheng, 1999). The session began with a short introduction during which model complexity was framed borrowing from an economic concept, the Law of Diminishing Returns, and an example of enjoyment derived by eating ice cream. Initially, there is increasing satisfaction gained from eating more ice cream, to a point where the gain in satisfaction starts to decrease, ending at a point when the eater sees no value in eating more ice cream. A traditional view of model complexity is similar—understanding gained from modeling can actually decrease if models become unnecessarily complex. However, oversimplified models—those that omit important aspects of the problem needed to make a good prediction—can also limit and confound our understanding. Thus, the goal of all modeling is to find the “sweet spot” of model sophistication—regardless of whether complexity was added sequentially to an overly simple model or collapsed from an initial highly parameterized framework that uses mathematics and statistics to attain an optimum (e.g., Hunt et al., 2007). Thus, holistic parsimony is attained, incorporating “as simple as possible,” as well as the equally important corollary “but no simpler.”

  15. Update on the Battery Projects at NREL (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Santhanagopalan, S.; Pesaran, A.

    2010-10-01

    NREL collaborates with industry, universities, and other national laboratories as part of the DOE integrated Energy Storage Program to develop advanced batteries for vehicle applications. Our efforts are focused in the following areas: thermal characterization and analysis, evaluation of thermal abuse tolerance via modeling and experimental analysis, and implications on battery life and cost. Our activities support DOE goals, FreedomCAR targets, the USABC Tech Team, and battery developers. We develop tools to support the industry, both through one-on-one collaborations and by dissemination of information in the form of presentations in conferences and journal publications.

  16. An update of Leighton's solar dynamo model

    CERN Document Server

    Cameron, R H

    2016-01-01

    In 1969 Leighton developed a quasi-1D mathematical model of the solar dynamo, building upon the phenomenological scenario of Babcock(1961). Here we present a modification and extension of Leighton's model. Using the axisymmetric component of the magnetic field, we consider the radial field component at the solar surface and the radially integrated toroidal magnetic flux in the convection zone, both as functions of latitude. No assumptions are made with regard to the radial location of the toroidal flux. The model includes the effects of turbulent diffusion at the surface and in the convection zone, poleward meridional flow at the surface and an equatorward return flow affecting the toroidal flux, latitudinal differential rotation and the near-surface layer of radial rotational shear, downward convective pumping of magnetic flux in the shear layer, and flux emergence in the form of tilted bipolar magnetic regions. While the parameters relevant for the transport of the surface field are taken from observations,...

  17. Updates on Projections of Physics Reach with the Upgraded CMS Detector for High Luminosity LHC

    CERN Document Server

    CMS Collaboration

    2016-01-01

    This document contains updates on projections of physics reach with the upgraded CMS detector for HL-LHC. Selected measurements in Higgs Physics, Top Physics, Heavy Flavor physics, Searches for Dark Matter and new Heavy particles highlighting the performance of the planned upgrades of the CMS detector. The projections take into account the effects of high pileup conditions and detector performance, based on the CMS Phase II Technical Proposal (CMS-TDR-15-002). Some of the studies are performed using the Delphes fast simulation of the upgraded CMS detector.

  18. National Bioenergy Center, Biochemical Platform Integration Project: Quarterly Update, Winter 2011-2012 (Newsletter)

    Energy Technology Data Exchange (ETDEWEB)

    2012-04-01

    Winter 2011-2012 issue of the National Bioenergy Center Biochemical Platform Integration Project quarterly update. Issue topics: 34th Symposium on Biotechnology for Fuels and Chemicals; feasibility of NIR spectroscopy-based rapid feedstock reactive screening; demonstrating integrated pilot-scale biomass conversion. The Biochemical Process Integration Task focuses on integrating the processing steps in enzyme-based lignocellulose conversion technology. This project supports the U.S. Department of Energy's efforts to foster development, demonstration, and deployment of 'biochemical platform' biorefineries that economically produce ethanol or other fuels, as well as commodity sugars and a variety of other chemical products, from renewable lignocellulosic biomass.

  19. High-speed AMB machining spindle model updating and model validation

    Science.gov (United States)

    Wroblewski, Adam C.; Sawicki, Jerzy T.; Pesch, Alexander H.

    2011-04-01

    High-Speed Machining (HSM) spindles equipped with Active Magnetic Bearings (AMBs) have been envisioned to be capable of automated self-identification and self-optimization in efforts to accurately calculate parameters for stable high-speed machining operation. With this in mind, this work presents rotor model development accompanied by automated model-updating methodology followed by updated model validation. The model updating methodology is developed to address the dynamic inaccuracies of the nominal open-loop plant model when compared with experimental open-loop transfer function data obtained by the built in AMB sensors. The nominal open-loop model is altered by utilizing an unconstrained optimization algorithm to adjust only parameters that are a result of engineering assumptions and simplifications, in this case Young's modulus of selected finite elements. Minimizing the error of both resonance and anti-resonance frequencies simultaneously (between model and experimental data) takes into account rotor natural frequencies and mode shape information. To verify the predictive ability of the updated rotor model, its performance is assessed at the tool location which is independent of the experimental transfer function data used in model updating procedures. Verification of the updated model is carried out with complementary temporal and spatial response comparisons substantiating that the updating methodology is effective for derivation of open-loop models for predictive use.

  20. An update of Leighton's solar dynamo model

    Science.gov (United States)

    Cameron, R. H.; Schüssler, M.

    2017-02-01

    In 1969, Leighton developed a quasi-1D mathematical model of the solar dynamo, building upon the phenomenological scenario of Babcock published in 1961. Here we present a modification and extension of Leighton's model. Using the axisymmetric component (longitudinal average) of the magnetic field, we consider the radial field component at the solar surface and the radially integrated toroidal magnetic flux in the convection zone, both as functions of latitude. No assumptions are made with regard to the radial location of the toroidal flux. The model includes the effects of (i) turbulent diffusion at the surface and in the convection zone; (ii) poleward meridional flow at the surface and an equatorward return flow affecting the toroidal flux; (iii) latitudinal differential rotation and the near-surface layer of radial rotational shear; (iv) downward convective pumping of magnetic flux in the shear layer; and (v) flux emergence in the form of tilted bipolar magnetic regions treated as a source term for the radial surface field. While the parameters relevant for the transport of the surface field are taken from observations, the model condenses the unknown properties of magnetic field and flow in the convection zone into a few free parameters (turbulent diffusivity, effective return flow, amplitude of the source term, and a parameter describing the effective radial shear). Comparison with the results of 2D flux transport dynamo codes shows that the model captures the essential features of these simulations. We make use of the computational efficiency of the model to carry out an extended parameter study. We cover an extended domain of the 4D parameter space and identify the parameter ranges that provide solar-like solutions. Dipole parity is always preferred and solutions with periods around 22 yr and a correct phase difference between flux emergence in low latitudes and the strength of the polar fields are found for a return flow speed around 2 m s-1, turbulent

  1. A comparison of the updated very high resolution model RegCM3_10km with the previous version RegCM3_25km over the complex terrain of Greece: present and future projections

    Science.gov (United States)

    Tolika, Konstantia; Anagnostopoulou, Christina; Velikou, Kondylia; Vagenas, Christos

    2016-11-01

    The ability of a fine resolution regional climate model (10 × 10 km) in simulating efficiently the climate characteristics (temperature, precipitation, and wind) over Greece, in comparison to the previous version of the model with a 25 × 25 km resolution, is examined and analyzed in the present study. Overall, the results showed that the finer resolution model presented a better skill in generating low winter temperatures at high altitudinal areas, the temperature difference between the islands and the surrounding sea, high rainfall totals over the mountainous areas, the thermal storms during summer, and the wind maxima over the Aegean Sea. Regarding the future projections, even though the two models agree on the climatic signal, differences are found mainly to the magnitude of change of the selected parameters. Finally, it was found that at higher pressure levels, the present day projections of the two models do not show significant differences since the topography and terrain does not play such an important role as the general atmospheric circulation.

  2. Substructure System Identification for Finite Element Model Updating

    Science.gov (United States)

    Craig, Roy R., Jr.; Blades, Eric L.

    1997-01-01

    This report summarizes research conducted under a NASA grant on the topic 'Substructure System Identification for Finite Element Model Updating.' The research concerns ongoing development of the Substructure System Identification Algorithm (SSID Algorithm), a system identification algorithm that can be used to obtain mathematical models of substructures, like Space Shuttle payloads. In the present study, particular attention was given to the following topics: making the algorithm robust to noisy test data, extending the algorithm to accept experimental FRF data that covers a broad frequency bandwidth, and developing a test analytical model (TAM) for use in relating test data to reduced-order finite element models.

  3. Update on Advection-Diffusion Purge Flow Model

    Science.gov (United States)

    Brieda, Lubos

    2015-01-01

    Gaseous purge is commonly used in sensitive spacecraft optical or electronic instruments to prevent infiltration of contaminants and/or water vapor. Typically, purge is sized using simplistic zero-dimensional models that do not take into account instrument geometry, surface effects, and the dependence of diffusive flux on the concentration gradient. For this reason, an axisymmetric computational fluid dynamics (CFD) simulation was recently developed to model contaminant infiltration and removal by purge. The solver uses a combined Navier-Stokes and Advection-Diffusion approach. In this talk, we report on updates in the model, namely inclusion of a particulate transport model.

  4. Maximum likelihood reconstruction for Ising models with asynchronous updates

    CERN Document Server

    Zeng, Hong-Li; Aurell, Erik; Hertz, John; Roudi, Yasser

    2012-01-01

    We describe how the couplings in a non-equilibrium Ising model can be inferred from observing the model history. Two cases of an asynchronous update scheme are considered: one in which we know both the spin history and the update times (times at which an attempt was made to flip a spin) and one in which we only know the spin history (i.e., the times at which spins were actually flipped). In both cases, maximizing the likelihood of the data leads to exact learning rules for the couplings in the model. For the first case, we show that one can average over all possible choices of update times to obtain a learning rule that depends only on spin correlations and not on the specific spin history. For the second case, the same rule can be derived within a further decoupling approximation. We study all methods numerically for fully asymmetric Sherrington-Kirkpatrick models, varying the data length, system size, temperature, and external field. Good convergence is observed in accordance with the theoretical expectatio...

  5. An updated geospatial liquefaction model for global application

    Science.gov (United States)

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric

    2017-01-01

    We present an updated geospatial approach to estimation of earthquake-induced liquefaction from globally available geospatial proxies. Our previous iteration of the geospatial liquefaction model was based on mapped liquefaction surface effects from four earthquakes in Christchurch, New Zealand, and Kobe, Japan, paired with geospatial explanatory variables including slope-derived VS30, compound topographic index, and magnitude-adjusted peak ground acceleration from ShakeMap. The updated geospatial liquefaction model presented herein improves the performance and the generality of the model. The updates include (1) expanding the liquefaction database to 27 earthquake events across 6 countries, (2) addressing the sampling of nonliquefaction for incomplete liquefaction inventories, (3) testing interaction effects between explanatory variables, and (4) overall improving model performance. While we test 14 geospatial proxies for soil density and soil saturation, the most promising geospatial parameters are slope-derived VS30, modeled water table depth, distance to coast, distance to river, distance to closest water body, and precipitation. We found that peak ground velocity (PGV) performs better than peak ground acceleration (PGA) as the shaking intensity parameter. We present two models which offer improved performance over prior models. We evaluate model performance using the area under the curve under the Receiver Operating Characteristic (ROC) curve (AUC) and the Brier score. The best-performing model in a coastal setting uses distance to coast but is problematic for regions away from the coast. The second best model, using PGV, VS30, water table depth, distance to closest water body, and precipitation, performs better in noncoastal regions and thus is the model we recommend for global implementation.

  6. On grey relation projection model based on projection pursuit

    Institute of Scientific and Technical Information of China (English)

    Wang Shuo; Yang Shanlin; Ma Xijun

    2008-01-01

    Multidimensional grey relation projection value can be synthesized as one-dimensional projection value by u-sing projection pursuit model.The larger the projection value is,the better the model.Thus,according to the projection value,the best one can be chosen from the model aggregation.Because projection pursuit modeling based on accelera-ting genetic algorithm can simplify the implementation procedure of the projection pursuit technique and overcome its complex calculation as well as the difficulty in implementing its program,a new method can be obtained for choosing the best grey relation projection model based on the projection pursuit technique.

  7. Updated scalar sector constraints in Higgs triplet model

    CERN Document Server

    Das, Dipankar

    2016-01-01

    We show that in the Higgs triplet model, after the Higgs discovery, the mixing angle in the CP-even sector can be strongly constrained from unitarity. We also discuss how large quantum effects in $h\\to\\gamma\\gamma$ may arise in a SM-like scenario and a certain part of the parameter space can be ruled out from the diphoton signal strength. Using $T$-parameter and diphoton signal strength measurements, we update the bounds on the nonstandard scalar masses.

  8. An improved optimal elemental method for updating finite element models

    Institute of Scientific and Technical Information of China (English)

    Duan Zhongdong(段忠东); Spencer B.F.; Yan Guirong(闫桂荣); Ou Jinping(欧进萍)

    2004-01-01

    The optimal matrix method and optimal elemental method used to update finite element models may not provide accurate results. This situation occurs when the test modal model is incomplete, as is often the case in practice. An improved optimal elemental method is presented that defines a new objective function, and as a byproduct, circumvents the need for mass normalized modal shapes, which are also not readily available in practice. To solve the group of nonlinear equations created by the improved optimal method, the Lagrange multiplier method and Matlab function fmincon are employed. To deal with actual complex structures,the float-encoding genetic algorithm (FGA) is introduced to enhance the capability of the improved method. Two examples, a 7-degree of freedom (DOF) mass-spring system and a 53-DOF planar frame, respectively, are updated using the improved method.Thc example results demonstrate the advantages of the improved method over existing optimal methods, and show that the genetic algorithm is an effective way to update the models used for actual complex structures.

  9. TA 55 Reinvestment Project II Phase C Update Project Status May 23, 2017

    Energy Technology Data Exchange (ETDEWEB)

    Giordano, Anthony P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-25

    The TA-55 Reinvestment Project (TRP) II Phase C is a critical infrastructure project focused on improving safety and reliability of the Los Alamos National Laboratory (LANL) TA-55 Complex. The Project recapitalizes and revitalizes aging and obsolete facility and safety systems providing a sustainable nuclear facility for National Security Missions.

  10. An updated subgrid orographic parameterization for global atmospheric forecast models

    Science.gov (United States)

    Choi, Hyun-Joo; Hong, Song-You

    2015-12-01

    A subgrid orographic parameterization (SOP) is updated by including the effects of orographic anisotropy and flow-blocking drag (FBD). The impact of the updated SOP on short-range forecasts is investigated using a global atmospheric forecast model applied to a heavy snowfall event over Korea on 4 January 2010. When the SOP is updated, the orographic drag in the lower troposphere noticeably increases owing to the additional FBD over mountainous regions. The enhanced drag directly weakens the excessive wind speed in the low troposphere and indirectly improves the temperature and mass fields over East Asia. In addition, the snowfall overestimation over Korea is improved by the reduced heat fluxes from the surface. The forecast improvements are robust regardless of the horizontal resolution of the model between T126 and T510. The parameterization is statistically evaluated based on the skill of the medium-range forecasts for February 2014. For the medium-range forecasts, the skill improvements of the wind speed and temperature in the low troposphere are observed globally and for East Asia while both positive and negative effects appear indirectly in the middle-upper troposphere. The statistical skill for the precipitation is mostly improved due to the improvements in the synoptic fields. The improvements are also found for seasonal simulation throughout the troposphere and stratosphere during boreal winter.

  11. Two dimensional cellular automaton for evacuation modeling: hybrid shuffle update

    CERN Document Server

    Arita, Chikashi; Appert-Rolland, Cécile

    2015-01-01

    We consider a cellular automaton model with a static floor field for pedestrians evacuating a room. After identifying some properties of real pedestrian flows, we discuss various update schemes, and we introduce a new one, the hybrid shuffle update. The properties specific to pedestrians are incorporated in variables associated to particles called phases, that represent their step cycles. The dynamics of the phases gives naturally raise to some friction, and allows to reproduce several features observed in experiments. We study in particular the crossover between a low- and a high-density regime that occurs when the density of pedestrian increases, the dependency of the outflow in the strength of the floor field, and the shape of the queue in front of the exit.

  12. An updating method for structural dynamics models with unknown excitations

    Energy Technology Data Exchange (ETDEWEB)

    Louf, F; Charbonnel, P E; Ladeveze, P [LMT-Cachan (ENS Cachan/CNRS/Paris 6 University) 61, avenue du Prsident Wilson, F-94235 Cachan Cedex (France); Gratien, C [Astrium (EADS space transportation) - Service TE 343 66, Route de Verneuil, 78133 Les Mureaux Cedex (France)], E-mail: charbonnel@lmt.ens-cachan.fr, E-mail: ladeveze@lmt.ens-cachan.fr, E-mail: louf@lmt.ens-cachan.fr, E-mail: christian.gratien@astrium.eads.net

    2008-11-01

    This paper presents an extension of the Constitutive Relation Error (CRE) updating method to complex industrial structures, such as space launchers, for which tests carried out in the functional context can provide significant amounts of information. Indeed, since several sources of excitation are involved simultaneously, a flight test can be viewed as a multiple test. However, there is a serious difficulty in that these sources of excitation are partially unknown. The CRE updating method enables one to obtain an estimate of these excitations. We present a first application of the method using a very simple finite element model of the Ariane V launcher along with measurements performed at the end of an atmospheric flight.

  13. UPDATING THE FREIGHT TRUCK STOCK ADJUSTMENT MODEL: 1997 VEHICLE INVENTORY AND USE SURVEY DATA

    Energy Technology Data Exchange (ETDEWEB)

    Davis, S.C.

    2000-11-16

    The Energy Information Administration's (EIA's) National Energy Modeling System (NEMS) Freight Truck Stock Adjustment Model (FTSAM) was created in 1995 relying heavily on input data from the 1992 Economic Census, Truck Inventory and Use Survey (TIUS). The FTSAM is part of the NEMS Transportation Sector Model, which provides baseline energy projections and analyzes the impacts of various technology scenarios on consumption, efficiency, and carbon emissions. The base data for the FTSAM can be updated every five years as new Economic Census information is released. Because of expertise in using the TIUS database, Oak Ridge National Laboratory (ORNL) was asked to assist the EIA when the new Economic Census data were available. ORNL provided the necessary base data from the 1997 Vehicle Inventory and Use Survey (VIUS) and other sources to update the FTSAM. The next Economic Census will be in the year 2002. When those data become available, the EIA will again want to update the FTSAM using the VIUS. This report, which details the methodology of estimating and extracting data from the 1997 VIUS Microdata File, should be used as a guide for generating the data from the next VIUS so that the new data will be as compatible as possible with the data in the model.

  14. An Updated Probabilistic Seismic Hazard Assessment for Romania and Comparison with the Approach and Outcomes of the SHARE Project

    Science.gov (United States)

    Pavel, Florin; Vacareanu, Radu; Douglas, John; Radulian, Mircea; Cioflan, Carmen; Barbat, Alex

    2016-06-01

    The probabilistic seismic hazard analysis for Romania is revisited within the framework of the BIGSEES national research project (http://infp.infp.ro/bigsees/default.htm) financed by the Romanian Ministry of Education and Scientific Research in the period 2012-2016. The scope of this project is to provide a refined description of the seismic action for Romanian sites according to the requirements of Eurocode 8. To this aim, the seismicity of all the sources influencing the Romanian territory is updated based on new data acquired in recent years. The ground-motion models used in the analysis, as well as their corresponding weights, are selected based on the results from several recent papers also published within the framework of the BIGSEES project. The seismic hazard analysis for Romania performed in this study are based on the traditional Cornell-McGuire approach. Finally, the results are discussed and compared with the values obtained in the recently completed SHARE research project. The BIGSEES and SHARE results are not directly comparable since the considered soil conditions are different—actual soil classes for BIGSEES and rock for SHARE. Nevertheless, the analyses of the seismic hazard results for 200 sites in Romania reveal considerable differences between the seismic hazard levels obtained in the present study and the SHARE results and point out the need for further analyses and thorough discussions related to the two seismic hazard models, especially in the light of a possible future harmonized hazard map for Europe.

  15. Performance Comparison of Two Meta-Model for the Application to Finite Element Model Updating of Structures

    Institute of Scientific and Technical Information of China (English)

    Yang Liu; DeJun Wang; Jun Ma; Yang Li

    2014-01-01

    To investigate the application of meta-model for finite element ( FE) model updating of structures, the performance of two popular meta-model, i.e., Kriging model and response surface model (RSM), were compared in detail. Firstly, above two kinds of meta-model were introduced briefly. Secondly, some key issues of the application of meta-model to FE model updating of structures were proposed and discussed, and then some advices were presented in order to select a reasonable meta-model for the purpose of updating the FE model of structures. Finally, the procedure of FE model updating based on meta-model was implemented by updating the FE model of a truss bridge model with the measured modal parameters. The results showed that the Kriging model was more proper for FE model updating of complex structures.

  16. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    Science.gov (United States)

    Fu, Y.; Yang, W.; Xu, O.; Zhou, L.; Wang, J.

    2017-04-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately.

  17. An Updated Gas/grain Sulfur Network for Astrochemical Models

    Science.gov (United States)

    Laas, Jacob; Caselli, Paola

    2017-06-01

    Sulfur is a chemical element that enjoys one of the highest cosmic abundances. However, it has traditionally played a relatively minor role in the field of astrochemistry, being drowned out by other chemistries after it depletes from the gas phase during the transition from a diffuse cloud to a dense one. A wealth of laboratory studies have provided clues to its rich chemistry in the condensed phase, and most recently, a report by a team behind the Rosetta spacecraft has significantly helped to unveil its rich cometary chemistry. We have set forth to use this information to greatly update/extend the sulfur reactions within the OSU gas/grain astrochemical network in a systematic way, to provide more realistic chemical models of sulfur for a variety of interstellar environments. We present here some results and implications of these models.

  18. Finite element modelling and updating of a lively footbridge: The complete process

    Science.gov (United States)

    Živanović, Stana; Pavic, Aleksandar; Reynolds, Paul

    2007-03-01

    The finite element (FE) model updating technology was originally developed in the aerospace and mechanical engineering disciplines to automatically update numerical models of structures to match their experimentally measured counterparts. The process of updating identifies the drawbacks in the FE modelling and the updated FE model could be used to produce more reliable results in further dynamic analysis. In the last decade, the updating technology has been introduced into civil structural engineering. It can serve as an advanced tool for getting reliable modal properties of large structures. The updating process has four key phases: initial FE modelling, modal testing, manual model tuning and automatic updating (conducted using specialist software). However, the published literature does not connect well these phases, although this is crucial when implementing the updating technology. This paper therefore aims to clarify the importance of this linking and to describe the complete model updating process as applicable in civil structural engineering. The complete process consisting the four phases is outlined and brief theory is presented as appropriate. Then, the procedure is implemented on a lively steel box girder footbridge. It was found that even a very detailed initial FE model underestimated the natural frequencies of all seven experimentally identified modes of vibration, with the maximum error being almost 30%. Manual FE model tuning by trial and error found that flexible supports in the longitudinal direction should be introduced at the girder ends to improve correlation between the measured and FE-calculated modes. This significantly reduced the maximum frequency error to only 4%. It was demonstrated that only then could the FE model be automatically updated in a meaningful way. The automatic updating was successfully conducted by updating 22 uncertain structural parameters. Finally, a physical interpretation of all parameter changes is discussed. This

  19. Project SAFE. Update of the SFR-1 safety assessment. Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [ed.] [Golder Associates AB (Sweden); Riggare, P. [ed.] [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Skagius, K. [ed.] [Kemakta Konsult AB, Stockholm (Sweden)

    1998-10-01

    SFR-1 is a facility for disposal of low-level radioactive operational waste from the nuclear power plants in Sweden. Low-level radioactive waste from industry, medicine, and research is also disposed in SFR-1. The facility is situated in bedrock beneath the Baltic Sea, 1 km off the coast near the Forsmark nuclear power plant. SFR-1 was built between the years 1983 and 1988. An assessment of the long-term performance of the facility was included in the vast documentation that was a part of the application for an operational license. The assessment was presented in the form of a final safety report. In the operational licence for SFR-1 it is stated that renewed safety assessments should be carried out at least each ten years. In order to meet this demand SKB has launched a special project, SAFE (Safety Assessment of Final Disposal of Operational Radioactive Waste). The aim of the project is to update the safety analysis and to prepare a safety report that will be presented to the Swedish authorities not later than year 2000. Project SAFE is divided into three phases. The first phase is a prestudy, and the results of the prestudy are given in this report. The aim of the prestudy is to identify issues where additional studies would improve the basis for the updated safety analysis as well as to suggest how these studies should be carried out. The work has been divided into six different topics, namely the inventory, the near field, the far field, the biosphere, radionuclide transport calculations and scenarios. For each topic the former safety reports and regulatory reviews are scrutinised and needs for additional work is identified. The evaluations are given in appendices covering the respective topics. The main report is a summary of the appendices with a more stringent description of the repository system and the processes that are of interest and therefore should be addressed in an updated safety assessment. However, it should be pointed out that one of the

  20. "Updates to Model Algorithms & Inputs for the Biogenic ...

    Science.gov (United States)

    We have developed new canopy emission algorithms and land use data for BEIS. Simulations with BEIS v3.4 and these updates in CMAQ v5.0.2 are compared these changes to the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and evaluated the simulations against observations. This has resulted in improvements in model evaluations of modeled isoprene, NOx, and O3. The National Exposure Research Laboratory (NERL) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA mission to protect human health and the environment. AMAD research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollution problem, but also in developing emission control policies and regulations for air quality improvements.

  1. Updated Delft Mass Transport model DMT-2: computation and validation

    Science.gov (United States)

    Hashemi Farahani, Hassan; Ditmar, Pavel; Inacio, Pedro; Klees, Roland; Guo, Jing; Guo, Xiang; Liu, Xianglin; Zhao, Qile; Didova, Olga; Ran, Jiangjun; Sun, Yu; Tangdamrongsub, Natthachet; Gunter, Brian; Riva, Ricardo; Steele-Dunne, Susan

    2014-05-01

    A number of research centers compute models of mass transport in the Earth's system using primarily K-Band Ranging (KBR) data from the Gravity Recovery And Climate Experiment (GRACE) satellite mission. These models typically consist of a time series of monthly solutions, each of which is defined in terms of a set of spherical harmonic coefficients up to degree 60-120. One of such models, the Delft Mass Transport, release 2 (DMT-2), is computed at the Delft University of Technology (The Netherlands) in collaboration with Wuhan University. An updated variant of this model has been produced recently. A unique feature of the computational scheme designed to compute DMT-2 is the preparation of an accurate stochastic description of data noise in the frequency domain using an Auto-Regressive Moving-Average (ARMA) model, which is derived for each particular month. The benefits of such an approach are a proper frequency-dependent data weighting in the data inversion and an accurate variance-covariance matrix of noise in the estimated spherical harmonic coefficients. Furthermore, the data prior to the inversion are subject to an advanced high-pass filtering, which makes use of a spatially-dependent weighting scheme, so that noise is primarily estimated on the basis of data collected over areas with minor mass transport signals (e.g., oceans). On the one hand, this procedure efficiently suppresses noise, which are caused by inaccuracies in satellite orbits and, on the other hand, preserves mass transport signals in the data. Finally, the unconstrained monthly solutions are filtered using a Wiener filter, which is based on estimates of the signal and noise variance-covariance matrices. In combination with a proper data weighting, this noticeably improves the spatial resolution of the monthly gravity models and the associated mass transport models.. For instance, the computed solutions allow long-term negative trends to be clearly seen in sufficiently small regions notorious

  2. Impact of the economic downturn on total joint replacement demand in the United States: updated projections to 2021.

    Science.gov (United States)

    Kurtz, Steven M; Ong, Kevin L; Lau, Edmund; Bozic, Kevin J

    2014-04-16

    Few studies have explored the role of the National Health Expenditure and macroeconomics on the utilization of total joint replacement. The economic downturn has raised questions about the sustainability of growth for total joint replacement in the future. Previous projections of total joint replacement demand in the United States were based on data up to 2003 using a statistical methodology that neglected macroeconomic factors, such as the National Health Expenditure. Data from the Nationwide Inpatient Sample (1993 to 2010) were used with United States Census and National Health Expenditure data to quantify historical trends in total joint replacement rates, including the two economic downturns in the 2000s. Primary and revision hip and knee arthroplasty were identified using codes from the International Classification of Diseases, Ninth Revision, Clinical Modification. Projections in total joint replacement were estimated using a regression model incorporating the growth in population and rate of arthroplasties from 1993 to 2010 as a function of age, sex, race, and census region using the National Health Expenditure as the independent variable. The regression model was used in conjunction with government projections of National Health Expenditure from 2011 to 2021 to estimate future arthroplasty rates in subpopulations of the United States and to derive national estimates. The growth trend for the incidence of joint arthroplasty, for the overall United States population as well as for the United States workforce, was insensitive to economic downturns. From 2009 to 2010, the total number of procedures increased by 6.0% for primary total hip arthroplasty, 6.1% for primary total knee arthroplasty, 10.8% for revision total hip arthroplasty, and 13.5% for revision total knee arthroplasty. The National Health Expenditure model projections for primary hip replacement in 2020 were higher than a previously projected model, whereas the current model estimates for total

  3. Update on microkinetic modeling of lean NOx trap chemistry.

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Richard S.; Daw, C. Stuart (Oak Ridge National Laboratory, Oak Ridge, TN); Pihl, Josh A. (Oak Ridge National Laboratory, Oak Ridge, TN); Choi, Jae-Soon (Oak Ridge National Laboratory, Oak Ridge, TN); Chakravarthy, V, Kalyana (Oak Ridge National Laboratory, Oak Ridge, TN)

    2010-04-01

    Our previously developed microkinetic model for lean NOx trap (LNT) storage and regeneration has been updated to address some longstanding issues, in particular the formation of N2O during the regeneration phase at low temperatures. To this finalized mechanism has been added a relatively simple (12-step) scheme that accounts semi-quantitatively for the main features observed during sulfation and desulfation experiments, namely (a) the essentially complete trapping of SO2 at normal LNT operating temperatures, (b) the plug-like sulfation of both barium oxide (NOx storage) and cerium oxide (oxygen storage) sites, (c) the degradation of NOx storage behavior arising from sulfation, (d) the evolution of H2S and SO2 during high temperature desulfation (temperature programmed reduction) under H2, and (e) the complete restoration of NOx storage capacity achievable through the chosen desulfation procedure.

  4. Simple brane-world inflationary models — An update

    Science.gov (United States)

    Okada, Nobuchika; Okada, Satomi

    2016-05-01

    In the light of the Planck 2015 results, we update simple inflationary models based on the quadratic, quartic, Higgs and Coleman-Weinberg potentials in the context of the Randall-Sundrum brane-world cosmology. Brane-world cosmological effect alters the inflationary predictions of the spectral index (ns) and the tensor-to-scalar ratio (r) from those obtained in the standard cosmology. In particular, the tensor-to-scalar ratio is enhanced in the presence of the 5th dimension. In order to maintain the consistency with the Planck 2015 results for the inflationary predictions in the standard cosmology, we find a lower bound on the five-dimensional Planck mass (M5). On the other hand, the inflationary predictions laying outside of the Planck allowed region can be pushed into the allowed region by the brane-world cosmological effect with a suitable choice of M5.

  5. Simple brane-world inflationary models: an update

    CERN Document Server

    Okada, Nobuchika

    2015-01-01

    In the light of the Planck 2015 results, we update simple inflationary models based on the quadratic, quartic, Higgs and Coleman-Weinberg potentials in the context of the Randall-Sundrum brane-world cosmology. Brane-world cosmological effect alters the inflationary predictions of the spectral index ($n_s$) and the tensor-to-scalar ratio ($r$) from those obtained in the standard cosmology. In particular, the tensor-to-scalar ratio is enhanced in the presence of the 5th dimension. In order to maintain the consistency with the Planck 2015 results for the inflationary predictions in the standard cosmology, we find a lower bound on the five-dimensional Planck mass. On the other hand, the inflationary predictions laying outside of the Planck allowed region can be pushed into the allowed region by the brane-world cosmological effect.

  6. Finite element model updating of existing steel bridge based on structural health monitoring

    Institute of Scientific and Technical Information of China (English)

    HE Xu-hui; YU zhi-wu; CHEN Zheng-qing

    2008-01-01

    Based on the physical meaning of sensitivity, a new finite element (FE) model updating method was proposed. In this method, a three-dimensional FE model of the Nanjing Yangtze River Bridge (NYRB) with ANSYS program was established and updated by modifying some design parameters. To further validate the updated FE model, the analytical stress-time histories responses of main members induced by a moving train were compared with the measured ones. The results show that the relative error of maximum stress is 2.49% and the minimum relative coefficient of analytical stress-time histories responses is 0.793. The updated model has a good agreement between the calculated data and the tested data, and provides a current baseline FE model for long-term health monitoring and condition assessment of the NYRB. At the same time, the model is validated by stress-time histories responses to be feasible and practical for railway steel bridge model updating.

  7. A Procedural Model for Process Improvement Projects

    OpenAIRE

    Kreimeyer, Matthias;Daniilidis, Charampos;Lindemann, Udo

    2017-01-01

    Process improvement projects are of a complex nature. It is therefore necessary to use experience and knowledge gained in previous projects when executing a new project. Yet, there are few pragmatic planning aids, and transferring the institutional knowledge from one project to the next is difficult. This paper proposes a procedural model that extends common models for project planning to enable staff on a process improvement project to adequately plan their projects, enabling them to documen...

  8. Implicit Value Updating Explains Transitive Inference Performance: The Betasort Model.

    Directory of Open Access Journals (Sweden)

    Greg Jensen

    Full Text Available Transitive inference (the ability to infer that B > D given that B > C and C > D is a widespread characteristic of serial learning, observed in dozens of species. Despite these robust behavioral effects, reinforcement learning models reliant on reward prediction error or associative strength routinely fail to perform these inferences. We propose an algorithm called betasort, inspired by cognitive processes, which performs transitive inference at low computational cost. This is accomplished by (1 representing stimulus positions along a unit span using beta distributions, (2 treating positive and negative feedback asymmetrically, and (3 updating the position of every stimulus during every trial, whether that stimulus was visible or not. Performance was compared for rhesus macaques, humans, and the betasort algorithm, as well as Q-learning, an established reward-prediction error (RPE model. Of these, only Q-learning failed to respond above chance during critical test trials. Betasort's success (when compared to RPE models and its computational efficiency (when compared to full Markov decision process implementations suggests that the study of reinforcement learning in organisms will be best served by a feature-driven approach to comparing formal models.

  9. Methods for the Update and Verification of Forest Surface Model

    Science.gov (United States)

    Rybansky, M.; Brenova, M.; Zerzan, P.; Simon, J.; Mikita, T.

    2016-06-01

    The digital terrain model (DTM) represents the bare ground earth's surface without any objects like vegetation and buildings. In contrast to a DTM, Digital surface model (DSM) represents the earth's surface including all objects on it. The DTM mostly does not change as frequently as the DSM. The most important changes of the DSM are in the forest areas due to the vegetation growth. Using the LIDAR technology the canopy height model (CHM) is obtained by subtracting the DTM and the corresponding DSM. The DSM is calculated from the first pulse echo and DTM from the last pulse echo data. The main problem of the DSM and CHM data using is the actuality of the airborne laser scanning. This paper describes the method of calculating the CHM and DSM data changes using the relations between the canopy height and age of trees. To get a present basic reference data model of the canopy height, the photogrammetric and trigonometric measurements of single trees were used. Comparing the heights of corresponding trees on the aerial photographs of various ages, the statistical sets of the tree growth rate were obtained. These statistical data and LIDAR data were compared with the growth curve of the spruce forest, which corresponds to a similar natural environment (soil quality, climate characteristics, geographic location, etc.) to get the updating characteristics.

  10. Update rules and interevent time distributions: slow ordering versus no ordering in the voter model.

    Science.gov (United States)

    Fernández-Gracia, J; Eguíluz, V M; San Miguel, M

    2011-07-01

    We introduce a general methodology of update rules accounting for arbitrary interevent time (IET) distributions in simulations of interacting agents. We consider in particular update rules that depend on the state of the agent, so that the update becomes part of the dynamical model. As an illustration we consider the voter model in fully connected, random, and scale-free networks with an activation probability inversely proportional to the time since the last action, where an action can be an update attempt (an exogenous update) or a change of state (an endogenous update). We find that in the thermodynamic limit, at variance with standard updates and the exogenous update, the system orders slowly for the endogenous update. The approach to the absorbing state is characterized by a power-law decay of the density of interfaces, observing that the mean time to reach the absorbing state might be not well defined. The IET distributions resulting from both update schemes show power-law tails.

  11. Update rules and interevent time distributions: Slow ordering versus no ordering in the voter model

    Science.gov (United States)

    Fernández-Gracia, J.; Eguíluz, V. M.; San Miguel, M.

    2011-07-01

    We introduce a general methodology of update rules accounting for arbitrary interevent time (IET) distributions in simulations of interacting agents. We consider in particular update rules that depend on the state of the agent, so that the update becomes part of the dynamical model. As an illustration we consider the voter model in fully connected, random, and scale-free networks with an activation probability inversely proportional to the time since the last action, where an action can be an update attempt (an exogenous update) or a change of state (an endogenous update). We find that in the thermodynamic limit, at variance with standard updates and the exogenous update, the system orders slowly for the endogenous update. The approach to the absorbing state is characterized by a power-law decay of the density of interfaces, observing that the mean time to reach the absorbing state might be not well defined. The IET distributions resulting from both update schemes show power-law tails.

  12. 77 FR 55466 - Environmental Impact Statement for Short Range-Projects and Update of the Real Property Master...

    Science.gov (United States)

    2012-09-10

    ... Property Master Plan for Fort Belvoir, VA AGENCY: Department of the Army, DoD. ACTION: Notice of Intent... proposed short-range improvement projects and the proposed update of the Real Property Master Plan (RPMP... Master Plan (as amended in the 2007 BRAC EIS) would remain in effect. Other reasonable...

  13. Passive Remote Sensing of Oceanic Whitecaps: Updated Geophysical Model Function

    Science.gov (United States)

    Anguelova, M. D.; Bettenhausen, M. H.; Johnston, W.; Gaiser, P. W.

    2016-12-01

    Many air-sea interaction processes are quantified in terms of whitecap fraction W because oceanic whitecaps are the most visible and direct way of observing breaking of wind waves in the open ocean. Enhanced by breaking waves, surface fluxes of momentum, heat, and mass are critical for ocean-atmosphere coupling and thus affect the accuracy of models used to forecast weather, predict storm intensification, and study climate change. Whitecap fraction has been traditionally measured from photographs or video images collected from towers, ships, and aircrafts. Satellite-based passive remote sensing of whitecap fraction is a recent development that allows long term, consistent observations of whitecapping on a global scale. The method relies on changes of ocean surface emissivity at microwave frequencies (e.g., 6 to 37 GHz) due to presence of sea foam on a rough sea surface. These changes at the ocean surface are observed from the satellite as brightness temperature TB. A year-long W database built with this algorithm has proven useful in analyzing and quantifying the variability of W, as well as estimating fluxes of CO2 and sea spray production. The algorithm to obtain W from satellite observations of TB was developed at the Naval Research Laboratory within the framework of WindSat mission. The W(TB) algorithm estimates W by minimizing the differences between measured and modeled TB data. A geophysical model function (GMF) calculates TB at the top of the atmosphere as contributions from the atmosphere and the ocean surface. The ocean surface emissivity combines the emissivity of rough sea surface and the emissivity of areas covered with foam. Wind speed and direction, sea surface temperature, water vapor, and cloud liquid water are inputs to the atmospheric, roughness and foam models comprising the GMF. The W(TB) algorithm has been recently updated to use new sources and products for the input variables. We present new version of the W(TB) algorithm that uses updated

  14. An Update to the NASA Reference Solar Sail Thrust Model

    Science.gov (United States)

    Heaton, Andrew F.; Artusio-Glimpse, Alexandra B.

    2015-01-01

    An optical model of solar sail material originally derived at JPL in 1978 has since served as the de facto standard for NASA and other solar sail researchers. The optical model includes terms for specular and diffuse reflection, thermal emission, and non-Lambertian diffuse reflection. The standard coefficients for these terms are based on tests of 2.5 micrometer Kapton sail material coated with 100 nm of aluminum on the front side and chromium on the back side. The original derivation of these coefficients was documented in an internal JPL technical memorandum that is no longer available. Additionally more recent optical testing has taken place and different materials have been used or are under consideration by various researchers for solar sails. Here, where possible, we re-derive the optical coefficients from the 1978 model and update them to accommodate newer test results and sail material. The source of the commonly used value for the front side non-Lambertian coefficient is not clear, so we investigate that coefficient in detail. Although this research is primarily designed to support the upcoming NASA NEA Scout and Lunar Flashlight solar sail missions, the results are also of interest to the wider solar sail community.

  15. Real time hybrid simulation with online model updating: An analysis of accuracy

    Science.gov (United States)

    Ou, Ge; Dyke, Shirley J.; Prakash, Arun

    2017-02-01

    In conventional hybrid simulation (HS) and real time hybrid simulation (RTHS) applications, the information exchanged between the experimental substructure and numerical substructure is typically restricted to the interface boundary conditions (force, displacement, acceleration, etc.). With additional demands being placed on RTHS and recent advances in recursive system identification techniques, an opportunity arises to improve the fidelity by extracting information from the experimental substructure. Online model updating algorithms enable the numerical model of components (herein named the target model), that are similar to the physical specimen to be modified accordingly. This manuscript demonstrates the power of integrating a model updating algorithm into RTHS (RTHSMU) and explores the possible challenges of this approach through a practical simulation. Two Bouc-Wen models with varying levels of complexity are used as target models to validate the concept and evaluate the performance of this approach. The constrained unscented Kalman filter (CUKF) is selected for using in the model updating algorithm. The accuracy of RTHSMU is evaluated through an estimation output error indicator, a model updating output error indicator, and a system identification error indicator. The results illustrate that, under applicable constraints, by integrating model updating into RTHS, the global response accuracy can be improved when the target model is unknown. A discussion on model updating parameter sensitivity to updating accuracy is also presented to provide guidance for potential users.

  16. A proposed model for construction project management ...

    African Journals Online (AJOL)

    The lack of a proper communication skills model for project management may ... done to identify the most important project management communication skills and applications of communication that effective project managers should possess.

  17. Prediction error, ketamine and psychosis: An updated model.

    Science.gov (United States)

    Corlett, Philip R; Honey, Garry D; Fletcher, Paul C

    2016-11-01

    In 2007, we proposed an explanation of delusion formation as aberrant prediction error-driven associative learning. Further, we argued that the NMDA receptor antagonist ketamine provided a good model for this process. Subsequently, we validated the model in patients with psychosis, relating aberrant prediction error signals to delusion severity. During the ensuing period, we have developed these ideas, drawing on the simple principle that brains build a model of the world and refine it by minimising prediction errors, as well as using it to guide perceptual inferences. While previously we focused on the prediction error signal per se, an updated view takes into account its precision, as well as the precision of prior expectations. With this expanded perspective, we see several possible routes to psychotic symptoms - which may explain the heterogeneity of psychotic illness, as well as the fact that other drugs, with different pharmacological actions, can produce psychotomimetic effects. In this article, we review the basic principles of this model and highlight specific ways in which prediction errors can be perturbed, in particular considering the reliability and uncertainty of predictions. The expanded model explains hallucinations as perturbations of the uncertainty mediated balance between expectation and prediction error. Here, expectations dominate and create perceptions by suppressing or ignoring actual inputs. Negative symptoms may arise due to poor reliability of predictions in service of action. By mapping from biology to belief and perception, the account proffers new explanations of psychosis. However, challenges remain. We attempt to address some of these concerns and suggest future directions, incorporating other symptoms into the model, building towards better understanding of psychosis. © The Author(s) 2016.

  18. The Cosmetics Europe strategy for animal-free genotoxicity testing: project status up-date.

    Science.gov (United States)

    Pfuhler, S; Fautz, R; Ouedraogo, G; Latil, A; Kenny, J; Moore, C; Diembeck, W; Hewitt, N J; Reisinger, K; Barroso, J

    2014-02-01

    The Cosmetics Europe (formerly COLIPA) Genotoxicity Task Force has driven and funded three projects to help address the high rate of misleading positives in in vitro genotoxicity tests: The completed "False Positives" project optimized current mammalian cell assays and showed that the predictive capacity of the in vitro micronucleus assay was improved dramatically by selecting more relevant cells and more sensitive toxicity measures. The on-going "3D skin model" project has been developed and is now validating the use of human reconstructed skin (RS) models in combination with the micronucleus (MN) and Comet assays. These models better reflect the in use conditions of dermally applied products, such as cosmetics. Both assays have demonstrated good inter- and intra-laboratory reproducibility and are entering validation stages. The completed "Metabolism" project investigated enzyme capacities of human skin and RS models. The RS models were shown to have comparable metabolic capacity to native human skin, confirming their usefulness for testing of compounds with dermal exposure. The program has already helped to improve the initial test battery predictivity and the RS projects have provided sound support for their use as a follow-up test in the assessment of the genotoxic hazard of cosmetic ingredients in the absence of in vivo data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Network inference using asynchronously updated kinetic Ising Model

    CERN Document Server

    Zeng, Hong-Li; Alava, Mikko; Mahmoudi, Hamed

    2010-01-01

    Network structures are reconstructed from dynamical data by respectively naive mean field (nMF) and Thouless-Anderson-Palmer (TAP) approximations. For TAP approximation, we use two methods to reconstruct the network: a) iteration method; b) casting the inference formula to a set of cubic equations and solving it directly. We investigate inference of the asymmetric Sherrington- Kirkpatrick (S-K) model using asynchronous update. The solutions of the sets cubic equation depend of temperature T in the S-K model, and a critical temperature Tc is found around 2.1. For T Tc there are three real roots. The iteration method is convergent only if the cubic equations have three real solutions. The two methods give same results when the iteration method is convergent. Compared to nMF, TAP is somewhat better at low temperatures, but approaches the same performance as temperature increase. Both methods behave better for longer data length, but for improvement arises, TAP is well pronounced.

  20. Model updating of rotor systems by using Nonlinear least square optimization

    Science.gov (United States)

    Jha, A. K.; Dewangan, P.; Sarangi, M.

    2016-07-01

    Mathematical models of structure or machineries are always different from the existing physical system, because the approach of numerical predictions to the behavior of a physical system is limited by the assumptions used in the development of the mathematical model. Model updating is, therefore necessary so that updated model should replicate the physical system. This work focuses on the model updating of rotor systems at various speeds as well as at different modes of vibration. Support bearing characteristics severely influence the dynamics of rotor systems like turbines, compressors, pumps, electrical machines, machine tool spindles etc. Therefore bearing parameters (stiffness and damping) are considered to be updating parameters. A finite element model of rotor systems is developed using Timoshenko beam element. Unbalance response in time domain and frequency response function have been calculated by numerical techniques, and compared with the experimental data to update the FE-model of rotor systems. An algorithm, based on unbalance response in time domain is proposed for updating the rotor systems at different running speeds of rotor. An attempt has been made to define Unbalance response assurance criterion (URAC) to check the degree of correlation between updated FE model and physical model.

  1. "Updates to Model Algorithms & Inputs for the Biogenic Emissions Inventory System (BEIS) Model"

    Science.gov (United States)

    We have developed new canopy emission algorithms and land use data for BEIS. Simulations with BEIS v3.4 and these updates in CMAQ v5.0.2 are compared these changes to the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and evaluated the simulations against observatio...

  2. Finite-element-model updating using computational intelligence techniques applications to structural dynamics

    CERN Document Server

    Marwala, Tshilidzi

    2010-01-01

    Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...

  3. Development of a cyber-physical experimental platform for real-time dynamic model updating

    Science.gov (United States)

    Song, Wei; Dyke, Shirley

    2013-05-01

    Model updating procedures are traditionally performed off-line. With the significant recent advances in embedded systems and the related real-time computing capabilities, online or real-time, model updating can be performed to inform decision making and controller actions. The applications for real-time model updating are mainly in the areas of (i) condition diagnosis and prognosis of engineering systems; and (ii) control systems that benefit from accurate modeling of the system plant. Herein, the development of a cyber-physical real-time model updating experimental platform, including real-time computing environment, model updating algorithm, hardware architecture and testbed, is described. Results from two challenging experimental implementations are presented to illustrate the performance of this cyber-physical platform in achieving the goal of updating nonlinear systems in real-time. The experiments consider typical nonlinear engineering systems that exhibit hysteresis. Among the available algorithms capable of identification of such complex nonlinearities, the unscented Kalman filter (UKF) is selected for these experiments as an effective method to update nonlinear dynamic system models under realistic conditions. The implementation of the platform is discussed for successful completion of these experiments, including required timing constraints and overall evaluation of the system.

  4. Systemic change increases model projection uncertainty

    Science.gov (United States)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André

    2014-05-01

    Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship resulting from e.g., climatic or societal changes, thereby overlooking a source of uncertainty. We define systemic change as a change in the system indicated by a system state change that cannot be simulated using a constant model structure. We have developed a method to detect systemic change, using a Bayesian data assimilation technique, the particle filter. The particle filter was used to update the prior knowledge about the model structure. In contrast to the traditional particle filter approach (e.g., Verstegen et al., 2014), we apply the filter separately for each point in time for which observations are available, obtaining the optimal model structure for each of the time periods in between. This allows us to create a time series of the evolution of the model structure. The Runs test (Wald and Wolfowitz, 1940), a stationarity test, is used to check whether variation in this time series can be attributed to randomness or not. If not, this indicates systemic change. The uncertainty that the systemic change adds to the existing model projection uncertainty can be determined by comparing model outcomes of a model with a stationary model structure and a model with a model structure changing according to the variation found in the time series. To test the systemic change detection methodology, we apply it to a land use change cellular automaton (CA) (Verstegen et al., 2012) and use observations of real land use from all years from 2004 to 2012 and associated uncertainty as observational data in the particle filter. A systemic change was detected for the period 2006 to 2008. In this period the influence on the location of sugar cane expansion of the driver sugar cane in

  5. Experimental Studies on Finite Element Model Updating for a Heated Beam-Like Structure

    Directory of Open Access Journals (Sweden)

    Kaipeng Sun

    2015-01-01

    Full Text Available An experimental study was made for the identification procedure of time-varying modal parameters and the finite element model updating technique of a beam-like thermal structure in both steady and unsteady high temperature environments. An improved time-varying autoregressive method was proposed first to extract the instantaneous natural frequencies of the structure in the unsteady high temperature environment. Based on the identified modal parameters, then, a finite element model for the structure was updated by using Kriging meta-model and optimization-based finite-element model updating method. The temperature-dependent parameters to be updated were expressed as low-order polynomials of temperature increase, and the finite element model updating problem was solved by updating several coefficients of the polynomials. The experimental results demonstrated the effectiveness of the time-varying modal parameter identification method and showed that the instantaneous natural frequencies of the updated model well tracked the trends of the measured values with high accuracy.

  6. Variance analysis for model updating with a finite element based subspace fitting approach

    Science.gov (United States)

    Gautier, Guillaume; Mevel, Laurent; Mencik, Jean-Mathieu; Serra, Roger; Döhler, Michael

    2017-07-01

    Recently, a subspace fitting approach has been proposed for vibration-based finite element model updating. The approach makes use of subspace-based system identification, where the extended observability matrix is estimated from vibration measurements. Finite element model updating is performed by correlating the model-based observability matrix with the estimated one, by using a single set of experimental data. Hence, the updated finite element model only reflects this single test case. However, estimates from vibration measurements are inherently exposed to uncertainty due to unknown excitation, measurement noise and finite data length. In this paper, a covariance estimation procedure for the updated model parameters is proposed, which propagates the data-related covariance to the updated model parameters by considering a first-order sensitivity analysis. In particular, this propagation is performed through each iteration step of the updating minimization problem, by taking into account the covariance between the updated parameters and the data-related quantities. Simulated vibration signals are used to demonstrate the accuracy and practicability of the derived expressions. Furthermore, an application is shown on experimental data of a beam.

  7. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  8. The psychopharmacology algorithm project at the Harvard South Shore Program: an update on schizophrenia.

    Science.gov (United States)

    Osser, David N; Roudsari, Mohsen Jalali; Manschreck, Theo

    2013-01-01

    This article is an update of the algorithm for schizophrenia from the Psychopharmacology Algorithm Project at the Harvard South Shore Program. A literature review was conducted focusing on new data since the last published version (1999-2001). The first-line treatment recommendation for new-onset schizophrenia is with amisulpride, aripiprazole, risperidone, or ziprasidone for four to six weeks. In some settings the trial could be shorter, considering that evidence of clear improvement with antipsychotics usually occurs within the first two weeks. If the trial of the first antipsychotic cannot be completed due to intolerance, try another until one of the four is tolerated and given an adequate trial. There should be evidence of bioavailability. If the response to this adequate trial is unsatisfactory, try a second monotherapy. If the response to this second adequate trial is also unsatisfactory, and if at least one of the first two trials was with risperidone, olanzapine, or a first-generation (typical) antipsychotic, then clozapine is recommended for the third trial. If neither trial was with any these three options, a third trial prior to clozapine should occur, using one of those three. If the response to monotherapy with clozapine (with dose adjusted by using plasma levels) is unsatisfactory, consider adding risperidone, lamotrigine, or ECT. Beyond that point, there is little solid evidence to support further psychopharmacological treatment choices, though we do review possible options.

  9. Progress update of NASA's free-piston Stirling space power converter technology project

    Science.gov (United States)

    Dudenhoefer, James E.; Winter, Jerry M.; Alger, Donald

    1992-01-01

    A progress update is presented of the NASA LeRC Free-Piston Stirling Space Power Converter Technology Project. This work is being conducted under NASA's Civil Space Technology Initiative (CSTI). The goal of the CSTI High Capacity Power Element is to develop the technology base needed to meet the long duration, high capacity power requirements for future NASA space initiatives. Efforts are focused upon increasing system power output and system thermal and electric energy conversion efficiency at least five fold over current SP-100 technology, and on achieving systems that are compatible with space nuclear reactors. This paper will discuss progress toward 1050 K Stirling Space Power Converters. Fabrication is nearly completed for the 1050 K Component Test Power Converter (CTPC); results of motoring tests of the cold end (525 K), are presented. The success of these and future designs is dependent upon supporting research and technology efforts including heat pipes, bearings, superalloy joining technologies, high efficiency alternators, life and reliability testing, and predictive methodologies. This paper will compare progress in significant areas of component development from the start of the program with the Space Power Development Engine (SPDE) to the present work on CTPC.

  10. Update rules and interevent time distributions: Slow ordering vs. no ordering in the Voter Model

    CERN Document Server

    Fernández-Gracia, Juan; Miguel, M San

    2011-01-01

    We introduce a general methodology of update rules accounting for arbitrary interevent time distributions in simulations of interacting agents. In particular we consider update rules that depend on the state of the agent, so that the update becomes part of the dynamical model. As an illustration we consider the voter model in fully-connected, random and scale free networks with an update probability inversely proportional to the persistence, that is, the time since the last event. We find that in the thermodynamic limit, at variance with standard updates, the system orders slowly. The approach to the absorbing state is characterized by a power law decay of the density of interfaces, observing that the mean time to reach the absorbing state might be not well defined.

  11. Spread and Quote-Update Frequency of the Limit-Order Driven Sergei Maslov Model

    Institute of Scientific and Technical Information of China (English)

    QIU Tian; CHEN Guang

    2007-01-01

    @@ We perform numerical simulations of the limit-order driven Sergei Maslov (SM) model and investigate the probability distribution and autocorrelation function of the bid-ask spread S and the quote-update frequency U.For the probability distribution, the model successfully reproduces the power law decay of the spread and the exponential decay of the quote-update frequency. For the autocorrelation function, both the spread and the quote-update frequency of the model decay by a power law, which is consistent with the empirical study. We obtain the power law exponent 0.54 for the spread, which is in good agreement with the real financial market.

  12. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  13. Space market model development project

    Science.gov (United States)

    Bishop, Peter C.

    1987-01-01

    The objectives of the research program, Space Market Model Development Project, (Phase 1) were: (1) to study the need for business information in the commercial development of space; and (2) to propose a design for an information system to meet the identified needs. Three simultaneous research strategies were used in proceeding toward this goal: (1) to describe the space business information which currently exists; (2) to survey government and business representatives on the information they would like to have; and (3) to investigate the feasibility of generating new economical information about the space industry.

  14. Research on the iterative method for model updating based on the frequency response function

    Institute of Scientific and Technical Information of China (English)

    Wei-Ming Li; Jia-Zhen Hong

    2012-01-01

    Model reduction technique is usually employed in model updating process,In this paper,a new model updating method named as cross-model cross-frequency response function (CMCF) method is proposed and a new iterative method associating the model updating method with the model reduction technique is investigated.The new model updating method utilizes the frequency response function to avoid the modal analysis process and it does not need to pair or scale the measured and the analytical frequency response function,which could greatly increase the number of the equations and the updating parameters.Based on the traditional iterative method,a correction term related to the errors resulting from the replacement of the reduction matrix of the experimental model with that of the finite element model is added in the new iterative method.Comparisons between the traditional iterative method and the proposed iterative method are shown by model updating examples of solar panels,and both of these two iterative methods combine the CMCF method and the succession-level approximate reduction technique.Results show the effectiveness of the CMCF method and the proposed iterative method.

  15. Execution model for limited ambiguity rules and its application to derived data update

    Energy Technology Data Exchange (ETDEWEB)

    Chen, I.M.A. [Lawrence Berkeley National Lab., CA (United States); Hull, R. [Univ. of Colorado, Boulder, CO (United States); McLeod, D. [Univ. of Southern California, Los Angeles, CA (United States)

    1995-12-01

    A novel execution model for rule application in active databases is developed and applied to the problem of updating derived data in a database represented using a semantic, object-based database model. The execution model is based on the use of `limited ambiguity rules` (LARs), which permit disjunction in rule actions. The execution model essentially performs a breadth-first exploration of alternative extensions of a user-requested update. Given an object-based database scheme, both integrity constraints and specifications of derived classes and attributes are compiled into a family of limited ambiguity rules. A theoretical analysis shows that the approach is sound: the execution model returns all valid `completions` of a user-requested update, or terminates with an appropriate error notification. The complexity of the approach in connection with derived data update is considered. 42 refs., 10 figs., 3 tabs.

  16. An inner-outer nonlinear programming approach for constrained quadratic matrix model updating

    Science.gov (United States)

    Andretta, M.; Birgin, E. G.; Raydan, M.

    2016-01-01

    The Quadratic Finite Element Model Updating Problem (QFEMUP) concerns with updating a symmetric second-order finite element model so that it remains symmetric and the updated model reproduces a given set of desired eigenvalues and eigenvectors by replacing the corresponding ones from the original model. Taking advantage of the special structure of the constraint set, it is first shown that the QFEMUP can be formulated as a suitable constrained nonlinear programming problem. Using this formulation, a method based on successive optimizations is then proposed and analyzed. To avoid that spurious modes (eigenvectors) appear in the frequency range of interest (eigenvalues) after the model has been updated, additional constraints based on a quadratic Rayleigh quotient are dynamically included in the constraint set. A distinct practical feature of the proposed method is that it can be implemented by computing only a few eigenvalues and eigenvectors of the associated quadratic matrix pencil.

  17. Flow Forecasting in Urban Drainage Systems using Deterministic Updating of Water Levels in Distributed Hydraulic Models

    DEFF Research Database (Denmark)

    Hansen, Lisbeth S.; Borup, Morten; Møller, A.;

    2011-01-01

    the performance of the updating procedure for flow forecasting. Measured water levels in combination with rain gauge input are used as basis for the evaluation. When compared to simulations without updating, the results show that it is possible to obtain an improvement in the 20 minute forecast of the water level...... to eliminate some of the unavoidable discrepancies between model and reality. The latter can partly be achieved by using the commercial tool MOUSE UPDATE, which is capable of inserting measured water levels from the system into the distributed, physically based MOUSE model. This study evaluates and documents...

  18. W-320 Project thermal modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sathyanarayana, K., Fluor Daniel Hanford

    1997-03-18

    This report summarizes the results of thermal analysis performed to provide a technical basis in support of Project W-320 to retrieve by sluicing the sludge in Tank 241-C-106 and to transfer into Tank 241-AY-102. Prior theraml evaluations in support of Project W-320 safety analysis assumed the availability of 2000 to 3000 CFM, as provided by Tank Farm Operations, for tank floor cooling channels from the secondary ventilation system. As this flow availability has no technical basis, a detailed Tank 241-AY-102 secondary ventilation and floor coating channel flow model was developed and analysis was performed. The results of the analysis show that only about 150 cfm flow is in floor cooLing channels. Tank 241-AY-102 thermal evaluation was performed to determine the necessary cooling flow for floor cooling channels using W-030 primary ventilation system for different quantities of Tank 241-C-106 sludge transfer into Tank 241-AY-102. These sludge transfers meet different options for the project along with minimum required modification of the ventilation system. Also the results of analysis for the amount of sludge transfer using the current system is presented. The effect of sludge fluffing factor, heat generation rate and its distribution between supernatant and sludge in Tank 241-AY-102 on the amount of sludge transfer from Tank 241-C-106 were evaluated and the results are discussed. Also transient thermal analysis was performed to estimate the time to reach the steady state. For a 2 feet sludge transfer, about 3 months time will be requirad to reach steady state. Therefore, for the purpose of process control, a detailed transient thermal analysis using GOTH Computer Code will be required to determine transient response of the sludge in Tank 241-AY-102. Process control considerations are also discussed to eliminate the potential for a steam bump during retrieval and storage in Tanks 241-C-106 and 241-AY-102 respectively.

  19. Sharks, Minnows, and Wheelbarrows: Calculus Modeling Projects

    Science.gov (United States)

    Smith, Michael D.

    2011-01-01

    The purpose of this article is to present two very active applied modeling projects that were successfully implemented in a first semester calculus course at Hollins University. The first project uses a logistic equation to model the spread of a new disease such as swine flu. The second project is a human take on the popular article "Do Dogs Know…

  20. Active Magnetic Bearing Rotor Model Updating Using Resonance and MAC Error

    Directory of Open Access Journals (Sweden)

    Yuanping Xu

    2015-01-01

    Full Text Available Modern control techniques can improve the performance and robustness of a rotor active magnetic bearing (AMB system. Since those control methods usually rely on system models, it is important to obtain a precise rotor AMB analytical model. However, the interference fits and shrink effects of rotor AMB cause inaccuracy to the final system model. In this paper, an experiment based model updating method is proposed to improve the accuracy of the finite element (FE model used in a rotor AMB system. Modelling error is minimized by applying a numerical optimization Nelder-Mead simplex algorithm to properly adjust FE model parameters. Both the error resonance frequencies and modal assurance criterion (MAC values are minimized simultaneously to account for the rotor natural frequencies as well as for the mode shapes. Verification of the updated rotor model is performed by comparing the experimental and analytical frequency response. The close agreements demonstrate the effectiveness of the proposed model updating methodology.

  1. Model Updating in Online Aircraft Prognosis Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this research is to develop algorithms for online health monitoring and prognostics (prediction of the remaining life of a component or system) in...

  2. Model Updating in Online Aircraft Prognosis Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Diagnostic and prognostic algorithms for many aircraft subsystems are steadily maturing. Unfortunately there is little experience integrating these technologies into...

  3. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  4. Flow Forecasting using Deterministic Updating of Water Levels in Distributed Hydrodynamic Urban Drainage Models

    Directory of Open Access Journals (Sweden)

    Lisbet Sneftrup Hansen

    2014-07-01

    Full Text Available There is a growing requirement to generate more precise model simulations and forecasts of flows in urban drainage systems in both offline and online situations. Data assimilation tools are hence needed to make it possible to include system measurements in distributed, physically-based urban drainage models and reduce a number of unavoidable discrepancies between the model and reality. The latter can be achieved partly by inserting measured water levels from the sewer system into the model. This article describes how deterministic updating of model states in this manner affects a simulation, and then evaluates and documents the performance of this particular updating procedure for flow forecasting. A hypothetical case study and synthetic observations are used to illustrate how the Update method works and affects downstream nodes. A real case study in a 544 ha urban catchment furthermore shows that it is possible to improve the 20-min forecast of water levels in an updated node and the three-hour forecast of flow through a downstream node, compared to simulations without updating. Deterministic water level updating produces better forecasts when implemented in large networks with slow flow dynamics and with measurements from upstream basins that contribute significantly to the flow at the forecast location.

  5. Inherently irrational? A computational model of escalation of commitment as Bayesian Updating.

    Science.gov (United States)

    Gilroy, Shawn P; Hantula, Donald A

    2016-06-01

    Monte Carlo simulations were performed to analyze the degree to which two-, three- and four-step learning histories of losses and gains correlated with escalation and persistence in extended extinction (continuous loss) conditions. Simulated learning histories were randomly generated at varying lengths and compositions and warranted probabilities were determined using Bayesian Updating methods. Bayesian Updating predicted instances where particular learning sequences were more likely to engender escalation and persistence under extinction conditions. All simulations revealed greater rates of escalation and persistence in the presence of heterogeneous (e.g., both Wins and Losses) lag sequences, with substantially increased rates of escalation when lags comprised predominantly of losses were followed by wins. These methods were then applied to human investment choices in earlier experiments. The Bayesian Updating models corresponded with data obtained from these experiments. These findings suggest that Bayesian Updating can be utilized as a model for understanding how and when individual commitment may escalate and persist despite continued failures.

  6. Model updating of complex structures using the combination of component mode synthesis and Kriging predictor.

    Science.gov (United States)

    Liu, Yang; Li, Yan; Wang, Dejun; Zhang, Shaoyi

    2014-01-01

    Updating the structural model of complex structures is time-consuming due to the large size of the finite element model (FEM). Using conventional methods for these cases is computationally expensive or even impossible. A two-level method, which combined the Kriging predictor and the component mode synthesis (CMS) technique, was proposed to ensure the successful implementing of FEM updating of large-scale structures. In the first level, the CMS was applied to build a reasonable condensed FEM of complex structures. In the second level, the Kriging predictor that was deemed as a surrogate FEM in structural dynamics was generated based on the condensed FEM. Some key issues of the application of the metamodel (surrogate FEM) to FEM updating were also discussed. Finally, the effectiveness of the proposed method was demonstrated by updating the FEM of a real arch bridge with the measured modal parameters.

  7. Model Updating of Complex Structures Using the Combination of Component Mode Synthesis and Kriging Predictor

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2014-01-01

    Full Text Available Updating the structural model of complex structures is time-consuming due to the large size of the finite element model (FEM. Using conventional methods for these cases is computationally expensive or even impossible. A two-level method, which combined the Kriging predictor and the component mode synthesis (CMS technique, was proposed to ensure the successful implementing of FEM updating of large-scale structures. In the first level, the CMS was applied to build a reasonable condensed FEM of complex structures. In the second level, the Kriging predictor that was deemed as a surrogate FEM in structural dynamics was generated based on the condensed FEM. Some key issues of the application of the metamodel (surrogate FEM to FEM updating were also discussed. Finally, the effectiveness of the proposed method was demonstrated by updating the FEM of a real arch bridge with the measured modal parameters.

  8. Updated projections of future vCJD deaths in the UK

    Directory of Open Access Journals (Sweden)

    Ferguson Neil M

    2003-04-01

    Full Text Available Abstract Background Past projections of the future course of the vCJD epidemic in the UK have shown considerable uncertainty, with wide confidence bounds. However, recent vCJD case data have indicated a decrease in the annual incidence of deaths over the past two years. Methods A detailed survival model is fitted to the 121 vCJD deaths reported by the end of 2002 stratified by age and calendar time to obtain projections of future incidence. The model is additionally fitted to recent results from a survey of appendix tissues. Results Our results show a substantial decrease in the uncertainty of the future course of the primary epidemic in the susceptible genotype (MM-homozygous at codon 129 of the prion protein gene, with a best estimate of 40 future deaths (95% prediction interval 9–540 based on fitting to the vCJD case data alone. Additional fitting of the appendix data increases these estimates (best estimate 100, 95% prediction interval 10–2,600 but remains lower than previous projections. Conclusions The primary vCJD epidemic in the known susceptible genotype in the UK appears to be in decline.

  9. Modeling Conservative Updates in Multi-Hash Approximate Count Sketches

    OpenAIRE

    2012-01-01

    Multi-hash-based count sketches are fast and memory efficient probabilistic data structures that are widely used in scalable online traffic monitoring applications. Their accuracy significantly improves with an optimization, called conservative update, which is especially effective when the aim is to discriminate a relatively small number of heavy hitters in a traffic stream consisting of an extremely large number of flows. Despite its widespread application, a thorough u...

  10. NASA's In-Space Manufacturing Project: Materials and Manufacturing Process Development Update

    Science.gov (United States)

    Prater, Tracie; Bean, Quincy; Werkheiser, Niki; Ledbetter, Frank

    2017-01-01

    The mission of NASA's In-Space Manufacturing (ISM) project is to identify, design, and implement on-demand, sustainable manufacturing solutions for fabrication, maintenance and repair during exploration missions. ISM has undertaken a phased strategy of incrementally increasing manufacturing capabilities to achieve this goal. The ISM project began with the development of the first 3D printer for the International Space Station. To date, the printer has completed two phases of flight operations. Results from phase I specimens indicated some differences in material properties between ground-processed and ISS-processed specimens, but results of follow-on analyses of these parts and a ground-based study with an equivalent printer strongly indicate that this variability is likely attributable to differences in manufacturing process settings between the ground and flight prints rather than microgravity effects on the fused deposition modeling (FDM) process. Analysis of phase II specimens from the 3D Printing in Zero G tech demo, which shed further light on the sources of material variability, will be presented. The ISM project has also developed a materials characterization plan for the Additive Manufacturing Facility, the follow-on commercial multimaterial 3D printing facility developed for ISS by Made in Space. This work will yield a suite of characteristic property values that can inform use of AMF by space system designers. Other project activities include development of an integrated 3D printer and recycler, known as the Refabricator, by Tethers Unlimited, which will be operational on ISS in 2018. The project also recently issued a broad area announcement for a multimaterial fabrication laboratory, which may include in-space manufacturing capabilities for metals, electronics, and polymeric materials, to be deployed on ISS in the 2022 timeframe.

  11. Proposed reporting model update creates dialogue between FASB and not-for-profits.

    Science.gov (United States)

    Mosrie, Norman C

    2016-04-01

    Seeing a need to refresh the current guidelines, the Financial Accounting Standards Board (FASB) proposed an update to the financial accounting and reporting model for not-for-profit entities. In a response to solicited feedback, the board is now revisiting its proposed update and has set forth a plan to finalize its new guidelines. The FASB continues to solicit and respond to feedback as the process progresses.

  12. Towards cost-sensitive adaptation: when is it worth updating your predictive model?

    OpenAIRE

    Zliobaite, Indre; Budka, Marcin; Stahl, Frederic

    2015-01-01

    Our digital universe is rapidly expanding, more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possib...

  13. Comparing the impact of time displaced and biased precipitation estimates for online updated urban runoff models

    DEFF Research Database (Denmark)

    2013-01-01

    When an online runoff model is updated from system measurements, the requirements of the precipitation input change. Using rain gauge data as precipitation input there will be a displacement between the time when the rain hits the gauge and the time where the rain hits the actual catchment, due...... to the time it takes for the rain cell to travel from the rain gauge to the catchment. Since this time displacement is not present for system measurements the data assimilation scheme might already have updated the model to include the impact from the particular rain cell when the rain data is forced upon...... the model, which therefore will end up including the same rain twice in the model run. This paper compares forecast accuracy of updated models when using time displaced rain input to that of rain input with constant biases. This is done using a simple time-area model and historic rain series that are either...

  14. Impact of time displaced precipitation estimates for on-line updated models

    DEFF Research Database (Denmark)

    Borup, Morten; Grum, Morten; Mikkelsen, Peter Steen

    2012-01-01

    catchment, due to the time it takes for the rain cell to travel from the rain gauge to the catchment. Since this time displacement is not present for system measurements the data assimilation scheme might already have updated the model to include the impact from the particular rain cell when the rain data......When an online runoff model is updated from system measurements the requirements to the precipitation estimates change. Using rain gauge data as precipitation input there will be a displacement between the time where the rain intensity hits the gauge and the time where the rain hits the actual...... is forced upon the model, which therefore will end up including the same rain twice in the model run. This paper compares forecast accuracy of updated models when using time displaced rain input to that of rain input with constant biases. This is done using a simple timearea model and historic rain series...

  15. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    Science.gov (United States)

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  16. Modeling Research Project Risks with Fuzzy Maps

    Science.gov (United States)

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  17. On-line updating of a distributed flow routing model - River Vistula case study

    Science.gov (United States)

    Karamuz, Emilia; Romanowicz, Renata; Napiorkowski, Jaroslaw

    2015-04-01

    This paper presents an application of methods of on-line updating in the River Vistula flow forecasting system. All flow-routing codes make simplifying assumptions and consider only a reduced set of the processes known to occur during a flood. Hence, all models are subject to a degree of structural error that is typically compensated for by calibration of the friction parameters. Calibrated parameter values are not, therefore, physically realistic, as in estimating them we also make allowance for a number of distinctly non-physical effects, such as model structural error and any energy losses or flow processes which occur at sub-grid scales. Calibrated model parameters are therefore area-effective, scale-dependent values which are not drawn from the same underlying statistical distribution as the equivalent at-a-point parameter of the same name. The aim of this paper is the derivation of real-time updated, on-line flow forecasts at certain strategic locations along the river, over a specified time horizon into the future, based on information on the behaviour of the flood wave upstream and available on-line measurements at a site. Depending on the length of the river reach and the slope of the river bed, a realistic forecast lead time, obtained in this manner, may range from hours to days. The information upstream can include observations of river levels and/or rainfall measurements. The proposed forecasting system will integrate distributed modelling, acting as a spatial interpolator with lumped parameter Stochastic Transfer Function models. Daily stage data from gauging stations are typically available at sites 10-60 km apart and test only the average routing performance of hydraulic models and not their ability to produce spatial predictions. Application of a distributed flow routing model makes it possible to interpolate forecasts both in time and space. This work was partly supported by the project "Stochastic flood forecasting system (The River Vistula reach

  18. The Canvas model in project formulation

    OpenAIRE

    Ferreira-Herrera, Diana Carolina

    2016-01-01

    Purpose: The aim of this article is to determine the relevance of the Canvas methodology in project formulation through model characterization, thus answering the question: Is the Canvas methodology a relevant model for project management in an entrepreneurial context? Description: The Canvas model seeks to manage projects as business units. It is a model intended for emphasizing the entrepreneurial potential in project management. For this, texts and articles that have provided the basis for...

  19. Denmark's greenhouse gas projections until 2012, an update including a preliminary projection until 2017

    Energy Technology Data Exchange (ETDEWEB)

    Fenham, J. [Risoe National Lab., Roskilde (Denmark)

    2003-07-01

    solvent use and other sources is included as a souce of CO{sub 2} emission. A separate chapter is dedicated to each of these sectors. However, the report starts with a summary of the emissions with a section for each of the pollutants treated. At the end of each of these section the main differences between the present calculation and thevalues in Denmark's Second National Communication on Climate Change are described shortly. For each of the pollutants the development of the emissions in the period 1972-2012 and the various emission targets in Danish sector plants or international conventions are shown on a figure. Below the figures the emissions for the main emitting sectors are shown in a table. The years shown in these tables are not the same for all pollutants. When a column is marked with '2010' it means that the values in the columns are averaged over the first commitment period 2008-2012. It is not possible in this report to present all the data from the emission calculations. The data is contained in an EXEL notebook model. Appendix 1 contains a table with time-series for 1975-2012 for the greenhouse gases CO{sub 2}, CH{sub 4} and N{sub 2}O for all emitting sectors. In Appendix 2 the results of the projections 2000-2012 are shown in the IPCC/CRF Sectoral Tables format in CO{sub 2} equivalents for each greenhouse gas and in total (only source and sink categories with greenhouse gas emissions or removals are shown). The model is structured as a set of worksheets for the primary energy consuming sector and the model contains similar sets for each of the pollutants. Additional sheets have been included for the relevant pollutants, where emissions originate from non-combustion processes. Each of these spreadsheets contains time-series for the emissions from each of the primary fuels consumed in the sector. (ba)

  20. A stochastic model updating strategy-based improved response surface model and advanced Monte Carlo simulation

    Science.gov (United States)

    Zhai, Xue; Fei, Cheng-Wei; Choy, Yat-Sze; Wang, Jian-Jun

    2017-01-01

    To improve the accuracy and efficiency of computation model for complex structures, the stochastic model updating (SMU) strategy was proposed by combining the improved response surface model (IRSM) and the advanced Monte Carlo (MC) method based on experimental static test, prior information and uncertainties. Firstly, the IRSM and its mathematical model were developed with the emphasis on moving least-square method, and the advanced MC simulation method is studied based on Latin hypercube sampling method as well. And then the SMU procedure was presented with experimental static test for complex structure. The SMUs of simply-supported beam and aeroengine stator system (casings) were implemented to validate the proposed IRSM and advanced MC simulation method. The results show that (1) the SMU strategy hold high computational precision and efficiency for the SMUs of complex structural system; (2) the IRSM is demonstrated to be an effective model due to its SMU time is far less than that of traditional response surface method, which is promising to improve the computational speed and accuracy of SMU; (3) the advanced MC method observably decrease the samples from finite element simulations and the elapsed time of SMU. The efforts of this paper provide a promising SMU strategy for complex structure and enrich the theory of model updating.

  1. The role of anti-resonance frequencies from operational modal analysis in finite element model updating

    Science.gov (United States)

    Hanson, D.; Waters, T. P.; Thompson, D. J.; Randall, R. B.; Ford, R. A. J.

    2007-01-01

    Finite element model updating traditionally makes use of both resonance and modeshape information. The mode shape information can also be obtained from anti-resonance frequencies, as has been suggested by a number of researchers in recent years. Anti-resonance frequencies have the advantage over mode shapes that they can be much more accurately identified from measured frequency response functions. Moreover, anti-resonance frequencies can, in principle, be estimated from output-only measurements on operating machinery. The motivation behind this paper is to explore whether the availability of anti-resonances from such output-only techniques would add genuinely new information to the model updating process, which is not already available from using only resonance frequencies. This investigation employs two-degree-of-freedom models of a rigid beam supported on two springs. It includes an assessment of the contribution made to the overall anti-resonance sensitivity by the mode shape components, and also considers model updating through Monte Carlo simulations, experimental verification of the simulation results, and application to a practical mechanical system, in this case a petrol generator set. Analytical expressions are derived for the sensitivity of anti-resonance frequencies to updating parameters such as the ratio of spring stiffnesses, the position of the centre of gravity, and the beam's radius of gyration. These anti-resonance sensitivities are written in terms of natural frequency and mode shape sensitivities so their relative contributions can be assessed. It is found that the contribution made by the mode shape sensitivity varies considerably depending on the value of the parameters, contributing no new information for significant combinations of parameter values. The Monte Carlo simulations compare the performance of the update achieved when using information from: the resonances only; the resonances and either anti-resonance; and the resonances and both

  2. Integrating High Penetrations of PV into Southern California: Year 2 Project Update; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Mather, B.; Neal, R.

    2012-08-01

    Southern California Edison (SCE) is well into a five-year project to install a total of 500 MW of distributed photovoltaic (PV) energy within its utility service territory. Typical installations to date are 1-3 MW peak rooftop PV systems that interconnect to medium-voltage urban distribution circuits or larger (5 MW peak) ground-mounted systems that connect to medium-voltage rural distribution circuits. Some of the PV system interconnections have resulted in distribution circuits that have a significant amount of PV generation compared to customer load, resulting in high-penetration PV integration scenarios. The National Renewable Energy Laboratory (NREL) and SCE have assembled a team of distribution modeling, resource assessment, and PV inverter technology experts in order to investigate a few of the high-penetration PV distribution circuits. Currently, the distribution circuits being studied include an urban circuit with a PV penetration of approximately 46% and a rural circuit with a PV penetration of approximately 60%. In both cases, power flow on the circuit reverses direction, compared to traditional circuit operation, during periods of high PV power production and low circuit loading. Research efforts during year two of the five-year project were focused on modeling the distribution system level impacts of high-penetration PV integrations, the development and installation of distribution circuit data acquisition equipment appropriate for quantifying the impacts of high-penetration PV integrations, and investigating high-penetration PV impact mitigation strategies. This paper outlines these research efforts and discusses the following activities in more detail: the development of a quasi-static time-series test feeder for evaluating high-penetration PV integration modeling tools; the advanced inverter functions being investigated for deployment in the project's field demonstration and a power hardware-in-loop test of a 500-kW PV inverter implementing a

  3. MLE's bias pathology, Model Updated Maximum Likelihood Estimates and Wallace's Minimum Message Length method

    OpenAIRE

    Yatracos, Yannis G.

    2013-01-01

    The inherent bias pathology of the maximum likelihood (ML) estimation method is confirmed for models with unknown parameters $\\theta$ and $\\psi$ when MLE $\\hat \\psi$ is function of MLE $\\hat \\theta.$ To reduce $\\hat \\psi$'s bias the likelihood equation to be solved for $\\psi$ is updated using the model for the data $Y$ in it. Model updated (MU) MLE, $\\hat \\psi_{MU},$ often reduces either totally or partially $\\hat \\psi$'s bias when estimating shape parameter $\\psi.$ For the Pareto model $\\hat...

  4. Ambient modal testing of a double-arch dam: the experimental campaign and model updating

    Science.gov (United States)

    García-Palacios, Jaime H.; Soria, José M.; Díaz, Iván M.; Tirado-Andrés, Francisco

    2016-09-01

    A finite element model updating of a double-curvature-arch dam (La Tajera, Spain) is carried out hereof using the modal parameters obtained from an operational modal analysis. That is, the system modal dampings, natural frequencies and mode shapes have been identified using output-only identification techniques under environmental loads (wind, vehicles). A finite element model of the dam-reservoir-foundation system was initially created. Then, a testing campaing was then carried out from the most significant test points using high-sensitivity accelerometers wirelessly synchronized. Afterwards, the model updating of the initial model was done using a Monte Carlo based approach in order to match it to the recorded dynamic behaviour. The updated model may be used within a structural health monitoring system for damage detection or, for instance, for the analysis of the seismic response of the arch dam- reservoir-foundation coupled system.

  5. Inspection Uncertainty and Model Uncertainty Updating for Ship Structures Subjected to Corrosion Deterioration

    Institute of Scientific and Technical Information of China (English)

    LIDian-qing; ZHANGSheng-kun

    2004-01-01

    The classical probability theory cannot effectively quantify the parameter uncertainty in probability of detection.Furthermore,the conventional data analytic method and expert judgment method fail to handle the problem of model uncertainty updating with the information from nondestructive inspection.To overcome these disadvantages,a Bayesian approach was proposed to quantify the parameter uncertainty in probability of detection.Furthermore,the formulae of the multiplication factors to measure the statistical uncertainties in the probability of detection following the Weibull distribution were derived.A Bayesian updating method was applied to compute the posterior probabilities of model weights and the posterior probability density functions of distribution parameters of probability of detection.A total probability model method was proposed to analyze the problem of multi-layered model uncertainty updating.This method was then applied to the problem of multilayered corrosion model uncertainty updating for ship structures.The results indicate that the proposed method is very effective in analyzing the problem of multi-layered model uncertainty updating.

  6. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating.

    Science.gov (United States)

    Lee, Young-Joo; Cho, Soojin

    2016-03-02

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed.

  7. Dynamic finite element model updating of prestressed concrete continuous box-girder bridge

    Institute of Scientific and Technical Information of China (English)

    Lin Xiankun; Zhang Lingmi; Guo Qintao; Zhang Yufeng

    2009-01-01

    The dynamic finite element model (FEM) of a prestressed concrete continuous box-girder bridge, called the Tongyang Canal Bridge, is built and updated based on the results of ambient vibration testing (AVT) using a real-coded accelerating genetic algorithm (RAGA). The objective functions are defined based on natural frequency and modal assurance criterion (MAC) metrics to evaluate the updated FEM. Two objective functions are defined to fully account for the relative errors and standard deviations of the natural frequencies and MAC between the AVT results and the updated FEM predictions. The dynamically updated FEM of the bridge can better represent its structural dynamics and serve as a baseline in long-term health monitoring, condition assessment and damage identification over the service life of the bridge.

  8. Sensitivity-based finite element model updating using constrained optimization with a trust region algorithm

    Science.gov (United States)

    Bakir, Pelin Gundes; Reynders, Edwin; De Roeck, Guido

    2007-08-01

    The use of changes in dynamic system characteristics to detect damage has received considerable attention during the last years. Within this context, FE model updating technique, which belongs to the class of inverse problems in classical mechanics, is used to detect, locate and quantify damage. In this study, a sensitivity-based finite element (FE) model updating scheme using a trust region algorithm is developed and implemented in a complex structure. A damage scenario is applied on the structure in which the stiffness values of the beam elements close to the beam-column joints are decreased by stiffness reduction factors. A worst case and complex damage pattern is assumed such that the stiffnesses of adjacent elements are decreased by substantially different stiffness reduction factors. The objective of the model updating is to minimize the differences between the eigenfrequency and eigenmodes residuals. The updating parameters of the structure are the stiffness reduction factors. The changes of these parameters are determined iteratively by solving a nonlinear constrained optimization problem. The FE model updating algorithm is also tested in the presence of two levels of noise in simulated measurements. In all three cases, the updated MAC values are above 99% and the relative eigenfrequency differences improve substantially after model updating. In cases without noise and with moderate levels of noise; detection, localization and quantification of damage are successfully accomplished. In the case with substantially noisy measurements, detection and localization of damage are successfully realized. Damage quantification is also promising in the presence of high noise as the algorithm can still predict 18 out of 24 damage parameters relatively accurately in that case.

  9. Lazy Updating of hubs can enable more realistic models by speeding up stochastic simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ehlert, Kurt; Loewe, Laurence, E-mail: loewe@wisc.edu [Laboratory of Genetics, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States); Wisconsin Institute for Discovery, University of Wisconsin-Madison, Madison, Wisconsin 53715 (United States)

    2014-11-28

    To respect the nature of discrete parts in a system, stochastic simulation algorithms (SSAs) must update for each action (i) all part counts and (ii) each action's probability of occurring next and its timing. This makes it expensive to simulate biological networks with well-connected “hubs” such as ATP that affect many actions. Temperature and volume also affect many actions and may be changed significantly in small steps by the network itself during fever and cell growth, respectively. Such trends matter for evolutionary questions, as cell volume determines doubling times and fever may affect survival, both key traits for biological evolution. Yet simulations often ignore such trends and assume constant environments to avoid many costly probability updates. Such computational convenience precludes analyses of important aspects of evolution. Here we present “Lazy Updating,” an add-on for SSAs designed to reduce the cost of simulating hubs. When a hub changes, Lazy Updating postpones all probability updates for reactions depending on this hub, until a threshold is crossed. Speedup is substantial if most computing time is spent on such updates. We implemented Lazy Updating for the Sorting Direct Method and it is easily integrated into other SSAs such as Gillespie's Direct Method or the Next Reaction Method. Testing on several toy models and a cellular metabolism model showed >10× faster simulations for its use-cases—with a small loss of accuracy. Thus we see Lazy Updating as a valuable tool for some special but important simulation problems that are difficult to address efficiently otherwise.

  10. Updated Peach Bottom Model for MELCOR 1.8.6: Description and Comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Robb, Kevin R [ORNL

    2014-09-01

    A MELCOR 1.8.5 model of the Peach Bottom Unit 2 or 3 has been updated for MELCOR 1.8.6. Primarily, this update involved modification of the lower head modeling. Three additional updates were also performed. First, a finer nodalization of the containment wet well was employed. Second, the pressure differential used by the logic controlling the safety relief valve actuation was modified. Finally, an additional stochastic failure mechanism for the safety relief valves was added. Simulation results from models with and without the modifications were compared. All the analysis was performed by comparing key figures of merit from simulations of a long-term station blackout scenario. This report describes the model changes and the results of the comparisons.

  11. A simplified model of software project dynamics

    OpenAIRE

    Ruiz Carreira, Mercedes; Ramos Román, Isabel; Toro Bonilla, Miguel

    2001-01-01

    The simulation of a dynamic model for software development projects (hereinafter SDPs) helps to investigate the impact of a technological change, of different management policies, and of maturity level of organisations over the whole project. In the beginning of the 1990s, with the appearance of the dynamic model for SDPs by Abdel-Hamid and Madnick [Software Project Dynamics: An Integrated Approach, Prentice-Hall, Englewood Cliffs, NJ, 1991], a significant advance took place in the field of p...

  12. The hourly updated US High-Resolution Rapid Refresh (HRRR) storm-scale forecast model

    Science.gov (United States)

    Alexander, Curtis; Dowell, David; Benjamin, Stan; Weygandt, Stephen; Olson, Joseph; Kenyon, Jaymes; Grell, Georg; Smirnova, Tanya; Ladwig, Terra; Brown, John; James, Eric; Hu, Ming

    2016-04-01

    The 3-km convective-allowing High-Resolution Rapid Refresh (HRRR) is a US NOAA hourly updating weather forecast model that use a specially configured version of the Advanced Research WRF (ARW) model and assimilate many novel and most conventional observation types on an hourly basis using Gridpoint Statistical Interpolation (GSI). Included in this assimilation is a procedure for initializing ongoing precipitation systems from observed radar reflectivity data (and proxy reflectivity from lightning and satellite data), a cloud analysis to initialize stable layer clouds from METAR and satellite observations, and special techniques to enhance retention of surface observation information. The HRRR is run hourly out to 15 forecast hours over a domain covering the entire conterminous United States using initial and boundary conditions from the hourly-cycled 13km Rapid Refresh (RAP, using similar physics and data assimilation) covering North America and a significant part of the Northern Hemisphere. The HRRR is continually developed and refined at NOAA's Earth System Research Laboratory, and an initial version was implemented into the operational NOAA/NCEP production suite in September 2014. Ongoing experimental RAP and HRRR model development throughout 2014 and 2015 has culminated in a set of data assimilation and model enhancements that will be incorporated into the first simultaneous upgrade of both the operational RAP and HRRR that is scheduled for spring 2016 at NCEP. This presentation will discuss the operational RAP and HRRR changes contained in this upgrade. The RAP domain is being expanded to encompass the NAM domain and the forecast lengths of both the RAP and HRRR are being extended. RAP and HRRR assimilation enhancements have focused on (1) extending surface data assimilation to include mesonet observations and improved use of all surface observations through better background estimates of 2-m temperature and dewpoint including projection of 2-m temperature

  13. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    Energy Technology Data Exchange (ETDEWEB)

    Gettelman, Andrew [University Corporation For Atmospheric Research (UCAR), Boulder, CO (United States)

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  14. Transfer Innovations Fund Updating Project. BC Council on Admissions and Transfer. Tourism Management Articulation

    Science.gov (United States)

    British Columbia Council on Admissions and Transfer, 2010

    2010-01-01

    In 2008, a number of changes were identified that expanded the scope of the updating required for Block Transfer for tourism management as follows: a new core curriculum for diploma programs; the need for expanded information on diploma to diploma transfer; and, a growing need for an expanded system of transfer identified in Campus 2020…

  15. Enhancement of ELDA Tracker Based on CNN Features and Adaptive Model Update.

    Science.gov (United States)

    Gao, Changxin; Shi, Huizhang; Yu, Jin-Gang; Sang, Nong

    2016-04-15

    Appearance representation and the observation model are the most important components in designing a robust visual tracking algorithm for video-based sensors. Additionally, the exemplar-based linear discriminant analysis (ELDA) model has shown good performance in object tracking. Based on that, we improve the ELDA tracking algorithm by deep convolutional neural network (CNN) features and adaptive model update. Deep CNN features have been successfully used in various computer vision tasks. Extracting CNN features on all of the candidate windows is time consuming. To address this problem, a two-step CNN feature extraction method is proposed by separately computing convolutional layers and fully-connected layers. Due to the strong discriminative ability of CNN features and the exemplar-based model, we update both object and background models to improve their adaptivity and to deal with the tradeoff between discriminative ability and adaptivity. An object updating method is proposed to select the "good" models (detectors), which are quite discriminative and uncorrelated to other selected models. Meanwhile, we build the background model as a Gaussian mixture model (GMM) to adapt to complex scenes, which is initialized offline and updated online. The proposed tracker is evaluated on a benchmark dataset of 50 video sequences with various challenges. It achieves the best overall performance among the compared state-of-the-art trackers, which demonstrates the effectiveness and robustness of our tracking algorithm.

  16. Enhancement of ELDA Tracker Based on CNN Features and Adaptive Model Update

    Directory of Open Access Journals (Sweden)

    Changxin Gao

    2016-04-01

    Full Text Available Appearance representation and the observation model are the most important components in designing a robust visual tracking algorithm for video-based sensors. Additionally, the exemplar-based linear discriminant analysis (ELDA model has shown good performance in object tracking. Based on that, we improve the ELDA tracking algorithm by deep convolutional neural network (CNN features and adaptive model update. Deep CNN features have been successfully used in various computer vision tasks. Extracting CNN features on all of the candidate windows is time consuming. To address this problem, a two-step CNN feature extraction method is proposed by separately computing convolutional layers and fully-connected layers. Due to the strong discriminative ability of CNN features and the exemplar-based model, we update both object and background models to improve their adaptivity and to deal with the tradeoff between discriminative ability and adaptivity. An object updating method is proposed to select the “good” models (detectors, which are quite discriminative and uncorrelated to other selected models. Meanwhile, we build the background model as a Gaussian mixture model (GMM to adapt to complex scenes, which is initialized offline and updated online. The proposed tracker is evaluated on a benchmark dataset of 50 video sequences with various challenges. It achieves the best overall performance among the compared state-of-the-art trackers, which demonstrates the effectiveness and robustness of our tracking algorithm.

  17. Function-weighted frequency response function sensitivity method for analytical model updating

    Science.gov (United States)

    Lin, R. M.

    2017-09-01

    Since the frequency response function (FRF) sensitivity method was first proposed [26], it has since become a most powerful and practical method for analytical model updating. Nevertheless, the original formulation of the FRF sensitivity method does suffer the limitation that the initial analytical model to be updated should be reasonably close to the final updated model to be sought, due the assumed mathematical first order approximation implicit to most sensitivity based methods. Convergence to correct model is not guaranteed when large modelling errors exist and blind application often leads to optimal solutions which are truly sought. This paper seeks to examine all the important numerical characteristics of the original FRF sensitivity method including frequency data selection, numerical balance and convergence performance. To further improve the applicability of the method to cases of large modelling errors, a new novel function-weighted sensitivity method is developed. The new method has shown much superior performance on convergence even in the presence of large modelling errors. Extensive numerical case studies based on a mass-spring system and a GARTEUR structure have been conducted and very encouraging results have been achieved. Effect of measurement noise has been examined and the method works reasonably well in the presence of measurement uncertainties. The new method removes the restriction of modelling error magnitude being of second order in Euclidean norm as compared with that of system matrices, thereby making it a truly general method applicable to most practical model updating problems.

  18. MODELING THE EFFECTS OF UPDATING THE INFLUENZA VACCINE ON THE EFFICACY OF REPEATED VACCINATION.

    Energy Technology Data Exchange (ETDEWEB)

    D. SMITH; A. LAPEDES; ET AL

    2000-11-01

    The accumulated wisdom is to update the vaccine strain to the expected epidemic strain only when there is at least a 4-fold difference [measured by the hemagglutination inhibition (HI) assay] between the current vaccine strain and the expected epidemic strain. In this study we investigate the effect, on repeat vaccines, of updating the vaccine when there is a less than 4-fold difference. Methods: Using a computer model of the immune response to repeated vaccination, we simulated updating the vaccine on a 2-fold difference and compared this to not updating the vaccine, in each case predicting the vaccine efficacy in first-time and repeat vaccines for a variety of possible epidemic strains. Results: Updating the vaccine strain on a 2-fold difference resulted in increased vaccine efficacy in repeat vaccines compared to leaving the vaccine unchanged. Conclusions: These results suggest that updating the vaccine strain on a 2-fold difference between the existing vaccine strain and the expected epidemic strain will increase vaccine efficacy in repeat vaccines compared to leaving the vaccine unchanged.

  19. Active control and parameter updating techniques for nonlinear thermal network models

    Science.gov (United States)

    Papalexandris, M. V.; Milman, M. H.

    The present article reports on active control and parameter updating techniques for thermal models based on the network approach. Emphasis is placed on applications where radiation plays a dominant role. Examples of such applications are the thermal design and modeling of spacecrafts and space-based science instruments. Active thermal control of a system aims to approximate a desired temperature distribution or to minimize a suitably defined temperature-dependent functional. Similarly, parameter updating aims to update the values of certain parameters of the thermal model so that the output approximates a distribution obtained through direct measurements. Both problems are formulated as nonlinear, least-square optimization problems. The proposed strategies for their solution are explained in detail and their efficiency is demonstrated through numerical tests. Finally, certain theoretical results pertaining to the characterization of solutions of the problems of interest are also presented.

  20. Adaptive update using visual models for lifting-based motion-compensated temporal filtering

    Science.gov (United States)

    Li, Song; Xiong, H. K.; Wu, Feng; Chen, Hong

    2005-03-01

    Motion compensated temporal filtering is a useful framework for fully scalable video compression schemes. However, when supposed motion models cannot represent a real motion perfectly, both the temporal high and the temporal low frequency sub-bands may contain artificial edges, which possibly lead to a decreased coding efficiency, and ghost artifacts appear in the reconstructed video sequence at lower bit rates or in case of temporal scaling. We propose a new technique that is based on utilizing visual models to mitigate ghosting artifacts in the temporal low frequency sub-bands. Specifically, we propose content adaptive update schemes where visual models are used to determine image dependent upper bounds on information to be updated. Experimental results show that the proposed algorithm can significantly improve subjective visual quality of the low-pass temporal frames and at the same time, coding performance can catch or exceed the classical update steps.

  1. Dynamic test and finite element model updating of bridge structures based on ambient vibration

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The dynamic characteristics of bridge structures are the basis of structural dynamic response and seismic analysis,and are also an important target of health condition monitoring.In this paper,a three-dimensional finite-element model is first established for a highway bridge over a railroad on No.312 National Highway.Based on design drawings,the dynamic characteristics of the bridge are studied using finite element analysis and ambient vibration measurements.Thus,a set of data is selected based on sensitivity analysis and optimization theory;the finite element model of the bridge is updated.The numerical and experimental results show that the updated method is more simple and effective,the updated finite element model can reflect the dynamic characteristics of the bridge better,and it can be used to predict the dynamic response under complex external forces.It is also helpful for further damage identification and health condition monitoring.

  2. A hierarchical updating method for finite element model of airbag buffer system under landing impact

    Institute of Scientific and Technical Information of China (English)

    He Huan; Chen Zhe; He Cheng; Ni Lei; Chen Guoping

    2015-01-01

    In this paper, we propose an impact finite element (FE) model for an airbag landing buf-fer system. First, an impact FE model has been formulated for a typical airbag landing buffer sys-tem. We use the independence of the structure FE model from the full impact FE model to develop a hierarchical updating scheme for the recovery module FE model and the airbag system FE model. Second, we define impact responses at key points to compare the computational and experimental results to resolve the inconsistency between the experimental data sampling frequency and experi-mental triggering. To determine the typical characteristics of the impact dynamics response of the airbag landing buffer system, we present the impact response confidence factors (IRCFs) to evalu-ate how consistent the computational and experiment results are. An error function is defined between the experimental and computational results at key points of the impact response (KPIR) to serve as a modified objective function. A radial basis function (RBF) is introduced to construct updating variables for a surrogate model for updating the objective function, thereby converting the FE model updating problem to a soluble optimization problem. Finally, the developed method has been validated using an experimental and computational study on the impact dynamics of a classic airbag landing buffer system.

  3. A hierarchical updating method for finite element model of airbag buffer system under landing impact

    Directory of Open Access Journals (Sweden)

    He Huan

    2015-12-01

    Full Text Available In this paper, we propose an impact finite element (FE model for an airbag landing buffer system. First, an impact FE model has been formulated for a typical airbag landing buffer system. We use the independence of the structure FE model from the full impact FE model to develop a hierarchical updating scheme for the recovery module FE model and the airbag system FE model. Second, we define impact responses at key points to compare the computational and experimental results to resolve the inconsistency between the experimental data sampling frequency and experimental triggering. To determine the typical characteristics of the impact dynamics response of the airbag landing buffer system, we present the impact response confidence factors (IRCFs to evaluate how consistent the computational and experiment results are. An error function is defined between the experimental and computational results at key points of the impact response (KPIR to serve as a modified objective function. A radial basis function (RBF is introduced to construct updating variables for a surrogate model for updating the objective function, thereby converting the FE model updating problem to a soluble optimization problem. Finally, the developed method has been validated using an experimental and computational study on the impact dynamics of a classic airbag landing buffer system.

  4. World energy projection system: Model documentation

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO) (Figure 1). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES) provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report.

  5. Real-Time Flood Forecasting System Using Channel Flow Routing Model with Updating by Particle Filter

    Science.gov (United States)

    Kudo, R.; Chikamori, H.; Nagai, A.

    2008-12-01

    A real-time flood forecasting system using channel flow routing model was developed for runoff forecasting at water gauged and ungaged points along river channels. The system is based on a flood runoff model composed of upstream part models, tributary part models and downstream part models. The upstream part models and tributary part models are lumped rainfall-runoff models, and the downstream part models consist of a lumped rainfall-runoff model for hillslopes adjacent to a river channel and a kinematic flow routing model for a river channel. The flow forecast of this model is updated by Particle filtering of the downstream part model as well as by the extended Kalman filtering of the upstream part model and the tributary part models. The Particle filtering is a simple and powerful updating algorithm for non-linear and non-gaussian system, so that it can be easily applied to the downstream part model without complicated linearization. The presented flood runoff model has an advantage in simlecity of updating procedure to the grid-based distributed models, which is because of less number of state variables. This system was applied to the Gono-kawa River Basin in Japan, and flood forecasting accuracy of the system with both Particle filtering and extended Kalman filtering and that of the system with only extended Kalman filtering were compared. In this study, water gauging stations in the objective basin were divided into two types of stations, that is, reference stations and verification stations. Reference stations ware regarded as ordinary water gauging stations and observed data at these stations are used for calibration and updating of the model. Verification stations ware considered as ungaged or arbitrary points and observed data at these stations are used not for calibration nor updating but for only evaluation of forecasting accuracy. The result confirms that Particle filtering of the downstream part model improves forecasting accuracy of runoff at

  6. Finite element modelling and updating of friction stir welding (FSW joint for vibration analysis

    Directory of Open Access Journals (Sweden)

    Zahari Siti Norazila

    2017-01-01

    Full Text Available Friction stir welding of aluminium alloys widely used in automotive and aerospace application due to its advanced and lightweight properties. The behaviour of FSW joints plays a significant role in the dynamic characteristic of the structure due to its complexities and uncertainties therefore the representation of an accurate finite element model of these joints become a research issue. In this paper, various finite elements (FE modelling technique for prediction of dynamic properties of sheet metal jointed by friction stir welding will be presented. Firstly, nine set of flat plate with different series of aluminium alloy; AA7075 and AA6061 joined by FSW are used. Nine set of specimen was fabricated using various types of welding parameters. In order to find the most optimum set of FSW plate, the finite element model using equivalence technique was developed and the model validated using experimental modal analysis (EMA on nine set of specimen and finite element analysis (FEA. Three types of modelling were engaged in this study; rigid body element Type 2 (RBE2, bar element (CBAR and spot weld element connector (CWELD. CBAR element was chosen to represent weld model for FSW joints due to its accurate prediction of mode shapes and contains an updating parameter for weld modelling compare to other weld modelling. Model updating was performed to improve correlation between EMA and FEA and before proceeds to updating, sensitivity analysis was done to select the most sensitive updating parameter. After perform model updating, total error of the natural frequencies for CBAR model is improved significantly. Therefore, CBAR element was selected as the most reliable element in FE to represent FSW weld joint.

  7. Remote sensing and climate data as a key for understanding fasciolosis transmission in the Andes: review and update of an ongoing interdisciplinary project

    Directory of Open Access Journals (Sweden)

    Màrius V. Fuentes

    2006-11-01

    Full Text Available Fasciolosis caused by Fasciola hepatica in various South American countries located on the slopes of the Andes has been recognized as an important public health problem. However, the importance of this zoonotic hepatic parasite was neglected until the last decade. Countries such as Peru and Bolivia are considered to be hyperendemic areas for human and animal fasciolosis, and other countries such as Chile, Ecuador, Colombia and Venezuela are also affected. At the beginning of the 1990s a multidisciplinary project was launched with the aim to shed light on the problems related to this parasitic disease in the Northern Bolivian Altiplano. A few years later, a geographic information system (GIS was incorporated into this multidisciplinary project analysing the epidemiology of human and animal fasciolosis in this South American Andean region. Various GIS projects were developed in some Andean regions using climatic data, climatic forecast indices and remote sensing data. Step by step, all these GIS projects concerning the forecast of the fasciolosis transmission risk in the Andean mountain range were revised and in some cases updated taking into account new data. The first of these projects was developed on a regional scale for the central Chilean regions and the proposed model was validated on a local scale in the Northern Bolivian Altiplano. This validated mixed model, based on both fasciolosis climatic forecast indices and normalized difference vegetation index values from Advanced Very High Resolution Radiometer satellite sensor, was extrapolated to other human and/or animal endemic areas of Peru and Ecuador. The resulting fasciolosis risk maps make it possible to show the known human endemic areas of, mainly, the Peruvian Altiplano, Cajamarca and Mantaro Peruvian valleys, and some valleys of the Ecuadorian Cotopaxi province. Nevertheless, more climate and remote sensing data, as well as more accurate epidemiological reports, have to be

  8. A new Gibbs sampling based algorithm for Bayesian model updating with incomplete complex modal data

    Science.gov (United States)

    Cheung, Sai Hung; Bansal, Sahil

    2017-08-01

    Model updating using measured system dynamic response has a wide range of applications in system response evaluation and control, health monitoring, or reliability and risk assessment. In this paper, we are interested in model updating of a linear dynamic system with non-classical damping based on incomplete modal data including modal frequencies, damping ratios and partial complex mode shapes of some of the dominant modes. In the proposed algorithm, the identification model is based on a linear structural model where the mass and stiffness matrix are represented as a linear sum of contribution of the corresponding mass and stiffness matrices from the individual prescribed substructures, and the damping matrix is represented as a sum of individual substructures in the case of viscous damping, in terms of mass and stiffness matrices in the case of Rayleigh damping or a combination of the former. To quantify the uncertainties and plausibility of the model parameters, a Bayesian approach is developed. A new Gibbs-sampling based algorithm is proposed that allows for an efficient update of the probability distribution of the model parameters. In addition to the model parameters, the probability distribution of complete mode shapes is also updated. Convergence issues and numerical issues arising in the case of high-dimensionality of the problem are addressed and solutions to tackle these problems are proposed. The effectiveness and efficiency of the proposed method are illustrated by numerical examples with complex modes.

  9. The sequential propensity household projection model

    Directory of Open Access Journals (Sweden)

    Tom Wilson

    2013-04-01

    Full Text Available BACKGROUND The standard method of projecting living arrangements and households in Australia and New Zealand is the 'propensity model', a type of extended headship rate model. Unfortunately it possesses a number of serious shortcomings, including internal inconsistencies, difficulties in setting living arrangement assumptions, and very limited scenario creation capabilities. Data allowing the application of more sophisticated dynamic household projection models are unavailable in Australia. OBJECTIVE The aim was create a projection model to overcome these shortcomings whilst minimising input data requirements and costs, and retaining the projection outputs users are familiar with. METHODS The sequential propensity household projection model is proposed. Living arrangement projections take place in a sequence of calculations, with progressively more detailed living arrangement categories calculated in each step. In doing so the model largely overcomes the three serious deficiencies of the standard propensity model noted above. RESULTS The model is illustrated by three scenarios produced for one case study State, Queensland. They are: a baseline scenario in which all propensities are held constant to demonstrate the effects of population growth and ageing, a housing crisis scenario where housing affordability declines, and a prosperity scenario where families and individuals enjoy greater real incomes. A sensitivity analysis in which assumptions are varied one by one is also presented. CONCLUSIONS The sequential propensity model offers a more effective method of producing household and living arrangement projections than the standard propensity model, and is a practical alternative to dynamic projection models for countries and regions where the data and resources to apply such models are unavailable.

  10. A Model of Project and Organisational Dynamics

    Directory of Open Access Journals (Sweden)

    Jenny Leonard

    2012-04-01

    Full Text Available The strategic, transformational nature of many information systems projects is now widely understood. Large-scale implementations of systems are known to require significant management of organisational change in order to be successful. Moreover, projects are rarely executed in isolation – most organisations have a large programme of projects being implemented at any one time. However, project and value management methodologies provide ad hoc definitions of the relationship between a project and its environment. This limits the ability of an organisation to manage the larger dynamics between projects and organisations, over time, and between projects. The contribution of this paper, therefore, is to use literature on organisational theory to provide a more systematic understanding of this area. The organisational facilitators required to obtain value from a project are categorised, and the processes required to develop those facilitators are defined. This formalisation facilitates generalisation between projects and highlights any time and path dependencies required in developing organisational facilitators. The model therefore has the potential to contribute to the development of IS project management theory within dynamic organisational contexts. Six cases illustrate how this model could be used.

  11. Clustering of Parameter Sensitivities: Examples from a Helicopter Airframe Model Updating Exercise

    OpenAIRE

    Shahverdi, H.; C. Mares; W. Wang; J. E. Mottershead

    2009-01-01

    The need for high fidelity models in the aerospace industry has become ever more important as increasingly stringent requirements on noise and vibration levels, reliability, maintenance costs etc. come into effect. In this paper, the results of a finite element model updating exercise on a Westland Lynx XZ649 helicopter are presented. For large and complex structures, such as a helicopter airframe, the finite element model represents the main tool for obtaining accurate models which could pre...

  12. K3 projective models in scrolls

    CERN Document Server

    Johnsen, Trygve

    2004-01-01

    The exposition studies projective models of K3 surfaces whose hyperplane sections are non-Clifford general curves. These models are contained in rational normal scrolls. The exposition supplements standard descriptions of models of general K3 surfaces in projective spaces of low dimension, and leads to a classification of K3 surfaces in projective spaces of dimension at most 10. The authors bring further the ideas in Saint-Donat's classical article from 1974, lifting results from canonical curves to K3 surfaces and incorporating much of the Brill-Noether theory of curves and theory of syzygies developed in the mean time.

  13. Yucca Mountain Site Characterization Project Bibliography, July--December 1994: An update

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Following a reorganization of the Office of Civilian Radioactive Waste Management in 1990, the Yucca Mountain Project was renamed Yucca Mountain Site Charactrization Project. The title of this bibliography was also changed to Yucca Mountain Site Characterization Project Bibliography. Prior to August 5, 1988, this project was called the Nevada Nuclear Waste Storage Investigations. This bibliography contains information on this ongoing project that was added to the Department of Energy`s Science and Technology Database from July 1, 1994 through December 31, 1994. The bibliography is categorized by principal project participating organization. Participant-sponsored subcontractor reports, papers, and articles are included in the sponsoring organization`s list. Another section contains information about publications on the Energy Science and Technology Database that were not sponsored by the project but have some relevance to it.

  14. Yucca Mountain Site Characterization Project bibliography, January--June 1995. Supplement 4, Add.3: An update

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, P.M. [ed.

    1996-01-01

    Following a reorganization of the Office of Civilian Radioactive Waste Management in 1990, the Yucca Mountain Project was renamed Yucca Mountain Site Characterization Project. The title of this bibliography was also changed to Yucca Mountain Site Characterization Project Bibliography. Prior to August 5, 1988, this project was called the Nevada Nuclear Waste Storage Investigations. This bibliography contains information on this ongoing project that was added to the Department of Energy`s Energy Science and Technology Database from January 1, 1995, through June 30, 1995. The bibliography is categorized by principal project participating organization. Participant-sponsored subcontractor reports, papers, and articles are included in the sponsoring organization`s list. Another section contains information about publications on the Energy Science and Technology Database that were not sponsored by the project but have some relevance to it.

  15. Yucca Mountain Site Characterization Project Bibliography, January--June 1993. An update: Supplement 4, Addendum 1

    Energy Technology Data Exchange (ETDEWEB)

    Stephan, P.M. [ed.

    1995-01-01

    Following a reorganization of the Office of Civilian Radioactive Waste Management in 1990, the Yucca Mountain Project was renamed Yucca Mountain Site Characterization Project. The title of this bibliography was also changed to Yucca Mountain Site Characterization Project Bibliography. Prior to August 5, 1988, this project was called the Nevada Nuclear Waste Storage Investigations. This bibliography contains information on this ongoing project that was added to the Department of Energy`s Energy Science and Technology Database from January 1, 1994 through June 30, 1994. The bibliography is categorized by principal project participating organization. Participant-sponsored subcontractor reports, papers,and articles are included in the sponsoring organization`s list. Another section contains information about publications on the Energy Science and Technology Database that were not sponsored by the project but have some relevance to it.

  16. A New Probability of Detection Model for Updating Crack Distribution of Offshore Structures

    Institute of Scientific and Technical Information of China (English)

    李典庆; 张圣坤; 唐文勇

    2003-01-01

    There exists model uncertainty of probability of detection for inspecting ship structures with nondestructive inspection techniques. Based on a comparison of several existing probability of detection (POD) models, a new probability of detection model is proposed for the updating of crack size distribution. Furthermore, the theoretical derivation shows that most existing probability of detection models are special cases of the new probability of detection model. The least square method is adopted for determining the values of parameters in the new POD model. This new model is also compared with other existing probability of detection models. The results indicate that the new probability of detection model can fit the inspection data better. This new probability of detection model is then applied to the analysis of the problem of crack size updating for offshore structures. The Bayesian updating method is used to analyze the effect of probability of detection models on the posterior distribution of a crack size. The results show that different probabilities of detection models generate different posterior distributions of a crack size for offshore structures.

  17. Nonlinear structural joint model updating based on instantaneous characteristics of dynamic responses

    Science.gov (United States)

    Wang, Zuo-Cai; Xin, Yu; Ren, Wei-Xin

    2016-08-01

    This paper proposes a new nonlinear joint model updating method for shear type structures based on the instantaneous characteristics of the decomposed structural dynamic responses. To obtain an accurate representation of a nonlinear system's dynamics, the nonlinear joint model is described as the nonlinear spring element with bilinear stiffness. The instantaneous frequencies and amplitudes of the decomposed mono-component are first extracted by the analytical mode decomposition (AMD) method. Then, an objective function based on the residuals of the instantaneous frequencies and amplitudes between the experimental structure and the nonlinear model is created for the nonlinear joint model updating. The optimal values of the nonlinear joint model parameters are obtained by minimizing the objective function using the simulated annealing global optimization method. To validate the effectiveness of the proposed method, a single-story shear type structure subjected to earthquake and harmonic excitations is simulated as a numerical example. Then, a beam structure with multiple local nonlinear elements subjected to earthquake excitation is also simulated. The nonlinear beam structure is updated based on the global and local model using the proposed method. The results show that the proposed local nonlinear model updating method is more effective for structures with multiple local nonlinear elements. Finally, the proposed method is verified by the shake table test of a real high voltage switch structure. The accuracy of the proposed method is quantified both in numerical and experimental applications using the defined error indices. Both the numerical and experimental results have shown that the proposed method can effectively update the nonlinear joint model.

  18. Finite element model updating of natural fibre reinforced composite structure in structural dynamics

    Directory of Open Access Journals (Sweden)

    Sani M.S.M.

    2016-01-01

    Full Text Available Model updating is a process of making adjustment of certain parameters of finite element model in order to reduce discrepancy between analytical predictions of finite element (FE and experimental results. Finite element model updating is considered as an important field of study as practical application of finite element method often shows discrepancy to the test result. The aim of this research is to perform model updating procedure on a composite structure as well as trying improving the presumed geometrical and material properties of tested composite structure in finite element prediction. The composite structure concerned in this study is a plate of reinforced kenaf fiber with epoxy. Modal properties (natural frequency, mode shapes, and damping ratio of the kenaf fiber structure will be determined using both experimental modal analysis (EMA and finite element analysis (FEA. In EMA, modal testing will be carried out using impact hammer test while normal mode analysis using FEA will be carried out using MSC. Nastran/Patran software. Correlation of the data will be carried out before optimizing the data from FEA. Several parameters will be considered and selected for the model updating procedure.

  19. Teaching mathematical modelling through project work

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Kjeldsen, Tinne Hoff

    2006-01-01

    are reported in manners suitable for internet publication for colleagues. The reports and the related discussions reveal interesting dilemmas concerning the teaching of mathematical modelling and how to cope with these through “setting the scene” for the students modelling projects and through dialogues...... in their own classes, evaluate and report a project based problem oriented course in mathematical modelling. The in-service course runs over one semester and includes three seminars of 3, 1 and 2 days. Experiences show that the course objectives in general are fulfilled and that the course projects......The paper presents and analyses experiences from developing and running an in-service course in project work and mathematical modelling for mathematics teachers in the Danish gymnasium, e.g. upper secondary level, grade 10-12. The course objective is to support the teachers to develop, try out...

  20. Causal Models for Safety Assurance Technologies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fulfillment of NASA's System-Wide Safety and Assurance Technology (SSAT) project at NASA requires leveraging vast amounts of data into actionable knowledge. Models...

  1. Teaching mathematical modelling through project work

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Kjeldsen, Tinne Hoff

    2006-01-01

    The paper presents and analyses experiences from developing and running an in-service course in project work and mathematical modelling for mathematics teachers in the Danish gymnasium, e.g. upper secondary level, grade 10-12. The course objective is to support the teachers to develop, try out...... in their own classes, evaluate and report a project based problem oriented course in mathematical modelling. The in-service course runs over one semester and includes three seminars of 3, 1 and 2 days. Experiences show that the course objectives in general are fulfilled and that the course projects...... are reported in manners suitable for internet publication for colleagues. The reports and the related discussions reveal interesting dilemmas concerning the teaching of mathematical modelling and how to cope with these through “setting the scene” for the students modelling projects and through dialogues...

  2. Developing Project Duration Models in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Pierre Bourque; Serge Oligny; Alain Abran; Bertrand Fournier

    2007-01-01

    Based on the empirical analysis of data contained in the International Software Benchmarking Standards Group(ISBSG) repository, this paper presents software engineering project duration models based on project effort. Duration models are built for the entire dataset and for subsets of projects developed for personal computer, mid-range and mainframeplatforms. Duration models are also constructed for projects requiring fewer than 400 person-hours of effort and for projectsre quiring more than 400 person-hours of effort. The usefulness of adding the maximum number of assigned resources as asecond independent variable to explain duration is also analyzed. The opportunity to build duration models directly fromproject functional size in function points is investigated as well.

  3. Using radar altimetry to update a routing model of the Zambezi River Basin

    DEFF Research Database (Denmark)

    Michailovsky, Claire Irene B.; Bauer-Gottwein, Peter

    2012-01-01

    Satellite radar altimetry allows for the global monitoring of lakes and river levels. However, the widespread use of altimetry for hydrological studies is limited by the coarse temporal and spatial resolution provided by current altimetric missions and the fact that discharge rather than level...... is needed for hydrological applications. To overcome these limitations, altimetry river levels can be combined with hydrological modeling in a dataassimilation framework. This study focuses on the updating of a river routing model of the Zambezi using river levels from radar altimetry. A hydrological model...... of the basin was built to simulate the land phase of the water cycle and produce inflows to a Muskingum routing model. River altimetry from the ENVISAT mission was then used to update the storages in the reaches of the Muskingum model using the Extended Kalman Filter. The method showed improvements in modeled...

  4. A pose-based structural dynamic model updating method for serial modular robots

    Science.gov (United States)

    Mohamed, Richard Phillip; Xi, Fengfeng (Jeff); Chen, Tianyan

    2017-02-01

    A new approach is presented for updating the structural dynamic component models of serial modular robots using experimental data from component tests such that the updated model of the entire robot assembly can provide accurate results in any pose. To accomplish this, a test-analysis component mode synthesis (CMS) model with fixed-free component boundaries is implemented to directly compare measured frequency response functions (FRFs) from vibration experiments of individual modules. The experimental boundary conditions are made to emulate module connection interfaces and can enable individual joint and link modules to be tested in arbitrary poses. By doing so, changes in the joint dynamics can be observed and more FRF data points can be obtained from experiments to be used in the updating process. Because this process yields an overdetermined system of equations, a direct search method with nonlinear constraints on the resonances and antiresonances is used to update the FRFs of the analytical component models. The effectiveness of the method is demonstrated with experimental case studies on an adjustable modular linkage system. Overall, the method can enable virtual testing of modular robot systems without the need to perform further testing on entire assemblies.

  5. Progressive collapse analysis using updated models for alternate path analysis after a blast

    Science.gov (United States)

    Eskew, Edward; Jang, Shinae; Bertolaccini, Kelly

    2016-04-01

    Progressive collapse is of rising importance within the structural engineering community due to several recent cases. The alternate path method is a design technique to determine the ability of a structure to sustain the loss of a critical element, or elements, and still resist progressive collapse. However, the alternate path method only considers the removal of the critical elements. In the event of a blast, significant damage may occur to nearby members not included in the alternate path design scenarios. To achieve an accurate assessment of the current condition of the structure after a blast or other extreme event, it may be necessary to reduce the strength or remove additional elements beyond the critical members designated in the alternate path design method. In this paper, a rapid model updating technique utilizing vibration measurements is used to update the structural model to represent the real-time condition of the structure after a blast occurs. Based upon the updated model, damaged elements will either have their strength reduced, or will be removed from the simulation. The alternate path analysis will then be performed, but only utilizing the updated structural model instead of numerous scenarios. After the analysis, the simulated response from the analysis will be compared to failure conditions to determine the buildings post-event condition. This method has the ability to incorporate damage to noncritical members into the analysis. This paper will utilize numerical simulations based upon a unified facilities criteria (UFC) example structure subjected to an equivalent blast to validate the methodology.

  6. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ) model

    Science.gov (United States)

    Sea spray aerosols (SSA) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. In this study, the Community Multiscale Air Quality (CMAQ) model is updated to enhance fine mode SSA emissions,...

  7. Spatial coincidence modelling, automated database updating and data consistency in vector GIS.

    NARCIS (Netherlands)

    Kufoniyi, O.

    1995-01-01

    This thesis presents formal approaches for automated database updating and consistency control in vector- structured spatial databases. To serve as a framework, a conceptual data model is formalized for the representation of geo-data from multiple map layers in which a map layer denotes a set of ter

  8. Towards an integrated workflow for structural reservoir model updating and history matching

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Peters, E.; Wilschut, F.

    2011-01-01

    A history matching workflow, as typically used for updating of petrophysical reservoir model properties, is modified to include structural parameters including the top reservoir and several fault properties: position, slope, throw and transmissibility. A simple 2D synthetic oil reservoir produced by

  9. Towards an integrated workflow for structural reservoir model updating and history matching

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Peters, E.; Wilschut, F.

    2011-01-01

    A history matching workflow, as typically used for updating of petrophysical reservoir model properties, is modified to include structural parameters including the top reservoir and several fault properties: position, slope, throw and transmissibility. A simple 2D synthetic oil reservoir produced by

  10. Custom map projections for regional groundwater models

    Science.gov (United States)

    Kuniansky, Eve L.

    2017-01-01

    For regional groundwater flow models (areas greater than 100,000 km2), improper choice of map projection parameters can result in model error for boundary conditions dependent on area (recharge or evapotranspiration simulated by application of a rate using cell area from model discretization) and length (rivers simulated with head-dependent flux boundary). Smaller model areas can use local map coordinates, such as State Plane (United States) or Universal Transverse Mercator (correct zone) without introducing large errors. Map projections vary in order to preserve one or more of the following properties: area, shape, distance (length), or direction. Numerous map projections are developed for different purposes as all four properties cannot be preserved simultaneously. Preservation of area and length are most critical for groundwater models. The Albers equal-area conic projection with custom standard parallels, selected by dividing the length north to south by 6 and selecting standard parallels 1/6th above or below the southern and northern extent, preserves both area and length for continental areas in mid latitudes oriented east-west. Custom map projection parameters can also minimize area and length error in non-ideal projections. Additionally, one must also use consistent vertical and horizontal datums for all geographic data. The generalized polygon for the Floridan aquifer system study area (306,247.59 km2) is used to provide quantitative examples of the effect of map projections on length and area with different projections and parameter choices. Use of improper map projection is one model construction problem easily avoided.

  11. A model updating method for hybrid composite/aluminum bolted joints using modal test data

    Science.gov (United States)

    Adel, Farhad; Shokrollahi, Saeed; Jamal-Omidi, Majid; Ahmadian, Hamid

    2017-05-01

    The aim of this paper is to present a simple and applicable model for predicting the dynamic behavior of bolted joints in hybrid aluminum/composite structures and its model updating using modal test data. In this regards, after investigations on bolted joints in metallic structures which led to a new concept called joint affected region (JAR) published in Shokrollahi and Adel (2016), now, a doubly connective layer is established in order to simulate the bolted joint interfaces in hybrid structures. Using the proposed model, the natural frequencies of the hybrid bolted joint structure are computed and compared to the modal test results in order to evaluate and verify the new model predictions. Because of differences in the results of two approaches, the finite element (FE) model is updated based on the genetic algorithm (GA) by minimizing the differences between analytical model and test results. This is done by identifying the parameters at the JAR including isotropic Young's modulus in metallic substructure and that of anisotropic composite substructure. The updated model compared to the initial model simulates experimental results more properly. Therefore, the proposed model can be used for modal analysis of the hybrid joint interfaces in complex and large structures.

  12. Stochastic filtering for damage identification through nonlinear structural finite element model updating

    Science.gov (United States)

    Astroza, Rodrigo; Ebrahimian, Hamed; Conte, Joel P.

    2015-03-01

    This paper describes a novel framework that combines advanced mechanics-based nonlinear (hysteretic) finite element (FE) models and stochastic filtering techniques to estimate unknown time-invariant parameters of nonlinear inelastic material models used in the FE model. Using input-output data recorded during earthquake events, the proposed framework updates the nonlinear FE model of the structure. The updated FE model can be directly used for damage identification and further used for damage prognosis. To update the unknown time-invariant parameters of the FE model, two alternative stochastic filtering methods are used: the extended Kalman filter (EKF) and the unscented Kalman filter (UKF). A three-dimensional, 5-story, 2-by-1 bay reinforced concrete (RC) frame is used to verify the proposed framework. The RC frame is modeled using fiber-section displacement-based beam-column elements with distributed plasticity and is subjected to the ground motion recorded at the Sylmar station during the 1994 Northridge earthquake. The results indicate that the proposed framework accurately estimate the unknown material parameters of the nonlinear FE model. The UKF outperforms the EKF when the relative root-mean-square error of the recorded responses are compared. In addition, the results suggest that the convergence of the estimate of modeling parameters is smoother and faster when the UKF is utilized.

  13. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... of the chicken processing line model....

  14. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; van den Berg, Stéphanie Martine

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  15. Updated Results for the Wake Vortex Inverse Model

    Science.gov (United States)

    Robins, Robert E.; Lai, David Y.; Delisi, Donald P.; Mellman, George R.

    2008-01-01

    NorthWest Research Associates (NWRA) has developed an Inverse Model for inverting aircraft wake vortex data. The objective of the inverse modeling is to obtain estimates of the vortex circulation decay and crosswind vertical profiles, using time history measurements of the lateral and vertical position of aircraft vortices. The Inverse Model performs iterative forward model runs using estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Iterations are performed until a user-defined criterion is satisfied. Outputs from an Inverse Model run are the best estimates of the time history of the vortex circulation derived from the observed data, the vertical crosswind profile, and several vortex parameters. The forward model, named SHRAPA, used in this inverse modeling is a modified version of the Shear-APA model, and it is described in Section 2 of this document. Details of the Inverse Model are presented in Section 3. The Inverse Model was applied to lidar-observed vortex data at three airports: FAA acquired data from San Francisco International Airport (SFO) and Denver International Airport (DEN), and NASA acquired data from Memphis International Airport (MEM). The results are compared with observed data. This Inverse Model validation is documented in Section 4. A summary is given in Section 5. A user's guide for the inverse wake vortex model is presented in a separate NorthWest Research Associates technical report (Lai and Delisi, 2007a).

  16. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2016-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  17. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  18. BPS-ICF model, a tool to measure biopsychosocial functioning and disability within ICF concepts: theory and practice updated.

    Science.gov (United States)

    Talo, Seija A; Rytökoski, Ulla M

    2016-03-01

    The transformation of International Classification of Impairments, Disabilities and Handicaps into International Classification of Functioning, Disability and Health (ICF) meant a lot for those needing to communicate in terms of functioning concept in their daily work. With ICF's commonly understood language, the decades' uncertainty on what concepts and terms describe functioning and disabilities seemed to be dispelled. Instead, operationalizing ICF to measure the level of functioning along with the new nomenclature has not been as unambiguous. Transforming linguistic terms into quantified functioning seems to need another type of theorizing. Irrespective of challenging tasks, numerous projects were formulated during the past decades to apply ICF for measurement purposes. This article updates one of them, the so-called biopsychosocial-ICF model, which uses all ICF categories but classifies them into more components than ICF for measurement purposes. The model suggests that both disabilities and functional resources should be described by collecting and organizing functional measurement data in a multidisciplinary, biopsychosocial data matrice.

  19. The Middle Eastern Regional Irrigation Management Information Systems project-update

    Science.gov (United States)

    The Middle Eastern Regional Irrigation Management Information Systems Project (MERIMIS) was formulated at a meeting of experts from the region in Jordan in 2003. Funded by the U.S. Department of State, it is a cooperative regional project bringing together participants from Israel, Jordan, Palestini...

  20. Updating Finite Element Model of a Wind Turbine Blade Section Using Experimental Modal Analysis Results

    DEFF Research Database (Denmark)

    Luczak, Marcin; Manzato, Simone; Peeters, Bart;

    2014-01-01

    of model parameters was selected for the model updating process. Design of experiment and response surface method was implemented to find values of model parameters yielding results closest to the experimental. The updated finite element model is producing results more consistent with the measurement...... is to validate finite element model of the modified wind turbine blade section mounted in the flexible support structure accordingly to the experimental results. Bend-twist coupling was implemented by adding angled unidirectional layers on the suction and pressure side of the blade. Dynamic test and simulations...... were performed on a section of a full scale wind turbine blade provided by Vestas Wind Systems A/S. The numerical results are compared to the experimental measurements and the discrepancies are assessed by natural frequency difference and modal assurance criterion. Based on sensitivity analysis, set...

  1. PRINCIPAL COMPONENT DECOMPOSITION BASED FINITE ELEMENT MODEL UPDATING FOR STRAIN-RATE-DEPENDENCE NONLINEAR DYNAMIC PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    GUO Qintao; ZHANG Lingmi; TAO Zheng

    2008-01-01

    Thin wall component is utilized to absorb impact energy of a structure. However, the dynamic behavior of such thin-walled structure is highly non-linear with material, geometry and boundary non-linearity. A model updating and validation procedure is proposed to build accurate finite element model of a frame structure with a non-linear thin-walled component for dynamic analysis. Design of experiments (DOE) and principal component decomposition (PCD) approach are applied to extract dynamic feature from nonlinear impact response for correlation of impact test result and FE model of the non-linear structure. A strain-rate-dependent non-linear model updating method is then developed to build accurate FE model of the structure. Computer simulation and a real frame structure with a highly non-linear thin-walled component are employed to demonstrate the feasibility and effectiveness of the proposed approach.

  2. Parabolic Trough Collector Cost Update for the System Advisor Model (SAM)

    Energy Technology Data Exchange (ETDEWEB)

    Kurup, Parthiv [National Renewable Energy Lab. (NREL), Golden, CO (United States); Turchi, Craig S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    This report updates the baseline cost for parabolic trough solar fields in the United States within NREL's System Advisor Model (SAM). SAM, available at no cost at https://sam.nrel.gov/, is a performance and financial model designed to facilitate decision making for people involved in the renewable energy industry. SAM is the primary tool used by NREL and the U.S. Department of Energy (DOE) for estimating the performance and cost of concentrating solar power (CSP) technologies and projects. The study performed a bottom-up build and cost estimate for two state-of-the-art parabolic trough designs -- the SkyTrough and the Ultimate Trough. The SkyTrough analysis estimated the potential installed cost for a solar field of 1500 SCAs as $170/m2 +/- $6/m2. The investigation found that SkyTrough installed costs were sensitive to factors such as raw aluminum alloy cost and production volume. For example, in the case of the SkyTrough, the installed cost would rise to nearly $210/m2 if the aluminum alloy cost was $1.70/lb instead of $1.03/lb. Accordingly, one must be aware of fluctuations in the relevant commodities markets to track system cost over time. The estimated installed cost for the Ultimate Trough was only slightly higher at $178/m2, which includes an assembly facility of $11.6 million amortized over the required production volume. Considering the size and overall cost of a 700 SCA Ultimate Trough solar field, two parallel production lines in a fully covered assembly facility, each with the specific torque box, module and mirror jigs, would be justified for a full CSP plant.

  3. EXPENSES FORECASTING MODEL IN UNIVERSITY PROJECTS PLANNING

    Directory of Open Access Journals (Sweden)

    Sergei A. Arustamov

    2016-11-01

    Full Text Available The paper deals with mathematical model presentation of cash flows in project funding. We describe different types of expenses linked to university project activities. Problems of project budgeting that contribute most uncertainty have been revealed. As an example of the model implementation we consider calculation of vacation allowance expenses for project participants. We define problems of forecast for funds reservation: calculation based on methodology established by the Ministry of Education and Science calculation according to the vacation schedule and prediction of the most probable amount. A stochastic model for vacation allowance expenses has been developed. We have proposed methods and solution of the problems that increase the accuracy of forecasting for funds reservation based on 2015 data.

  4. Applications of EOR (enhanced oil recovery) technology in field projects--1990 update

    Energy Technology Data Exchange (ETDEWEB)

    Pautz, J.F.; Thomas, R.D.

    1991-01-01

    Trends in the type and number of US enhanced oil recovery (EOR) projects are analyzed for the period from 1980 through 1989. The analysis is based on current literature and news media and the Department of Energy (DOE) EOR Project Data Base, which contains information on over 1,348 projects. The characteristics of the EOR projects are grouped by starting date and process type to identify trends in reservoir statistics and applications of process technologies. Twenty-two EOR projects starts were identified for 1989 and ten project starts for 1988. An obvious trend over recent years has been the decline in the number of project starts since 1981 until 1988 which corresponds to the oil price decline during that period. There was a modest recovery in 1989 of project starts, which lags the modest recovery of oil prices in 1987 that was reconfirmed in 1989. During the time frame of 1980 to 1989, there has been a gradual improvement in costs of operation for EOR technology. The perceived average cost of EOR has gone down from a $30/bbl range to low $20/bbl. These costs of operation seems to stay just at the price of oil or slightly above to result in marginal profitability. The use of polymer flooding has drastically decreased both in actual and relative numbers of project starts since the oil price drop in 1986. Production from polymer flooding is down more than 50%. Long-term plans for large, high-cost projects such as CO{sub 2} flooding in West Texas, steamflooding in California, and hydrocarbon flooding on the North Slope have continued to be implemented. EOR process technologies have been refined to be more cost effective as shown by the continued application and rising production attributable to EOR. 8 refs., 6 figs., 13 tabs.

  5. Updated cloud physics in a regional atmospheric climate model improves the modelled surface energy balance of Antarctica

    NARCIS (Netherlands)

    van Wessem, J.M.; Reijmer, C.H.; Lenaerts, J.T.M.; van de Berg, W.J.; van den Broeke, M.R.; van Meijgaard, E.

    2014-01-01

    In this study the effects of changes in the physics package of the regional atmospheric climate model RACMO2 on the modelled surface energy balance, nearsurface temperature and wind speed of Antarctica are presented. The physics package update primarily consists of an improved turbulent and radiativ

  6. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  7. Marshal: Maintaining Evolving Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — SIFT proposes to design and develop the Marshal system, a mixed-initiative tool for maintaining task models over the course of evolving missions. Marshal-enabled...

  8. Advanced Spacecraft Thermal Modeling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft developers who spend millions to billions of dollars per unit and require 3 to 7 years to deploy, the LoadPath reduced-order (RO) modeling thermal...

  9. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  10. Model update and variability assessment for automotive crash simulations

    NARCIS (Netherlands)

    Sun, J.; He, J.; Vlahopoulos, N.; Ast, P. van

    2007-01-01

    In order to develop confidence in numerical models which are used for automotive crash simulations, results are often compared with test data, and in some cases the numerical models are adjusted in order to improve the correlation. Comparisons between the time history of acceleration responses from

  11. Varying facets of a model of competitive learning: the role of updates and memory

    CERN Document Server

    Bhat, Ajaz Ahmad

    2011-01-01

    The effects of memory and different updating paradigms in a game-theoretic model of competitive learning, comprising two distinct agent types, are analysed. For nearly all the updating schemes, the phase diagram of the model consists of a disordered phase separating two ordered phases at coexistence: the critical exponents of these transitions belong to the generalised universality class of the voter model. Also, as appropriate for a model of competing strategies, we examine the situation when the two types have different characteristics, i.e. their parameters are chosen to be away from coexistence. We find linear response behaviour in the expected regimes but, more interestingly, are able to probe the effect of memory. This suggests that even the less successful agent types can win over the more successful ones, provided they have better retentive powers.

  12. An update on land-ice modeling in the CESM

    Energy Technology Data Exchange (ETDEWEB)

    Lipscomb, William H [Los Alamos National Laboratory

    2011-01-18

    Mass loss from land ice, including the Greenland and Antarctic ice sheets as well as smaller glacier and ice caps, is making a large and growing contribution to global sea-level rise. Land ice is only beginning to be incorporated in climate models. The goal of the Land Ice Working Group (LIWG) is to develop improved land-ice models and incorporate them in CESM, in order to provide useful, physically-based sea-level predictions. LJWG efforts to date have led to the inclusion of a dynamic ice-sheet model (the Glimmer Community Ice Sheet Model, or Glimmer-CISM) in the Community Earth System Model (CESM), which was released in June 2010. CESM also includes a new surface-mass-balance scheme for ice sheets in the Community Land Model. Initial modeling efforts are focused on the Greenland ice sheet. Preliminary results are promising. In particular, the simulated surface mass balance for Greenland is in good agreement with observations and regional model results. The current model, however, has significant limitations: The land-ice coupling is one-way; we are using a serial version of Glimmer-CISM with the shallow-ice approximation; and there is no ice-ocean coupling. During the next year we plan to implement two-way coupling (including ice-ocean coupling with a dynamic Antarctic ice sheet) with a parallel , higher-order version of Glimmer-CISM. We will also add parameterizations of small glaciers and ice caps. With these model improvements, CESM will be able to simulate all the major contributors to 21st century global sea-level rise. Results of the first round of simulations should be available in time to be included in the Fifth Assessment Report (ARS) of the Intergovernmental Panel on Climate Change.

  13. Hydraulic model with roughness coefficient updating method based on Kalman filter for channel flood forecast

    Directory of Open Access Journals (Sweden)

    Hong-jun BAO

    2011-03-01

    Full Text Available A real-time channel flood forecast model was developed to simulate channel flow in plain rivers based on the dynamic wave theory. Taking into consideration channel shape differences along the channel, a roughness updating technique was developed using the Kalman filter method to update Manning’s roughness coefficient at each time step of the calculation processes. Channel shapes were simplified as rectangles, triangles, and parabolas, and the relationships between hydraulic radius and water depth were developed for plain rivers. Based on the relationship between the Froude number and the inertia terms of the momentum equation in the Saint-Venant equations, the relationship between Manning’s roughness coefficient and water depth was obtained. Using the channel of the Huaihe River from Wangjiaba to Lutaizi stations as a case, to test the performance and rationality of the present flood routing model, the original hydraulic model was compared with the developed model. Results show that the stage hydrographs calculated by the developed flood routing model with the updated Manning’s roughness coefficient have a good agreement with the observed stage hydrographs. This model performs better than the original hydraulic model.

  14. Project Update: A Collaborative Partnership Between Federal, State, and Non-Government Organizations (NGOs)

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This document provides a project overview, accomplishments, and future efforts in the Integrated Waterbird Management and Monitoring (IWMM) initiative.

  15. NASA's International Lunar Network Anchor Nodes and Robotic Lunar Lander Project Update

    Science.gov (United States)

    Cohen, Barbara A.; Bassler, Julie A.; Ballard, Benjamin; Chavers, Greg; Eng, Doug S.; Hammond, Monica S.; Hill, Larry A.; Harris, Danny W.; Hollaway, Todd A.; Kubota, Sanae; Morse, Brian J.; Mulac, Brian D.; Reed, Cheryl L.

    2010-01-01

    NASA Marshall Space Flight Center and The Johns Hopkins University Applied Physics Laboratory have been conducting mission studies and performing risk reduction activities for NASA's robotic lunar lander flight projects. Additional mission studies have been conducted to support other objectives of the lunar science and exploration community and extensive risk reduction design and testing has been performed to advance the design of the lander system and reduce development risk for flight projects.

  16. Dynamic causal modelling of electrographic seizure activity using Bayesian belief updating.

    Science.gov (United States)

    Cooray, Gerald K; Sengupta, Biswa; Douglas, Pamela K; Friston, Karl

    2016-01-15

    Seizure activity in EEG recordings can persist for hours with seizure dynamics changing rapidly over time and space. To characterise the spatiotemporal evolution of seizure activity, large data sets often need to be analysed. Dynamic causal modelling (DCM) can be used to estimate the synaptic drivers of cortical dynamics during a seizure; however, the requisite (Bayesian) inversion procedure is computationally expensive. In this note, we describe a straightforward procedure, within the DCM framework, that provides efficient inversion of seizure activity measured with non-invasive and invasive physiological recordings; namely, EEG/ECoG. We describe the theoretical background behind a Bayesian belief updating scheme for DCM. The scheme is tested on simulated and empirical seizure activity (recorded both invasively and non-invasively) and compared with standard Bayesian inversion. We show that the Bayesian belief updating scheme provides similar estimates of time-varying synaptic parameters, compared to standard schemes, indicating no significant qualitative change in accuracy. The difference in variance explained was small (less than 5%). The updating method was substantially more efficient, taking approximately 5-10min compared to approximately 1-2h. Moreover, the setup of the model under the updating scheme allows for a clear specification of how neuronal variables fluctuate over separable timescales. This method now allows us to investigate the effect of fast (neuronal) activity on slow fluctuations in (synaptic) parameters, paving a way forward to understand how seizure activity is generated.

  17. Interval model updating using perturbation method and Radial Basis Function neural networks

    Science.gov (United States)

    Deng, Zhongmin; Guo, Zhaopu; Zhang, Xinjie

    2017-02-01

    In recent years, stochastic model updating techniques have been applied to the quantification of uncertainties inherently existing in real-world engineering structures. However in engineering practice, probability density functions of structural parameters are often unavailable due to insufficient information of a structural system. In this circumstance, interval analysis shows a significant advantage of handling uncertain problems since only the upper and lower bounds of inputs and outputs are defined. To this end, a new method for interval identification of structural parameters is proposed using the first-order perturbation method and Radial Basis Function (RBF) neural networks. By the perturbation method, each random variable is denoted as a perturbation around the mean value of the interval of each parameter and that those terms can be used in a two-step deterministic updating sense. Interval model updating equations are then developed on the basis of the perturbation technique. The two-step method is used for updating the mean values of the structural parameters and subsequently estimating the interval radii. The experimental and numerical case studies are given to illustrate and verify the proposed method in the interval identification of structural parameters.

  18. Finite element model updating of a prestressed concrete box girder bridge using subproblem approximation

    Science.gov (United States)

    Chen, G. W.; Omenzetter, P.

    2016-04-01

    This paper presents the implementation of an updating procedure for the finite element model (FEM) of a prestressed concrete continuous box-girder highway off-ramp bridge. Ambient vibration testing was conducted to excite the bridge, assisted by linear chirp sweepings induced by two small electrodynamic shakes deployed to enhance the excitation levels, since the bridge was closed to traffic. The data-driven stochastic subspace identification method was executed to recover the modal properties from measurement data. An initial FEM was developed and correlation between the experimental modal results and their analytical counterparts was studied. Modelling of the pier and abutment bearings was carefully adjusted to reflect the real operational conditions of the bridge. The subproblem approximation method was subsequently utilized to automatically update the FEM. For this purpose, the influences of bearing stiffness, and mass density and Young's modulus of materials were examined as uncertain parameters using sensitivity analysis. The updating objective function was defined based on a summation of squared values of relative errors of natural frequencies between the FEM and experimentation. All the identified modes were used as the target responses with the purpose of putting more constrains for the optimization process and decreasing the number of potentially feasible combinations for parameter changes. The updated FEM of the bridge was able to produce sufficient improvements in natural frequencies in most modes of interest, and can serve for a more precise dynamic response prediction or future investigation of the bridge health.

  19. An Updating Method for Structural Dynamics Models with Uncertainties

    Directory of Open Access Journals (Sweden)

    B. Faverjon

    2008-01-01

    Full Text Available One challenge in the numerical simulation of industrial structures is model validation based on experimental data. Among the indirect or parametric methods available, one is based on the “mechanical” concept of constitutive relation error estimator introduced in order to quantify the quality of finite element analyses. In the case of uncertain measurements obtained from a family of quasi-identical structures, parameters need to be modeled randomly. In this paper, we consider the case of a damped structure modeled with stochastic variables. Polynomial chaos expansion and reduced bases are used to solve the stochastic problems involved in the calculation of the error.

  20. iTree-Hydro: Snow hydrology update for the urban forest hydrology model

    Science.gov (United States)

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2011-01-01

    This article presents snow hydrology updates made to iTree-Hydro, previously called the Urban Forest Effects—Hydrology model. iTree-Hydro Version 1 was a warm climate model developed by the USDA Forest Service to provide a process-based planning tool with robust water quantity and quality predictions given data limitations common to most urban areas. Cold climate...

  1. PACIAE 2.0: An Updated Parton and Hadron Cascade Model (Program) for Relativistic Nuclear Collisions

    Institute of Scientific and Technical Information of China (English)

    SA; Ben-hao; ZHOU; Dai-mei; YAN; Yu-liang; LI; Xiao-mei; FENG; Sheng-qing; DONG; Bao-guo; CAI; Xu

    2012-01-01

    <正>We have updated the parton and hadron cascade model PACIAE for the relativistic nuclear collisions, from based on JETSET 6.4 and PYTHIA 5.7, and referred to as PACIAE 2.0. The main physics concerning the stages of the parton initiation, parton rescattering, hadronization, and hadron rescattering were discussed. The structures of the programs were briefly explained. In addition, some calculated examples were compared with the experimental data. It turns out that this model (program) works well.

  2. Dental caries: an updated medical model of risk assessment.

    Science.gov (United States)

    Kutsch, V Kim

    2014-04-01

    Dental caries is a transmissible, complex biofilm disease that creates prolonged periods of low pH in the mouth, resulting in a net mineral loss from the teeth. Historically, the disease model for dental caries consisted of mutans streptococci and Lactobacillus species, and the dental profession focused on restoring the lesions/damage from the disease by using a surgical model. The current recommendation is to implement a risk-assessment-based medical model called CAMBRA (caries management by risk assessment) to diagnose and treat dental caries. Unfortunately, many of the suggestions of CAMBRA have been overly complicated and confusing for clinicians. The risk of caries, however, is usually related to just a few common factors, and these factors result in common patterns of disease. This article examines the biofilm model of dental caries, identifies the common disease patterns, and discusses their targeted therapeutic strategies to make CAMBRA more easily adaptable for the privately practicing professional.

  3. Model county ordinance for wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Bain, D.A. [Oregon Office of Energy, Portland, OR (United States)

    1997-12-31

    Permitting is a crucial step in the development cycle of a wind project and permits affect the timing, cost, location, feasibility, layout, and impacts of wind projects. Counties often have the lead responsibility for permitting yet few have appropriate siting regulations for wind projects. A model ordinance allows a county to quickly adopt appropriate permitting procedures. The model county wind ordinance developed for use by northwest states is generally applicable across the country and counties seeking to adopt siting or zoning regulations for wind will find it a good starting place. The model includes permitting procedures for wind measurement devices and two types of wind systems. Both discretionary and nondiscretionary standards apply to wind systems and a conditional use permit would be issued. The standards, criteria, conditions for approval, and process procedures are defined for each. Adaptation examples for the four northwest states are provided along with a model Wind Resource Overlay Zone.

  4. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ) model version 5.0.2

    Data.gov (United States)

    U.S. Environmental Protection Agency — The uploaded data consists of the BRACE Na aerosol observations paired with CMAQ model output, the updated model's parameterization of sea salt aerosol emission size...

  5. Disentangling density-dependent dynamics using full annual cycle models and Bayesian model weight updating

    Science.gov (United States)

    Robinson, Orin J.; McGowan, Conor; Devers, Patrick K.

    2017-01-01

    Density dependence regulates populations of many species across all taxonomic groups. Understanding density dependence is vital for predicting the effects of climate, habitat loss and/or management actions on wild populations. Migratory species likely experience seasonal changes in the relative influence of density dependence on population processes such as survival and recruitment throughout the annual cycle. These effects must be accounted for when characterizing migratory populations via population models.To evaluate effects of density on seasonal survival and recruitment of a migratory species, we used an existing full annual cycle model framework for American black ducks Anas rubripes, and tested different density effects (including no effects) on survival and recruitment. We then used a Bayesian model weight updating routine to determine which population model best fit observed breeding population survey data between 1990 and 2014.The models that best fit the survey data suggested that survival and recruitment were affected by density dependence and that density effects were stronger on adult survival during the breeding season than during the non-breeding season.Analysis also suggests that regulation of survival and recruitment by density varied over time. Our results showed that different characterizations of density regulations changed every 8–12 years (three times in the 25-year period) for our population.Synthesis and applications. Using a full annual cycle, modelling framework and model weighting routine will be helpful in evaluating density dependence for migratory species in both the short and long term. We used this method to disentangle the seasonal effects of density on the continental American black duck population which will allow managers to better evaluate the effects of habitat loss and potential habitat management actions throughout the annual cycle. The method here may allow researchers to hone in on the proper form and/or strength of

  6. Comparing the impact of time displaced and biased precipitation estimates for online updated urban runoff models.

    Science.gov (United States)

    Borup, Morten; Grum, Morten; Mikkelsen, Peter Steen

    2013-01-01

    When an online runoff model is updated from system measurements, the requirements of the precipitation input change. Using rain gauge data as precipitation input there will be a displacement between the time when the rain hits the gauge and the time where the rain hits the actual catchment, due to the time it takes for the rain cell to travel from the rain gauge to the catchment. Since this time displacement is not present for system measurements the data assimilation scheme might already have updated the model to include the impact from the particular rain cell when the rain data is forced upon the model, which therefore will end up including the same rain twice in the model run. This paper compares forecast accuracy of updated models when using time displaced rain input to that of rain input with constant biases. This is done using a simple time-area model and historic rain series that are either displaced in time or affected with a bias. The results show that for a 10 minute forecast, time displacements of 5 and 10 minutes compare to biases of 60 and 100%, respectively, independent of the catchments time of concentration.

  7. Clustering of Parameter Sensitivities: Examples from a Helicopter Airframe Model Updating Exercise

    Directory of Open Access Journals (Sweden)

    H. Shahverdi

    2009-01-01

    Full Text Available The need for high fidelity models in the aerospace industry has become ever more important as increasingly stringent requirements on noise and vibration levels, reliability, maintenance costs etc. come into effect. In this paper, the results of a finite element model updating exercise on a Westland Lynx XZ649 helicopter are presented. For large and complex structures, such as a helicopter airframe, the finite element model represents the main tool for obtaining accurate models which could predict the sensitivities of responses to structural changes and optimisation of the vibration levels. In this study, the eigenvalue sensitivities with respect to Young's modulus and mass density are used in a detailed parameterisation of the structure. A new methodology is developed using an unsupervised learning technique based on similarity clustering of the columns of the sensitivity matrix. An assessment of model updating strategies is given and comparative results for the correction of vibration modes are discussed in detail. The role of the clustering technique in updating large-scale models is emphasised.

  8. Finite element model updating for large span spatial steel structure considering uncertainties

    Institute of Scientific and Technical Information of China (English)

    TENG Jun; ZHU Yan-huang; ZHOU Feng; LI Hui; OU Jin-ping

    2010-01-01

    In order to establish the baseline finite element model for structural health monitoring,a new method of model updating was proposed after analyzing the uncertainties of measured data and the error of finite element model.In the new method,the finite element model was replaced by the multi-output support vector regression machine(MSVR).The interval variables of the measured frequency were sampled by Latin hypercube sampling method.The samples of frequency were regarded as the inputs of the trained MSVR.The outputs of MSVR were the target values of design parameters.The steel structure of National Aquatic Center for Beijing Olympic Games was introduced as a case for finite element model updating.The results show that the proposed method can avoid solving the problem of complicated calculation.Both the estimated values and associated uncertainties of the structure parameters can be obtained by the method.The static and dynamic characteristics of the updated finite element model are in good agreement with the measured data.

  9. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario...

  10. Status Update: Modeling Energy Balance in NIF Hohlraums

    Energy Technology Data Exchange (ETDEWEB)

    Jones, O. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-22

    We have developed a standardized methodology to model hohlraum drive in NIF experiments. We compare simulation results to experiments by 1) comparing hohlraum xray fluxes and 2) comparing capsule metrics, such as bang times. Long-pulse, high gas-fill hohlraums require a 20-28% reduction in simulated drive and inclusion of ~15% backscatter to match experiment through (1) and (2). Short-pulse, low fill or near-vacuum hohlraums require a 10% reduction in simulated drive to match experiment through (2); no reduction through (1). Ongoing work focuses on physical model modifications to improve these matches.

  11. Status Update: Modeling Energy Balance in NIF Hohlraums

    Energy Technology Data Exchange (ETDEWEB)

    Jones, O. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-22

    We have developed a standardized methodology to model hohlraum drive in NIF experiments. We compare simulation results to experiments by 1) comparing hohlraum xray fluxes and 2) comparing capsule metrics, such as bang times. Long-pulse, high gas-fill hohlraums require a 20-28% reduction in simulated drive and inclusion of ~15% backscatter to match experiment through (1) and (2). Short-pulse, low fill or near-vacuum hohlraums require a 10% reduction in simulated drive to match experiment through (2); no reduction through (1). Ongoing work focuses on physical model modifications to improve these matches.

  12. Using radar altimetry to update a large-scale hydrological model of the Brahmaputra river basin

    DEFF Research Database (Denmark)

    Finsen, F.; Milzow, Christian; Smith, R.

    2014-01-01

    Measurements of river and lake water levels from space-borne radar altimeters (past missions include ERS, Envisat, Jason, Topex) are useful for calibration and validation of large-scale hydrological models in poorly gauged river basins. Altimetry data availability over the downstream reaches...... of the Brahmaputra is excellent (17 high-quality virtual stations from ERS-2, 6 from Topex and 10 from Envisat are available for the Brahmaputra). In this study, altimetry data are used to update a large-scale Budyko-type hydrological model of the Brahmaputra river basin in real time. Altimetry measurements...... are converted to discharge using rating curves of simulated discharge versus observed altimetry. This approach makes it possible to use altimetry data from river cross sections where both in-situ rating curves and accurate river cross section geometry are not available. Model updating based on radar altimetry...

  13. An updated natural history model of cervical cancer: derivation of model parameters.

    Science.gov (United States)

    Campos, Nicole G; Burger, Emily A; Sy, Stephen; Sharma, Monisha; Schiffman, Mark; Rodriguez, Ana Cecilia; Hildesheim, Allan; Herrero, Rolando; Kim, Jane J

    2014-09-01

    Mathematical models of cervical cancer have been widely used to evaluate the comparative effectiveness and cost-effectiveness of preventive strategies. Major advances in the understanding of cervical carcinogenesis motivate the creation of a new disease paradigm in such models. To keep pace with the most recent evidence, we updated a previously developed microsimulation model of human papillomavirus (HPV) infection and cervical cancer to reflect 1) a shift towards health states based on HPV rather than poorly reproducible histological diagnoses and 2) HPV clearance and progression to precancer as a function of infection duration and genotype, as derived from the control arm of the Costa Rica Vaccine Trial (2004-2010). The model was calibrated leveraging empirical data from the New Mexico Surveillance, Epidemiology, and End Results Registry (1980-1999) and a state-of-the-art cervical cancer screening registry in New Mexico (2007-2009). The calibrated model had good correspondence with data on genotype- and age-specific HPV prevalence, genotype frequency in precancer and cancer, and age-specific cancer incidence. We present this model in response to a call for new natural history models of cervical cancer intended for decision analysis and economic evaluation at a time when global cervical cancer prevention policy continues to evolve and evidence of the long-term health effects of cervical interventions remains critical. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  14. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  15. An updated summary of MATHEW/ADPIC model evaluation studies

    Energy Technology Data Exchange (ETDEWEB)

    Foster, K.T.; Dickerson, M.H.

    1990-05-01

    This paper summarizes the major model evaluation studies conducted for the MATHEW/ADPIC atmospheric transport and diffusion models used by the US Department of Energy's Atmospheric Release Advisory Capability. These studies have taken place over the last 15 years and involve field tracer releases influenced by a variety of meteorological and topographical conditions. Neutrally buoyant tracers released both as surface and elevated point sources, as well as material dispersed by explosive, thermally bouyant release mechanisms have been studied. Results from these studies show that the MATHEW/ADPIC models estimate the tracer air concentrations to within a factor of two of the measured values 20% to 50% of the time, and within a factor of five of the measurements 35% to 85% of the time depending on the complexity of the meteorology and terrain, and the release height of the tracer. Comparisons of model estimates to peak downwind deposition and air concentration measurements from explosive releases are shown to be generally within a factor of two to three. 24 refs., 14 figs., 3 tabs.

  16. General equilibrium basic needs policy model, (updating part).

    OpenAIRE

    Kouwenaar A

    1985-01-01

    ILO pub-WEP pub-PREALC pub. Working paper, econometric model for the assessment of structural change affecting development planning for basic needs satisfaction in Ecuador - considers population growth, family size (households), labour force participation, labour supply, wages, income distribution, profit rates, capital ownership, etc.; examines nutrition, education and health as factors influencing productivity. Diagram, graph, references, statistical tables.

  17. Adaptive Nonlinear Model Predictive Control Using an On-line Support Vector Regression Updating Strategy

    Institute of Scientific and Technical Information of China (English)

    Ping Wang; Chaohe Yang; Xuemin Tian; Dexian Huang

    2014-01-01

    The performance of data-driven models relies heavily on the amount and quality of training samples, so it might deteriorate significantly in the regions where samples are scarce. The objective of this paper is to develop an on-line SVR model updating strategy to track the change in the process characteristics efficiently with affordable computational burden. This is achieved by adding a new sample that violates the Karush-Kuhn-Tucker condi-tions of the existing SVR model and by deleting the old sample that has the maximum distance with respect to the newly added sample in feature space. The benefits offered by such an updating strategy are exploited to develop an adaptive model-based control scheme, where model updating and control task perform alternately. The effectiveness of the adaptive controller is demonstrated by simulation study on a continuous stirred tank reactor. The results reveal that the adaptive MPC scheme outperforms its non-adaptive counterpart for large-magnitude set point changes and variations in process parameters.

  18. Employing incomplete complex modes for model updating and damage detection of damped structures

    Institute of Scientific and Technical Information of China (English)

    LI HuaJun; LIU FuShun; HU Sau-Lon James

    2008-01-01

    In the study of finite element model updating or damage detection, most papers are devoted to undamped systems. Thus, their objective has been exclusively re-stricted to the correction of the mass and stiffness matrices. In contrast, this paper performs the model updating and damage detection for damped structures. A theoretical contribution of this paper is to extend the cross-model cross-mode (CMCM) method to simultaneously update the mass, damping and stiffness matri-ces of a finite element model when only few spatially incomplete, complex-valued modes are available. Numerical studies are conducted for a 30-DOF (degree-of-freedom) cantilever beam with multiple damaged elements, as the measured modes are synthesized from finite element models. The numerical results reveal that ap-plying the CMCM method, together with an iterative Guyan reduction scheme, can yield good damage detection in general. When the measured modes utilized in the CMCM method are corrupted with irregular errors, assessing damage at the loca-tion that possesses larger modal strain energy is less sensitive to the corrupted modes.

  19. Employing incomplete complex modes for model updating and damage detection of damped structures

    Institute of Scientific and Technical Information of China (English)

    HU; Sau-Lon; James

    2008-01-01

    In the study of finite element model updating or damage detection,most papers are devoted to undamped systems.Thus,their objective has been exclusively restricted to the correction of the mass and stiffness matrices.In contrast,this paper performs the model updating and damage detection for damped structures.A theoretical contribution of this paper is to extend the cross-model cross-mode(CMCM) method to simultaneously update the mass,damping and stiffness matrices of a finite element model when only few spatially incomplete,complex-valued modes are available.Numerical studies are conducted for a 30-DOF(degree-of-freedom) cantilever beam with multiple damaged elements,as the measured modes are synthesized from finite element models.The numerical results reveal that applying the CMCM method,together with an iterative Guyan reduction scheme,can yield good damage detection in general.When the measured modes utilized in the CMCM method are corrupted with irregular errors,assessing damage at the location that possesses larger modal strain energy is less sensitive to the corrupted modes.

  20. An Update on Experimental Climate Prediction and Analysis Products Being Developed at NASA's Global Modeling and Assimilation Office

    Science.gov (United States)

    Schubert, Siegfried

    2011-01-01

    The Global Modeling and Assimilation Office at NASA's Goddard Space Flight Center is developing a number of experimental prediction and analysis products suitable for research and applications. The prediction products include a large suite of subseasonal and seasonal hindcasts and forecasts (as a contribution to the US National MME), a suite of decadal (10-year) hindcasts (as a contribution to the IPCC decadal prediction project), and a series of large ensemble and high resolution simulations of selected extreme events, including the 2010 Russian and 2011 US heat waves. The analysis products include an experimental atlas of climate (in particular drought) and weather extremes. This talk will provide an update on those activities, and discuss recent efforts by WCRP to leverage off these and similar efforts at other institutions throughout the world to develop an experimental global drought early warning system.

  1. POMP - Pervasive Object Model Project

    DEFF Research Database (Denmark)

    Schougaard, Kari Rye; Schultz, Ulrik Pagh

    applications, we consider it essential that a standard object-oriented style of programming can be used for those parts of the application that do not concern its mobility. This position paper describes an ongoing effort to implement a language and a virtual machine for applications that execute in a pervasive...... mobility. Mobile agent platforms are often based on such virtual machines, but typically do not provide strong mobility (the ability to migrate at any program point), and have limited support for multi-threaded applications, although there are exceptions. For a virtual machine to support mobile...... computing environment. This system, named POM (Pervasive Object Model), supports applications split into coarse-grained, strongly mobile units that communicate using method invocations through proxies. We are currently investigating efficient execution of mobile applications, scalability to suit...

  2. Conducting and Evaluating Stakeholder Workshops to Facilitate Updates to a Storm Surge Forecasting Model for Coastal Louisiana

    Science.gov (United States)

    DeLorme, D.; Lea, K.; Hagen, S. C.

    2016-12-01

    As coastal Louisiana evolves morphologically, ecologically, and from engineering advancements, there is a crucial need to continually adjust real-time forecasting and coastal restoration planning models. This presentation discusses planning, conducting, and evaluating stakeholder workshops to support such an endeavor. The workshops are part of an ongoing Louisiana Sea Grant-sponsored project. The project involves updating an ADCIRC (Advanced Circulation) mesh representation of topography including levees and other flood control structures by applying previously-collected elevation data and new data acquired during the project. The workshops are designed to educate, solicit input, and ensure incorporation of topographic features into the framework is accomplished in the best interest of stakeholders. During this project's first year, three one-day workshops directed to levee managers and other local officials were convened at agricultural extension facilities in Hammond, Houma, and Lake Charles, Louisiana. The objectives were to provide a forum for participants to learn about the ADCIRC framework, understand the importance of accurate elevations for a robust surge model, discuss and identify additional data sources, and become familiar with the CERA (Coastal Emergency Risks Assessment) visualization tool. The workshop structure consisted of several scientific presentations with questions/answer time (ADCIRC simulation inputs and outputs; ADCIRC framework elevation component; description and examples of topographic features such as levees, roadways, railroads, etc. currently utilized in the mesh; ADCIRC model validation demonstration through historic event simulations; CERA demonstration), a breakout activity for participant groups to identify and discuss raised features not currently in the mesh and document them on provided worksheets, and a closing session for debriefing and discussion of future model improvements. Evaluation involved developing, and analyzing a

  3. World Energy Projection System model documentation

    Energy Technology Data Exchange (ETDEWEB)

    Hutzler, M.J.; Anderson, A.T.

    1997-09-01

    The World Energy Projection System (WEPS) was developed by the Office of Integrated Analysis and Forecasting within the Energy Information Administration (EIA), the independent statistical and analytical agency of the US Department of Energy. WEPS is an integrated set of personal computer based spreadsheets containing data compilations, assumption specifications, descriptive analysis procedures, and projection models. The WEPS accounting framework incorporates projections from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product GDP), and about the rate of incremental energy requirements met by natural gas, coal, and renewable energy sources (hydroelectricity, geothermal, solar, wind, biomass, and other renewable resources). Projections produced by WEPS are published in the annual report, International Energy Outlook. This report documents the structure and procedures incorporated in the 1998 version of the WEPS model. It has been written to provide an overview of the structure of the system and technical details about the operation of each component of the model for persons who wish to know how WEPS projections are produced by EIA.

  4. Revisiting the Carrington Event: Updated modeling of atmospheric effects

    CERN Document Server

    Thomas, Brian C; Snyder, Brock R

    2011-01-01

    The terrestrial effects of major solar events such as the Carrington white-light flare and subsequent geomagnetic storm of August-September 1859 are of considerable interest, especially in light of recent predictions that such extreme events will be more likely over the coming decades. Here we present results of modeling the atmospheric effects, especially production of odd nitrogen compounds and subsequent depletion of ozone, by solar protons associated with the Carrington event. This study combines approaches from two previous studies of the atmospheric effect of this event. We investigate changes in NOy compounds as well as depletion of O3 using a two-dimensional atmospheric chemistry and dynamics model. Atmospheric ionization is computed using a range-energy relation with four different proxy proton spectra associated with more recent well-known solar proton events. We find that changes in atmospheric constituents are in reasonable agreement with previous studies, but effects of the four proxy spectra use...

  5. Off-Highway Gasoline Consuption Estimation Models Used in the Federal Highway Administration Attribution Process: 2008 Updates

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ho-Ling [ORNL; Davis, Stacy Cagle [ORNL

    2009-12-01

    This report is designed to document the analysis process and estimation models currently used by the Federal Highway Administration (FHWA) to estimate the off-highway gasoline consumption and public sector fuel consumption. An overview of the entire FHWA attribution process is provided along with specifics related to the latest update (2008) on the Off-Highway Gasoline Use Model and the Public Use of Gasoline Model. The Off-Highway Gasoline Use Model is made up of five individual modules, one for each of the off-highway categories: agricultural, industrial and commercial, construction, aviation, and marine. This 2008 update of the off-highway models was the second major update (the first model update was conducted during 2002-2003) after they were originally developed in mid-1990. The agricultural model methodology, specifically, underwent a significant revision because of changes in data availability since 2003. Some revision to the model was necessary due to removal of certain data elements used in the original estimation method. The revised agricultural model also made use of some newly available information, published by the data source agency in recent years. The other model methodologies were not drastically changed, though many data elements were updated to improve the accuracy of these models. Note that components in the Public Use of Gasoline Model were not updated in 2008. A major challenge in updating estimation methods applied by the public-use model is that they would have to rely on significant new data collection efforts. In addition, due to resource limitation, several components of the models (both off-highway and public-us models) that utilized regression modeling approaches were not recalibrated under the 2008 study. An investigation of the Environmental Protection Agency's NONROAD2005 model was also carried out under the 2008 model update. Results generated from the NONROAD2005 model were analyzed, examined, and compared, to the extent that

  6. A Quest for Missing Proteins : update 2015 on Chromosome-Centric Human Proteome Project

    NARCIS (Netherlands)

    Horvatovich, Péter; Lundberg, Emma K; Chen, Yu-Ju; Sung, Ting-Yi; He, Fuchu; Nice, Edouard C; Goode, Robert J A; Yu, Simon; Ranganathan, Shoba; Baker, Mark S; Domont, Gilberto B; Velasquez, Erika; Li, Dong; Liu, Siqi; Wang, Quanhui; He, Qing-Yu; Menon, Rajasree; Guan, Yuanfang; Corrales, Fernando Jose; Segura, Victor; Casal, José Ignacio; Pascual-Montano, Alberto; Albar, Juan Pablo; Fuentes, Manuel; Gonzalez-Gonzalez, Maria; Diez, Paula; Ibarrola, Nieves; Degano, Rosa M; Mohammed, Yassene; Borchers, Christoph H; Urbani, Andrea; Soggiu, Alessio; Yamamoto, Tadashi; Archakov, Alexander I; Ponomarenko, Elena; Lisitsa, Andrey V; Lichti, Cheryl F; Mostovenko, Ekaterina; Kroes, Roger A; Rezeli, Melinda; Vegvari, Akos; Fehniger, Thomas E; Bischoff, Rainer; Vizcaíno, Juan Antonio; Deutsch, Eric W; Lane, Lydie; Nilsson, Carol L; Marko-Varga, György; Omenn, Gilbert S; Jeong, Seul-Ki; Cho, Jin-Young; Paik, Young-Ki; Hancock, William S

    2015-01-01

    This paper summarizes the recent activities of the Chromosome-Centric Human Proteome Project (C-HPP) consortium, which develops new technologies to identify yet-to-be annotated proteins (termed "missing proteins") in biological samples that lack sufficient experimental evidence at the protein level

  7. Connected Mathematics Project (CMP). What Works Clearinghouse Intervention Report. Updated February 2017

    Science.gov (United States)

    What Works Clearinghouse, 2017

    2017-01-01

    "Connected Mathematics Project" (CMP) is a math curriculum for students in grades 6-8. It uses interactive problems and everyday situations to explore mathematical ideas, with a goal of fostering a problem-centered, inquiry-based learning environment. At each grade level, the curriculum covers numbers, algebra, geometry/measurement,…

  8. The CGEM-IT of the BESIII experiment: project update and test results in magnetic field

    Science.gov (United States)

    Mezzadri, G.

    2016-08-01

    The BESIII experiment is a multi-purpose detector operating on the electron- positron collider BEPCII in Beijing. Since 2008, the world's largest sample of J/ψ, ψ’ were collected. Due to increasing luminosity, the inner drift chamber is showing signs of aging. In 2014, an upgrade was proposed by the Italian collaboration based on the Cylindrical Gas Electron Multipliers (CGEM) technology, developed within the KLOE-II experiment, but with several new features and innovations. In this contribution, an overview of the project will be presented. Preliminary results of a beam test will be shown, with particular focus on the detector performance in magnetic field, with different configurations of electric field. A new readout mode, the µTPC readout, will also be described. The project has been recognized as a Significant Research Project within the Executive Programme for Scientific and Technological Cooperation between Italy and P.R.C for the years 2013-2015, and more recently has been selected as one of the project funded by the European Commission within the call H2020- MSCA-RISE-2014.

  9. A Quest for Missing Proteins : update 2015 on Chromosome-Centric Human Proteome Project

    NARCIS (Netherlands)

    Horvatovich, Péter; Lundberg, Emma K; Chen, Yu-Ju; Sung, Ting-Yi; He, Fuchu; Nice, Edouard C; Goode, Robert J A; Yu, Simon; Ranganathan, Shoba; Baker, Mark S; Domont, Gilberto B; Velasquez, Erika; Li, Dong; Liu, Siqi; Wang, Quanhui; He, Qing-Yu; Menon, Rajasree; Guan, Yuanfang; Corrales, Fernando Jose; Segura, Victor; Casal, José Ignacio; Pascual-Montano, Alberto; Albar, Juan Pablo; Fuentes, Manuel; Gonzalez-Gonzalez, Maria; Diez, Paula; Ibarrola, Nieves; Degano, Rosa M; Mohammed, Yassene; Borchers, Christoph H; Urbani, Andrea; Soggiu, Alessio; Yamamoto, Tadashi; Archakov, Alexander I; Ponomarenko, Elena; Lisitsa, Andrey V; Lichti, Cheryl F; Mostovenko, Ekaterina; Kroes, Roger A; Rezeli, Melinda; Vegvari, Akos; Fehniger, Thomas E; Bischoff, Rainer; Vizcaíno, Juan Antonio; Deutsch, Eric W; Lane, Lydie; Nilsson, Carol L; Marko-Varga, György; Omenn, Gilbert S; Jeong, Seul-Ki; Cho, Jin-Young; Paik, Young-Ki; Hancock, William S

    2015-01-01

    This paper summarizes the recent activities of the Chromosome-Centric Human Proteome Project (C-HPP) consortium, which develops new technologies to identify yet-to-be annotated proteins (termed "missing proteins") in biological samples that lack sufficient experimental evidence at the protein level

  10. National Bioenergy Center - Biochemical Platform Integration Project: Quarterly Update, Winter 2010

    Energy Technology Data Exchange (ETDEWEB)

    Schell, D.

    2011-02-01

    Winter 2011 edition of the National Bioenergy Center's Biochemical Platform Integration Project quarterly newsletter. Issue topics: 33rd Symposium on Biotechnology for Fuels and Chemicals program topic areas; results from reactive membrane extraction of inhibitors from dilute-acid pretreated corn stover; list of 2010 task publications.

  11. An Updated Account of the WISELAV Project: A Visual Construction of the English Verb System

    Science.gov (United States)

    Pablos, Andrés Palacios

    2016-01-01

    This article presents the state of the art in WISELAV, an on-going research project based on the metaphor Languages Are (like) Visuals (LAV) and its mapping Words-In-Shapes Exchange (WISE). First, the cognitive premises that motivate the proposal are recalled: the power of images, students' increasingly visual cognitive learning style, and the…

  12. Impact of time displaced precipitation estimates for on-line updated models

    DEFF Research Database (Denmark)

    Borup, Morten; Grum, Morten; Mikkelsen, Peter Steen

    2012-01-01

    is forced upon the model, which therefore will end up including the same rain twice in the model run. This paper compares forecast accuracy of updated models when using time displaced rain input to that of rain input with constant biases. This is done using a simple timearea model and historic rain series...... that are either displaced in time or affected with a bias. The results show that for a 10 minute forecast, time displacements of 5 and 10 minutes compare to biases of 60% and 100%, respectively, independent of the catchments time of concentration....

  13. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Lei Qin

    2014-05-01

    Full Text Available We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences.

  14. The Chandra X-Ray Observatory Radiation Environmental Model Update

    Science.gov (United States)

    Blackwell, William C.; Minow, Joseph I.; ODell, Stephen L.; Cameron, Robert A.; Virani, Shanil N.

    2003-01-01

    CRMFLX (Chandra Radiation Model of ion FLUX) is a radiation environment risk mitigation tool for use as a decision aid in planning the operation times for Chandra's Advanced CCD Imaging Spectrometer (ACIS) detector. The accurate prediction of the proton flux environment with energies of 100 - 200 keV is needed in order to protect the ACIS detector against proton degradation. Unfortunately, protons of this energy are abundant in the region of space where Chandra must operate. In addition, on-board particle detectors do not measure proton flux levels of the required energy range. CRMFLX is an engineering environment model developed to predict the proton flux in the solar wind, magnetosheath, and magnetosphere phenomenological regions of geospace. This paper describes the upgrades to the ion flux databases for the magnetosphere, magnetosheath, and solar wind regions. These data files were created by using Geotail and Polar spacecraft flux measurements only when the Advanced Composition Explorer (ACE) spacecraft's 0.14 MeV particle flux was below a threshold value. This new database allows for CRMFLX output to be correlated with both the geomagnetic activity level, as represented by the Kp index, as well as with solar proton events. Also, reported in this paper are results of analysis leading to a change in Chandra operations that successfully mitigates the false trigger rate for autonomous radiation events caused by relativistic electron flux contamination of proton channels.

  15. Updates on measurements and modeling techniques for expendable countermeasures

    Science.gov (United States)

    Gignilliat, Robert; Tepfer, Kathleen; Wilson, Rebekah F.; Taczak, Thomas M.

    2016-10-01

    The potential threat of recently-advertised anti-ship missiles has instigated research at the United States (US) Naval Research Laboratory (NRL) into the improvement of measurement techniques for visual band countermeasures. The goal of measurements is the collection of radiometric imagery for use in the building and validation of digital models of expendable countermeasures. This paper will present an overview of measurement requirements unique to the visual band and differences between visual band and infrared (IR) band measurements. A review of the metrics used to characterize signatures in the visible band will be presented and contrasted to those commonly used in IR band measurements. For example, the visual band measurements require higher fidelity characterization of the background, including improved high-transmittance measurements and better characterization of solar conditions to correlate results more closely with changes in the environment. The range of relevant engagement angles has also been expanded to include higher altitude measurements of targets and countermeasures. In addition to the discussion of measurement techniques, a top-level qualitative summary of modeling approaches will be presented. No quantitative results or data will be presented.

  16. Varieties of update

    Directory of Open Access Journals (Sweden)

    Sarah E Murray

    2014-03-01

    Full Text Available This paper discusses three potential varieties of update: updates to the common ground, structuring updates, and updates that introduce discourse referents. These different types of update are used to model different aspects of natural language phenomena. Not-at-issue information directly updates the common ground. The illocutionary mood of a sentence structures the context. Other updates introduce discourse referents of various types, including propositional discourse referents for at-issue information. Distinguishing these types of update allows a unified treatment of a broad range of phenomena, including the grammatical evidentials found in Cheyenne (Algonquian as well as English evidential parentheticals, appositives, and mood marking. An update semantics that can formalize all of these varieties of update is given, integrating the different kinds of semantic contributions into a single representation of meaning. http://dx.doi.org/10.3765/sp.7.2 BibTeX info

  17. Box models for the evolution of atmospheric oxygen: an update

    Science.gov (United States)

    Kasting, J. F.

    1991-01-01

    A simple 3-box model of the atmosphere/ocean system is used to describe the various stages in the evolution of atmospheric oxygen. In Stage I, which probably lasted until redbeds began to form about 2.0 Ga ago, the Earth's surface environment was generally devoid of free O2, except possibly in localized regions of high productivity in the surface ocean. In Stage II, which may have lasted for less than 150 Ma, the atmosphere and surface ocean were oxidizing, while the deep ocean remained anoxic. In Stage III, which commenced with the disappearance of banded iron formations around 1.85 Ga ago and has lasted until the present, all three surface reservoirs contained appreciable amounts of free O2. Recent and not-so-recent controversies regarding the abundance of oxygen in the Archean atmosphere are identified and discussed. The rate of O2 increase during the Middle and Late Proterozoic is identified as another outstanding question.

  18. Box models for the evolution of atmospheric oxygen: an update

    Science.gov (United States)

    Kasting, J. F.

    1991-01-01

    A simple 3-box model of the atmosphere/ocean system is used to describe the various stages in the evolution of atmospheric oxygen. In Stage I, which probably lasted until redbeds began to form about 2.0 Ga ago, the Earth's surface environment was generally devoid of free O2, except possibly in localized regions of high productivity in the surface ocean. In Stage II, which may have lasted for less than 150 Ma, the atmosphere and surface ocean were oxidizing, while the deep ocean remained anoxic. In Stage III, which commenced with the disappearance of banded iron formations around 1.85 Ga ago and has lasted until the present, all three surface reservoirs contained appreciable amounts of free O2. Recent and not-so-recent controversies regarding the abundance of oxygen in the Archean atmosphere are identified and discussed. The rate of O2 increase during the Middle and Late Proterozoic is identified as another outstanding question.

  19. An Updated GA Signaling 'Relief of Repression' Regulatory Model

    Institute of Scientific and Technical Information of China (English)

    Xiu-Hua Gao; Sen-Lin Xiao; Qin-Fang Yao; Yu-Juan Wang; Xiang-Dong Fu

    2011-01-01

    Gibberellic acid (GA)regulates many aspects of plant growth and development. The DELLA proteins act to restrain plant growth, and GA relieves this repression by promoting their degradation via the 26S proteasome pathway.The elucidation of the crystalline structure of the GA soluble receptor GID1 protein represents an important breakthrough for understanding the way in which GA is perceived and how it induces the destabilization of the DELLA proteins. Recent advances have revealed that the DELLA proteins are involved in protein-protein interactions within various environmental and hormone signaling pathways. In this review, we highlight our current understanding of the 'relief of repression" model that aims to explain the role of GA and the function of the DELLA proteins, incorporating the many aspects of cross-talk shown to exist in the control of plant development and the response to stress.

  20. PROJECT ACTIVITY ANALYSIS WITHOUT THE NETWORK MODEL

    Directory of Open Access Journals (Sweden)

    S. Munapo

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper presents a new procedure for analysing and managing activity sequences in projects. The new procedure determines critical activities, critical path, start times, free floats, crash limits, and other useful information without the use of the network model. Even though network models have been successfully used in project management so far, there are weaknesses associated with the use. A network is not easy to generate, and dummies that are usually associated with it make the network diagram complex – and dummy activities have no meaning in the original project management problem. The network model for projects can be avoided while still obtaining all the useful information that is required for project management. What are required are the activities, their accurate durations, and their predecessors.

    AFRIKAANSE OPSOMMING: Die navorsing beskryf ’n nuwerwetse metode vir die ontleding en bestuur van die sekwensiële aktiwiteite van projekte. Die voorgestelde metode bepaal kritiese aktiwiteite, die kritieke pad, aanvangstye, speling, verhasing, en ander groothede sonder die gebruik van ’n netwerkmodel. Die metode funksioneer bevredigend in die praktyk, en omseil die administratiewe rompslomp van die tradisionele netwerkmodelle.

  1. Effect of overpasses in the Biham-Middleton-Levine traffic flow model with random and parallel update rule

    Science.gov (United States)

    Ding, Zhong-Jun; Jiang, Rui; Gao, Zi-You; Wang, Bing-Hong; Long, Jiancheng

    2013-08-01

    The effect of overpasses in the Biham-Middleton-Levine traffic flow model with random and parallel update rules has been studied. An overpass is a site that can be occupied simultaneously by an eastbound car and a northbound one. Under periodic boundary conditions, both self-organized and random patterns are observed in the free-flowing phase of the parallel update model, while only the random pattern is observed in the random update model. We have developed mean-field analysis for the moving phase of the random update model, which agrees with the simulation results well. An intermediate phase is observed in which some cars could pass through the jamming cluster due to the existence of free paths in the random update model. Two intermediate states are observed in the parallel update model, which have been ignored in previous studies. The intermediate phases in which the jamming skeleton is only oriented along the diagonal line in both models have been analyzed, with the analyses agreeing well with the simulation results. With the increase of overpass ratio, the jamming phase and the intermediate phases disappear in succession for both models. Under open boundary conditions, the system exhibits only two phases when the ratio of overpasses is below a threshold in the random update model. When the ratio of the overpass is close to 1, three phases could be observed, similar to the totally asymmetric simple exclusion process model. The dependence of the average velocity, the density, and the flow rate on the injection probability in the moving phase has also been obtained through mean-field analysis. The results of the parallel model under open boundary conditions are similar to that of the random update model.

  2. User's guide to the MESOI diffusion model and to the utility programs UPDATE and LOGRVU

    Energy Technology Data Exchange (ETDEWEB)

    Athey, G.F.; Allwine, K.J.; Ramsdell, J.V.

    1981-11-01

    MESOI is an interactive, Lagrangian puff trajectory diffusion model. The model is documented separately (Ramsdell and Athey, 1981); this report is intended to provide MESOI users with the information needed to successfully conduct model simulations. The user is also provided with guidance in the use of the data file maintenance and review programs; UPDATE and LOGRVU. Complete examples are given for the operaton of all three programs and an appendix documents UPDATE and LOGRVU.

  3. Unsaturated Zone Flow Model Expert Elicitation Project

    Energy Technology Data Exchange (ETDEWEB)

    Coppersmith, K. J.

    1997-05-30

    This report presents results of the Unsaturated Zone Flow Model Expert Elicitation (UZFMEE) project at Yucca Mountain, Nevada. This project was sponsored by the US Department of Energy (DOE) and managed by Geomatrix Consultants, Inc. (Geomatrix), for TRW Environmental Safety Systems, Inc. The objective of this project was to identify and assess the uncertainties associated with certain key components of the unsaturated zone flow system at Yucca Mountain. This assessment reviewed the data inputs, modeling approaches, and results of the unsaturated zone flow model (termed the ''UZ site-scale model'') being developed by Lawrence Berkeley National Laboratory (LBNL) and the US Geological Survey (USGS). In addition to data input and modeling issues, the assessment focused on percolation flux (volumetric flow rate per unit cross-sectional area) at the potential repository horizon. An understanding of unsaturated zone processes is critical to evaluating the performance of the potential high-level nuclear waste repository at Yucca Mountain. A major goal of the project was to capture the uncertainties involved in assessing the unsaturated flow processes, including uncertainty in both the models used to represent physical controls on unsaturated zone flow and the parameter values used in the models. To ensure that the analysis included a wide range of perspectives, multiple individual judgments were elicited from members of an expert panel. The panel members, who were experts from within and outside the Yucca Mountain project, represented a range of experience and expertise. A deliberate process was followed in facilitating interactions among the experts, in training them to express their uncertainties, and in eliciting their interpretations. The resulting assessments and probability distributions, therefore, provide a reasonable aggregate representation of the knowledge and uncertainties about key issues regarding the unsaturated zone at the Yucca

  4. ACCOUNTING STANDARD SETTING IN THE INTERNATIONAL ARENA: UPDATE ON THE CONVERGENCE PROJECT

    Directory of Open Access Journals (Sweden)

    Bonaci Carmen Giorgiana

    2012-07-01

    Full Text Available Our paper contributes to the literature on international accounting by focusing on the standard setting process. As documented by research literature, accounting regulation can enhance corporate governance (Melis and Carta, 2010, corporate reporting being expected to reduce information asymmetry. Based on accounting research and trade literature we first synthesize recent evolutions in the international accounting arena. We therefore position our study within current realities significantly marked by uncertainty in relation to the world wide globalization process. The objective of our paper is to perform an analysis that would help assess further developments of the convergence project. This is done by looking at the current status of the projects being developed under the IASB –FASB collaboration, as well as by developing a comparison between IFRS and US GAAP. The employed research methodology relies on analyzing data provided through the IASB and the FASB’s websites, as well as other official documents being issued by the two Boards. The assessment of the projects was done by reviewing exposure documents and monitoring the Boards’ deliberations, while the developed comparison requires accounting regulations content analysis. Concluding upon the Boards’ ongoing projects, we might identify areas in which convergence seems to be quite close (such as revenue recognition and leasing, but also areas in which convergence becomes even more challenging (such as financial instruments or the particular case of offsetting. Similar to other studies being developed within accounting research and trade literature (SEC 2011: 8 we may conclude that, generally, US GAAP present more detailed, specific requirements than IFRS.

  5. Update on the SKA Offset Optics Design for the U.S. Technology Development Project

    Science.gov (United States)

    Imbriale, William A.; Cortes-Medellin, German; Baker, Lynn

    2011-01-01

    The U.S. design concept for the Square Kilometre Array (SKA) program is based on utilizing a large number of small-diameter dish antennas in the 12 to 15 meter diameter range. The Technology Development Project (TDP) is planning to design and build the first of these antennas to provide a demonstration of the technology and a solid base on which to estimate costs. The latest considerations for selecting both the optics and feed design are presented.

  6. Squaring the Project Management Circle: Updating the Cost, Schedule, and Performance Methodology

    Science.gov (United States)

    2016-04-30

    measure execution. This paper suggests that the cost, schedule, and performance paradigm, while still effective as a measure of managing programs, needs...acquisition and its relationship to the measures of cost, schedule, and performance—the project management circle (see Figure 1). Rather than use the... management circle. The circle recognizes the interrelationships, necessary equilibrium, and the influencing and balancing effects that these three

  7. Update on PHELIX Pulsed-Power Hydrodynamics Experiments and Modeling

    Science.gov (United States)

    Rousculp, Christopher; Reass, William; Oro, David; Griego, Jeffery; Turchi, Peter; Reinovsky, Robert; Devolder, Barbara

    2013-10-01

    The PHELIX pulsed-power driver is a 300 kJ, portable, transformer-coupled, capacitor bank capable of delivering 3-5 MA, 10 μs pulse into a low inductance load. Here we describe further testing and hydrodynamics experiments. First, a 4 nH static inductive load has been constructed. This allows for repetitive high-voltage, high-current testing of the system. Results are used in the calibration of simple circuit models and numerical simulations across a range of bank charges (+/-20 < V0 < +/-40 kV). Furthermore, a dynamic liner-on-target load experiment has been conducted to explore the shock-launched transport of particulates (diam. ~ 1 μm) from a surface. The trajectories of the particulates are diagnosed with radiography. Results are compared to 2D hydro-code simulations. Finally, initial studies are underway to assess the feasibility of using the PHELIX driver as an electromagnetic launcher for planer shock-physics experiments. Work supported by United States-DOE under contract DE-AC52-06NA25396.

  8. Project SAFE. Update of the SFR-1 safety assessment. Phase 1. Appendix A6: Biosphere

    Energy Technology Data Exchange (ETDEWEB)

    Kautsky, U. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Bergstroem, U. [Studsvik Eco and Safety AB, Nykoeping (Sweden)

    1998-10-01

    There has been a considerable development of models used for describing the turnover of radionuclides or other pollutants in the biosphere. New regulations require realistic assessments and description of effects on fauna and flora. Thus the use of trophic transfer models will be a more appropriate way to model the biosphere. These models take all accumulations of radio-nuclides in the ecosystem into account, not only direct pathways to man. Thus these models must be developed for this area. Moreover the turnover of loose deposits needs to be modelled. To be able to use these models there is a need to collect data on sediment composition, ecosystem structure and potential changes due e.g. sea-level fluctuations. These data will be collected from literature and where it is necessary complemented with field surveys. In some cases new models need to be developed. The integration of the geosphere and biosphere models is identified as an important issue 32 refs, 4 figs

  9. Research of Cadastral Data Modelling and Database Updating Based on Spatio-temporal Process

    Directory of Open Access Journals (Sweden)

    ZHANG Feng

    2016-02-01

    Full Text Available The core of modern cadastre management is to renew the cadastre database and keep its currentness,topology consistency and integrity.This paper analyzed the changes and their linkage of various cadastral objects in the update process.Combined object-oriented modeling technique with spatio-temporal objects' evolution express,the paper proposed a cadastral data updating model based on the spatio-temporal process according to people's thought.Change rules based on the spatio-temporal topological relations of evolution cadastral spatio-temporal objects are drafted and further more cascade updating and history back trace of cadastral features,land use and buildings are realized.This model implemented in cadastral management system-ReGIS.Achieved cascade changes are triggered by the direct driving force or perceived external events.The system records spatio-temporal objects' evolution process to facilitate the reconstruction of history,change tracking,analysis and forecasting future changes.

  10. Sensitivity-based model updating for structural damage identification using total variation regularization

    Science.gov (United States)

    Grip, Niklas; Sabourova, Natalia; Tu, Yongming

    2017-02-01

    Sensitivity-based Finite Element Model Updating (FEMU) is one of the widely accepted techniques used for damage identification in structures. FEMU can be formulated as a numerical optimization problem and solved iteratively making automatic updating of the unknown model parameters by minimizing the difference between measured and analytical structural properties. However, in the presence of noise in the measurements, the updating results are usually prone to errors. This is mathematically described as instability of the damage identification as an inverse problem. One way to resolve this problem is by using regularization. In this paper, we compare a well established interpolation-based regularization method against methods based on the minimization of the total variation of the unknown model parameters. These are new regularization methods for structural damage identification. We investigate how using Huber and pseudo Huber functions in the definition of total variation affects important properties of the methods. For instance, for well-localized damages the results show a clear advantage of the total variation based regularization in terms of the identified location and severity of damage compared with the interpolation-based solution. For a practical test of the proposed method we use a reinforced concrete plate. Measurements and analysis were performed first on an undamaged plate, and then repeated after applying four different degrees of damage.

  11. Stabilizing a Bicycle: A Modeling Project

    Science.gov (United States)

    Pennings, Timothy J.; Williams, Blair R.

    2010-01-01

    This article is a project that takes students through the process of forming a mathematical model of bicycle dynamics. Beginning with basic ideas from Newtonian mechanics (forces and torques), students use techniques from calculus and differential equations to develop the equations of rotational motion for a bicycle-rider system as it tips from…

  12. Updating prediction models by dynamical relaxation - An examination of the technique. [for numerical weather forecasting

    Science.gov (United States)

    Davies, H. C.; Turner, R. E.

    1977-01-01

    A dynamical relaxation technique for updating prediction models is analyzed with the help of the linear and nonlinear barotropic primitive equations. It is assumed that a complete four-dimensional time history of some prescribed subset of the meteorological variables is known. The rate of adaptation of the flow variables toward the true state is determined for a linearized f-model, and for mid-latitude and equatorial beta-plane models. The results of the analysis are corroborated by numerical experiments with the nonlinear shallow-water equations.

  13. Testing the prognostic accuracy of the updated pediatric sepsis biomarker risk model.

    Directory of Open Access Journals (Sweden)

    Hector R Wong

    Full Text Available BACKGROUND: We previously derived and validated a risk model to estimate mortality probability in children with septic shock (PERSEVERE; PEdiatRic SEpsis biomarkEr Risk modEl. PERSEVERE uses five biomarkers and age to estimate mortality probability. After the initial derivation and validation of PERSEVERE, we combined the derivation and validation cohorts (n = 355 and updated PERSEVERE. An important step in the development of updated risk models is to test their accuracy using an independent test cohort. OBJECTIVE: To test the prognostic accuracy of the updated version PERSEVERE in an independent test cohort. METHODS: Study subjects were recruited from multiple pediatric intensive care units in the United States. Biomarkers were measured in 182 pediatric subjects with septic shock using serum samples obtained during the first 24 hours of presentation. The accuracy of PERSEVERE 28-day mortality risk estimate was tested using diagnostic test statistics, and the net reclassification improvement (NRI was used to test whether PERSEVERE adds information to a physiology-based scoring system. RESULTS: Mortality in the test cohort was 13.2%. Using a risk cut-off of 2.5%, the sensitivity of PERSEVERE for mortality was 83% (95% CI 62-95, specificity was 75% (68-82, positive predictive value was 34% (22-47, and negative predictive value was 97% (91-99. The area under the receiver operating characteristic curve was 0.81 (0.70-0.92. The false positive subjects had a greater degree of organ failure burden and longer intensive care unit length of stay, compared to the true negative subjects. When adding PERSEVERE to a physiology-based scoring system, the net reclassification improvement was 0.91 (0.47-1.35; p<0.001. CONCLUSIONS: The updated version of PERSEVERE estimates mortality probability reliably in a heterogeneous test cohort of children with septic shock and provides information over and above a physiology-based scoring system.

  14. An update of the Death Valley regional groundwater flow system transient model, Nevada and California

    Science.gov (United States)

    Belcher, Wayne R.; Sweetkind, Donald S.; Faunt, Claudia C.; Pavelko, Michael T.; Hill, Mary C.

    2017-01-19

    Since the original publication of the Death Valley regional groundwater flow system (DVRFS) numerical model in 2004, more information on the regional groundwater flow system in the form of new data and interpretations has been compiled. Cooperators such as the Bureau of Land Management, National Park Service, U.S. Fish and Wildlife Service, the Department of Energy, and Nye County, Nevada, recognized a need to update the existing regional numerical model to maintain its viability as a groundwater management tool for regional stakeholders. The existing DVRFS numerical flow model was converted to MODFLOW-2005, updated with the latest available data, and recalibrated. Five main data sets were revised: (1) recharge from precipitation varying in time and space, (2) pumping data, (3) water-level observations, (4) an updated regional potentiometric map, and (5) a revision to the digital hydrogeologic framework model.The resulting DVRFS version 2.0 (v. 2.0) numerical flow model simulates groundwater flow conditions for the Death Valley region from 1913 to 2003 to correspond to the time frame for the most recently published (2008) water-use data. The DVRFS v 2.0 model was calibrated by using the Tikhonov regularization functionality in the parameter estimation and predictive uncertainty software PEST. In order to assess the accuracy of the numerical flow model in simulating regional flow, the fit of simulated to target values (consisting of hydraulic heads and flows, including evapotranspiration and spring discharge, flow across the model boundary, and interbasin flow; the regional water budget; values of parameter estimates; and sensitivities) was evaluated. This evaluation showed that DVRFS v. 2.0 simulates conditions similar to DVRFS v. 1.0. Comparisons of the target values with simulated values also indicate that they match reasonably well and in some cases (boundary flows and discharge) significantly better than in DVRFS v. 1.0.

  15. Investigating the Impact on Modeled Ozone Concentrations Using Meteorological Fields From WRF With and Updated Four-Dimensional Data Assimilation Approach”

    Science.gov (United States)

    The four-dimensional data assimilation (FDDA) technique in the Weather Research and Forecasting (WRF) meteorological model has recently undergone an important update from the original version. Previous evaluation results have demonstrated that the updated FDDA approach in WRF pr...

  16. Project management system model development and experimental research

    OpenAIRE

    Golubeva, Viktorija

    2006-01-01

    Project management is the application of knowledge, skills, tools and techniques to project activities to meet project requirements. Project Management Information System is tightly connected with organizational structure and particularity of executed projects. However the main objective of this research was to identify project management model that would be universal, helpful and easily used with small and medium projects In analysis phase we reviewed different methodologies, project ...

  17. Updated projections of radioactive wastes to be generated by the U. S. nuclear power industry

    Energy Technology Data Exchange (ETDEWEB)

    Kee, C.W.; Croft, A.G.; Blomeke, J.O.

    1976-12-01

    Eleven types of radioactive wastes to be generated within the fuel cycle operations of the U.S. nuclear power industry are defined, and projections are presented of their annual generation rates, shipping requirements, and accumulated characteristics over the remainder of this century. The power reactor complex is assumed to consist of uranium- and plutonium-fueled LWRs, HTGRs, and LMFBRs, and the installed nuclear electric capacity of the U.S. is taken as 68.1, 252, and 510 GW at the ends of calendar years 1980, 1990, and 2000, respectively. 72 tables.

  18. Automatically updating predictive modeling workflows support decision-making in drug design.

    Science.gov (United States)

    Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O

    2016-09-01

    Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.

  19. Update on Small Modular Reactors Dynamic System Modeling Tool: Web Application

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation (United States)

    2015-01-01

    Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individual component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.

  20. Quest for Missing Proteins: Update 2015 on Chromosome-Centric Human Proteome Project.

    Science.gov (United States)

    Horvatovich, Péter; Lundberg, Emma K; Chen, Yu-Ju; Sung, Ting-Yi; He, Fuchu; Nice, Edouard C; Goode, Robert J; Yu, Simon; Ranganathan, Shoba; Baker, Mark S; Domont, Gilberto B; Velasquez, Erika; Li, Dong; Liu, Siqi; Wang, Quanhui; He, Qing-Yu; Menon, Rajasree; Guan, Yuanfang; Corrales, Fernando J; Segura, Victor; Casal, J Ignacio; Pascual-Montano, Alberto; Albar, Juan P; Fuentes, Manuel; Gonzalez-Gonzalez, Maria; Diez, Paula; Ibarrola, Nieves; Degano, Rosa M; Mohammed, Yassene; Borchers, Christoph H; Urbani, Andrea; Soggiu, Alessio; Yamamoto, Tadashi; Salekdeh, Ghasem Hosseini; Archakov, Alexander; Ponomarenko, Elena; Lisitsa, Andrey; Lichti, Cheryl F; Mostovenko, Ekaterina; Kroes, Roger A; Rezeli, Melinda; Végvári, Ákos; Fehniger, Thomas E; Bischoff, Rainer; Vizcaíno, Juan Antonio; Deutsch, Eric W; Lane, Lydie; Nilsson, Carol L; Marko-Varga, György; Omenn, Gilbert S; Jeong, Seul-Ki; Lim, Jong-Sun; Paik, Young-Ki; Hancock, William S

    2015-09-04

    This paper summarizes the recent activities of the Chromosome-Centric Human Proteome Project (C-HPP) consortium, which develops new technologies to identify yet-to-be annotated proteins (termed "missing proteins") in biological samples that lack sufficient experimental evidence at the protein level for confident protein identification. The C-HPP also aims to identify new protein forms that may be caused by genetic variability, post-translational modifications, and alternative splicing. Proteogenomic data integration forms the basis of the C-HPP's activities; therefore, we have summarized some of the key approaches and their roles in the project. We present new analytical technologies that improve the chemical space and lower detection limits coupled to bioinformatics tools and some publicly available resources that can be used to improve data analysis or support the development of analytical assays. Most of this paper's content has been compiled from posters, slides, and discussions presented in the series of C-HPP workshops held during 2014. All data (posters, presentations) used are available at the C-HPP Wiki (http://c-hpp.webhosting.rug.nl/) and in the Supporting Information.

  1. Constraints on a φCDM model from strong gravitational lensing and updated Hubble parameter measurements

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yun; Geng, Chao-Qiang [Department of Physics, National Tsing Hua University, Hsinchu, 300 Taiwan (China); Cao, Shuo; Huang, Yu-Mei; Zhu, Zong-Hong, E-mail: chenyun@bao.ac.cn, E-mail: geng@phys.nthu.edu.tw, E-mail: caoshuo@bnu.edu.cn, E-mail: huangymei@gmail.com, E-mail: zhuzh@bnu.edu.cn [Department of Astronomy, Beijing Normal University, Beijing 100875 (China)

    2015-02-01

    We constrain the scalar field dark energy model with an inverse power-law potential, i.e., V(φ) ∝ φ{sup −α} (α > 0), from a set of recent cosmological observations by compiling an updated sample of Hubble parameter measurements including 30 independent data points. Our results show that the constraining power of the updated sample of H(z) data with the HST prior on H{sub 0} is stronger than those of the SCP Union2 and Union2.1 compilations. A recent sample of strong gravitational lensing systems is also adopted to confine the model even though the results are not significant. A joint analysis of the strong gravitational lensing data with the more restrictive updated Hubble parameter measurements and the Type Ia supernovae data from SCP Union2 indicates that the recent observations still can not distinguish whether dark energy is a time-independent cosmological constant or a time-varying dynamical component.

  2. Propagation Modeling of Food Safety Crisis Information Update Based on the Multi-agent System

    Directory of Open Access Journals (Sweden)

    Meihong Wu

    2015-08-01

    Full Text Available This study propose a new multi-agent system frame based on epistemic default complex adaptive theory and use the agent based simulation and modeling the information updating process to study food safety crisis information dissemination. Then, we explore interaction effect between each agent in food safety crisis information dissemination at the current environment and mostly reveals how the government agent, food company agent and network media agent influence users confidence in food safety. The information updating process give a description on how to guide a normal spread of food safety crisis in public opinion in the current environment and how to enhance the confidence of food quality and safety of the average users.

  3. From up to date climate and ocean evidence with updated UN emissions projections, the time is now to recommend an immediate massive effort on CO2.

    Science.gov (United States)

    Carter, Peter

    2017-04-01

    This paper provides further compelling evidence for 'an immediate, massive effort to control CO2 emissions, stopped by mid-century' (Cai, Lenton & Lontzek, 2016). Atmospheric CO2 which is above 405 ppm (actual and trend) still accelerating, despite flat emissions since 2014, with a 2015 >3ppm unprecedented spike in Earth history (A. Glikson),is on the worst case IPCC scenario. Atmospheric methane is increasing faster than its past 20-year rate, almost on the worst-case IPCC AR5 scenario (Global Carbon Project, 2016). Observed effects of atmospheric greenhouse gas (GHG) pollution are increasing faster. This includes long-lived atmospheric GHG concentrations, radiative forcing, surface average warming, Greenland ice sheet melting, Arctic daily sea ice anomaly, ocean heat (and rate of going deeper), ocean acidification, and ocean de-oxygenation. The atmospheric GHG concentration of 485 ppm CO2 eq (WMO, 2015) commits us to 'about 2°C' equilibrium (AR5). 2°C by 2100 would require 'substantial emissions reductions over the next few decades' (AR5). Instead, the May 2016 UN update on 'intended' national emissions targets under the Paris Agreement projects global emissions will be 16% higher by 2030 and the November 2016 International Energy Agency update projects energy-related CO2 eq emissions will be 30% higher by 2030, leading to 'around 2.7°C by 2100 and above 3°C thereafter'. Climate change feedback will be positive this century and multiple large vulnerable sources of amplifying feedback exist (AR5). 'Extensive tree mortality and widespread forest die-back linked to drought and temperature stress have been documented on all vegetated continents' (AR5). 'Recent studies suggest a weakening of the land sink, further amplifying atmospheric growth of CO2' (WMO, 2016). Under all but the best-case IPCC AR5 scenario, surface temperature is projected to increase above 2°C by 2100, which is above 3°C (equilibrium) after 2100, with ocean acidification still increasing at

  4. Subglacial Hydrology Model Intercomparison Project (SHMIP)

    Science.gov (United States)

    Werder, Mauro A.; de Fleurian, Basile; Creyts, Timothy T.; Damsgaard, Anders; Delaney, Ian; Dow, Christine F.; Gagliardini, Olivier; Hoffman, Matthew J.; Seguinot, Julien; Sommers, Aleah; Irarrazaval Bustos, Inigo; Downs, Jakob

    2017-04-01

    The SHMIP project is the first intercomparison project of subglacial drainage models (http://shmip.bitbucket.org). Its synthetic test suites and evaluation were designed such that any subglacial hydrology model producing effective pressure can participate. In contrast to ice deformation, the physical processes of subglacial hydrology (which in turn impacts basal sliding of glaciers) are poorly known. A further complication is that different glacial and geological settings can lead to different drainage physics. The aim of the project is therefore to qualitatively compare the outputs of the participating models for a wide range of water forcings and glacier geometries. This will allow to put existing studies, which use different drainage models, into context and will allow new studies to select the most suitable model for the problem at hand. We present the results from the just completed intercomparison exercise. Twelve models participated: eight 2D and four 1D models; nine include both an efficient and inefficient system, the other three one of the systems; all but two models use R-channels as efficient system, and/or a linked-cavity like inefficient system, one exception uses porous layers with different characteristic for each of the systems, the other exception is based on canals. The main variable used for the comparison is effective pressure, as that is a direct proxy for basal sliding of glaciers. The models produce large differences in the effective pressure fields, in particular for higher water input scenarios. This shows that the selection of a subglacial drainage model will likely impact the conclusions of a study significantly.

  5. Updating irradiated graphite disposal: Project 'GRAPA' and the international decommissioning network.

    Science.gov (United States)

    Wickham, Anthony; Steinmetz, Hans-Jürgen; O'Sullivan, Patrick; Ojovan, Michael I

    2017-05-01

    Demonstrating competence in planning and executing the disposal of radioactive wastes is a key factor in the public perception of the nuclear power industry and must be demonstrated when making the case for new nuclear build. This work addresses the particular waste stream of irradiated graphite, mostly derived from reactor moderators and amounting to more than 250,000 tonnes world-wide. Use may be made of its unique chemical and physical properties to consider possible processing and disposal options outside the normal simple classifications and repository options for mixed low or intermediate-level wastes. The IAEA has an obvious involvement in radioactive waste disposal and has established a new project 'GRAPA' - Irradiated Graphite Processing Approaches - to encourage an international debate and collaborative work aimed at optimising and facilitating the treatment of irradiated graphite. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Update on HI data collection from GBT, Parkes and Arecibo telescopes for the Cosmic Flows project

    CERN Document Server

    Courtois, Helene M

    2014-01-01

    Cosmic Flows is an international multi-element project with the goal to map motions of galaxies in the Local Universe. Kinematic information from observations in the radio HI line and photometry at optical or near-infrared bands are acquired to derive the large majority of distances that are obtained through the luminosity-linewidth or Tully-Fisher relation. This paper gathers additional observational radio data, frequently unpublished, retrieved from the archives of Green Bank, Parkes and Arecibo telescopes. Extracted HI profiles are consistently processed to produce linewidth measurements. Our current "All-Digital HI Catalog" contains a total of 20,343 HI spectra for 17,738 galaxies with 14,802 galaxies with accurate linewidth measurement useful for Tully-Fisher galaxy distances. This addition of 4,117 new measurements represents an augmentation of 34\\% compared to our last release.

  7. Foods and beverages and colorectal cancer risk: a systematic review and meta-analysis of cohort studies, an update of the evidence of the WCRF-AICR Continuous Update Project.

    Science.gov (United States)

    Vieira, A R; Abar, L; Chan, D S M; Vingeliene, S; Polemiti, E; Stevens, C; Greenwood, D; Norat, T

    2017-08-01

    As part of the World Cancer Research Fund International Continuous Update Project, we updated the systematic review and meta-analysis of prospective studies to quantify the dose-response between foods and beverages intake and colorectal cancer risk. PubMed and several databases up to 31 May 2015. Prospective studies reporting adjusted relative risk estimates for the association of specific food groups and beverages and risk of colorectal, colon and rectal cancer. Dose-response meta-analyses using random effect models to estimate summary relative risks (RRs). About 400 individual study estimates from 111 unique cohort studies were included. Overall, the risk increase of colorectal cancer is 12% for each 100 g/day increase of red and processed meat intake (95% CI = 4-21%, I2=70%, pheterogeneity (ph)alcoholic drinks (95% CI = 5-9%, I2=25%, ph = 0.21). Colorectal cancer risk decrease in 17% for each 90g/day increase of whole grains (95% CI = 11-21%, I2 = 0%, ph = 0.30, 6 studies) and 13% for each 400 g/day increase of dairy products intake (95% CI = 10-17%, I2 = 18%, ph = 0.27, 10 studies). Inverse associations were also observed for vegetables intake (RR per 100 g/day =0.98 (95% CI = 0.96-0.99, I2=0%, ph = 0.48, 11 studies) and for fish intake (RR for 100 g/day = 0.89 (95% CI = 0.80-0.99, I2=0%, ph = 0.52, 11 studies), that were weak for vegetables and driven by one study for fish. Intakes of fruits, coffee, tea, cheese, poultry and legumes were not associated with colorectal cancer risk. Our results reinforce the evidence that high intake of red and processed meat and alcohol increase the risk of colorectal cancer. Milk and whole grains may have a protective role against colorectal cancer. The evidence for vegetables and fish was less convincing.

  8. Receiver Operating Characteristic (ROC) Curve-based Prediction Model for Periodontal Disease Updated With the Calibrated Community Periodontal Index.

    Science.gov (United States)

    Su, Chiu-Wen; Ming-Fang Yen, Amy; Lai, Hongmin; Chen, Hsiu-Hsi; Chen, Sam Li-Sheng

    2017-07-28

    Background The accuracy of a prediction model for periodontal disease using the community periodontal index (CPI) has been undertaken by using an area receiver operating characteristics (AUROC) curve, but how the uncalibrated CPI, as measured by general dentists trained by periodontists in a large epidemiological study, required for constructing a prediction model that affects its performance has not been researched yet. Methods We conducted a two-stage design by first proposing a validation study to calibrate the CPI between a senior periodontal specialist and trained general dentists who measured CPIs in the main study of a nationwide survey. A Bayesian hierarchical logistic regression model was applied to estimate the non-updated and updated clinical weights used for building up risk scores. How the calibrated CPI affected the performance of the updated prediction model was quantified by comparing the AUROC curves between the original and the updated model. Results The estimates regarding the calibration of CPI obtained from the validation study were 66% and 85% for sensitivity and specificity, respectively. After updating, the clinical weights of each predictor were inflated, and the risk score for the highest risk category was elevated from 434 to 630. Such an update improved the AUROC performance of the two corresponding prediction models from 62.6% (95% CI: 61.7%-63.6%) for the non-updated model to 68.9% (95% CI: 68.0%-69.6%) for the updated one, reaching a statistically significant difference (P periodontal disease as measured by the calibrated CPI derived from a large epidemiological survey.

  9. Update of the Computing Models of the WLCG and the LHC Experiments

    CERN Document Server

    Bird, I; Carminati, F; Cattaneo, M; Clarke, P; Fisk, I; Girone, M; Harvey, J; Kersevan, B; Mato, P; Mount, R; Panzer-Steindel, B; CERN. Geneva. The LHC experiments Committee; LHCC

    2014-01-01

    In preparation for the data collection and analysis in LHC Run 2, the LHCC and Computing Scrutiny Group of the RRB requested a detailed review of the current computing models of the LHC experiments and a consolidated plan for the future computing needs. This document represents the status of the work of the WLCG collaboration and the four LHC experiments in updating the computing models to reflect the advances in understanding of the most effective ways to use the distributed computing and storage resources, based upon the experience gained during LHC Run 1.

  10. Neuroadaptation in nicotine addiction: update on the sensitization-homeostasis model.

    Science.gov (United States)

    DiFranza, Joseph R; Huang, Wei; King, Jean

    2012-10-17

    The role of neuronal plasticity in supporting the addictive state has generated much research and some conceptual theories. One such theory, the sensitization-homeostasis (SH) model, postulates that nicotine suppresses craving circuits, and this triggers the development of homeostatic adaptations that autonomously support craving. Based on clinical studies, the SH model predicts the existence of three distinct forms of neuroplasticity that are responsible for withdrawal, tolerance and the resolution of withdrawal. Over the past decade, many controversial aspects of the SH model have become well established by the literature, while some details have been disproven. Here we update the model based on new studies showing that nicotine dependence develops through a set sequence of symptoms in all smokers, and that the latency to withdrawal, the time it takes for withdrawal symptoms to appear during abstinence, is initially very long but shortens by several orders of magnitude over time. We conclude by outlining directions for future research based on the updated model, and commenting on how new experimental studies can gain from the framework put forth in the SH model.

  11. Neuroadaptation in Nicotine Addiction: Update on the Sensitization-Homeostasis Model

    Directory of Open Access Journals (Sweden)

    Jean King

    2012-10-01

    Full Text Available The role of neuronal plasticity in supporting the addictive state has generated much research and some conceptual theories. One such theory, the sensitization-homeostasis (SH model, postulates that nicotine suppresses craving circuits, and this triggers the development of homeostatic adaptations that autonomously support craving. Based on clinical studies, the SH model predicts the existence of three distinct forms of neuroplasticity that are responsible for withdrawal, tolerance and the resolution of withdrawal. Over the past decade, many controversial aspects of the SH model have become well established by the literature, while some details have been disproven. Here we update the model based on new studies showing that nicotine dependence develops through a set sequence of symptoms in all smokers, and that the latency to withdrawal, the time it takes for withdrawal symptoms to appear during abstinence, is initially very long but shortens by several orders of magnitude over time. We conclude by outlining directions for future research based on the updated model, and commenting on how new experimental studies can gain from the framework put forth in the SH model.

  12. Adaptive Equalizer Using Selective Partial Update Algorithm and Selective Regressor Affine Projection Algorithm over Shallow Water Acoustic Channels

    Directory of Open Access Journals (Sweden)

    Masoumeh Soflaei

    2014-01-01

    Full Text Available One of the most important problems of reliable communications in shallow water channels is intersymbol interference (ISI which is due to scattering from surface and reflecting from bottom. Using adaptive equalizers in receiver is one of the best suggested ways for overcoming this problem. In this paper, we apply the family of selective regressor affine projection algorithms (SR-APA and the family of selective partial update APA (SPU-APA which have low computational complexity that is one of the important factors that influences adaptive equalizer performance. We apply experimental data from Strait of Hormuz for examining the efficiency of the proposed methods over shallow water channel. We observe that the values of the steady-state mean square error (MSE of SR-APA and SPU-APA decrease by 5.8 (dB and 5.5 (dB, respectively, in comparison with least mean square (LMS algorithm. Also the families of SPU-APA and SR-APA have better convergence speed than LMS type algorithm.

  13. THE CELL CENTERED DATABASE PROJECT: AN UPDATE ON BUILDING COMMUNITY RESOURCES FOR MANAGING AND SHARING 3D IMAGING DATA

    Science.gov (United States)

    Martone, Maryann E.; Tran, Joshua; Wong, Willy W.; Sargis, Joy; Fong, Lisa; Larson, Stephen; Lamont, Stephan P.; Gupta, Amarnath; Ellisman, Mark H.

    2008-01-01

    Databases have become integral parts of data management, dissemination and mining in biology. At the Second Annual Conference on Electron Tomography, held in Amsterdam in 2001, we proposed that electron tomography data should be shared in a manner analogous to structural data at the protein and sequence scales. At that time, we outlined our progress in creating a database to bring together cell level imaging data across scales, The Cell Centered Database (CCDB). The CCDB was formally launched in 2002 as an on-line repository of high-resolution 3D light and electron microscopic reconstructions of cells and subcellular structures. It contains 2D, 3D and 4D structural and protein distribution information from confocal, multiphoton and electron microscopy, including correlated light and electron microscopy. Many of the data sets are derived from electron tomography of cells and tissues. In the five years since its debut, we have moved the CCDB from a prototype to a stable resource and expanded the scope of the project to include data management and knowledge engineering. Here we provide an update on the CCDB and how it is used by the scientific community. We also describe our work in developing additional knowledge tools, e.g., ontologies, for annotation and query of electron microscopic data. PMID:18054501

  14. A New Replacement for the Deep Diving Submersible ALVIN: Initial Project Update and Concept

    Science.gov (United States)

    Brown, R.; Tivey, M. A.; Walden, B.; Detrick, R. S.

    2004-12-01

    In August 2004, the National Science Foundation (NSF) funded the first phase of a 4-year project proposed by Woods Hole Oceanographic Institution (WHOI) to build a replacement submersible for the present human occupied vehicle (HOV) ALVIN operated by WHOI as part of the National Deep Submergence Facility. The design of the replacement HOV is the result of almost 10 years of deliberations among the scientific community and several studies including a recent 2004 National Research Council report on the "Future Needs of Deep Submergence Science". The over-riding design philosophy was to enhance capabilities and not to detract from the present ALVIN capabilities that have made it one of the premier research tools in oceanography. The replacement submersible will have a nominal depth capability of 6500 meters allowing access to over 99% of the world's ocean floor. The submersible is planned to have a sphere diameter of 2.1 m providing 27 cu. ft. of additional internal volume over the present ALVIN. A key improvement will be the viewport design with five viewports for a total 245 degree viewing area and with the forward three viewports having overlapping fields of view. This will provide an unprecedented view of the seafloor. The central pilot viewport is 7" in diameter with two forward 6" observer viewports and two lateral 5" observer viewports. The replacement vehicle will continue to operate with 1 pilot and 2 scientists inside the sphere. In order for the submersible to reach the greater depths will require increased descent and ascent rates. The new vehicle will operate with a variable water ballast system that can provide trim angles of up to +/-25 degrees to use on descent and ascent and will also enable the vehicle to stop in midwater to conduct experiments and sampling. Important design constraints are imposed by the capacity of the present ALVIN mother ship, Atlantis and the A-frame launch system. Due to these restrictions the replacement HOV will weigh 44

  15. Current Developments in Dementia Risk Prediction Modelling: An Updated Systematic Review.

    Directory of Open Access Journals (Sweden)

    Eugene Y H Tang

    Full Text Available Accurate identification of individuals at high risk of dementia influences clinical care, inclusion criteria for clinical trials and development of preventative strategies. Numerous models have been developed for predicting dementia. To evaluate these models we undertook a systematic review in 2010 and updated this in 2014 due to the increase in research published in this area. Here we include a critique of the variables selected for inclusion and an assessment of model prognostic performance.Our previous systematic review was updated with a search from January 2009 to March 2014 in electronic databases (MEDLINE, Embase, Scopus, Web of Science. Articles examining risk of dementia in non-demented individuals and including measures of sensitivity, specificity or the area under the curve (AUC or c-statistic were included.In total, 1,234 articles were identified from the search; 21 articles met inclusion criteria. New developments in dementia risk prediction include the testing of non-APOE genes, use of non-traditional dementia risk factors, incorporation of diet, physical function and ethnicity, and model development in specific subgroups of the population including individuals with diabetes and those with different educational levels. Four models have been externally validated. Three studies considered time or cost implications of computing the model.There is no one model that is recommended for dementia risk prediction in population-based settings. Further, it is unlikely that one model will fit all. Consideration of the optimal features of new models should focus on methodology (setting/sample, model development and testing in a replication cohort and the acceptability and cost of attaining the risk variables included in the prediction score. Further work is required to validate existing models or develop new ones in different populations as well as determine the ethical implications of dementia risk prediction, before applying the particular

  16. Updating Finite Element Model of a Wind Turbine Blade Section Using Experimental Modal Analysis Results

    Directory of Open Access Journals (Sweden)

    Marcin Luczak

    2014-01-01

    Full Text Available This paper presents selected results and aspects of the multidisciplinary and interdisciplinary research oriented for the experimental and numerical study of the structural dynamics of a bend-twist coupled full scale section of a wind turbine blade structure. The main goal of the conducted research is to validate finite element model of the modified wind turbine blade section mounted in the flexible support structure accordingly to the experimental results. Bend-twist coupling was implemented by adding angled unidirectional layers on the suction and pressure side of the blade. Dynamic test and simulations were performed on a section of a full scale wind turbine blade provided by Vestas Wind Systems A/S. The numerical results are compared to the experimental measurements and the discrepancies are assessed by natural frequency difference and modal assurance criterion. Based on sensitivity analysis, set of model parameters was selected for the model updating process. Design of experiment and response surface method was implemented to find values of model parameters yielding results closest to the experimental. The updated finite element model is producing results more consistent with the measurement outcomes.

  17. A Review of the Updated Pharmacophore for the Alpha 5 GABA(A Benzodiazepine Receptor Model

    Directory of Open Access Journals (Sweden)

    Terry Clayton

    2015-01-01

    Full Text Available An updated model of the GABA(A benzodiazepine receptor pharmacophore of the α5-BzR/GABA(A subtype has been constructed prompted by the synthesis of subtype selective ligands in light of the recent developments in both ligand synthesis, behavioral studies, and molecular modeling studies of the binding site itself. A number of BzR/GABA(A α5 subtype selective compounds were synthesized, notably α5-subtype selective inverse agonist PWZ-029 (1 which is active in enhancing cognition in both rodents and primates. In addition, a chiral positive allosteric modulator (PAM, SH-053-2′F-R-CH3 (2, has been shown to reverse the deleterious effects in the MAM-model of schizophrenia as well as alleviate constriction in airway smooth muscle. Presented here is an updated model of the pharmacophore for α5β2γ2 Bz/GABA(A receptors, including a rendering of PWZ-029 docked within the α5-binding pocket showing specific interactions of the molecule with the receptor. Differences in the included volume as compared to α1β2γ2, α2β2γ2, and α3β2γ2 will be illustrated for clarity. These new models enhance the ability to understand structural characteristics of ligands which act as agonists, antagonists, or inverse agonists at the Bz BS of GABA(A receptors.

  18. Experimental Update of the Overtopping Model Used for the Wave Dragon Wave Energy Converter

    Energy Technology Data Exchange (ETDEWEB)

    Parmeggiani, Stefano [Wave Dragon Ltd., London (United Kingdom); Kofoed, Jens Peter [Aalborg Univ. (Denmark). Department of Civil Engineering; Friis-Madsen, Erik [Wave Dragon Ltd., London (United Kingdom)

    2013-04-15

    An overtopping model specifically suited for Wave Dragon is needed in order to improve the reliability of its performance estimates. The model shall be comprehensive of all relevant physical processes that affect overtopping and flexible to adapt to any local conditions and device configuration. An experimental investigation is carried out to update an existing formulation suited for 2D draft-limited, low-crested structures, in order to include the effects on the overtopping flow of the wave steepness, the 3D geometry of Wave Dragon, the wing reflectors, the device motions and the non-rigid connection between platform and reflectors. The study is carried out in four phases, each of them specifically targeted at quantifying one of these effects through a sensitivity analysis and at modeling it through custom-made parameters. These are depending on features of the wave or the device configuration, all of which can be measured in real-time. Instead of using new fitting coefficients, this approach allows a broader applicability of the model beyond the Wave Dragon case, to any overtopping WEC or structure within the range of tested conditions. Predictions reliability of overtopping over Wave Dragon increased, as the updated model allows improved accuracy and precision respect to the former version.

  19. Experimental Update of the Overtopping Model Used for the Wave Dragon Wave Energy Converter

    Directory of Open Access Journals (Sweden)

    Erik Friis-Madsen

    2013-04-01

    Full Text Available An overtopping model specifically suited for Wave Dragon is needed in order to improve the reliability of its performance estimates. The model shall be comprehensive of all relevant physical processes that affect overtopping and flexible to adapt to any local conditions and device configuration. An experimental investigation is carried out to update an existing formulation suited for 2D draft-limited, low-crested structures, in order to include the effects on the overtopping flow of the wave steepness, the 3D geometry of Wave Dragon, the wing reflectors, the device motions and the non-rigid connection between platform and reflectors. The study is carried out in four phases, each of them specifically targeted at quantifying one of these effects through a sensitivity analysis and at modeling it through custom-made parameters. These are depending on features of the wave or the device configuration, all of which can be measured in real-time. Instead of using new fitting coefficients, this approach allows a broader applicability of the model beyond the Wave Dragon case, to any overtopping WEC or structure within the range of tested conditions. Predictions reliability of overtopping over Wave Dragon increased, as the updated model allows improved accuracy and precision respect to the former version.

  20. Updating the CHAOS series of field models using Swarm data and resulting candidate models for IGRF-12

    DEFF Research Database (Denmark)

    Finlay, Chris; Olsen, Nils; Tøffner-Clausen, Lars

    th order spline representation with knot points spaced at 0.5 year intervals. The resulting field model is able to consistently fit data from six independent low Earth orbit satellites: Oersted, CHAMP, SAC-C and the three Swarm satellites. As an example, we present comparisons of the excellent model......Ten months of data from ESA's Swarm mission, together with recent ground observatory monthly means, are used to update the CHAOS series of geomagnetic field models with a focus on time-changes of the core field. As for previous CHAOS field models quiet-time, night-side, data selection criteria...

  1. Regional Climate Model Intercomparison Project for Asia.

    Science.gov (United States)

    Fu, Congbin; Wang, Shuyu; Xiong, Zhe; Gutowski, William J.; Lee, Dong-Kyou; McGregor, John L.; Sato, Yasuo; Kato, Hisashi; Kim, Jeong-Woo; Suh, Myoung-Seok

    2005-02-01

    Improving the simulation of regional climate change is one of the high-priority areas of climate study because regional information is needed for climate change impact assessments. Such information is especially important for the region covered by the East Asian monsoon where there is high variability in both space and time. To this end, the Regional Climate Model Intercomparison Project (RMIP) for Asia has been established to evaluate and improve regional climate model (RCM) simulations of the monsoon climate. RMIP operates under joint support of the Asia-Pacific Network for Global Change Research (APN), the Global Change System for Analysis, Research and Training (START), the Chinese Academy of Sciences, and several projects of participating nations. The project currently involves 10 research groups from Australia, China, Japan, South Korea, and the United States, as well as scientists from India, Italy, Mongolia, North Korea, and Russia.RMIP has three simulation phases: March 1997-August 1998, which covers a full annual cycle and extremes in monsoon behavior; January 1989-December 1998, which examines simulated climatology; and a regional climate change scenario, involving nesting with a global model. This paper is a brief report of RMIP goals, implementation design, and some initial results from the first phase studies.

  2. Projects to expand energy sources in the western states: an update of Information Circular 8719. [24 states west of the Mississippi River

    Energy Technology Data Exchange (ETDEWEB)

    Rich, C.H. Jr.

    1977-01-01

    This report is an expansion and update of BM-IC-8719 and comprises maps and tables listing the name, location, and other pertinent data concerning certain fuel-related projects. The maps show the locations of the planned or proposed facilities. The tables include information on projects involving the proposed or planned development of fuel resources, as well as the development of storage, transportation, and conversion facilities. The report covers the 24 states west of the Mississippi River including Alaska and Hawaii. Of the 808 projects for which information is provided, 219 concern coal mines, 246 concern electric generating plants, and 115 concern uranium mines; Energy Supply and Environmental Coordination Act coal conversion notices are also included. Because of the dynamic nature of the energy industry, many uncertainties exist and some of the listed projects may never become realities. Also, no attempt has been made to determine the degree of certainty or viability of each project.

  3. Nonlinear finite element model updating for damage identification of civil structures using batch Bayesian estimation

    Science.gov (United States)

    Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.; de Callafon, Raymond A.

    2017-02-01

    This paper presents a framework for structural health monitoring (SHM) and damage identification of civil structures. This framework integrates advanced mechanics-based nonlinear finite element (FE) modeling and analysis techniques with a batch Bayesian estimation approach to estimate time-invariant model parameters used in the FE model of the structure of interest. The framework uses input excitation and dynamic response of the structure and updates a nonlinear FE model of the structure to minimize the discrepancies between predicted and measured response time histories. The updated FE model can then be interrogated to detect, localize, classify, and quantify the state of damage and predict the remaining useful life of the structure. As opposed to recursive estimation methods, in the batch Bayesian estimation approach, the entire time history of the input excitation and output response of the structure are used as a batch of data to estimate the FE model parameters through a number of iterations. In the case of non-informative prior, the batch Bayesian method leads to an extended maximum likelihood (ML) estimation method to estimate jointly time-invariant model parameters and the measurement noise amplitude. The extended ML estimation problem is solved efficiently using a gradient-based interior-point optimization algorithm. Gradient-based optimization algorithms require the FE response sensitivities with respect to the model parameters to be identified. The FE response sensitivities are computed accurately and efficiently using the direct differentiation method (DDM). The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem by computing the exact Fisher Information matrix using the FE response sensitivities with respect to the model parameters. The accuracy of the proposed uncertainty quantification approach is verified using a sampling approach based on the unscented transformation. Two validation studies, based on realistic

  4. Updated Hungarian Gravity Field Solution Based on Fifth Generation GOCE Gravity Field Models

    Science.gov (United States)

    Toth, Gyula; Foldvary, Lorant

    2015-03-01

    With the completion of the ESA's GOCE satellite's mission fifth generation gravity field models are available from the ESA's GOCE High Processing Facility. Our contribution is an updated gravity field solution for Hungary using the latest DIR R05 GOCE gravity field model. The solution methodology is least squares gravity field parameter estimation using Spherical Radial Base Functions (SRBF). Regional datasets include deflections of the vertical (DOV), gravity anomalies and quasigeoid heights by GPS/levelling. The GOCE DIR R05 model has been combined with the EGM20008 model and has been evaluated in comparison with the EGM2008 and EIGEN-6C3stat models to assess the performance of our regional gravity field solution.

  5. Contact-based model for strategy updating and evolution of cooperation

    Science.gov (United States)

    Zhang, Jianlei; Chen, Zengqiang

    2016-06-01

    To establish an available model for the astoundingly strategy decision process of players is not easy, sparking heated debate about the related strategy updating rules is intriguing. Models for evolutionary games have traditionally assumed that players imitate their successful partners by the comparison of respective payoffs, raising the question of what happens if the game information is not easily available. Focusing on this yet-unsolved case, the motivation behind the work presented here is to establish a novel model for the updating of states in a spatial population, by detouring the required payoffs in previous models and considering much more players' contact patterns. It can be handy and understandable to employ switching probabilities for determining the microscopic dynamics of strategy evolution. Our results illuminate the conditions under which the steady coexistence of competing strategies is possible. These findings reveal that the evolutionary fate of the coexisting strategies can be calculated analytically, and provide novel hints for the resolution of cooperative dilemmas in a competitive context. We hope that our results have disclosed new explanations about the survival and coexistence of competing strategies in structured populations.

  6. Avoiding unintentional eviction from integral projection models.

    Science.gov (United States)

    Williams, Jennifer L; Miller, Tom E X; Ellner, Stephen P

    2012-09-01

    Integral projection models (IPMs) are increasingly being applied to study size-structured populations. Here we call attention to a potential problem in their construction that can have important consequences for model results. IPMs are implemented using an approximating matrix and bounded size range. Individuals near the size limits can be unknowingly "evicted" from the model because their predicted future size is outside the range. We provide simple measures for the magnitude of eviction and the sensitivity of the population growth rate (lambda) to eviction, allowing modelers to assess the severity of the problem in their IPM. For IPMs of three plant species, we found that eviction occurred in all cases and caused underestimation of the population growth rate (lambda) relative to eviction-free models; it is likely that other models are similarly affected. Models with frequent eviction should be modified because eviction is only possible when size transitions are badly mis-specified. We offer several solutions to eviction problems, but we emphasize that the modeler must choose the most appropriate solution based on an understanding of why eviction occurs in the first place. We recommend testing IPMs for eviction problems and resolving them, so that population dynamics are modeled more accurately.

  7. Simulation-based Estimation of Thermal Behavior of Direct Feed Drive Mechanism with Updated Finite Element Model

    Institute of Scientific and Technical Information of China (English)

    LIN Xiankun; LI Yanjun; LI Haolin

    2014-01-01

    Linear motors generate high heat and cause significant deformation in high speed direct feed drive mechanisms. It is relevant to estimate their deformation behavior to improve their application in precision machine tools. This paper describes a method to estimate its thermal deformation based on updated finite element(FE) model methods. Firstly, a FE model is established for a linear motor drive test rig that includes the correlation between temperature rise and its resulting deformation. The relationship between the input and output variables of the FE model is identified with a modified multivariate input/output least square support vector regression machine. Additionally, the temperature rise and displacements at some critical points on the mechanism are obtained experimentally by a system of thermocouples and an interferometer. The FE model is updated through intelligent comparison between the experimentally measured values and the results from the regression machine. The experiments for testing thermal behavior along with the updated FE model simulations is conducted on the test rig in reciprocating cycle drive conditions. The results show that the intelligently updated FE model can be implemented to analyze the temperature variation distribution of the mechanism and to estimate its thermal behavior. The accuracy of the thermal behavior estimation with the optimally updated method can be more than double that of the initial theoretical FE model. This paper provides a simulation method that is effective to estimate the thermal behavior of the direct feed drive mechanism with high accuracy.

  8. Projecting Policy Effects with Statistical Models Projecting Policy Effects with Statistical Models

    Directory of Open Access Journals (Sweden)

    Christopher Sims

    1988-03-01

    Full Text Available This paper attempts to briefly discus the current frontiers in quantitative modeling for forecastina and policy analvsis. It does so by summarizing some recent developmenrs in three areas: reduced form forecasting models; theoretical models including elements of stochastic optimization; and identification. In the process, the paper tries to provide some remarks on the direction we seem to be headed. Projecting Policy Effects with Statistical Models

  9. The Chancellor's Model School Project (CMSP)

    Science.gov (United States)

    Lopez, Gil

    1999-01-01

    What does it take to create and implement a 7th to 8th grade middle school program where the great majority of students achieve at high academic levels regardless of their previous elementary school backgrounds? This was the major question that guided the research and development of a 7-year long project effort entitled the Chancellor's Model School Project (CMSP) from September 1991 to August 1998. The CMSP effort conducted largely in two New York City public schools was aimed at creating and testing a prototype 7th and 8th grade model program that was organized and test-implemented in two distinct project phases: Phase I of the CMSP effort was conducted from 1991 to 1995 as a 7th to 8th grade extension of an existing K-6 elementary school, and Phase II was conducted from 1995 to 1998 as a 7th to 8th grade middle school program that became an integral part of a newly established 7-12th grade high school. In Phase I, the CMSP demonstrated that with a highly structured curriculum coupled with strong academic support and increased learning time, students participating in the CMSP were able to develop a strong foundation for rigorous high school coursework within the space of 2 years (at the 7th and 8th grades). Mathematics and Reading test score data during Phase I of the project, clearly indicated that significant academic gains were obtained by almost all students -- at both the high and low ends of the spectrum -- regardless of their previous academic performance in the K-6 elementary school experience. The CMSP effort expanded in Phase II to include a fully operating 7-12 high school model. Achievement gains at the 7th and 8th grade levels in Phase II were tempered by the fact that incoming 7th grade students' academic background at the CMSP High School was significantly lower than students participating in Phase 1. Student performance in Phase II was also affected by the broadening of the CMSP effort from a 7-8th grade program to a fully functioning 7-12 high

  10. Finite element model updating of concrete structures based on imprecise probability

    Science.gov (United States)

    Biswal, S.; Ramaswamy, A.

    2017-09-01

    Imprecise probability based methods are developed in this study for the parameter estimation, in finite element model updating for concrete structures, when the measurements are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Procedures are also developed for integrating the imprecision in the parameters of the finite element model, in the finite element software Abaqus. The proposed methods are then verified against reinforced concrete beams and prestressed concrete beams tested in our laboratory as part of this study.

  11. The updated geodetic mean dynamic topography model – DTU15MDT

    DEFF Research Database (Denmark)

    Knudsen, Per; Andersen, Ole Baltazar; Maximenko, Nikolai

    An update to the global mean dynamic topography model DTU13MDT is presented. For DTU15MDT the newer gravity model EIGEN-6C4 has been combined with the DTU15MSS mean sea surface model to construct this global mean dynamic topography model. The EIGEN-6C4 is derived using the full series of GOCE data...... re-tracked CRYOSAT-2 altimetry also, hence, increasing its resolution. Also, some issues in the Polar regions have been solved. Finally, the filtering was re-evaluated by adjusting the quasi-gaussian filter width to optimize the fit to drifter velocities. Subsequently, geostrophic surface currents...... were derived from the DTU15MDT. The results show that geostrophic surface currents associated with the mean circulation have been further improved and that currents having speeds down to below 4 cm/s have been recovered....

  12. Development of integrated software project planning model

    OpenAIRE

    Manalif, Ekananta; Capretz, Luiz Fernando; Ho, Danny

    2012-01-01

    As the most uncertain and complex project when compared to other types of projects, software development project is highly depend on the result of software project planning phase that helping project managers by predicting the project demands with respect to the budgeting, scheduling, and the allocation of resources. The two main activities in software project planning are effort estimation and risk assessment which has to be executed together because the accuracy of the effort estimation is ...

  13. Rotating shaft model updating from modal data by a direct energy approach : a feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Audebert, S. [Electricite de France (EDF), 75 - Paris (France). Direction des Etudes et Recherches; Girard, A.; Chatelain, J. [Intespace - Division Etudes et Recherche, 31 - Toulouse (France)

    1996-12-31

    Investigations to improve the rotating machinery monitoring tend more and more to use numerical models. The aim is to obtain multi-fluid bearing rotor models which are able to correctly represent their dynamic behaviour, either modal or forced response type. The possibility of extending the direct energy method, initially developed for undamped structures, to rotating machinery is studied. It is based on the minimization of the kinetic and strain energy gap between experimental and analytic modal data. The preliminary determination of a multi-linear bearing rotor system Eigen modes shows the problem complexity in comparison with undamped non rotating structures: taking into account gyroscopic effects and bearing damping, as factors of rotor velocities, leads to complex component Eigen modes; moreover, non symmetric matrices, related to stiffness and damping bearing contributions, induce distinct left and right-hand side Eigen modes (left hand side Eigenmodes corresponds to the adjoint structure). Theoretically, the extension of the energy method is studied, considering first the intermediate case of an undamped non gyroscopic structure, second the general case of a rotating shaft: dta used for updating procedure are Eigen frequencies and left- and right- hand side mode shapes. Since left hand side mode shapes cannot be directly measured, they are replaced by analytic ones. The method is tested on a two-bearing rotor system, with a mass added; simulated data are used, relative to a non compatible structure, i.e. which is not a part of the set of modified analytic possible structures. Parameters to be corrected are the mass density, the Young`s modulus, and the stiffness and damping linearized characteristics of bearings. If parameters are influent in regard with modes to be updates, the updating method permits a significant improvement of the gap between analytic and experimental modes, even for modes not involves in the procedure. Modal damping appears to be more

  14. GCSS Idealized Cirrus Model Comparison Project

    Science.gov (United States)

    Starr, David OC.; Benedetti, Angela; Boehm, Matt; Brown, Philip R. A.; Gierens, Klaus; Girard, Eric; Giraud, Vincent; Jakob, Christian; Jensen, Eric; Khvorostyanov, Vitaly; hide

    2000-01-01

    The GCSS Working Group on Cirrus Cloud Systems (WG2) is conducting a systematic comparison and evaluation of cirrus cloud models. This fundamental activity seeks to support the improvement of models used for climate simulation and numerical weather prediction through assessment and improvement of the "process" models underlying parametric treatments of cirrus cloud processes in large-scale models. The WG2 Idealized Cirrus Model Comparison Project is an initial comparison of cirrus cloud simulations by a variety of cloud models for a series of idealized situations with relatively simple initial conditions and forcing. The models (16) represent the state-of-the-art and include 3-dimensional large eddy simulation (LES) models, two-dimensional cloud resolving models (CRMs), and single column model (SCM) versions of GCMs. The model microphysical components are similarly varied, ranging from single-moment bulk (relative humidity) schemes to fully size-resolved (bin) treatments where ice crystal growth is explicitly calculated. Radiative processes are included in the physics package of each model. The baseline simulations include "warm" and "cold" cirrus cases where cloud top initially occurs at about -47C and -66C, respectively. All simulations are for nighttime conditions (no solar radiation) where the cloud is generated in an ice supersaturated layer, about 1 km in depth, with an ice pseudoadiabatic thermal stratification (neutral). Continuing cloud formation is forced via an imposed diabatic cooling representing a 3 cm/s uplift over a 4-hour time span followed by a 2-hour dissipation stage with no cooling. Variations of these baseline cases include no-radiation and stable-thermal-stratification cases. Preliminary results indicated the great importance of ice crystal fallout in determining even the gross cloud characteristics, such as average vertically-integrated ice water path (IWP). Significant inter-model differences were found. Ice water fall speed is directly

  15. An All-Time-Domain Moving Object Data Model, Location Updating Strategy, and Position Estimation

    National Research Council Canada - National Science Library

    Wu, Qunyong; Huang, Junyi; Luo, Jianping; Yang, Jianjun

    2015-01-01

    .... Secondly, we proposed a new dynamic threshold location updating strategy. The location updating threshold was given dynamically in accordance with the velocity, accuracy, and azimuth positioning information from the GPS...

  16. Summary of Expansions, Updates, and Results in GREET® 2016 Suite of Models

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2016-10-01

    This report documents the technical content of the expansions and updates in Argonne National Laboratory’s GREET® 2016 release and provides references and links to key documents related to these expansions and updates.

  17. Project-matrix models of marketing organization

    Directory of Open Access Journals (Sweden)

    Gutić Dragutin

    2009-01-01

    Full Text Available Unlike theory and practice of corporation organization, in marketing organization numerous forms and contents at its disposal are not reached until this day. It can be well estimated that marketing organization today in most of our companies and in almost all its parts, noticeably gets behind corporation organization. Marketing managers have always been occupied by basic, narrow marketing activities as: sales growth, market analysis, market growth and market share, marketing research, introduction of new products, modification of products, promotion, distribution etc. They rarely found it necessary to focus a bit more to different aspects of marketing management, for example: marketing planning and marketing control, marketing organization and leading. This paper deals with aspects of project - matrix marketing organization management. Two-dimensional and more-dimensional models are presented. Among two-dimensional, these models are analyzed: Market management/products management model; Products management/management of product lifecycle phases on market model; Customers management/marketing functions management model; Demand management/marketing functions management model; Market positions management/marketing functions management model. .

  18. Building information models for astronomy projects

    Science.gov (United States)

    Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro

    2012-09-01

    A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.

  19. THE SCHEME FOR THE DATABASE BUILDING AND UPDATING OF 1:10 000 DIGITAL ELEVATION MODELS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The National Bureau of Surveying and Mapping of China has planned to speed up the development of spatial data infrastructure (SDI) in the coming few years. This SDI consists of four types of digital products, i. e., digital orthophotos, digital elevation models,digital line graphs and digital raster graphs. For the DEM,a scheme for the database building and updating of 1:10 000 digital elevation models has been proposed and some experimental tests have also been accomplished. This paper describes the theoretical (and/or technical)background and reports some of the experimental results to support the scheme. Various aspects of the scheme such as accuracy, data sources, data sampling, spatial resolution, terrain modeling, data organization, etc are discussed.

  20. Experimental Update of the Overtopping Model Used for the Wave Dragon Wave Energy Converter

    DEFF Research Database (Denmark)

    Parmeggiani, Stefano; Kofoed, Jens Peter; Friis-Madsen, Erik

    2013-01-01

    An overtopping model specifically suited for Wave Dragon is needed in order to improve the reliability of its performance estimates. The model shall be comprehensive of all relevant physical processes that affect overtopping and flexible to adapt to any local conditions and device configuration....... An experimental investigation is carried out to update an existing formulation suited for 2D draft-limited, low-crested structures, in order to include the effects on the overtopping flow of the wave steepness, the 3D geometry of Wave Dragon, the wing reflectors, the device motions and the non-rigid connection...... of which can be measured in real-time. Instead of using new fitting coefficients, this approach allows a broader applicability of the model beyond the Wave Dragon case, to any overtopping WEC or structure within the range of tested conditions. Predictions reliability of overtopping over Wave Dragon...

  1. Updated Life-Cycle Assessment of Aluminum Production and Semi-fabrication for the GREET Model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiang [Argonne National Lab. (ANL), Argonne, IL (United States); Kelly, Jarod C. [Argonne National Lab. (ANL), Argonne, IL (United States); Burnham, Andrew [Argonne National Lab. (ANL), Argonne, IL (United States); Elgowainy, Amgad [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    This report serves as an update for the life-cycle analysis (LCA) of aluminum production based on the most recent data representing the state-of-the-art of the industry in North America. The 2013 Aluminum Association (AA) LCA report on the environmental footprint of semifinished aluminum products in North America provides the basis for the update (The Aluminum Association, 2013). The scope of this study covers primary aluminum production, secondary aluminum production, as well as aluminum semi-fabrication processes including hot rolling, cold rolling, extrusion and shape casting. This report focuses on energy consumptions, material inputs and criteria air pollutant emissions for each process from the cradle-to-gate of aluminum, which starts from bauxite extraction, and ends with manufacturing of semi-fabricated aluminum products. The life-cycle inventory (LCI) tables compiled are to be incorporated into the vehicle cycle model of Argonne National Laboratory’s Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) Model for the release of its 2015 version.

  2. Update of structural models at SFR nuclear waste repository, Forsmark, Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Axelsson, C.L.; Maersk Hansen, L. [Golder Associates AB (Sweden)

    1997-12-01

    The final repository for low and medium-level waste, SFR, is located below the Baltic, off Forsmark. A number off various geo-scientific investigations have been performed and used to design a conceptual model of the fracture system, to be used in hydraulic modeling for a performance assessment study of the SFR facility in 1987. An updated study was reported in 1993. No formal basic revision of the original conceptual model of the fracture system around SFR has so far been made. During review, uncertainties in the model of the fracture system were found. The previous local structure model is reviewed and an alternative model is presented together with evidence for the new interpretation. The model is based on review of geophysical data, geological mapping, corelogs, hydraulic testing, water inflow etc. The fact that two different models can result from the same data represent an interpretation uncertainty which can not be resolved without more data and basic interpretations of such data. Further refinement of the structure model could only be motivated in case the two different models discussed here would lead to significantly different consequences 20 refs, 24 figs, 16 tabs

  3. SCRI acrylamide project update

    Science.gov (United States)

    The US potato industry, with $3.5 billion in raw product value, identified acrylamide as its number one research funding priority in 2010 because of potential health concerns related to the presence of acrylamide in potato products. Acrylamide is present in much carbohydrate rich foods processed at ...

  4. An Updated Analytical Structural Pounding Force Model Based on Viscoelasticity of Materials

    Directory of Open Access Journals (Sweden)

    Qichao Xue

    2016-01-01

    Full Text Available Based on the summary of existing pounding force analytical models, an updated pounding force analysis method is proposed by introducing viscoelastic constitutive model and contact mechanics method. Traditional Kelvin viscoelastic pounding force model can be expanded to 3-parameter linear viscoelastic model by separating classic pounding model parameters into geometry parameters and viscoelastic material parameters. Two existing pounding examples, the poundings of steel-to-steel and concrete-to-concrete, are recalculated by utilizing the proposed method. Afterwards, the calculation results are compared with other pounding force models. The results show certain accuracy in proposed model. The relative normalized errors of steel-to-steel and concrete-to-concrete experiments are 19.8% and 12.5%, respectively. Furthermore, a steel-to-polymer pounding example is calculated, and the application of the proposed method in vibration control analysis for pounding tuned mass damper (TMD is simulated consequently. However, due to insufficient experiment details, the proposed model can only give a rough trend for both single pounding process and vibration control process. Regardless of the cheerful prospect, the study in this paper is only the first step of pounding force calculation. It still needs a more careful assessment of the model performance, especially in the presence of inelastic response.

  5. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. I. Biological model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    that really uses all these methodological improvements. In this paper, the biological model describing the performance and feed intake of sows is presented. In particular, estimation of herd specific parameters is emphasized. The optimization model is described in a subsequent paper......Several replacement models have been presented in literature. In other applicational areas like dairy cow replacement, various methodological improvements like hierarchical Markov processes and Bayesian updating have been implemented, but not in sow models. Furthermore, there are methodological...... improvements like multi-level hierarchical Markov processes with decisions on multiple time scales, efficient methods for parameter estimations at herd level and standard software that has been hardly implemented at all in any replacement model. The aim of this study is to present a sow replacement model...

  6. Uncertainty quantification of voice signal production mechanical model and experimental updating

    Science.gov (United States)

    Cataldo, E.; Soize, C.; Sampaio, R.

    2013-11-01

    The aim of this paper is to analyze the uncertainty quantification in a voice production mechanical model and update the probability density function corresponding to the tension parameter using the Bayes method and experimental data. Three parameters are considered uncertain in the voice production mechanical model used: the tension parameter, the neutral glottal area and the subglottal pressure. The tension parameter of the vocal folds is mainly responsible for the changing of the fundamental frequency of a voice signal, generated by a mechanical/mathematical model for producing voiced sounds. The three uncertain parameters are modeled by random variables. The probability density function related to the tension parameter is considered uniform and the probability density functions related to the neutral glottal area and the subglottal pressure are constructed using the Maximum Entropy Principle. The output of the stochastic computational model is the random voice signal and the Monte Carlo method is used to solve the stochastic equations allowing realizations of the random voice signals to be generated. For each realization of the random voice signal, the corresponding realization of the random fundamental frequency is calculated and the prior pdf of this random fundamental frequency is then estimated. Experimental data are available for the fundamental frequency and the posterior probability density function of the random tension parameter is then estimated using the Bayes method. In addition, an application is performed considering a case with a pathology in the vocal folds. The strategy developed here is important mainly due to two things. The first one is related to the possibility of updating the probability density function of a parameter, the tension parameter of the vocal folds, which cannot be measured direct and the second one is related to the construction of the likelihood function. In general, it is predefined using the known pdf. Here, it is

  7. A decision model for energy companies that sorts projects, classifies the project manager and recommends the final match between project and project manager

    Directory of Open Access Journals (Sweden)

    Elaine Cristina Batista de Oliveira

    2016-03-01

    Full Text Available Abstract This study presents an integrated model to support the process of classifying projects and selecting project managers for these projects in accordance with their characteristics and skills using a multiple criteria decision aid (MCDA approach. Such criteria are often conflicting. The model also supports the process of allocating project managers to projects by evaluating the characteristics/types of projects. The framework consists of a set of structured techniques and methods that are deemed very appropriate within the context of project management. A practical application of the proposed model was performed in a Brazilian electric energy company, which has a portfolio of projects that are specifically related to the company´s defined strategic plan. As a result, it was possible to classify the projects and project managers into definable categories, thus enabling more effective management as different projects require different levels of skills and abilities.

  8. A Traffic Information Estimation Model Using Periodic Location Update Events from Cellular Network

    Science.gov (United States)

    Lin, Bon-Yeh; Chen, Chi-Hua; Lo, Chi-Chun

    In recent years considerable concerns have arisen over building Intelligent Transportation System (ITS) which focuses on efficiently managing the road network. One of the important purposes of ITS is to improve the usability of transportation resources so as extend the durability of vehicle, reduce the fuel consumption and transportation times. Before this goal can be achieved, it is vital to obtain correct and real-time traffic information, so that traffic information services can be provided in a timely and effective manner. Using Mobile Stations (MS) as probe to tracking the vehicle movement is a low cost and immediately solution to obtain the real-time traffic information. In this paper, we propose a model to analyze the relation between the amount of Periodic Location Update (PLU) events and traffic density. Finally, the numerical analysis shows that this model is feasible to estimate the traffic density.

  9. Latest cosmological constraints on Cardassian expansion models including the updated gamma-ray bursts

    Institute of Scientific and Technical Information of China (English)

    Nan Liang; Pu-Xun Wua; Zong-Hong Zhu

    2011-01-01

    We constrain the Cardassian expansion models from the latest observations,including the updated Gamma-ray bursts (GRBs),which are calibrated using a cosmology independent method from the Union2 compilation of type Ia supernovae (SNe Ia).By combining the GRB data with the joint observations from the Union2SNe Ia set,along with the results from the Cosmic Microwave Background radiation observation from the seven-year Wilkinson Microwave Anisotropy Probe and the baryonic acoustic oscillation observation galaxy sample from the spectroscopic Sloan Digital Sky Survey Data Release,we find significant constraints on the model parameters of the original Cardassian model ΩM0=n 282+0.015-0.014,n=0.03+0.05-0.05;and n = -0.16+0.25-3.26,β=-0.76+0.34-0.58 of the modified polytropic Cardassian model,which are consistent with the ACDM model in a l-σ confidence region.From the reconstruction of the deceleration parameter q(z) in Cardassian models,we obtain the transition redshift ZT = 0.73 ± 0.04 for the original Cardassian model and ZT = 0.68 ± 0.04 for the modified polytropic Cardassian model.

  10. Updating the Finite Element Model of the Aerostructures Test Wing using Ground Vibration Test Data

    Science.gov (United States)

    Lung, Shun-fat; Pak, Chan-gi

    2009-01-01

    Improved and/or accelerated decision making is a crucial step during flutter certification processes. Unfortunately, most finite element structural dynamics models have uncertainties associated with model validity. Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. The model tuning process requires not only satisfactory correlations between analytical and experimental results, but also the retention of the mass and stiffness properties of the structures. Minimizing the difference between analytical and experimental results is a type of optimization problem. By utilizing the multidisciplinary design, analysis, and optimization (MDAO) tool in order to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes can be matched to the target data to retain the mass matrix orthogonality. This approach has been applied to minimize the model uncertainties for the structural dynamics model of the Aerostructures Test Wing (ATW), which was designed and tested at the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (DFRC) (Edwards, California). This study has shown that natural frequencies and corresponding mode shapes from the updated finite element model have excellent agreement with corresponding measured data.

  11. Development of a simple model for batch and boundary information updation for a similar ship's block model

    Institute of Scientific and Technical Information of China (English)

    LEE Hyeon-deok; SON Myeong-jo; OH Min-jae; LEE Hyung-woo; KIM Tae-wan

    2012-01-01

    In early 2000,large domestic shipyards introduced shipbuilding 3D computer-aided design (CAD)to the hull production design process to define manufacturing and assembly information.The production design process accounts for most of the man-hours (M/H) of the entire design process and is closely connected to yard production because designs must take into account the production schedule of the shipyard,the current state of the dock needed to mount the ship's block,and supply information.Therefore,many shipyards are investigating the complete automation of the production design process to reduce the M/H for designers.However,these problems are still currently unresolved,and a clear direction is needed for research on the automatic design base of manufacturing rules,batches reflecting changed building specifications,batch updates of boundary information for hull members,and management of the hull model change history to automate the production design process.In this study,a process was developed to aid production design engineers in designing a new ship's hull block model from that of a similar ship previously built,based on AVEVA Marine.An automation system that uses the similar ship's hull block model is proposed to reduce M/H and human errors by the production design engineer.First,scheme files holding important information were constructed in a database to automatically update hull block model modifications.Second,for batch updates,the database's table,including building specifications and the referential integrity of a relational database were compared.In particular,this study focused on reflecting the frequent modification of building specifications and regeneration of boundary information of the adjacent panel due to changes in a specific panel.Third,the rollback function is proposed in which the database (DB) is used to return to the previously designed panels.

  12. Project 2010 For Dummies

    CERN Document Server

    Muir, Nancy C

    2010-01-01

    A friendly reference guide to Microsoft Project, the leading enterprise project management software. As project management software, Microsoft Project allows you to oversee your business activities effectively. You can manage resources, share project info, perform modeling and scenario analysis, and standardize reporting processes. This easy-to-understand guide is completely updated to cover the latest changes and newest enhancements to Project 2010 and shows you how to get Project 2010 to work for you. After an introduction to basic project management concepts, you'll discover the mechanics o

  13. Alaska Regional Energy Resources Planning Project, Phase 2: coal, hydroelectric, and energy alternatives. Volume III. Alaska's alternative energies and regional assessment inventory update

    Energy Technology Data Exchange (ETDEWEB)

    1979-01-01

    The Alaska Regional Energy Resources Planning Project is presented in three volumes. This volume, Vol. III, considers alternative energies and the regional assessment inventory update. The introductory chapter, Chapter 12, examines the historical background, current technological status, environmental impact, applicability to Alaska, and siting considerations for a number of alternative systems. All of the systems considered use or could use renewable energy resources. The chapters that follow are entitled: Very Small Hydropower (about 12 kW or less for rural and remote villages); Low-Temperature Geothermal Space Heating; Wind; Fuel Cells; Siting Criteria and Preliminary Screening of Communities for Alternate Energy Use; Wood Residues; Waste Heat; and Regional Assessment Invntory Update. (MCW)

  14. External validation and updating of a Dutch prediction model for low hemoglobin deferral in Irish whole blood donors

    NARCIS (Netherlands)

    Baart, A.M.; Atsma, F.; McSweeney, E.N.; Moons, K.G.; Vergouwe, Y.; Kort, W.L. de

    2014-01-01

    BACKGROUND: Recently, sex-specific prediction models for low hemoglobin (Hb) deferral have been developed in Dutch whole blood donors. In the present study, we validated and updated the models in a cohort of Irish whole blood donors. STUDY DESIGN AND METHODS: Prospectively collected data from 45,031

  15. Projections of global changes in precipitation extremes from Coupled Model Intercomparison Project Phase 5 models

    NARCIS (Netherlands)

    Toreti, A.; Naveau, P.; Zampieri, M.; Schindler, A.; Scoccimarro, E.; Xoplaki, E.; Dijkstra, H.A.|info:eu-repo/dai/nl/073504467; Gualdi, S.; Luterbacher, J.

    2013-01-01

    Precipitation extremes are expected to increase in a warming climate; thus, it is essential to characterize their potential future changes. Here we evaluate eight high-resolution global climate model simulations in the twentieth century and provide new evidence on projected global precipitation

  16. Projections of global changes in precipitation extremes from Coupled Model Intercomparison Project Phase 5 models

    NARCIS (Netherlands)

    Toreti, A.; Naveau, P.; Zampieri, M.; Schindler, A.; Scoccimarro, E.; Xoplaki, E.; Dijkstra, H.A.; Gualdi, S.; Luterbacher, J.

    2013-01-01

    Precipitation extremes are expected to increase in a warming climate; thus, it is essential to characterize their potential future changes. Here we evaluate eight high-resolution global climate model simulations in the twentieth century and provide new evidence on projected global precipitation extr

  17. Updated cloud physics improve the modelled near surface climate of Antarctica of a regional atmospheric climate model

    Directory of Open Access Journals (Sweden)

    J. M. van Wessem

    2013-07-01

    Full Text Available The physics package of the polar version of the regional atmospheric climate model RACMO2 has been updated from RACMO2.1 to RACMO2.3. The update constitutes, amongst others, the inclusion of a parameterization for cloud ice super-saturation, an improved turbulent and radiative flux scheme and a changed cloud scheme. In this study the effects of these changes on the modelled near-surface climate of Antarctica are presented. Significant biases remain, but overall RACMO2.3 better represents the near-surface climate in terms of the modelled surface energy balance, based on a comparison with > 750 months of data from nine automatic weather stations located in East Antarctica. Especially the representation of the sensible heat flux and net longwave radiative flux has improved with a decrease in biases of up to 40 %. These improvements are mainly caused by the inclusion of ice super-saturation, which has led to more moisture being transported onto the continent, resulting in more and optically thicker clouds and more downward longwave radiation. As a result, modelled surface temperatures have increased and the bias, when compared to 10 m snow temperatures from 64 ice core observations, has decreased from −2.3 K to −1.3 K. The weaker surface temperature inversion consequently improves the representation of the sensible heat flux, whereas wind speed remains unchanged.

  18. The sigma model on complex projective superspaces

    Energy Technology Data Exchange (ETDEWEB)

    Candu, Constantin; Mitev, Vladimir; Schomerus, Volker [DESY, Hamburg (Germany). Theory Group; Quella, Thomas [Amsterdam Univ. (Netherlands). Inst. for Theoretical Physics; Saleur, Hubert [CEA Saclay, 91 - Gif-sur-Yvette (France). Inst. de Physique Theorique; USC, Los Angeles, CA (United States). Physics Dept.

    2009-08-15

    The sigma model on projective superspaces CP{sup S-1} {sup vertical} {sup stroke} {sup S} gives rise to a continuous family of interacting 2D conformal field theories which are parametrized by the curvature radius R and the theta angle {theta}. Our main goal is to determine the spectrum of the model, non-perturbatively as a function of both parameters. We succeed to do so for all open boundary conditions preserving the full global symmetry of the model. In string theory parlor, these correspond to volume filling branes that are equipped with a monopole line bundle and connection. The paper consists of two parts. In the first part, we approach the problem within the continuum formulation. Combining combinatorial arguments with perturbative studies and some simple free field calculations, we determine a closed formula for the partition function of the theory. This is then tested numerically in the second part. There we propose a spin chain regularization of the CP{sup S-1} {sup vertical} {sup stroke} {sup S} model with open boundary conditions and use it to determine the spectrum at the conformal fixed point. The numerical results are in remarkable agreement with the continuum analysis. (orig.)

  19. Bayesian updating in a fault tree model for shipwreck risk assessment.

    Science.gov (United States)

    Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M

    2017-03-14

    Shipwrecks containing oil and other hazardous substances have been deteriorating on the seabeds of the world for many years and are threatening to pollute the marine environment. The status of the wrecks and the potential volume of harmful substances present in the wrecks are affected by a multitude of uncertainties. Each shipwreck poses a unique threat, the nature of which is determined by the structural status of the wreck and possible damage resulting from hazardous activities that could potentially cause a discharge. Decision support is required to ensure the efficiency of the prioritisation process and the allocation of resources required to carry out risk mitigation measures. Whilst risk assessments can provide the requisite decision support, comprehensive methods that take into account key uncertainties related to shipwrecks are limited. The aim of this paper was to develop a method for estimating the probability of discharge of hazardous substances from shipwrecks. The method is based on Bayesian updating of generic information on the hazards posed by different activities in the surroundings of the wreck, with information on site-specific and wreck-specific conditions in a fault tree model. Bayesian updating is performed using Monte Carlo simulations for estimating the probability of a discharge of hazardous substances and formal handling of intrinsic uncertainties. An example application involving two wrecks located off the Swedish coast is presented. Results show the estimated probability of opening, discharge and volume of the discharge for the two wrecks and illustrate the capability of the model to provide decision support. Together with consequence estimations of a discharge of hazardous substances, the suggested model enables comprehensive and probabilistic risk assessments of shipwrecks to be made.

  20. Modelling in Evaluating a Working Life Project in Higher Education

    Science.gov (United States)

    Sarja, Anneli; Janhonen, Sirpa; Havukainen, Pirjo; Vesterinen, Anne

    2012-01-01

    This article describes an evaluation method based on collaboration between the higher education, a care home and university, in a R&D project. The aim of the project was to elaborate modelling as a tool of developmental evaluation for innovation and competence in project cooperation. The approach was based on activity theory. Modelling enabled a…

  1. Pataha Creek Model Watershed : 1998 Habitat Conservation Projects.

    Energy Technology Data Exchange (ETDEWEB)

    Bartels, Duane G.

    1999-12-01

    The projects outlined in detail on the attached project reports are a few of the many projects implemented in the Pataha Creek Model Watershed since it was selected as a model in 1993. 1998 was a year where a focused effort was made to work on the upland conservation practices to reduce the sedimentation into Pataha Creek.

  2. The Digital Astronaut Project Bone Remodeling Model

    Science.gov (United States)

    Pennline, James A.; Mulugeta, Lealem; Lewandowski, Beth E.; Thompson, William K.; Sibonga, Jean D.

    2014-01-01

    Under the conditions of microgravity, astronauts lose bone mass at a rate of 1% to 2% a month, particularly in the lower extremities such as the proximal femur: (1) The most commonly used countermeasure against bone loss has been prescribed exercise, (2) However, current exercise countermeasures do not completely eliminate bone loss in long duration, 4 to 6 months, spaceflight, (3,4) leaving the astronaut susceptible to early onset osteoporosis and a greater risk of fracture later in their lives. The introduction of the Advanced Resistive Exercise Device, coupled with improved nutrition, has further minimized the 4 to 6 month bone loss. But further work is needed to implement optimal exercise prescriptions, and (5) In this light, NASA's Digital Astronaut Project (DAP) is working with NASA physiologists to implement well-validated computational models that can help understand the mechanisms of bone demineralization in microgravity, and enhance exercise countermeasure development.

  3. Towards a neural basis of music perception -- A review and updated model

    Directory of Open Access Journals (Sweden)

    Stefan eKoelsch

    2011-06-01

    Full Text Available Music perception involves acoustic analysis, auditory memory, auditoryscene analysis, processing of interval relations, of musical syntax and semantics,and activation of (premotor representations of actions. Moreover, music percep-tion potentially elicits emotions, thus giving rise to the modulation of emotionaleffector systems such as the subjective feeling system, the autonomic nervoussystem, the hormonal, and the immune system. Building on a previous article(Koelsch & Siebel, 2005, this review presents an updated model of music percep-tion and its neural correlates. The article describes processes involved in musicperception, and reports EEG and fMRI studies that inform about the time courseof these processes, as well as about where in the brain these processes might belocated.

  4. [Social determinants of health and disability: updating the model for determination].

    Science.gov (United States)

    Tamayo, Mauro; Besoaín, Álvaro; Rebolledo, Jame

    2017-03-05

    Social determinants of health (SDH) are conditions in which people live. These conditions impact their lives, health status and social inclusion level. In line with the conceptual and comprehensive progression of disability, it is important to update SDH due to their broad implications in implementing health interventions in society. This proposal supports incorporating disability in the model as a structural determinant, as it would lead to the same social inclusion/exclusion of people described in other structural SDH. This proposal encourages giving importance to designing and implementing public policies to improve societal conditions and contribute to social equity. This will be an act of reparation, justice and fulfilment with the Convention on the Rights of Persons with Disabilities.

  5. Toward a Neural Basis of Music Perception – A Review and Updated Model

    Science.gov (United States)

    Koelsch, Stefan

    2011-01-01

    Music perception involves acoustic analysis, auditory memory, auditory scene analysis, processing of interval relations, of musical syntax and semantics, and activation of (pre)motor representations of actions. Moreover, music perception potentially elicits emotions, thus giving rise to the modulation of emotional effector systems such as the subjective feeling system, the autonomic nervous system, the hormonal, and the immune system. Building on a previous article (Koelsch and Siebel, 2005), this review presents an updated model of music perception and its neural correlates. The article describes processes involved in music perception, and reports EEG and fMRI studies that inform about the time course of these processes, as well as about where in the brain these processes might be located. PMID:21713060

  6. Integration of intraoperative and model-updated images into an industry-standard neuronavigation system: initial results

    Science.gov (United States)

    Schaewe, Timothy J.; Fan, Xiaoyao; Ji, Songbai; Hartov, Alex; Hiemenz Holton, Leslie; Roberts, David W.; Paulsen, Keith D.; Simon, David A.

    2013-03-01

    Dartmouth and Medtronic have established an academic-industrial partnership to develop, validate, and evaluate a multimodality neurosurgical image-guidance platform for brain tumor resection surgery that is capable of updating the spatial relationships between preoperative images and the current surgical field. Previous studies have shown that brain shift compensation through a modeling framework using intraoperative ultrasound and/or visible light stereovision to update preoperative MRI appears to result in improved accuracy in navigation. However, image updates have thus far only been produced retrospective to surgery in large part because of gaps in the software integration and information flow between the co-registration and tracking, image acquisition and processing, and image warping tasks which are required during a case. This paper reports the first demonstration of integration of a deformation-based image updating process for brain shift modeling with an industry-standard image guided surgery platform. Specifically, we have completed the first and most critical data transfer operation to transmit volumetric image data generated by the Dartmouth brain shift modeling process to the Medtronic StealthStation® system. StealthStation® comparison views, which allow the surgeon to verify the correspondence of the received updated image volume relative to the preoperative MRI, are presented, along with other displays of image data such as the intraoperative 3D ultrasound used to update the model. These views and data represent the first time that externally acquired and manipulated image data has been imported into the StealthStation® system through the StealthLink® portal and visualized on the StealthStation® display.

  7. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Borum, Robert C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chaleff, Ethan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogerson, Doug W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J. [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation, Canton, MI (United States)

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  8. Experimental test of spatial updating models for monkey eye-head gaze shifts.

    Directory of Open Access Journals (Sweden)

    Tom J Van Grootel

    Full Text Available How the brain maintains an accurate and stable representation of visual target locations despite the occurrence of saccadic gaze shifts is a classical problem in oculomotor research. Here we test and dissociate the predictions of different conceptual models for head-unrestrained gaze-localization behavior of macaque monkeys. We adopted the double-step paradigm with rapid eye-head gaze shifts to measure localization accuracy in response to flashed visual stimuli in darkness. We presented the second target flash either before (static, or during (dynamic the first gaze displacement. In the dynamic case the brief visual flash induced a small retinal streak of up to about 20 deg at an unpredictable moment and retinal location during the eye-head gaze shift, which provides serious challenges for the gaze-control system. However, for both stimulus conditions, monkeys localized the flashed targets with accurate gaze shifts, which rules out several models of visuomotor control. First, these findings exclude the possibility that gaze-shift programming relies on retinal inputs only. Instead, they support the notion that accurate eye-head motor feedback updates the gaze-saccade coordinates. Second, in dynamic trials the visuomotor system cannot rely on the coordinates of the planned first eye-head saccade either, which rules out remapping on the basis of a predictive corollary gaze-displacement signal. Finally, because gaze-related head movements were also goal-directed, requiring continuous access to eye-in-head position, we propose that our results best support a dynamic feedback scheme for spatial updating in which visuomotor control incorporates accurate signals about instantaneous eye- and head positions rather than relative eye- and head displacements.

  9. Slab2 - Providing updated subduction zone geometries and modeling tools to the community

    Science.gov (United States)

    Hayes, G. P.; Hearne, M. G.; Portner, D. E.; Borjas, C.; Moore, G.; Flamme, H.

    2015-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0) combines a variety of geophysical data sets (earthquake hypocenters, moment tensors, active source seismic survey images of the shallow subduction zone, bathymetry, trench locations, and sediment thickness information) to image the shape of subducting slabs in three dimensions, at approximately 85% of the world's convergent margins. The database is used extensively for a variety of purposes, from earthquake source imaging, to magnetotelluric modeling. Gaps in Slab1.0 exist where input data are sparse and/or where slabs are geometrically complex (and difficult to image with an automated approach). Slab1.0 also does not include information on the uncertainty in the modeled geometrical parameters, or the input data used to image them, and provides no means to reproduce the models it described. Currently underway, Slab2 will update and replace Slab1.0 by: (1) extending modeled slab geometries to all global subduction zones; (2) incorporating regional data sets that may describe slab geometry in finer detail than do previously used teleseismic data; (3) providing information on the uncertainties in each modeled slab surface; (4) modifying our modeling approach to a fully-three dimensional data interpolation, rather than following the 2-D to 3-D steps of Slab1.0; (5) migrating the slab modeling code base to a more universally distributable language, Python; and (6) providing the code base and input data we use to create our models, such that the community can both reproduce the slab geometries, and add their own data sets to ours to further improve upon those models in the future. In this presentation we describe our vision for Slab2, and the first results of this modeling process.

  10. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ model version 5.0.2

    Directory of Open Access Journals (Sweden)

    B. Gantt

    2015-05-01

    Full Text Available Sea spray aerosols (SSA impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. Despite their importance, the emission magnitude of SSA remains highly uncertain with global estimates varying by nearly two orders of magnitude. In this study, the Community Multiscale Air Quality (CMAQ model was updated to enhance fine mode SSA emissions, include sea surface temperature (SST dependency, and reduce coastally-enhanced emissions. Predictions from the updated CMAQ model and those of the previous release version, CMAQv5.0.2, were evaluated using several regional and national observational datasets in the continental US. The updated emissions generally reduced model underestimates of sodium, chloride, and nitrate surface concentrations for an inland site of the Bay Regional Atmospheric Chemistry Experiment (BRACE near Tampa, Florida. Including SST-dependency to the SSA emission parameterization led to increased sodium concentrations in the southeast US and decreased concentrations along parts of the Pacific coast and northeastern US. The influence of sodium on the gas-particle partitioning of nitrate resulted in higher nitrate particle concentrations in many coastal urban areas due to increased condensation of nitric acid in the updated simulations, potentially affecting the predicted nitrogen deposition in sensitive ecosystems. Application of the updated SSA emissions to the California Research at the Nexus of Air Quality and Climate Change (CalNex study period resulted in modest improvement in the predicted surface concentration of sodium and nitrate at several central and southern California coastal sites. This SSA emission update enabled a more realistic simulation of the atmospheric chemistry in environments where marine air mixes with urban pollution.

  11. Projective Market Model Approach to AHP Decision-Making

    CERN Document Server

    Szczypinska, Anna

    2007-01-01

    In this paper we describe market in projective geometry language and give definition of a matrix of market rate, which is related to the matrix rate of return and the matrix of judgements in the Analytic Hierarchy Process (AHP). We use these observations to extend the AHP model to projective geometry formalism and generalise it to intransitive case. We give financial interpretations of such generalised model in the Projective Model of Market (PMM) and propose its simplification. The unification of the AHP model and projective aspect of portfolio theory suggests a wide spectrum of new applications such extended model.

  12. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2)

    Science.gov (United States)

    Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2017-07-01

    The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs) and Earth system models (ESMs) to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx), HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect caused by the

  13. PACIAE 2.1: An Updated Issue of Parton and Hadron Cascade Model PACIAE 2.0

    Institute of Scientific and Technical Information of China (English)

    SA; Ben-hao; ZHOU; Dai-mei; YAN; Yu-liang; DONG; Bao-guo; CAI; Xu

    2013-01-01

    We have updated the parton and hadron cascade model PACIAE 2.0 to the new issue of PACIAE 2.1.The PACIAE model is based on PYTHIA.In the PYTHIA model,once the generated particle or parton transverse momentum pT is randomly sampled,the px and py components are originally put on the circle with radius pT randomly.Now,it is put

  14. Introduction to Financial Projection Models. Business Management Instructional Software.

    Science.gov (United States)

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  15. Flow Forecasting using Deterministic Updating of Water Levels in Distributed Hydrodynamic Urban Drainage Models

    DEFF Research Database (Denmark)

    Hansen, Lisbet Sneftrup; Borup, Morten; Moller, Arne

    2014-01-01

    , and then evaluates and documents the performance of this particular updating procedure for flow forecasting. A hypothetical case study and synthetic observations are used to illustrate how the Update method works and affects downstream nodes. A real case study in a 544 ha urban catchment furthermore shows...

  16. Modeling Uncertainty when Estimating IT Projects Costs

    OpenAIRE

    Winter, Michel; Mirbel, Isabelle; Crescenzo, Pierre

    2014-01-01

    In the current economic context, optimizing projects' cost is an obligation for a company to remain competitive in its market. Introducing statistical uncertainty in cost estimation is a good way to tackle the risk of going too far while minimizing the project budget: it allows the company to determine the best possible trade-off between estimated cost and acceptable risk. In this paper, we present new statistical estimators derived from the way IT companies estimate the projects' costs. In t...

  17. Explicit incremental-update algorithm for modeling crystal elasto-viscoplastic response in finite element simulation

    Institute of Scientific and Technical Information of China (English)

    LI Hong-wei; YANG He; SUN Zhi-chao

    2006-01-01

    Computational stability and efficiency are the key problems for numerical modeling of crystal plasticity,which will limit its development and application in finite element (FE) simulation evidently. Since implicit iterative algorithms are inefficient and have difficulty to determine initial values,an explicit incremental-update algorithm for the elasto-viscoplastic constitutive relation was developed in the intermediate frame by using the second Piola-Kirchoff (P-K) stress and Green stain. The increment of stress and slip resistance were solved by a calculation loop of linear equations sets. The reorientation of the crystal as well as the elastic strain can be obtained from a polar decomposition of the elastic deformation gradient. User material subroutine VUMAT was developed to combine crystal elasto-viscoplastic constitutive model with ABAQUS/Explicit. Numerical studies were performed on a cubic upset model with OFHC material (FCC crystal). The comparison of the numerical results with those obtained by implicit iterative algorithm and those from experiments demonstrates that the explicit algorithm is reliable. Furthermore,the effect rules of material anisotropy,rate sensitivity coefficient (RSC) and loading speeds on the deformation were studied. The numerical studies indicate that the explicit algorithm is suitable and efficient for large deformation analyses where anisotropy due to texture is important.

  18. Experimental liver fibrosis research: update on animal models, legal issues and translational aspects

    Science.gov (United States)

    2013-01-01

    Liver fibrosis is defined as excessive extracellular matrix deposition and is based on complex interactions between matrix-producing hepatic stellate cells and an abundance of liver-resident and infiltrating cells. Investigation of these processes requires in vitro and in vivo experimental work in animals. However, the use of animals in translational research will be increasingly challenged, at least in countries of the European Union, because of the adoption of new animal welfare rules in 2013. These rules will create an urgent need for optimized standard operating procedures regarding animal experimentation and improved international communication in the liver fibrosis community. This review gives an update on current animal models, techniques and underlying pathomechanisms with the aim of fostering a critical discussion of the limitations and potential of up-to-date animal experimentation. We discuss potential complications in experimental liver fibrosis and provide examples of how the findings of studies in which these models are used can be translated to human disease and therapy. In this review, we want to motivate the international community to design more standardized animal models which might help to address the legally requested replacement, refinement and reduction of animals in fibrosis research. PMID:24274743

  19. An updated analytic model for the attenuation by the intergalactic medium

    CERN Document Server

    Inoue, Akio K; Iwata, Ikuru

    2014-01-01

    We present an updated version of the so-called Madau model for the attenuation by the intergalactic neutral hydrogen against the radiation from distant objects. First, we derive a distribution function of the intergalactic absorbers from the latest observational statistics of the Ly$\\alpha$ forest, Lyman limit systems, and damped Ly$\\alpha$ systems. The distribution function excellently reproduces the observed redshift evolutions of the Ly$\\alpha$ depression and of the mean-free-path of the Lyman continuum simultaneously. Then, we derive a set of the analytic functions which describe the mean intergalactic attenuation curve for objects at $z>0.5$. Our new model predicts, for some redshifts, more than 0.5--1 mag different attenuation magnitudes through usual broad-band filters relative to the original Madau model. Such a difference would cause uncertainty of the photometric redshift of 0.2, in particular, at $z\\simeq3$--4. Finally, we find a more than 0.5 mag overestimation of the Lyman continuum attenuation i...

  20. Downplaying model power in IT project work

    DEFF Research Database (Denmark)

    Richter, Anne; Buhl, Henrik

    2004-01-01

    Executives and information technology specialists often manage IT projects in project teams. Integrative IT systems provide opportunities to manage and restructure work functions, but the process of change often causes serious problems in implementation and diffusion. A central issue in the resea......Executives and information technology specialists often manage IT projects in project teams. Integrative IT systems provide opportunities to manage and restructure work functions, but the process of change often causes serious problems in implementation and diffusion. A central issue...... possible to put issues such as team functions and quality of work on the agenda. Simultaneously, participation competencies seem to have been enhanced....

  1. Dynamic Damage Modeling for IRAC Simulations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA's Integrated Resilient Aircraft Control (IRAC) Project, Preliminary Technical Plan Summary identifies several causal and contributing factors that can lead to...

  2. Enterprise Projects Set Risk Element Transmission Chaotic Genetic Model

    Directory of Open Access Journals (Sweden)

    Cunbin Li

    2012-08-01

    Full Text Available In order to research projects set risk transfer process and improve risk management efficiency in projects management, combining chaos theory and genetic algorithm, put forward enterprise projects set risk element transmission chaos genetic model. Using logistic chaos mapping and chebyshev chaos mapping mixture, constructed a hybrid chaotic mapping system. The steps of adopting hybrid chaos mapping for genetic operation include projects set initialization, calculation of fitness, selection, crossover and mutation operators, fitness adjustment and condition judgment. The results showed that the model can simulate enterprise projects set risk transmission process very well and it also provides the basis for the enterprise managers to make decisions.

  3. SR 97. Alternative models project. Stochastic continuum modelling of Aberg

    Energy Technology Data Exchange (ETDEWEB)

    Widen, H. [Kemakta AB, Stockholm (Sweden); Walker, D. [INTERA KB/DE and S (Sweden)

    1999-08-01

    As part of studies into the siting of a deep repository for nuclear waste, Swedish Nuclear Fuel and Waste Management Company (SKB) has commissioned the Alternative Models Project (AMP). The AMP is a comparison of three alternative modelling approaches to bedrock performance assessment for a single hypothetical repository, arbitrarily named Aberg. The Aberg repository will adopt input parameters from the Aespoe Hard Rock Laboratory in southern Sweden. The models are restricted to an explicit domain, boundary conditions and canister location to facilitate the comparison. The boundary conditions are based on the regional groundwater model provided in digital format. This study is the application of HYDRASTAR, a stochastic continuum groundwater flow and transport-modelling program. The study uses 34 realisations of 945 canister locations in the hypothetical repository to evaluate the uncertainty of the advective travel time, canister flux (Darcy velocity at a canister) and F-ratio. Several comparisons of variability are constructed between individual canister locations and individual realisations. For the ensemble of all realisations with all canister locations, the study found a median travel time of 27 years, a median canister flux of 7.1 x 10{sup -4} m/yr and a median F-ratio of 3.3 x 10{sup 5} yr/m. The overall pattern of regional flow is preserved in the site-scale model, as is reflected in flow paths and exit locations. The site-scale model slightly over-predicts the boundary fluxes from the single realisation of the regional model. The explicitly prescribed domain was seen to be slightly restrictive, with 6% of the stream tubes failing to exit the upper surface of the model. Sensitivity analysis and calibration are suggested as possible extensions of the modelling study.

  4. Integrated Medical Model (IMM) Project Verification, Validation, and Credibility (VVandC)

    Science.gov (United States)

    Walton, M.; Boley, L.; Keenan, L.; Kerstman, E.; Shah, R.; Young, M.; Saile, L.; Garcia, Y.; Meyers, J.; Reyes, D.

    2015-01-01

    The Integrated Medical Model (IMM) Project supports end user requests by employing the Integrated Medical Evidence Database (iMED) and IMM tools as well as subject matter expertise within the Project. The iMED houses data used by the IMM. The IMM is designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic model based on historical data, cohort data, and subject matter expert opinion. A stochastic approach is taken because deterministic results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation, as well as lay the foundation for external review and application. METHODS: In 2008, the Project team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) [1] was published, the Project team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. Construction of these tools included meeting documentation and evidence requirements sufficient to meet external review success criteria. RESULTS: IMM Project VVC updates are compiled recurrently and include updates to the 7009 Compliance and Credibility matrices. Reporting tools have evolved over the lifetime of

  5. Badhwar-O'Neill 2011 Galactic Cosmic Ray Model Update and Future Improvements

    Science.gov (United States)

    O'Neill, Pat M.; Kim, Myung-Hee Y.

    2014-01-01

    The Badhwar-O'Neill Galactic Cosmic Ray (GCR) Model based on actual GR measurements is used by deep space mission planners for the certification of micro-electronic systems and the analysis of radiation health risks to astronauts in space missions. The BO GCR Model provides GCR flux in deep space (outside the earth's magnetosphere) for any given time from 1645 to present. The energy spectrum from 50 MeV/n-20 GeV/n is provided for ions from hydrogen to uranium. This work describes the most recent version of the BO GCR model (BO'11). BO'11 determines the GCR flux at a given time applying an empirical time delay function to past sunspot activity. We describe the GCR measurement data used in the BO'11 update - modern data from BESS, PAMELA, CAPRICE, and ACE emphasized for than the older balloon data used for the previous BO model (BO'10). We look at the GCR flux for the last 24 solar minima and show how much greater the flux was for the cycle 24 minimum in 2010. The BO'11 Model uses the traditional, steady-state Fokker-Planck differential equation to account for particle transport in the heliosphere due to diffusion, convection, and adiabatic deceleration. It assumes a radially symmetrical diffusion coefficient derived from magnetic disturbances caused by sunspots carried onward by a constant solar wind. A more complex differential equation is now being tested to account for particle transport in the heliosphere in the next generation BO model. This new model is time-dependent (no longer a steady state model). In the new model, the dynamics and anti-symmetrical features of the actual heliosphere are accounted for so empirical time delay functions will no longer be required. The new model will be capable of simulating the more subtle features of modulation - such as the Sun's polarity and modulation dependence on the gradient and curvature drift. This improvement is expected to significantly improve the fidelity of the BO GCR model. Preliminary results of its

  6. Multi-Agent Modeling in Managing Six Sigma Projects

    Directory of Open Access Journals (Sweden)

    K. Y. Chau

    2009-10-01

    Full Text Available In this paper, a multi-agent model is proposed for considering the human resources factor in decision making in relation to the six sigma project. The proposed multi-agent system is expected to increase the acccuracy of project prioritization and to stabilize the human resources service level. A simulation of the proposed multiagent model is conducted. The results show that a multi-agent model which takes into consideration human resources when making decisions about project selection and project team formation is important in enabling efficient and effective project management. The multi-agent modeling approach provides an alternative approach for improving communication and the autonomy of six sigma projects in business organizations.

  7. Business models for renewable energy in the built environment. Updated version

    Energy Technology Data Exchange (ETDEWEB)

    Wuertenberger, L.; Menkveld, M.; Vethman, P.; Van Tilburg, X. [ECN Policy Studies, Amsterdam (Netherlands); Bleyl, J.W. [Energetic Solutions, Graz (Austria)

    2012-04-15

    The project RE-BIZZ aims to provide insight to policy makers and market actors in the way new and innovative business models (and/or policy measures) can stimulate the deployment of renewable energy technologies (RET) and energy efficiency (EE) measures in the built environment. The project is initiated and funded by the IEA Implementing Agreement for Renewable Energy Technology Deployment (IEA-RETD). It analysed ten business models in three categories (amongst others different types of Energy Service Companies (ESCOs), Developing properties certified with a 'green' building label, Building owners profiting from rent increases after EE measures, Property Assessed Clean Energy (PACE) financing, On-bill financing, and Leasing of RET equipment) including their organisational and financial structure, the existing market and policy context, and an analysis of Strengths, Weaknesses, Opportunities and Threats (SWOT). The study concludes with recommendations for policy makers and other market actors.

  8. Update of an Object Oriented Track Reconstruction Model for LHC Experiments

    Institute of Scientific and Technical Information of China (English)

    DavidCandilin; SijinQIAN; 等

    2001-01-01

    In this update report about an Object Oriented (OO) track reconstruction model,which was presented at CHEP'97,CHEP'98,and CHEP'2000,we shall describe subsequent new developments since the beginning of year 2000.The OO model for the Kalman filtering method has been designed for high energy physics experiments at high luminosity hadron colliders.It has been coded in the C++ programming language originally for the CMS experiment at the future Large Hadron Collider (LHC) at CERN,and later has been successfully implemented into three different OO computing environments(including the level-2 trigger and offline software systems)of the ATLAS(another major experiment at LHC).For the level-2 trigger software environment.we shall selectively present some latest performance results(e.g.the B-physics event selection for ATLAS level-2 trigger,the robustness study result,ets.).For the offline environment,we shall present a new 3-D space point package which provides the essential offline input.A major development after CHEP'2000 is the implementation of the OO model into the new OO software frameworkAthena"of ATLAS experiment.The new modularization of this OO package enables the model to be more flexible and to be more easily implemented into different software environments.Also it provides the potential to handle the more comlpicated realistic situation(e.g.to include the calibration correction and the alignment correction,etc.) Some general interface issues(e.g.design of the common track class)of the algorithms to different framework environments have been investigated by using this OO package.

  9. Benchmarking Exercises To Validate The Updated ELLWF GoldSim Slit Trench Model

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, G. A.; Hiergesell, R. A.

    2013-11-12

    The Savannah River National Laboratory (SRNL) results of the 2008 Performance Assessment (PA) (WSRC, 2008) sensitivity/uncertainty analyses conducted for the trenches located in the EArea LowLevel Waste Facility (ELLWF) were subject to review by the United States Department of Energy (U.S. DOE) Low-Level Waste Disposal Facility Federal Review Group (LFRG) (LFRG, 2008). LFRG comments were generally approving of the use of probabilistic modeling in GoldSim to support the quantitative sensitivity analysis. A recommendation was made, however, that the probabilistic models be revised and updated to bolster their defensibility. SRS committed to addressing those comments and, in response, contracted with Neptune and Company to rewrite the three GoldSim models. The initial portion of this work, development of Slit Trench (ST), Engineered Trench (ET) and Components-in-Grout (CIG) trench GoldSim models, has been completed. The work described in this report utilizes these revised models to test and evaluate the results against the 2008 PORFLOW model results. This was accomplished by first performing a rigorous code-to-code comparison of the PORFLOW and GoldSim codes and then performing a deterministic comparison of the two-dimensional (2D) unsaturated zone and three-dimensional (3D) saturated zone PORFLOW Slit Trench models against results from the one-dimensional (1D) GoldSim Slit Trench model. The results of the code-to-code comparison indicate that when the mechanisms of radioactive decay, partitioning of contaminants between solid and fluid, implementation of specific boundary conditions and the imposition of solubility controls were all tested using identical flow fields, that GoldSim and PORFLOW produce nearly identical results. It is also noted that GoldSim has an advantage over PORFLOW in that it simulates all radionuclides simultaneously - thus avoiding a potential problem as demonstrated in the Case Study (see Section 2.6). Hence, it was concluded that the follow

  10. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  11. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  12. Bayesian-based Project Monitoring: Framework Development and Model Testing

    Directory of Open Access Journals (Sweden)

    Budi Hartono

    2015-12-01

    Full Text Available During project implementation, risk becomes an integral part of project monitoring. Therefore. a tool that could dynamically include elements of risk in project progress monitoring is needed. This objective of this study is to develop a general framework that addresses such a concern. The developed framework consists of three interrelated major building blocks, namely: Risk Register (RR, Bayesian Network (BN, and Project Time Networks (PTN for dynamic project monitoring. RR is used to list and to categorize identified project risks. PTN is utilized for modeling the relationship between project activities. BN is used to reflect the interdependence among risk factors and to bridge RR and PTN. A residential development project is chosen as a working example and the result shows that the proposed framework has been successfully applied. The specific model of the development project is also successfully developed and is used to monitor the project progress. It is shown in this study that the proposed BN-based model provides superior performance in terms of forecast accuracy compared to the extant models.

  13. An evolutionary cascade model for sauropod dinosaur gigantism--overview, update and tests.

    Science.gov (United States)

    Sander, P Martin

    2013-01-01

    Sauropod dinosaurs are a group of herbivorous dinosaurs which exceeded all other terrestrial vertebrates in mean and maximal body size. Sauropod dinosaurs were also the most successful and long-lived herbivorous tetrapod clade, but no abiological factors such as global environmental parameters conducive to their gigantism can be identified. These facts justify major efforts by evolutionary biologists and paleontologists to understand sauropods as living animals and to explain their evolutionary success and uniquely gigantic body size. Contributions to this research program have come from many fields and can be synthesized into a biological evolutionary cascade model of sauropod dinosaur gigantism (sauropod gigantism ECM). This review focuses on the sauropod gigantism ECM, providing an updated version based on the contributions to the PLoS ONE sauropod gigantism collection and on other very recent published evidence. The model consist of five separate evolutionary cascades ("Reproduction", "Feeding", "Head and neck", "Avian-style lung", and "Metabolism"). Each cascade starts with observed or inferred basal traits that either may be plesiomorphic or derived at the level of Sauropoda. Each trait confers hypothetical selective advantages which permit the evolution of the next trait. Feedback loops in the ECM consist of selective advantages originating from traits higher in the cascades but affecting lower traits. All cascades end in the trait "Very high body mass". Each cascade is linked to at least one other cascade. Important plesiomorphic traits of sauropod dinosaurs that entered the model were ovipary as well as no mastication of food. Important evolutionary innovations (derived traits) were an avian-style respiratory system and an elevated basal metabolic rate. Comparison with other tetrapod lineages identifies factors limiting body size.

  14. An evolutionary cascade model for sauropod dinosaur gigantism--overview, update and tests.

    Directory of Open Access Journals (Sweden)

    P Martin Sander

    Full Text Available Sauropod dinosaurs are a group of herbivorous dinosaurs which exceeded all other terrestrial vertebrates in mean and maximal body size. Sauropod dinosaurs were also the most successful and long-lived herbivorous tetrapod clade, but no abiological factors such as global environmental parameters conducive to their gigantism can be identified. These facts justify major efforts by evolutionary biologists and paleontologists to understand sauropods as living animals and to explain their evolutionary success and uniquely gigantic body size. Contributions to this research program have come from many fields and can be synthesized into a biological evolutionary cascade model of sauropod dinosaur gigantism (sauropod gigantism ECM. This review focuses on the sauropod gigantism ECM, providing an updated version based on the contributions to the PLoS ONE sauropod gigantism collection and on other very recent published evidence. The model consist of five separate evolutionary cascades ("Reproduction", "Feeding", "Head and neck", "Avian-style lung", and "Metabolism". Each cascade starts with observed or inferred basal traits that either may be plesiomorphic or derived at the level of Sauropoda. Each trait confers hypothetical selective advantages which permit the evolution of the next trait. Feedback loops in the ECM consist of selective advantages originating from traits higher in the cascades but affecting lower traits. All cascades end in the trait "Very high body mass". Each cascade is linked to at least one other cascade. Important plesiomorphic traits of sauropod dinosaurs that entered the model were ovipary as well as no mastication of food. Important evolutionary innovations (derived traits were an avian-style respiratory system and an elevated basal metabolic rate. Comparison with other tetrapod lineages identifies factors limiting body size.

  15. A Game-Dynamic Model of Gas Transportation Routes and Its Application to the Turkish Gas Market [Updated November 2003

    OpenAIRE

    Klaassen, G.; Matrosov, I.; Roehrl, R.A.; A.M. Tarasyev

    2003-01-01

    The purpose of this paper is to study an optimal structure of a system of international gas pipelines competing for a gas market. We develop a game-dynamic model of the operation of several interacting gas pipeline projects with project owners acting as players in the game. The model treats the projects' commercialization times major players' controls. Current quantities of gas supply are modeled as approximations of Nash equilibrium points in instantaneous "gas supply games", in which each p...

  16. BUSINESS PROCESS MODELLING FOR PROJECTS COSTS MANAGEMENT IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    PĂTRAŞCU AURELIA

    2014-05-01

    Full Text Available Using Information Technologies in organizations represents an evident progress for company, money economy, time economy and generates value for the organization. In this paper the author proposes to model the business processes for an organization that manages projects costs, because modelling is an important part of any software development process. Using software for projects costs management is essential because it allows the management of all operations according to the established parameters, the management of the projects groups, as well as the management of the projects and subprojects, at different complexity levels.

  17. Recent updates in the aerosol component of the C-IFS model run by ECMWF

    Science.gov (United States)

    Remy, Samuel; Boucher, Olivier; Hauglustaine, Didier; Kipling, Zak; Flemming, Johannes

    2017-04-01

    The Composition-Integrated Forecast System (C-IFS) is a global atmospheric composition forecasting tool, run by ECMWF within the framework of the Copernicus Atmospheric Monitoring Service (CAMS). The aerosol model of C-IFS is a simple bulk scheme that forecasts 5 species: dust, sea-salt, black carbon, organic matter and sulfate. Three bins represent the dust and sea-salt, for the super-coarse, coarse and fine mode of these species (Morcrette et al., 2009). This talk will present recent updates of the aerosol model, and also introduce forthcoming developments. It will also present the impact of these changes as measured scores against AERONET Aerosol Optical Depth (AOD) and Airbase PM10 observations. The next cycle of C-IFS will include a mass fixer, because the semi-Lagrangian advection scheme used in C-IFS is not mass-conservative. C-IFS now offers the possibility to emit biomass-burning aerosols at an injection height that is provided by a new version of the Global Fire Assimilation System (GFAS). Secondary Organic Aerosols (SOA) production will be scaled on non-biomass burning CO fluxes. This approach allows to represent the anthropogenic contribution to SOA production; it brought a notable improvement in the skill of the model, especially over Europe. Lastly, the emissions of SO2 are now provided by the MACCity inventory instead of and older version of the EDGAR dataset. The seasonal and yearly variability of SO2 emissions are better captured by the MACCity dataset. Upcoming developments of the aerosol model of C-IFS consist mainly in the implementation of a nitrate and ammonium module, with 2 bins (fine and coarse) for nitrate. Nitrate and ammonium sulfate particle formation from gaseous precursors is represented following Hauglustaine et al. (2014); formation of coarse nitrate over pre-existing sea-salt or dust particles is also represented. This extension of the forward model improved scores over heavily populated areas such as Europe, China and Eastern

  18. Validation of HEDR models. Hanford Environmental Dose Reconstruction Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid.

  19. Wake models developed during the Wind Shadow project

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, S.; Ott, S.; Pena, A.; Berg, J.; Nielsen, M.; Rathmann, O.; Joergensen, H.

    2011-11-15

    The Wind Shadow project has developed and validated improved models for determining the wakes losses, and thereby the array efficiency of very large, closely packed wind farms. The rationale behind the project has been that the existing software has been covering these types of wind farms poorly, both with respect to the densely packed turbines and the large fetches needed to describe the collective shadow effects of one farm to the next. Further the project has developed the necessary software for the use of the models. Guidelines with recommendations for the use of the models are included in the model deliverables. The project has been carried out as a collaborative project between Risoe DTU, DONG, Vattenfall, DNV and VESTAS, and it has been financed by energinet.dk grant no. 10086. (Author)

  20. A novel Q-based online model updating strategy and its application in statistical process control for rubber mixing

    Institute of Scientific and Technical Information of China (English)

    Chunying Zhang; Sun Chen; Fang Wu; Kai Song

    2015-01-01

    To overcome the large time-delay in measuring the hardness of mixed rubber, rheological parameters were used to predict the hardness. A novel Q-based model updating strategy was proposed as a universal platform to track time-varying properties. Using a few selected support samples to update the model, the strategy could dramat-ical y save the storage cost and overcome the adverse influence of low signal-to-noise ratio samples. Moreover, it could be applied to any statistical process monitoring system without drastic changes to them, which is practical for industrial practices. As examples, the Q-based strategy was integrated with three popular algorithms (partial least squares (PLS), recursive PLS (RPLS), and kernel PLS (KPLS)) to form novel regression ones, QPLS, QRPLS and QKPLS, respectively. The applications for predicting mixed rubber hardness on a large-scale tire plant in east China prove the theoretical considerations.

  1. Development of roughness updating based on artificial neural network in a river hydraulic model for flash flood forecasting

    Indian Academy of Sciences (India)

    J C Fu; M H Hsu; Y Duann

    2016-02-01

    Flood is the worst weather-related hazard in Taiwan because of steep terrain and storm. The tropical storm often results in disastrous flash flood. To provide reliable forecast of water stages in rivers is indispensable for proper actions in the emergency response during flood. The river hydraulic model based on dynamic wave theory using an implicit finite-difference method is developed with river roughness updating for flash flood forecast. The artificial neural network (ANN) is employed to update the roughness of rivers in accordance with the observed river stages at each time-step of the flood routing process. Several typhoon events at Tamsui River are utilized to evaluate the accuracy of flood forecasting. The results present the adaptive n-values of roughness for river hydraulic model that can provide a better flow state for subsequent forecasting at significant locations and longitudinal profiles along rivers.

  2. Food for thought: Overconfidence in model projections

    DEFF Research Database (Denmark)

    Brander, Keith; Neuheimer, Anna; Andersen, Ken Haste

    2013-01-01

    There is considerable public and political interest in the state of marine ecosystems and fisheries, but the reliability of some recent projections has been called into question. New information about declining fish stocks, loss of biodiversity, climate impacts, and management failure is frequently...

  3. Rapid Energy Modeling Workflow Demonstration Project

    Science.gov (United States)

    2014-01-01

    CONTACT Point of Contact Organization Phone E-Mail Role In Project John Sullivan Autodesk, Inc. 111 McInnis Parkway San Rafael, CA 94903...McInnis Parkway San Rafael, CA 94903 Phone: 703-827-7213 E-Mail: john.rittling@autodesk.com Collaborator Mark Frost Autodesk, Inc. 111 McInnis

  4. Basic models modeling resistance training: an update for basic scientists interested in study skeletal muscle hypertrophy.

    Science.gov (United States)

    Cholewa, Jason; Guimarães-Ferreira, Lucas; da Silva Teixeira, Tamiris; Naimo, Marshall Alan; Zhi, Xia; de Sá, Rafaele Bis Dal Ponte; Lodetti, Alice; Cardozo, Mayara Quadros; Zanchi, Nelo Eidy

    2014-09-01

    Human muscle hypertrophy brought about by voluntary exercise in laboratorial conditions is the most common way to study resistance exercise training, especially because of its reliability, stimulus control and easy application to resistance training exercise sessions at fitness centers. However, because of the complexity of blood factors and organs involved, invasive data is difficult to obtain in human exercise training studies due to the integration of several organs, including adipose tissue, liver, brain and skeletal muscle. In contrast, studying skeletal muscle remodeling in animal models are easier to perform as the organs can be easily obtained after euthanasia; however, not all models of resistance training in animals displays a robust capacity to hypertrophy the desired muscle. Moreover, some models of resistance training rely on voluntary effort, which complicates the results observed when animal models are employed since voluntary capacity is something theoretically impossible to measure in rodents. With this information in mind, we will review the modalities used to simulate resistance training in animals in order to present to investigators the benefits and risks of different animal models capable to provoke skeletal muscle hypertrophy. Our second objective is to help investigators analyze and select the experimental resistance training model that best promotes the research question and desired endpoints.

  5. Soil as natural heat resource for very shallow geothermal application: laboratory and test site updates from ITER Project

    Science.gov (United States)

    Di Sipio, Eloisa; Bertermann, David

    2017-04-01

    Nowadays renewable energy resources for heating/cooling residential and tertiary buildings and agricultural greenhouses are becoming increasingly important. In this framework, a possible, natural and valid alternative for thermal energy supply is represented by soils. In fact, since 1980 soils have been studied and used also as heat reservoir in geothermal applications, acting as a heat source (in winter) or sink (in summer) coupled mainly with heat pumps. Therefore, the knowledge of soil thermal properties and of heat and mass transfer in the soils plays an important role in modeling the performance, reliability and environmental impact in the short and long term of engineering applications. However, the soil thermal behavior varies with soil physical characteristics such as soil texture and water content. The available data are often scattered and incomplete for geothermal applications, especially very shallow geothermal systems (up to 10 m depths), so it is worthy of interest a better comprehension of how the different soil typologies (i.e. sand, loamy sand...) affect and are affected by the heat transfer exchange with very shallow geothermal installations (i.e. horizontal collector systems and special forms). Taking into consideration these premises, the ITER Project (Improving Thermal Efficiency of horizontal ground heat exchangers, http://iter-geo.eu/), funded by European Union, is here presented. An overview of physical-thermal properties variations under different moisture and load conditions for different mixtures of natural material is shown, based on laboratory and field test data. The test site, located in Eltersdorf, near Erlangen (Germany), consists of 5 trenches, filled in each with a different material, where 5 helix have been installed in an horizontal way instead of the traditional vertical option.

  6. Multilevel modelling of mechanical properties of textile composites: ITOOL Project

    NARCIS (Netherlands)

    Van Den Broucke, Bjorn; Drechsler, Klaus; Hanisch, Vera; Hartung, Daniel; Ivanov, Dimitry S.; Koissin, Vitaly E.; Lomov, Stepan V.; Middendorf, Peter

    2007-01-01

    The paper presents an overview of the multi-level modelling of textile composites in the ITOOL project, focusing on the models of textile reinforcements, which serve as a basis for micromechanical models of textile composites on the unit cell level. The modelling is performed using finite element an

  7. Update on Multi-Variable Parametric Cost Models for Ground and Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Henrichs, Todd; Luedtke, Alexander; West, Miranda

    2012-01-01

    Parametric cost models can be used by designers and project managers to perform relative cost comparisons between major architectural cost drivers and allow high-level design trades; enable cost-benefit analysis for technology development investment; and, provide a basis for estimating total project cost between related concepts. This paper reports on recent revisions and improvements to our ground telescope cost model and refinements of our understanding of space telescope cost models. One interesting observation is that while space telescopes are 50X to 100X more expensive than ground telescopes, their respective scaling relationships are similar. Another interesting speculation is that the role of technology development may be different between ground and space telescopes. For ground telescopes, the data indicates that technology development tends to reduce cost by approximately 50% every 20 years. But for space telescopes, there appears to be no such cost reduction because we do not tend to re-fly similar systems. Thus, instead of reducing cost, 20 years of technology development may be required to enable a doubling of space telescope capability. Other findings include: mass should not be used to estimate cost; spacecraft and science instrument costs account for approximately 50% of total mission cost; and, integration and testing accounts for only about 10% of total mission cost.

  8. Forecasting project schedule performance using probabilistic and deterministic models

    Directory of Open Access Journals (Sweden)

    S.A. Abdel Azeem

    2014-04-01

    Full Text Available Earned value management (EVM was originally developed for cost management and has not widely been used for forecasting project duration. In addition, EVM based formulas for cost or schedule forecasting are still deterministic and do not provide any information about the range of possible outcomes and the probability of meeting the project objectives. The objective of this paper is to develop three models to forecast the estimated duration at completion. Two of these models are deterministic; earned value (EV and earned schedule (ES models. The third model is a probabilistic model and developed based on Kalman filter algorithm and earned schedule management. Hence, the accuracies of the EV, ES and Kalman Filter Forecasting Model (KFFM through the different project periods will be assessed and compared with the other forecasting methods such as the Critical Path Method (CPM, which makes the time forecast at activity level by revising the actual reporting data for each activity at a certain data date. A case study project is used to validate the results of the three models. Hence, the best model is selected based on the lowest average percentage of error. The results showed that the KFFM developed in this study provides probabilistic prediction bounds of project duration at completion and can be applied through the different project periods with smaller errors than those observed in EV and ES forecasting models.

  9. An updated probabilistic seismic hazard assessment for Romania and comparison with the approach and outcomes of the SHARE Project

    OpenAIRE

    Pavel, Florin; Vacareanu, Radu; Douglas, John; Radulian, Micrea; Cioflan, Carmen; Barbat Barbat, Horia Alejandro

    2016-01-01

    The probabilistic seismic hazard analysis for Romania is revisited within the framework of the BIGSEES national research project (http://infp.infp.ro/bigsees/default.htm) financed by the Romanian Ministry of Education and Scientific Research in the period 2012-2016. The scope of this project is to provide a refined description of the seismic action for Romanian sites according to the requirements of Eurocode 8. To this aim, the seismicity of all the sources influencing the Romanian territory ...

  10. Recent updates in the aerosol model of C-IFS and their impact on skill scores

    Science.gov (United States)

    Remy, Samuel; Boucher, Olivier; Hauglustaine, Didier

    2016-04-01

    The Composition-Integrated Forecast System (C-IFS) is a global atmospheric composition forecasting tool, run by ECMWF within the framework of the Copernicus Atmospheric Monitoring Services (CAMS). The aerosol model of C-IFS is a simple bulk scheme that forecasts 5 species: dust, sea-salt, black carbon, organic matter and sulfates. Three bins represent the dust and sea-salt, for the super-coarse, coarse and fine mode of these species (Morcrette et al., 2009). This talk will present recent updates of the aerosol model, and also introduce coming upgrades. It will also present evaluations of these scores against AERONET observations. Next cycle of the C-IFS will include a mass fixer, because the semi-Lagrangian advection scheme used in C-IFS is not mass-conservative. This modification has a negligible impact for most species except for black carbon and organic matter; it allows to close the budgets between sources and sinks in the diagnostics. Dust emissions have been tuned to favor the emissions of large particles, which were under-represented. This brought an overall decrease of the burden of dust aerosol and improved scores especially close to source regions. The biomass-burning aerosol emissions are now emitted at an injection height that is provided by a new version of the Global Fire Assimilation System (GFAS). This brought a small increase in biomass burning aerosols, and a better representation of some large fire events. Lastly, SO2 emissions are now provided by the MACCity dataset instead of and older version of the EDGAR dataset. The seasonal and yearly variability of SO2 emissions are better captured by the MACCity dataset; the use of which brought significant improvements of the forecasts against observations. Upcoming upgrades of the aerosol model of C-IFS consist mainly in the overhaul of the representation of secondary aerosols. Secondary Organic Aerosols (SOA) production will be dynamically estimated by scaling them on CO fluxes. This approach has been

  11. Pastures to Prairies to Pools: An Update on Natural Resource Damages Settlement Projects at the Fernald Preserve - 13198

    Energy Technology Data Exchange (ETDEWEB)

    Powell, Jane [Fernald Preserve Site Manager, DOE Office of Legacy Management, Harrison, Ohio (United States); Schneider, Tom [Fernald Project Manager, Ohio Environmental Protection Agency, Dayton, Ohio (United States); Hertel, Bill [Project Manager, S.M. Stoller Corporation, Harrison, Ohio (United States); Homer, John [Environmental Scientist, S.M. Stoller Corporation, Harrison, Ohio (United States)

    2013-07-01

    The DOE Office of Legacy Management oversees implementation and monitoring of two ecological restoration projects at the Fernald Preserve, Fernald, Ohio, that are funded through a CERCLA natural resource damage settlement. Planning and implementation of on-property ecological restoration projects is one component of compensation for natural resource injury. The Paddys Run Tributary Project involves creation of vernal pool wetland habitat with adjacent forest restoration. The Triangle Area Project is a mesic tall-grass prairie establishment, similar to other efforts at the Fernald Preserve. The goal of the Fernald Natural Resource Trustees is to establish habitat for Ambystomatid salamander species, as well as grassland birds. Field implementation of these projects was completed in May 2012. Herbaceous cover and woody vegetation survival was determined in August and September 2012. Results show successful establishment of native vegetation. Additional monitoring will be needed to determine whether project goals have been met. As with the rest of the Fernald Preserve, ecological restoration has helped turn a DOE liability into a community asset. (authors)

  12. Building Context with Tumor Growth Modeling Projects in Differential Equations

    Science.gov (United States)

    Beier, Julie C.; Gevertz, Jana L.; Howard, Keith E.

    2015-01-01

    The use of modeling projects serves to integrate, reinforce, and extend student knowledge. Here we present two projects related to tumor growth appropriate for a first course in differential equations. They illustrate the use of problem-based learning to reinforce and extend course content via a writing or research experience. Here we discuss…

  13. Improving Project Management Using Formal Models and Architectures

    Science.gov (United States)

    Kahn, Theodore; Sturken, Ian

    2011-01-01

    This talk discusses the advantages formal modeling and architecture brings to project management. These emerging technologies have both great potential and challenges for improving information available for decision-making. The presentation covers standards, tools and cultural issues needing consideration, and includes lessons learned from projects the presenters have worked on.

  14. Modelling precipitation extremes in the Czech Republic: update of intensity–duration–frequency curves

    Directory of Open Access Journals (Sweden)

    Michal Fusek

    2016-11-01

    Full Text Available Precipitation records from six stations of the Czech Hydrometeorological Institute were subject to statistical analysis with the objectives of updating the intensity–duration–frequency (IDF curves, by applying extreme value distributions, and comparing the updated curves against those produced by an empirical procedure in 1958. Another objective was to investigate differences between both sets of curves, which could be explained by such factors as different measuring instruments, measuring stations altitudes and data analysis methods. It has been shown that the differences between the two sets of IDF curves are significantly influenced by the chosen method of data analysis.

  15. Updated U.S. Geothermal Supply Characterization and Representation for Market Penetration Model Input

    Energy Technology Data Exchange (ETDEWEB)

    Augustine, C.

    2011-10-01

    The U.S. Department of Energy (DOE) Geothermal Technologies Program (GTP) tasked the National Renewable Energy Laboratory (NREL) with conducting the annual geothermal supply curve update. This report documents the approach taken to identify geothermal resources, determine the electrical producing potential of these resources, and estimate the levelized cost of electricity (LCOE), capital costs, and operating and maintenance costs from these geothermal resources at present and future timeframes under various GTP funding levels. Finally, this report discusses the resulting supply curve representation and how improvements can be made to future supply curve updates.

  16. Updates on Modeling the Water Cycle with the NASA Ames Mars Global Climate Model

    Science.gov (United States)

    Kahre, M. A.; Haberle, R. M.; Hollingsworth, J. L.; Montmessin, F.; Brecht, A. S.; Urata, R.; Klassen, D. R.; Wolff, M. J.

    2017-01-01

    Global Circulation Models (GCMs) have made steady progress in simulating the current Mars water cycle. It is now widely recognized that clouds are a critical component that can significantly affect the nature of the simulated water cycle. Two processes in particular are key to implementing clouds in a GCM: the microphysical processes of formation and dissipation, and their radiative effects on heating/ cooling rates. Together, these processes alter the thermal structure, change the dynamics, and regulate inter-hemispheric transport. We have made considerable progress representing these processes in the NASA Ames GCM, particularly in the presence of radiatively active water ice clouds. We present the current state of our group's water cycle modeling efforts, show results from selected simulations, highlight some of the issues, and discuss avenues for further investigation.­

  17. LITHO1.0 - An Updated Crust and Lithospheric Model of the Earth Developed Using Multiple Data Constraints

    Science.gov (United States)

    Pasyanos, M. E.; Masters, G.; Laske, G.; Ma, Z.

    2012-12-01

    Models such as CRUST2.0 (Bassin et al., 2000) have proven very useful to many seismic studies on regional, continental, and global scales. We have developed an updated, higher resolution model called LITHO1.0 that extends deeper to include the lithospheric lid, and includes mantle anisotropy, potentially making it more useful for a wider variety of applications. The model is evolving away from the crustal types strongly used in CRUST5.1 (Mooney et al., 1998) to a more data-driven model. This is accomplished by performing a targeted grid search with multiple data inputs. We seek to find the most plausible model which is able to fit multiple constraints, including updated sediment and crustal thickness models, upper mantle velocities derived from travel times, and surface wave dispersion. The latter comes from a new, very large, global surface wave dataset built using a new, efficient measurement technique that employs cluster analysis (Ma et al., 2012), and includes the group and phase velocities of both Love and Rayleigh waves. We will discuss datasets and methodology, highlight significant features of the model, and provide detailed information on the availability of the model in various formats.

  18. QMU in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ACTA and Sandia National Laboratories propose to quantify and propagate substructure modeling uncertainty for reduced-order substructure models to higher levels of...

  19. QMU in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ACTA and Sandia National Laboratories propose to quantify and propagate substructure modeling uncertainty for reduced-order substructure models to higher levels of...

  20. On-line updating Gaussian mixture model for aircraft wing spar damage evaluation under time-varying boundary condition

    Science.gov (United States)

    Qiu, Lei; Yuan, Shenfang; Chang, Fu-Kuo; Bao, Qiao; Mei, Hanfei

    2014-12-01

    Structural health monitoring technology for aerospace structures has gradually turned from fundamental research to practical implementations. However, real aerospace structures work under time-varying conditions that introduce uncertainties to signal features that are extracted from sensor signals, giving rise to difficulty in reliably evaluating the damage. This paper proposes an online updating Gaussian Mixture Model (GMM)-based damage evaluation method to improve damage evaluation reliability under time-varying conditions. In this method, Lamb-wave-signal variation indexes and principle component analysis (PCA) are adopted to obtain the signal features. A baseline GMM is constructed on the signal features acquired under time-varying conditions when the structure is in a healthy state. By adopting the online updating mechanism based on a moving feature sample set and inner probability structural reconstruction, the probability structures of the GMM can be updated over time with new monitoring signal features to track the damage progress online continuously under time-varying conditions. This method can be implemented without any physical model of damage or structure. A real aircraft wing spar, which is an important load-bearing structure of an aircraft, is adopted to validate the proposed method. The validation results show that the method is effective for edge crack growth monitoring of the wing spar bolts holes under the time-varying changes in the tightness degree of the bolts.

  1. A Systematic Approach to Modelling Change Processes in Construction Projects

    Directory of Open Access Journals (Sweden)

    Ibrahim Motawa

    2012-11-01

    Full Text Available Modelling change processes within construction projects isessential to implement changes efficiently. Incomplete informationon the project variables at the early stages of projects leads toinadequate knowledge of future states and imprecision arisingfrom ambiguity in project parameters. This lack of knowledge isconsidered among the main source of changes in construction.Change identification and evaluation, in addition to predictingits impacts on project parameters, can help in minimising thedisruptive effects of changes. This paper presents a systematicapproach to modelling change process within construction projectsthat helps improve change identification and evaluation. Theapproach represents the key decisions required to implementchanges. The requirements of an effective change processare presented first. The variables defined for efficient changeassessment and diagnosis are then presented. Assessmentof construction changes requires an analysis for the projectcharacteristics that lead to change and also analysis of therelationship between the change causes and effects. The paperconcludes that, at the early stages of a project, projects with a highlikelihood of change occurrence should have a control mechanismover the project characteristics that have high influence on theproject. It also concludes, for the relationship between changecauses and effects, the multiple causes of change should bemodelled in a way to enable evaluating the change effects moreaccurately. The proposed approach is the framework for tacklingsuch conclusions and can be used for evaluating change casesdepending on the available information at the early stages ofconstruction projects.

  2. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to develop a physics-based pneumatic hammer instability model that accurately predicts the stability of hydrostatic bearings...

  3. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  4. Final Project Report Load Modeling Transmission Research

    Energy Technology Data Exchange (ETDEWEB)

    Lesieutre, Bernard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bravo, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yinger, Robert [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chassin, Dave [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Huang, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lu, Ning [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hiskens, Ian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Venkataramanan, Giri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-03-31

    The research presented in this report primarily focuses on improving power system load models to better represent their impact on system behavior. The previous standard load model fails to capture the delayed voltage recovery events that are observed in the Southwest and elsewhere. These events are attributed to stalled air conditioner units after a fault. To gain a better understanding of their role in these events and to guide modeling efforts, typical air conditioner units were testing in laboratories. Using data obtained from these extensive tests, new load models were developed to match air conditioner behavior. An air conditioner model is incorporated in the new WECC composite load model. These models are used in dynamic studies of the West and can impact power transfer limits for California. Unit-level and systemlevel solutions are proposed as potential solutions to the delayed voltage recovery problem.

  5. Reliability analysis and updating of deteriorating systems with subset simulation

    DEFF Research Database (Denmark)

    Schneider, Ronald; Thöns, Sebastian; Straub, Daniel

    2017-01-01

    Bayesian updating of the system deterioration model. The updated system reliability is then obtained through coupling the updated deterioration model with a probabilistic structural model. The underlying high-dimensional structural reliability problems are solved using subset simulation, which...

  6. Numerical modeling in photonic crystals integrated technology: the COPERNICUS Project

    DEFF Research Database (Denmark)

    Malaguti, Stefania; Armaroli, Andrea; Bellanca, Gaetano

    2011-01-01

    Photonic crystals will play a fundamental role in the future of optical communications. The relevance of the numerical modeling for the success of this technology is assessed by using some examples concerning the experience of the COPERNICUS Project.......Photonic crystals will play a fundamental role in the future of optical communications. The relevance of the numerical modeling for the success of this technology is assessed by using some examples concerning the experience of the COPERNICUS Project....

  7. Numerical modeling in photonic crystals integrated technology: the COPERNICUS Project

    DEFF Research Database (Denmark)

    Malaguti, Stefania; Armaroli, Andrea; Bellanca, Gaetano

    2011-01-01

    Photonic crystals will play a fundamental role in the future of optical communications. The relevance of the numerical modeling for the success of this technology is assessed by using some examples concerning the experience of the COPERNICUS Project.......Photonic crystals will play a fundamental role in the future of optical communications. The relevance of the numerical modeling for the success of this technology is assessed by using some examples concerning the experience of the COPERNICUS Project....

  8. On Helical Projection and Its Application in Screw Modeling

    Directory of Open Access Journals (Sweden)

    Riliang Liu

    2014-04-01

    Full Text Available As helical surfaces, in their many and varied forms, are finding more and more applications in engineering, new approaches to their efficient design and manufacture are desired. To that end, the helical projection method that uses curvilinear projection lines to map a space object to a plane is examined in this paper, focusing on its mathematical model and characteristics in terms of graphical representation of helical objects. A number of interesting projective properties are identified in regard to straight lines, curves, and planes, and then the method is further investigated with respect to screws. The result shows that the helical projection of a cylindrical screw turns out to be a Jordan curve, which is determined by the screw's axial profile and number of flights. Based on the projection theory, a practical approach to the modeling of screws and helical surfaces is proposed and illustrated with examples, and its possible application in screw manufacturing is discussed.

  9. INFORMATIONAL-ANALYTIC MODEL OF REGIONAL PROJECT PORTFOLIO FORMING

    Directory of Open Access Journals (Sweden)

    I. A. Osaulenko

    2016-01-01

    Full Text Available The article is devoted to the problem of regional project portfolio management in context of interaction of the regional development’s motive forces interaction. The features of innovation development on the regional level and their influence on the portfolio forming process considered. An existing approaches for portfolio modelling and formal criterion of the projects selection analyzed. At the same time the organization of key subjects of regional development interaction described. The aim of the article is investigation of informational aspects of project selection in process of the main development’s motive forces interaction and analytic model of portfolio filling validation. At that an inclination of stakeholders to reach a consensus taking into account. The Triple Helix conception using for concrete definition of the functions of the regional development’s motive forces. Asserted, that any component of innovation triad «science–business–government» can be an initiator of regional project, but it need to support two another components. Non-power interaction theory using for investigation of subjects interrelations in process of joint activity proposed. One of the key concept of the theory is information distance. It characterizes inclination of the parties to reach a consensus based on statistics. Projections of information distance onto directions of development axes using for more accurate definition of mutual positions in the all lines of development proposed. Another important parameter of the model which has an influence on the project support is awareness of stakeholders about it. Formalized description of project in the form of fast set of parameters proposes to use for determination of the awareness. The weighting coefficients for each parameter by expert way. Simultaneously the precision of the each parameter setting for all presented projects determines. On the base of appointed values of information distances and

  10. Parameter selection and stochastic model updating using perturbation methods with parameter weighting matrix assignment

    Science.gov (United States)

    Abu Husain, Nurulakmar; Haddad Khodaparast, Hamed; Ouyang, Huajiang

    2012-10-01

    Parameterisation in stochastic problems is a major issue in real applications. In addition, complexity of test structures (for example, those assembled through laser spot welds) is another challenge. The objective of this paper is two-fold: (1) stochastic uncertainty in two sets of different structures (i.e., simple flat plates, and more complicated formed structures) is investigated to observe how updating can be adequately performed using the perturbation method, and (2) stochastic uncertainty in a set of welded structures is studied by using two parameter weighting matrix approaches. Different combinations of parameters are explored in the first part; it is found that geometrical features alone cannot converge the predicted outputs to the measured counterparts, hence material properties must be included in the updating process. In the second part, statistical properties of experimental data are considered and updating parameters are treated as random variables. Two weighting approaches are compared; results from one of the approaches are in very good agreement with the experimental data and excellent correlation between the predicted and measured covariances of the outputs is achieved. It is concluded that proper selection of parameters in solving stochastic updating problems is crucial. Furthermore, appropriate weighting must be used in order to obtain excellent convergence between the predicted mean natural frequencies and their measured data.

  11. Leaf Area Index in Earth System Models: evaluation and projections

    Directory of Open Access Journals (Sweden)

    N. Mahowald

    2015-04-01

    Full Text Available The amount of leaves in a plant canopy (measured as leaf area index, LAI modulates key land–atmosphere interactions, including the exchange of energy, moisture, carbon dioxide (CO2, and other trace gases, and is therefore an essential variable in predicting terrestrial carbon, water, and energy fluxes. The latest generation of Earth system models (ESMs simulate LAI, as well as provide projections of LAI in the future to improve simulations of biophysical and biogeochemical processes, and for use in climate impact studies. Here we use satellite measurements of LAI to answer the following questions: (1 are the models accurately simulating the mean LAI spatial distribution? (2 Are the models accurately simulating the seasonal cycle in LAI? (3 Are the models correctly simulating the processes driving interannual variability in the current climate? And finally based on this analysis, (4 can we reduce the uncertainty in future projections of LAI by using each model's skill in the current climate? Overall, models are able to capture some of the main characteristics of the LAI mean and seasonal cycle, but all of the models can be improved in one or more regions. Comparison of the modeled and observed interannual variability in the current climate suggested that in high latitudes the models may overpredict increases in LAI based on warming temperature, while in the tropics the models may overpredict the negative impacts of warming temperature on LAI. We expect, however, larger uncertainties in observational estimates of interannual LAI compared to estimates of seasonal or mean LAI. Future projections of LAI by the ESMs are largely optimistic, with only limited regions seeing reductions in LAI. Future projections of LAI in the models are quite different, and are sensitive to climate model projections of precipitation. They also strongly depend on the amount of carbon dioxide fertilization in high latitudes. Based on comparisons between model simulated

  12. Process simulation and parametric modeling for strategic project management

    CERN Document Server

    Morales, Peter J

    2013-01-01

    Process Simulation and Parametric Modeling for Strategic Project Management will offer CIOs, CTOs and Software Development Managers, IT Graduate Students an introduction to a set of technologies that will help them understand how to better plan software development projects, manage risk and have better insight into the complexities of the software development process.A novel methodology will be introduced that allows a software development manager to better plan and access risks in the early planning of a project.  By providing a better model for early software development estimation and softw

  13. Variable Fidelity Aeroelastic Toolkit - Structural Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is a methodology to incorporate variable fidelity structural models into steady and unsteady aeroelastic and aeroservoelastic analyses in...

  14. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  15. A Regional Climate Model Evaluation System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Develop a packaged data management infrastructure for the comparison of generated climate model output to existing observational datasets that includes capabilities...

  16. Geometry control of long-span continuous girder concrete bridge during construction through finite element model updating

    Science.gov (United States)

    Wu, Jie; Yan, Quan-sheng; Li, Jian; Hu, Min-yi

    2016-04-01

    In bridge construction, geometry control is critical to ensure that the final constructed bridge has the consistent shape as design. A common method is by predicting the deflections of the bridge during each construction phase through the associated finite element models. Therefore, the cambers of the bridge during different construction phases can be determined beforehand. These finite element models are mostly based on the design drawings and nominal material properties. However, the accuracy of these bridge models can be large due to significant uncertainties of the actual properties of the materials used in construction. Therefore, the predicted cambers may not be accurate to ensure agreement of bridge geometry with design, especially for long-span bridges. In this paper, an improved geometry control method is described, which incorporates finite element (FE) model updating during the construction process based on measured bridge deflections. A method based on the Kriging model and Latin hypercube sampling is proposed to perform the FE model updating due to its simplicity and efficiency. The proposed method has been applied to a long-span continuous girder concrete bridge during its construction. Results show that the method is effective in reducing construction error and ensuring the accuracy of the geometry of the final constructed bridge.

  17. Systemic change increases model projection uncertainty

    NARCIS (Netherlands)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floortje; Faaij, André

    2014-01-01

    Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship re

  18. A MODEL FOR ALIGNING SOFTWARE PROJECTS REQUIREMENTS WITH PROJECT TEAM MEMBERS REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Robert Hans

    2013-02-01

    Full Text Available The fast-paced, dynamic environment within which information and communication technology (ICT projects are run as well as ICT professionals’ constant changing requirements present a challenge for project managers in terms of aligning projects’ requirements with project team members’ requirements. This research paper purports that if projects’ requirements are properly aligned with team members’ requirements, then this will result in a balanced decision approach. Moreover, such an alignment will result in the realization of employee’s needs as well as meeting project’s needs. This paper presents a Project’s requirements and project Team members’ requirements (PrTr alignment model and argues that a balanced decision which meets both software project’s requirements and team members’ requirements can be achieved through the application of the PrTr alignment model.

  19. Proposal of New PRORISK Model for GSD Projects

    Directory of Open Access Journals (Sweden)

    M. Rizwan Jameel Qureshi

    2015-05-01

    Full Text Available The level of complexity and risks associated with software are increasing exponentially because of competing environment especially in geographically distributed projects. Global software development (GSD face challenges like distance, communication and coordination challenges. The coordination and communication challenges are the main causes of failure in GSD. Project Oriented Risk Management (PRORISK is one of the models to address the importance of risk management and project management processes in standard software projects. However, existing model is not proposed to handle GSD associated risks. This warrants the proposal of new PRORISK model to manage the risks of GSD. Survey is used as a research design to validate the proposed solution. We anticipate that the proposed solution will help the software companies to cater the risks associated with GSD.

  20. Service Oriented Spacecraft Modeling Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The I-Logix team proposes development of the Service Oriented Spacecraft Modeling Environment (SOSME) to allow faster and more effective spacecraft system design...

  1. Generalized Reduced Order Model Generation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — M4 Engineering proposes to develop a generalized reduced order model generation method. This method will allow for creation of reduced order aeroservoelastic state...

  2. Computational Models for Nonlinear Aeroelastic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate a new and efficient computational method of modeling nonlinear aeroelastic systems. The...

  3. Multiscale Modeling of Hall Thrusters Project

    Data.gov (United States)

    National Aeronautics and Space Administration — New multiscale modeling capability for analyzing advanced Hall thrusters is proposed. This technology offers NASA the ability to reduce development effort of new...

  4. Crew Autonomy Measures and Models (CAMM) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — SA Technologies will employ a two-part solution including measures and models for evaluating crew autonomy in exploratory space missions. An integrated measurement...

  5. The geothermal energy potential in Denmark - updating the database and new structural and thermal models

    Science.gov (United States)

    Nielsen, Lars Henrik; Sparre Andersen, Morten; Balling, Niels; Boldreel, Lars Ole; Fuchs, Sven; Leth Hjuler, Morten; Kristensen, Lars; Mathiesen, Anders; Olivarius, Mette; Weibel, Rikke

    2017-04-01

    Knowledge of structural, hydraulic and thermal conditions of the subsurface is fundamental for the planning and use of hydrothermal energy. In the framework of a project under the Danish Research program 'Sustainable Energy and Environment' funded by the 'Danish Agency for Science, Technology and Innovation', fundamental geological and geophysical information of importance for the utilization of geothermal energy in Denmark was compiled, analyzed and re-interpreted. A 3D geological model was constructed and used as structural basis for the development of a national subsurface temperature model. In that frame, all available reflection seismic data were interpreted, quality controlled and integrated to improve the regional structural understanding. The analyses and interpretation of available relevant data (i.e. old and new seismic profiles, core and well-log data, literature data) and a new time-depth conversion allowed a consistent correlation of seismic surfaces for whole Denmark and across tectonic features. On this basis, new topologically consistent depth and thickness maps for 16 geological units from the top pre-Zechstein to the surface were drawn. A new 3D structural geological model was developed with special emphasis on potential geothermal reservoirs. The interpretation of petrophysical data (core data and well-logs) allows to evaluate the hydraulic and thermal properties of potential geothermal reservoirs and to develop a parameterized numerical 3D conductive subsurface temperature model. Reservoir properties and quality were estimated by integrating petrography and diagenesis studies with porosity-permeability data. Detailed interpretation of the reservoir quality of the geological formations was made by estimating net reservoir sandstone thickness based on well-log analysis, determination of mineralogy including sediment provenance analysis, and burial history data. New local surface heat-flow values (range: 64-84 mW/m2) were determined for the Danish

  6. Discovering Plate Boundaries Update: Builds Content Knowledge and Models Inquiry-based Learning

    Science.gov (United States)

    Sawyer, D. S.; Pringle, M. S.; Henning, A. T.

    2009-12-01

    Discovering Plate Boundaries (DPB) is a jigsaw-structured classroom exercise in which students explore the fundamental datasets from which plate boundary processes were discovered. The exercise has been widely used in the past ten years as a classroom activity for students in fifth grade through high school, and for Earth Science major and general education courses in college. Perhaps more importantly, the exercise has been used extensively for professional development of in-service and pre-service K-12 science teachers, where it simultaneously builds content knowledge in plate boundary processes (including natural hazards), models an effective data-rich, inquiry-based pedagogy, and provides a set of lesson plans and materials which teachers can port directly into their own classroom (see Pringle, et al, this session for a specific example). DPB is based on 4 “specialty” data maps, 1) earthquake locations, 2) modern volcanic activity, 3) seafloor age, and 4) topography and bathymetry, plus a fifth map of (undifferentiated) plate boundary locations. The jigsaw is structured so that students are first split into one of the four “specialties,” then re-arranged into groups with each of the four specialties to describe the boundaries of a particular plate. We have taken the original DPB materials, used the latest digital data sets to update all the basic maps, and expanded the opportunities for further student and teacher learning. The earthquake maps now cover the recent period including the deadly Banda Aceh event. The topography/bathymetry map now has global coverage and uses ice-free elevations, which can, for example, extend to further inquiry about mantle viscosity and loading processes (why are significant portions of the bedrock surface of Greenland and Antarctica below sea level?). The volcanic activity map now differentiates volcano type and primary volcanic lithology, allowing a more elaborate understanding of volcanism at different plate boundaries

  7. Land Boundary Conditions for the Goddard Earth Observing System Model Version 5 (GEOS-5) Climate Modeling System: Recent Updates and Data File Descriptions

    Science.gov (United States)

    Mahanama, Sarith P.; Koster, Randal D.; Walker, Gregory K.; Takacs, Lawrence L.; Reichle, Rolf H.; De Lannoy, Gabrielle; Liu, Qing; Zhao, Bin; Suarez, Max J.

    2015-01-01

    The Earths land surface boundary conditions in the Goddard Earth Observing System version 5 (GEOS-5) modeling system were updated using recent high spatial and temporal resolution global data products. The updates include: (i) construction of a global 10-arcsec land-ocean lakes-ice mask; (ii) incorporation of a 10-arcsec Globcover 2009 land cover dataset; (iii) implementation of Level 12 Pfafstetter hydrologic catchments; (iv) use of hybridized SRTM global topography data; (v) construction of the HWSDv1.21-STATSGO2 merged global 30 arc second soil mineral and carbon data in conjunction with a highly-refined soil classification system; (vi) production of diffuse visible and near-infrared 8-day MODIS albedo climatologies at 30-arcsec from the period 2001-2011; and (vii) production of the GEOLAND2 and MODIS merged 8-day LAI climatology at 30-arcsec for GEOS-5. The global data sets were preprocessed and used to construct global raster data files for the software (mkCatchParam) that computes parameters on catchment-tiles for various atmospheric grids. The updates also include a few bug fixes in mkCatchParam, as well as changes (improvements in algorithms, etc.) to mkCatchParam that allow it to produce tile-space parameters efficiently for high resolution AGCM grids. The update process also includes the construction of data files describing the vegetation type fractions, soil background albedo, nitrogen deposition and mean annual 2m air temperature to be used with the future Catchment CN model and the global stream channel network to be used with the future global runoff routing model. This report provides detailed descriptions of the data production process and data file format of each updated data set.

  8. Combining Multi-Source Remotely Sensed Data and a Process-Based Model for Forest Aboveground Biomass Updating.

    Science.gov (United States)

    Lu, Xiaoman; Zheng, Guang; Miller, Colton; Alvarado, Ernesto

    2017-09-08

    Monitoring and understanding the spatio-temporal variations of forest aboveground biomass (AGB) is a key basis to quantitatively assess the carbon sequestration capacity of a forest ecosystem. To map and update forest AGB in the Greater Khingan Mountains (GKM) of China, this work proposes a physical-based approach. Based on the baseline forest AGB from Landsat Enhanced Thematic Mapper Plus (ETM+) images in 2008, we dynamically updated the annual forest AGB from 2009 to 2012 by adding the annual AGB increment (ABI) obtained from the simulated daily and annual net primary productivity (NPP) using the Boreal Ecosystem Productivity Simulator (BEPS) model. The 2012 result was validated by both field- and aerial laser scanning (ALS)-based AGBs. The predicted forest AGB for 2012 estimated from the process-based model can explain 31% (n = 35, p < 0.05, RMSE = 2.20 kg/m²) and 85% (n = 100, p < 0.01, RMSE = 1.71 kg/m²) of variation in field- and ALS-based forest AGBs, respectively. However, due to the saturation of optical remote sensing-based spectral signals and contribution of understory vegetation, the BEPS-based AGB tended to underestimate/overestimate the AGB for dense/sparse forests. Generally, our results showed that the remotely sensed forest AGB estimates could serve as the initial carbon pool to parameterize the process-based model for NPP simulation, and the combination of the baseline forest AGB and BEPS model could effectively update the spatiotemporal distribution of forest AGB.

  9. Projected increase in total knee arthroplasty in the United States - an alternative projection model.

    Science.gov (United States)

    Inacio, M C S; Paxton, E W; Graves, S E; Namba, R S; Nemes, S

    2017-08-08

    The purpose of our study was to estimate the future incidence rate (IR) and volume of primary total knee arthroplasty (TKA) in the United States from 2015 to 2050 using a conservative projection model that assumes a maximum IR of procedures. Furthermore, our study compared these projections to a model assuming exponential growth, as done in previous studies, for illustrative purposes. A population based epidemiological study was conducted using data from US National Inpatient Sample (NIS) and Census Bureau. Primary TKA procedures performed between 1993 and 2012 were identified. The IR, 95% confidence intervals (CI), or prediction intervals (PI) of TKA per 100,000 US citizens over the age of 40 years were calculated. The estimated IR was used as the outcome of a regression modelling with a logistic regression (i.e., conservative model) and Poisson regression equation (i.e., exponential growth model). Logistic regression modelling suggests the IR of TKA is expected to increase 69% by 2050 compared to 2012, from 429 (95%CI 374-453) procedures/100,000 in 2012 to 725 (95%PI 121-1041) in 2050. This translates into a 143% projected increase in TKA volume. Using the Poisson model, the IR in 2050 was projected to increase 565%, to 2854 (95%CI 2278-4004) procedures/100,000 IR, which is an 855% projected increase in volume compared to 2012. Even after using a conservative projection approach, the number of TKAs in the US, which already has the highest IR of knee arthroplasty in the world, is expected to increase 143% by 2050. Copyright © 2017 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  10. WCRF-AICR continuous update project: Systematic literature review of prospective studies on circulating 25-hydroxyvitamin D and kidney cancer risk.

    Science.gov (United States)

    Darling, Andrea L; Abar, Leila; Norat, Teresa

    2016-11-01

    As part of the World Cancer Research Fund/American Institute for Cancer Research (WCRF-AICR) Continuous Update project we performed a systematic review of prospective studies with data for both measured or predicted 25(OH)D concentration and kidney cancer risk. PubMed was searched from inception until 1st December 2014 using WCRF/AICR search criteria. The search identified 4 papers suitable for inclusion, reporting data from three prospective cohort studies, one nested case-control study and the Vitamin D Pooling Project of Rarer Cancers (8 nested case-control studies). Summary effect sizes could not be computed due to incompatibility between studies. All studies except the Pooling Project suggested a reduced risk of kidney cancer by 19-40% with higher or adequate vitamin D status,. However, these estimates only reached statistical significance in one cohort (Copenhagen City Heart Study; CCHS, HR=0.75 (0.58 to 0.96)). In the European Prospective Investigation into Cancer and Nutrition (EPIC) study, a significant reduction in risk by 18% was seen when using combined matched and non-matched controls OR=0.82 (0.68, 0.99), but not when using only matched controls (OR=0.81 (0.65, 1.00). Pooled (but not single cohort) data for predicted 25(OH)D from the Nurses' Health Study (NHS) and Health Professionals Follow-up Study (HPFS) showed a statistically significant reduction in risk by 37% (HR=0.63 (0.44, 0.91)). There is no clear explanation for the inconsistency of results between studies, but reasons may include prevalence of smoking or other study population characteristics. Methods for assessing circulating 25(OH)D levels and control for confounders including seasonality or hypertension do not seem explanatory. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Applying a Hybrid MCDM Model for Six Sigma Project Selection

    Directory of Open Access Journals (Sweden)

    Fu-Kwun Wang

    2014-01-01

    Full Text Available Six Sigma is a project-driven methodology; the projects that provide the maximum financial benefits and other impacts to the organization must be prioritized. Project selection (PS is a type of multiple criteria decision making (MCDM problem. In this study, we present a hybrid MCDM model combining the decision-making trial and evaluation laboratory (DEMATEL technique, analytic network process (ANP, and the VIKOR method to evaluate and improve Six Sigma projects for reducing performance gaps in each criterion and dimension. We consider the film printing industry of Taiwan as an empirical case. The results show that our study not only can use the best project selection, but can also be used to analyze the gaps between existing performance values and aspiration levels for improving the gaps in each dimension and criterion based on the influential network relation map.

  12. Ensemble of regional climate model projections for Ireland

    Science.gov (United States)

    Nolan, Paul; McGrath, Ray

    2016-04-01

    The method of Regional Climate Modelling (RCM) was employed to assess the impacts of a warming climate on the mid-21st-century climate of Ireland. The RCM simulations were run at high spatial resolution, up to 4 km, thus allowing a better evaluation of the local effects of climate change. Simulations were run for a reference period 1981-2000 and future period 2041-2060. Differences between the two periods provide a measure of climate change. To address the issue of uncertainty, a multi-model ensemble approach was employed. Specifically, the future climate of Ireland was simulated using three different RCMs, driven by four Global Climate Models (GCMs). To account for the uncertainty in future emissions, a number of SRES (B1, A1B, A2) and RCP (4.5, 8.5) emission scenarios were used to simulate the future climate. Through the ensemble approach, the uncertainty in the RCM projections can be partially quantified, thus providing a measure of confidence in the predictions. In addition, likelihood values can be assigned to the projections. The RCMs used in this work are the COnsortium for Small-scale MOdeling-Climate Limited-area Modelling (COSMO-CLM, versions 3 and 4) model and the Weather Research and Forecasting (WRF) model. The GCMs used are the Max Planck Institute's ECHAM5, the UK Met Office's HadGEM2-ES, the CGCM3.1 model from the Canadian Centre for Climate Modelling and the EC-Earth consortium GCM. The projections for mid-century indicate an increase of 1-1.6°C in mean annual temperatures, with the largest increases seen in the east of the country. Warming is enhanced for the extremes (i.e. hot or cold days), with the warmest 5% of daily maximum summer temperatures projected to increase by 0.7-2.6°C. The coldest 5% of night-time temperatures in winter are projected to rise by 1.1-3.1°C. Averaged over the whole country, the number of frost days is projected to decrease by over 50%. The projections indicate an average increase in the length of the growing season

  13. Benefits of a Cohort Survival Projection Model

    Science.gov (United States)

    Suslow, Sidney

    1977-01-01

    A cohort survival model of student attendance provides primary and secondary benefits in accurate student information not before available. At Berkeley the computerized Cohort Survival History File, in use for two years, has been successful in assessing various aspects of students' academic behavior and student flow problems. (Editor/LBH)

  14. Modeling Change in Project Duration and Completion

    DEFF Research Database (Denmark)

    Wiltshire, Travis; Butner, Jonathan E.; Pirtle, Zachary

    2017-01-01

    In complex work domains and organizations, understanding scheduleing dynamics can ensure objectives are reached and delays are mitigated. In the current paper, we examine the scheduling dynamics for NASA’s Exploration Flight Test 1 (EFT-1) activities. For this examination, we specifically modeled...

  15. Probability of Inconsistencies in Theory Revision: A multi-agent model for updating logically interconnected beliefs under bounded confidence

    CERN Document Server

    Wenmackers, Sylvia; Douven, Igor

    2014-01-01

    We present a model for studying communities of epistemically interacting agents who update their belief states by averaging (in a specified way) the belief states of other agents in the community. The agents in our model have a rich belief state, involving multiple independent issues which are interrelated in such a way that they form a theory of the world. Our main goal is to calculate the probability for an agent to end up in an inconsistent belief state due to updating (in the given way). To that end, an analytical expression is given and evaluated numerically, both exactly and using statistical sampling. It is shown that, under the assumptions of our model, an agent always has a probability of less than 2% of ending up in an inconsistent belief state. Moreover, this probability can be made arbitrarily small by increasing the number of independent issues the agents have to judge or by increasing the group size. A real-world situation to which this model applies is a group of experts participating in a Delp...

  16. Towards an Intelligent Project Based Organization Business Model

    Directory of Open Access Journals (Sweden)

    Alami Marrouni Oussama

    2013-01-01

    Full Text Available Global economy is undergoing a recession phase that had made competition tougher and imposed new business framework. Businesses have to shift from the classical management approaches to an Intelligent Project Based Organization (IPBO model that provides flexibility and agility. IPBO model is intended to reinforce the proven advantages of Project Based Organization (PBO by the use of suitable Enterprise Intelligence (EI Systems. The goal of this paper is to propose an IPBO model that combines benefits of PBO and EI and helps overcoming their pitfalls

  17. The Geoengineering Model Intercomparison Project (GeoMIP)

    KAUST Repository

    Kravitz, Ben

    2011-01-31

    To evaluate the effects of stratospheric geoengineering with sulphate aerosols, we propose standard forcing scenarios to be applied to multiple climate models to compare their results and determine the robustness of their responses. Thus far, different modeling groups have used different forcing scenarios for both global warming and geoengineering, complicating the comparison of results. We recommend four experiments to explore the extent to which geoengineering might offset climate change projected in some of the Climate Model Intercomparison Project 5 experiments. These experiments focus on stratospheric aerosols, but future experiments under this framework may focus on different means of geoengineering. © 2011 Royal Meteorological Society.

  18. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  19. Observed and projected impacts of climate change on marine fisheries, aquaculture, coastal tourism, and human health: an update

    Directory of Open Access Journals (Sweden)

    Lauren V Weatherdon

    2016-04-01

    Full Text Available The Intergovernmental Panel on Climate Change (IPCC Fifth Assessment Report (AR5 states that climate change and ocean acidification are altering the oceans at a rate that is unprecedented compared with the recent past, leading to multifaceted impacts on marine ecosystems, associated goods and services, and human societies. AR5 underlined key uncertainties that remain regarding how synergistic changes in the ocean are likely to affect human systems, and how humans are likely to respond to these events. As climate change research has accelerated rapidly following AR5, an updated synthesis of available knowledge is necessary to identify emerging evidence, and to thereby better inform policy discussions. This paper reviews the literature to capture corroborating, conflicting, and novel findings published following the cut-off date for contribution to AR5. Specifically, we highlight key scientific developments on the impacts of climate-induced changes in the ocean on key socioeconomic sectors, including fisheries, aquaculture and tourism. New evidence continues to support a climate-induced redistribution of benefits and losses at multiple scales and across coastal and marine socio-ecological systems, partly resulting from species and ecosystem range shifts and changes in primary productivity. New efforts have been made to characterize and value ecosystem services in the context of climate change, with specific relevance to ecosystem-based adaptation. Recent studies have also explored synergistic interactions between climatic drivers, and have found strong variability between impacts on species at different life stages. Although climate change may improve conditions for some types of freshwater aquaculture, potentially providing alternative opportunities to adapt to impacts on wild capture fisheries, ocean acidification poses a risk to shellfish fisheries and aquaculture. The risk of increased prevalence of disease under warmer temperatures is

  20. The Frontier Fields Lens Modeling Comparison Project

    CERN Document Server

    Meneghetti, M; Coe, D; Contini, E; De Lucia, G; Giocoli, C; Acebron, A; Borgani, S; Bradac, M; Diego, J M; Hoag, A; Ishigaki, M; Johnson, T L; Jullo, E; Kawamata, R; Lam, D; Limousin, M; Liesenborgs, J; Oguri, M; Sebesta, K; Sharon, K; Williams, L L R; Zitrin, A

    2016-01-01

    Gravitational lensing by clusters of galaxies offers a powerful probe of their structure and mass distribution. Deriving a lens magnification map for a galaxy cluster is a classic inversion problem and many methods have been developed over the past two decades to solve it. Several research groups have developed techniques independently to map the predominantly dark matter distribution in cluster lenses. While these methods have all provided remarkably high precision mass maps, particularly with exquisite imaging data from the Hubble Space Telescope (HST), the reconstructions themselves have never been directly compared. In this paper, we report the results of comparing various independent lens modeling techniques employed by individual research groups in the community. Here we present for the first time a detailed and robust comparison of methodologies for fidelity, accuracy and precision. For this collaborative exercise, the lens modeling community was provided simulated cluster images -- of two clusters Are...

  1. Cacao Intensification in Sulawesi: A Green Prosperity Model Project

    Energy Technology Data Exchange (ETDEWEB)

    Moriarty, K.; Elchinger, M.; Hill, G.; Katz, J.; Barnett, J.

    2014-09-01

    NREL conducted eight model projects for Millennium Challenge Corporation's (MCC) Compact with Indonesia. Green Prosperity, the largest project of the Compact, seeks to address critical constraints to economic growth while supporting the Government of Indonesia's commitment to a more sustainable, less carbon-intensive future. This study evaluates techniques to improve cacao farming in Sulawesi Indonesia with an emphasis on Farmer Field Schools and Cocoa Development Centers to educate farmers and for train the trainer programs. The study estimates the economic viability of cacao farming if smallholder implement techniques to increase yield as well as social and environmental impacts of the project.

  2. Human genome education model project. Ethical, legal, and social implications of the human genome project: Education of interdisciplinary professionals

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, J.O. [Alliance of Genetic Support Groups, Chevy Chase, MD (United States); Lapham, E.V. [Georgetown Univ., Washington, DC (United States). Child Development Center

    1996-12-31

    This meeting was held June 10, 1996 at Georgetown University. The purpose of this meeting was to provide a multidisciplinary forum for exchange of state-of-the-art information on the human genome education model. Topics of discussion include the following: psychosocial issues; ethical issues for professionals; legislative issues and update; and education issues.

  3. Animal models of autism with a particular focus on the neural basis of changes in social behaviour: an update article.

    Science.gov (United States)

    Olexová, Lucia; Talarovičová, Alžbeta; Lewis-Evans, Ben; Borbélyová, Veronika; Kršková, Lucia

    2012-12-01

    Research on autism has been gaining more and more attention. However, its aetiology is not entirely known and several factors are thought to contribute to the development of this neurodevelopmental disorder. These potential contributing factors range from genetic heritability to environmental effects. A significant number of reviews have already been published on different aspects of autism research as well as focusing on using animal models to help expand current knowledge around its aetiology. However, the diverse range of symptoms and possible causes of autism have resulted in as equally wide variety of animal models of autism. In this update article we focus only on the animal models with neurobehavioural characteristics of social deficit related to autism and present an overview of the animal models with alterations in brain regions, neurotransmitters, or hormones that are involved in a decrease in sociability.

  4. Updated Lagrangian finite element formulations of various biological soft tissue non-linear material models: a comprehensive procedure and review.

    Science.gov (United States)

    Townsend, Molly T; Sarigul-Klijn, Nesrin

    2016-01-01

    Simplified material models are commonly used in computational simulation of biological soft tissue as an approximation of the complicated material response and to minimize computational resources. However, the simulation of complex loadings, such as long-duration tissue swelling, necessitates complex models that are not easy to formulate. This paper strives to offer the updated Lagrangian formulation comprehensive procedure of various non-linear material models for the application of finite element analysis of biological soft tissues including a definition of the Cauchy stress and the spatial tangential stiffness. The relationships between water content, osmotic pressure, ionic concentration and the pore pressure stress of the tissue are discussed with the merits of these models and their applications.

  5. Building Models from the Bottom Up: The HOBBES Project

    Science.gov (United States)

    Medellin-Azuara, J.; Sandoval Solis, S.; Lund, J. R.; Chu, W.

    2013-12-01

    Water problems are often bigger than technical and data challenges associated in representing a water system using a model. Controversy and complexity is inherent when water is to be allocated among different uses making difficult to maintain coherent and productive discussions on addressing water problems. Quantification of a water supply system through models has proven to be helpful to improve understanding, explore and develop adaptable solutions to water problems. However, models often become too large and complex and become hostages of endless discussions of the assumptions, their algorithms and their limitations. Data management organization and documentation keep model flexible and useful over time. The UC Davis HOBBES project is a new approach, building models from the bottom up. Reversing the traditional model development, where data are arranged around a model algorithm, in Hobbes the data structure, organization and documentation are established first, followed by application of simulation or optimization modeling algorithms for a particular problem at hand. The HOBBES project establishes standards for storing, documenting and sharing datasets on California water system. This allows models to be developed and modified more easily and transparently, with greater comparability. Elements in the database have a spatial definition and can aggregate several infrastructural elements into detailed to coarse representations of the water system. Elements in the database represent reservoirs, groundwater basins, pumping stations, hydropower and water treatment facilities, demand areas and conveyance infrastructure statewide. These elements also host time series, economic and other information from hydrologic, economic, climate and other models. This presentation provides an overview of the project HOBBES project, its applications and prospects for California and elsewhere. The HOBBES Project

  6. Proposed best practice for projects that involve modelling and simulation.

    Science.gov (United States)

    O'Kelly, Michael; Anisimov, Vladimir; Campbell, Chris; Hamilton, Sinéad

    2017-03-01

    Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Semiclassical projection of hedgehog models with quarks

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, T.D.; Broniowski, W.

    1986-12-01

    A simple semiclassical method is presented for calculating physical observables in states with good angular momentum and isospin for models whose mean-field solutions are hedgehogs. The method is applicable for theories which have both quark and meson degrees of freedom. The basic approach is to find slowly rotating solutions to the time-dependent mean-field equations. A nontrivial set of differential equations must be solved to find the quark configuration for these rotating hedgehogs. The parameters which specify the rotating solutions are treated as the collective degrees of freedom. They are requantized by imposing a set of commutation relations which ensures the correct algebra for the SU(2) x SU(2) group of angular momentum and isospin. Collective wave functions can then be found and with these wave functions all matrix elements can be calculated. The method is applied to a simple version of the chiral quark-meson model. A number of physical quantities such as magnetic moments, charge distributions, g/sub A/, g/sub ..pi..//sub N//sub N/, N-..delta.. mass splitting, properties of the N-..delta.. transition, etc., are calculated.

  8. Rotating and linear synchronous generators for renewable electric energy conversion - an update of the ongoing research projects at Uppsala University

    Energy Technology Data Exchange (ETDEWEB)

    Bolund, B.; Segergren, E.; Solum, A.; Perers, R.; Lundstroem, L.; Lindblom, A.; Thorburn, K.; Eriksson, M.; Nilsson, K.; Ivanova, I.; Danielsson, O.; Eriksson, S.; Bengtsson, H.; Sjoestedt, E.; Isberg, J.; Sundberg, J.; Bernhoff, H.; Karlsson, K.-E.; Wolfbrandt, A.; Aagren, O.; Leijon, M.

    2004-07-01

    The discussion regarding renewable energy has gone on for several years. The many ideas and opinions that are presented in this field reflect the great impact future energy production has on people all over the world. This paper describes the new direction of the division of Electricity at Uppsala University after the admission of the new professor, Mats Leijon, in February 2001. Full electromagnetic dynamics can be used in order to improve performance of existing electromagnetic conversion systems and to adapt new technology to the renewable power in nature. These ideas are adopted in wind power, wave power, water-current power, bio-fuelled plants as well as in conventional hydropower, i.e. in every different area were the division is active. This paper is a coarse description of the different activities at the division and alms to highlight their link to each other. Theoretical and experimental results from the different PhD projects are briefly introduced and summarized. (author)

  9. The Timber Resource Inventory Model (TRIM): a projection model for timber supply and policy analysis.

    Science.gov (United States)

    P.L. Tedder; R.N. La Mont; J.C. Kincaid

    1987-01-01

    TRIM (Timber Resource Inventory Model) is a yield table projection system developed for timber supply projections and policy analysis. TRIM simulates timber growth, inventories, management and area changes, and removals over the projection period. Programs in the TRIM system, card-by-card descriptions of required inputs, table formats, and sample results are presented...

  10. On reducibility and ergodicity of population projection matrix models

    DEFF Research Database (Denmark)

    Stott, Iain; Townley, Stuart; Carslake, David

    2010-01-01

    1. Population projection matrices (PPMs) are probably the most commonly used empirical population models. To be useful for predictive or prospective analyses, PPM models should generally be irreducible (the associated life cycle graph contains the necessary transition rates to facilitate pathways...... structure used in the population projection). In our sample of published PPMs, 15·6% are non-ergodic. 3. This presents a problem: reducible–ergodic models often defy biological rationale in their description of the life cycle but may or may not prove problematic for analysis as they often behave similarly...... to irreducible models. Reducible–non-ergodic models will usually defy biological rationale in their description of the both the life cycle and population dynamics, hence contravening most analytical methods. 4. We provide simple methods to evaluate reducibility and ergodicity of PPM models, present illustrative...

  11. A Model for Crises Management in Software Projects

    Directory of Open Access Journals (Sweden)

    Mohammad Tarawneh

    2011-11-01

    Full Text Available Today software projects are important part into almost every business application. It is quality, efficiency and effectiveness of these applications will determine the failure or success of many business solutions. Consequently, businesses often find that they need to have a competitive and efficient advantage through the development and improve of software projects that help critical business activities. The quality of a software project is determined by the quality of the software development process. Improvements in the development process can lead to significant improvement in software quality. Based on the foregoing risks and problems which may be software engineering project faced, we try to shed light on the mechanism of dealing with crises in software engineering projects in this research. This research suggests a set of rules and guidelines that help software project mangers to prevent and dealing with software project crises Also a model was proposed; the proposed model showed a set of steps that must be implemented in case of crises emerging or before it happen. The crisis management starts understanding it first and then to prepare a careful review of her as she is looking for regions or aspects of the turmoil and failures. The next step is the classification of crisis, then the preparation or design a plan attitudinal or contingency plan, which must be implemented immediately upon the occurrence of crisis. Finally, the final element is the implementation of the program or plan established soon after the crisis and it should be noted here that the project team of software engineering that have been trained on the virtual models of various crises, which helps in the development of managed, skills, and also that you should avoid or ignore the failure to acknowledge a problem when Start or try to be underestimated or taken lightly.

  12. Projected Dipole Model for Quantum Plasmonics

    DEFF Research Database (Denmark)

    Yan, Wei; Wubs, Martijn; Mortensen, N. Asger

    2015-01-01

    Quantum effects of plasmonic phenomena have been explored through ab initio studies, but only for exceedingly small metallic nanostructures, leaving most experimentally relevant structures too large to handle. We propose instead an effective description with the computationally appealing features...... of classical electrodynamics, while quantum properties are described accurately through an infinitely thin layer of dipoles oriented normally to the metal surface. The nonlocal polarizability of the dipole layer-the only introduced parameter-is mapped from the free-electron distribution near the metal surface...... as obtained with 1D quantum calculations, such as time-dependent density-functional theory (TDDFT), and is determined once and for all. The model can be applied in two and three dimensions to any system size that is tractable within classical electrodynamics, while capturing quantum plasmonic aspects...

  13. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  14. A CONCEPTUAL MODEL FOR IMPROVED PROJECT SELECTION AND PRIORITISATION

    Directory of Open Access Journals (Sweden)

    P. J. Viljoen

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Project portfolio management processes are often designed and operated as a series of stages (or project phases and gates. However, the flow of such a process is often slow, characterised by queues waiting for a gate decision and by repeated work from previous stages waiting for additional information or for re-processing. In this paper the authors propose a conceptual model that applies supply chain and constraint management principles to the project portfolio management process. An advantage of the proposed model is that it provides the ability to select and prioritise projects without undue changes to project schedules. This should result in faster flow through the system.

    AFRIKAANSE OPSOMMING: Prosesse om portefeuljes van projekte te bestuur word normaalweg ontwerp en bedryf as ’n reeks fases en hekke. Die vloei deur so ’n proses is dikwels stadig en word gekenmerk deur toue wat wag vir besluite by die hekke en ook deur herwerk van vorige fases wat wag vir verdere inligting of vir herprosessering. In hierdie artikel word ‘n konseptuele model voorgestel. Die model berus op die beginsels van voorsieningskettings sowel as van beperkingsbestuur, en bied die voordeel dat projekte geselekteer en geprioritiseer kan word sonder onnodige veranderinge aan projekskedules. Dit behoort te lei tot versnelde vloei deur die stelsel.

  15. Modelling of Airship Flight Mechanics by the Projection Equivalent Method

    OpenAIRE

    Frantisek Jelenciak; Michael Gerke; Ulrich Borgolte

    2015-01-01

    This article describes the projection equivalent method (PEM) as a specific and relatively simple approach for the modelling of aircraft dynamics. By the PEM it is possible to obtain a mathematic al model of the aerodynamic forces and momentums acting on different kinds of aircraft during flight. For the PEM, it is a characteristic of it that - in principle - it provides an acceptable regression model of aerodynamic forces and momentums which exhibits reasonable and plausible behaviour from a...

  16. Suitability Analysis of Continuous-Use Reliability Growth Projection Models

    Science.gov (United States)

    2015-03-26

    exists for all types, shapes, and sizes. The primary focus of this study is a comparison of reliability growth projection models designed for...requirements to use reliability growth models, recent studies have noted trends in reliability failures throughout the DoD. In [14] Dr. Michael Gilmore...so a strict exponential distribu- tion was used to stay within their assumptions. In reality, however, reliability growth models often must be used

  17. Heart Modeling, Computational Physiology and the IUPS Physiome Project

    Science.gov (United States)

    Hunter, Peter J.

    The Physiome Project of the International Union of Physiological Sciences (IUPS) is attempting to provide a comprehensive framework for modelling the human body using computational methods which can incorporate the biochemistry, biophysics and anatomy of cells, tissues and organs. A major goal of the project is to use computational modelling to analyse integrative biological function in terms of underlying structure and molecular mechanisms. To support that goal the project is developing XML markup languages (CellML & FieldML) for encoding models, and software tools for creating, visualizing and executing these models. It is also establishing web-accessible physiological databases dealing with model-related data at the cell, tissue, organ and organ system levels. Two major developments in current medicine are, on the one hand, the much publicised genomics (and soon proteomics) revolution and, on the other, the revolution in medical imaging in which the physiological function of the human body can be studied with a plethora of imaging devices such as MRI, CT, PET, ultrasound, electrical mapping, etc. The challenge for the Physiome Project is to link these two developments for an individual - to use complementary genomic and medical imaging data, together with computational modelling tailored to the anatomy, physiology and genetics of that individual, for patient-specific diagnosis and treatment.

  18. An updated synthesis of the observed and projected impacts of climate change on the chemical, physical and biological processes in the oceans

    Directory of Open Access Journals (Sweden)

    Ella Louise Howes

    2015-06-01

    Full Text Available The 5th Assessment Report (AR5 of the Intergovernmental Panel on Climate Change (IPCC states with very high certainty that anthropogenic emissions have caused measurable changes in the physical ocean environment. These changes are summarized with special focus on those that are predicted to have the strongest, most direct effects on ocean biological processes; namely, ocean warming and associated phenomena (including stratification and sea level rise as well as deoxygenation and ocean acidification. The biological effects of these changes are then discussed for microbes (including phytoplankton, plants, animals, warm and cold-water corals, and ecosystems. The IPCC AR5 highlighted several areas related to both the physical and biological processes that required further research. As a rapidly developing field, there have been many pertinent studies published since the cut off dates for the AR5, which have increased our understanding of the processes at work. This study undertook an extensive review of recently published literature to update the findings of the AR5 and provide a synthesized review on the main issues facing future oceans. The level of detail provided in the AR5 and subsequent work provided a basis for constructing projections of the state of ocean ecosystems in 2100 under two the Representative Concentration Pathways RCP4.5 and 8.5. Finally the review highlights notable additions, clarifications and points of departure from AR5 provided by subsequent studies.

  19. Construction project investment control model based on instant information

    Institute of Scientific and Technical Information of China (English)

    WANG Xue-tong

    2006-01-01

    Change of construction conditions always influences project investment by causing the loss of construction work time and extending the duration. To resolve such problem as difficult dynamic control in work construction plan, this article presents a concept of instant optimization by ways of adjustment operation time of each working procedure to minimize investment change. Based on this concept, its mathematical model is established and a strict mathematical justification is performed. An instant optimization model takes advantage of instant information in the construction process to duly complete adjustment of construction; thus we maximize cost efficiency of project investment.

  20. Updated collider and direct detection constraints on Dark Matter models for the Galactic Center gamma-ray excess

    Science.gov (United States)

    Escudero, Miguel; Hooper, Dan; Witte, Samuel J.

    2017-02-01

    Utilizing an exhaustive set of simplified models, we revisit dark matter scenarios potentially capable of generating the observed Galactic Center gamma-ray excess, updating constraints from the LUX and PandaX-II experiments, as well as from the LHC and other colliders. We identify a variety of pseudoscalar mediated models that remain consistent with all constraints. In contrast, dark matter candidates which annihilate through a spin-1 mediator are ruled out by direct detection constraints unless the mass of the mediator is near an annihilation resonance, or the mediator has a purely vector coupling to the dark matter and a purely axial coupling to Standard Model fermions. All scenarios in which the dark matter annihilates through t-channel processes are now ruled out by a combination of the constraints from LUX/PandaX-II and the LHC.