WorldWideScience

Sample records for modeling efforts required

  1. Prosocial apathy for helping others when effort is required.

    Science.gov (United States)

    Lockwood, Patricia L; Hamonet, Mathilde; Zhang, Samuel H; Ratnavel, Anya; Salmony, Florentine U; Husain, Masud; Apps, Matthew A J

    2017-07-01

    Prosocial acts - those that are costly to ourselves but benefit others - are a central component of human co-existence1-3. While the financial and moral costs of prosocial behaviours are well understood4-6, everyday prosocial acts do not typically come at such costs. Instead, they require effort. Here, using computational modelling of an effort-based task we show that people are prosocially apathetic. They are less willing to choose to initiate highly effortful acts that benefit others compared to benefitting themselves. Moreover, even when choosing to initiate effortful prosocial acts, people show superficiality, exerting less force into actions that benefit others than themselves. These findings replicated, were present when the other was anonymous or not, and when choices were made to earn rewards or avoid losses. Importantly, the least prosocially motivated people had higher subclinical levels of psychopathy and social apathy. Thus, although people sometimes 'help out', they are less motivated to benefit others and sometimes 'superficially prosocial', which may characterise everyday prosociality and its disruption in social disorders.

  2. Manage changes in the requirements definition through a collaborative effort

    CSIR Research Space (South Africa)

    Joseph-Malherbe, S

    2009-08-01

    Full Text Available Updating or changing the requirements statement during the systems engineering process may impact adversely on project parameters such as sequence, dependencies, effort, and duration of tasks, usually with an increase in development time and cost...

  3. Identification of efforts required for continued safe operation of KANUPP

    International Nuclear Information System (INIS)

    Ghafoor, M.A.; Hashmi, J.A.; Siddiqui, Z.H.

    1991-01-01

    Kanupp, the first commercial CANDU PHWR, rated at 137 MWe, was built on turnkey basis by the Canadian General Electric Company for the Pakistan Atomic Energy Commission, and went operational in October, 1972 near Karachi. It has operated since then with a lifetime average availability factor of 51.5% and capacity factor of 25%. In 1976, Kanupp suffered loss of technical support from its original vendors due to the Canadian embargo on export of nuclear technology. Simultaneously, the world experienced the most explosive development and advancement in electronic and computer technology, accelerating the obsolescence of such equipment and systems installed in Kanupp. Replacement upgrading of obsolete computers, control and instrumentation was thus the first major set of efforts realized as essential f or continued safe operation. On the other hand, Kanupp was able to cope with the normal maintenance of its process, mechanical and electrical equipment till the late 80's. But now many of these components are reaching the end of their useful life, and developing chronic problems due to ageing, which can only be solved by complete replacement. This is much more difficult for custom-made nuclear process equipment, e.g. the reactor internals and the fuelling machine. Public awareness and international concern about nuclear safety have increased significantly since the TMI and Chernobyl events. Corresponding realization of the critical role of human factors and the importance of operational experience feedback, has helped Kanupp by opening international channels of communication, including renewed cooperation on CANDU technology. The safety standards and criteria for CANDU as well as other NPPs have matured and evolved gradually over the past two decades. First Kanupp has to ensure that its present ageing-induced equipment problems are resolved to satisfy the original safety requirements and public risk targets which are still internationally acceptable. But as a policy, we

  4. ERP services effort estimation strategies based on early requirements

    NARCIS (Netherlands)

    Erasmus, I.P.; Daneva, Maia; Kalenborg, Axel; Trapp, Marcus

    2015-01-01

    ERP clients and vendors necessarily estimate their project interventions at a very early stage, before the full requirements to an ERP solution are known and often before a contract is finalized between a vendor/ consulting company and a client. ERP project estimation at the stage of early

  5. Efforts and models of education for parents

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal

    2010-01-01

    Artiklen omfatter en gennemgang af modeller for forældreuddannelse, der fortrinsvis anvendes i Danmark. Artiklen indlejrer modellerne i nogle bredere blikke på uddannelsessystemet og den aktuelle diskurs om ansvarliggørelse af forældre.   Udgivelsesdato: Marts 2010...

  6. How to use COSMIC Functional Size in Effort Estimation Models?

    OpenAIRE

    Gencel, Cigdem

    2008-01-01

    Although Functional Size Measurement (FSM) methods have become widely used by the software organizations, the functional size based effort estimation still needs further investigation. Most of the studies on effort estimation consider total functional size of the software as the primary input to estimation models and they mostly focus on identifying the project parameters which might have a significant effect on the size-effort relationship. This study brings suggestions on how to use COSMIC ...

  7. Incorporating Responsiveness to Marketing Efforts When Modeling Brand Choice

    NARCIS (Netherlands)

    D. Fok (Dennis); Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    2001-01-01

    textabstractIn this paper we put forward a brand choice model which incorporates responsiveness to marketing efforts as a form of structural heterogeneity. We introduce two latent segments of households. The households in the first segment are assumed to respond to marketing efforts while households

  8. Efforts - Final technical report on task 4. Physical modelling calidation

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; Olsson, David Dam; Christensen, T. W.

    The present report is documentation for the work carried out in Task 4 at DTU Physical modelling-validation on the Brite/Euram project No. BE96-3340, contract No. BRPR-CT97-0398, with the title Enhanced Framework for forging design using reliable three-dimensional simulation (EFFORTS). The report...

  9. 29 CFR 1620.16 - Jobs requiring equal effort in performance.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Jobs requiring equal effort in performance. 1620.16 Section 1620.16 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION THE... factors which cause mental fatigue and stress, as well as those which alleviate fatigue, are to be...

  10. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts.

    Science.gov (United States)

    Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M

    2012-03-09

    A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.

  11. Linking effort and fishing mortality in a mixed fisheries model

    DEFF Research Database (Denmark)

    Thøgersen, Thomas Talund; Hoff, Ayoe; Frost, Hans Staby

    2012-01-01

    Since the implementation of the Common Fisheries Policy of the European Union in 1983, the management of EU fisheries has been enormously challenging. The abundance of many fish stocks has declined because too much fishing capacity has been utilised on healthy fish stocks. Today, this decline...... in fish stocks has led to overcapacity in many fisheries, leading to incentives for overfishing. Recent research has shown that the allocation of effort among fleets can play an important role in mitigating overfishing when the targeting covers a range of species (multi-species—i.e., so-called mixed...... fisheries), while simultaneously optimising the overall economic performance of the fleets. The so-called FcubEcon model, in particular, has elucidated both the biologically and economically optimal method for allocating catches—and thus effort—between fishing fleets, while ensuring that the quotas...

  12. Characterization of infiltration rates from landfills: supporting groundwater modeling efforts.

    Science.gov (United States)

    Moo-Young, Horace; Johnson, Barnes; Johnson, Ann; Carson, David; Lew, Christine; Liu, Salley; Hancocks, Katherine

    2004-01-01

    The purpose of this paper is to review the literature to characterize infiltration rates from landfill liners to support groundwater modeling efforts. The focus of this investigation was on collecting studies that describe the performance of liners 'as installed' or 'as operated'. This document reviews the state of the science and practice on the infiltration rate through compacted clay liner (CCL) for 149 sites and geosynthetic clay liner (GCL) for 1 site. In addition, it reviews the leakage rate through geomembrane (GM) liners and composite liners for 259 sites. For compacted clay liners (CCL), there was limited information on infiltration rates (i.e., only 9 sites reported infiltration rates.), thus, it was difficult to develop a national distribution. The field hydraulic conductivities for natural clay liners range from 1 x 10(-9) cm s(-1) to 1 x 10(-4) cm s(-1), with an average of 6.5 x 10(-8) cm s(-1). There was limited information on geosynthetic clay liner. For composite lined and geomembrane systems, the leak detection system flow rates were utilized. The average monthly flow rate for composite liners ranged from 0-32 lphd for geomembrane and GCL systems to 0 to 1410 lphd for geomembrane and CCL systems. The increased infiltration for the geomembrane and CCL system may be attributed to consolidation water from the clay.

  13. Colloids and Radionuclide Transport: A Field, Experimental and Modeling Effort

    Science.gov (United States)

    Zhao, P.; Zavarin, M.; Sylwester, E. E.; Allen, P. G.; Williams, R. W.; Kersting, A. B.

    2002-05-01

    Natural inorganic colloids (colloid-facilitated transport to the transport of low-solubility actinides, such as Pu, is still not well understood. In an effort to better understand the dominant geochemical mechanisms that control Pu transport, we have performed a series of sorption/desorption experiments using mineral colloids. We focused on natural colloidal minerals present in water samples collected from both saturated and vadose zone waters at the Nevada Test Site. These minerals include zeolites, clays, silica, Mn-oxides, Fe-oxides, and calcite. X-ray absorption fine-structure spectroscopy ( both XANES and EXAFS) was performed in order to characterize the speciation of sorbed plutonium. The XANES spectra show that only Pu(IV) was detected (within experimental error) on these mineral surfaces when the starting Pu oxidation state was +5, indicating that Pu(V) was reduced to Pu(IV) during sorption. The EXAFS detected Pu-M and Pu-C interactions (where M=Fe, Mn, or Si) indicating Pu(IV) surface complexation along with carbonate ternary complex formation on most of the minerals tested. Although the plutonium sorption as Pu(IV) species is mineral independent, the actual sorption paths are different for different minerals. The sorption rates were compared to the rates of plutonium disproportionation under similar conditions. The batch sorption/desorption experiments of Pu(IV) and Pu(V) onto colloidal zeolite (clinoptilolite, colloids particle size 171 ñ 25 nm) were conducted in synthetic groundwater (similar to J-13, Yucca Mountain standard) with a pH range from 4 to 10 and initial plutonium concentration of 10-9 M. The results show that Pu(IV) sorption takes place within an hour, while the rates of Pu(V) sorption onto the colloids is much slower and mineral dependent. The kinetic results from the batch sorption/desorption experiments, coupled with redox kinetics of plutonium in solution will be used in geochemical modeling of Pu surface complexation to colloids and

  14. A Pilot Study to Compare Programming Effort for Two Parallel Programming Models (PREPRINT)

    National Research Council Canada - National Science Library

    Hochstein, Lorin; Basili, Victor R; Vishkin, Uzi; Gilbert, John

    2007-01-01

    CONTEXT: Writing software for the current generation of parallel systems requires significant programmer effort, and the community is seeking alternatives that reduce effort while still achieving good performance. OBJECTIVE...

  15. Report Summarizing the Effort Required to Initiate Welding of Irradiated Materials within the Welding Cubicle

    Energy Technology Data Exchange (ETDEWEB)

    Frederick, Greg [Electric Power Research Institute (EPRI), Palo Alto, CA (United States); Sutton, Benjamin J. [Electric Power Research Institute (EPRI), Palo Alto, CA (United States); Tatman, Jonathan K. [Electric Power Research Institute (EPRI), Palo Alto, CA (United States); Vance, Mark Christopher [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Allen W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clark, Scarlett R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Feng, Zhili [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Roger G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chen, Jian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tang, Wei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Xunxiang [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gibson, Brian T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-06-01

    The advanced welding facility within a hot cell at the Radiochemical Engineering Development Center of Oak Ridge National Laboratory (ORNL), which has been jointly funded by the U.S. Department of Energy (DOE), Office of Nuclear Energy, Light Water Reactor Sustainability Program and the Electric Power Research Institute, Long Term Operations Program and the Welding and Repair Technology Center, is in the final phase of development. Research and development activities in this facility will involve direct testing of advanced welding technologies on irradiated materials in order to address the primary technical challenge of helium induced cracking that can arise when conventional fusion welding techniques are utilized on neutron irradiated stainless steels and nickel-base alloys. This report details the effort that has been required since the beginning of fiscal year 2017 to initiate welding research and development activities on irradiated materials within the hot cell cubicle, which houses welding sub-systems that include laser beam welding (LBW) and friction stir welding (FSW) and provides material containment within the hot cell.

  16. Hybrid discrete choice models: Gained insights versus increasing effort

    International Nuclear Information System (INIS)

    Mariel, Petr; Meyerhoff, Jürgen

    2016-01-01

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  17. Hybrid discrete choice models: Gained insights versus increasing effort

    Energy Technology Data Exchange (ETDEWEB)

    Mariel, Petr, E-mail: petr.mariel@ehu.es [UPV/EHU, Economía Aplicada III, Avda. Lehendakari Aguire, 83, 48015 Bilbao (Spain); Meyerhoff, Jürgen [Institute for Landscape Architecture and Environmental Planning, Technical University of Berlin, D-10623 Berlin, Germany and The Kiel Institute for the World Economy, Duesternbrooker Weg 120, 24105 Kiel (Germany)

    2016-10-15

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  18. Hybrid discrete choice models: Gained insights versus increasing effort.

    Science.gov (United States)

    Mariel, Petr; Meyerhoff, Jürgen

    2016-10-15

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Requirements for Medical Modeling Languages

    Science.gov (United States)

    van der Maas, Arnoud A.F.; Ter Hofstede, Arthur H.M.; Ten Hoopen, A. Johannes

    2001-01-01

    Objective: The development of tailor-made domain-specific modeling languages is sometimes desirable in medical informatics. Naturally, the development of such languages should be guided. The purpose of this article is to introduce a set of requirements for such languages and show their application in analyzing and comparing existing modeling languages. Design: The requirements arise from the practical experience of the authors and others in the development of modeling languages in both general informatics and medical informatics. The requirements initially emerged from the analysis of information modeling techniques. The requirements are designed to be orthogonal, i.e., one requirement can be violated without violation of the others. Results: The proposed requirements for any modeling language are that it be “formal” with regard to syntax and semantics, “conceptual,” “expressive,” “comprehensible,” “suitable,” and “executable.” The requirements are illustrated using both the medical logic modules of the Arden Syntax as a running example and selected examples from other modeling languages. Conclusion: Activity diagrams of the Unified Modeling Language, task structures for work flows, and Petri nets are discussed with regard to the list of requirements, and various tradeoffs are thus made explicit. It is concluded that this set of requirements has the potential to play a vital role in both the evaluation of existing domain-specific languages and the development of new ones. PMID:11230383

  20. Requirements for effective modelling strategies.

    NARCIS (Netherlands)

    Gaunt, J.L.; Riley, J.; Stein, A.; Penning de Vries, F.W.T.

    1997-01-01

    As the result of a recent BBSRC-funded workshop between soil scientists, modellers, statisticians and others to discuss issues relating to the derivation of complex environmental models, a set of modelling guidelines is presented and the required associated research areas are discussed.

  1. Nuclear Hybrid Energy Systems FY16 Modeling Efforts at ORNL

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Greenwood, Michael Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Guler Yigitoglu, Askin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    A nuclear hybrid system uses a nuclear reactor as the basic power generation unit. The power generated by the nuclear reactor is utilized by one or more power customers as either thermal power, electrical power, or both. In general, a nuclear hybrid system will couple the nuclear reactor to at least one thermal power user in addition to the power conversion system. The definition and architecture of a particular nuclear hybrid system is flexible depending on local markets needs and opportunities. For example, locations in need of potable water may be best served by coupling a desalination plant to the nuclear system. Similarly, an area near oil refineries may have a need for emission-free hydrogen production. A nuclear hybrid system expands the nuclear power plant from its more familiar central power station role by diversifying its immediately and directly connected customer base. The definition, design, analysis, and optimization work currently performed with respect to the nuclear hybrid systems represents the work of three national laboratories. Idaho National Laboratory (INL) is the lead lab working with Argonne National Laboratory (ANL) and Oak Ridge National Laboratory. Each laboratory is providing modeling and simulation expertise for the integration of the hybrid system.

  2. APPLYING TEACHING-LEARNING TO ARTIFICIAL BEE COLONY FOR PARAMETER OPTIMIZATION OF SOFTWARE EFFORT ESTIMATION MODEL

    Directory of Open Access Journals (Sweden)

    THANH TUNG KHUAT

    2017-05-01

    Full Text Available Artificial Bee Colony inspired by the foraging behaviour of honey bees is a novel meta-heuristic optimization algorithm in the community of swarm intelligence algorithms. Nevertheless, it is still insufficient in the speed of convergence and the quality of solutions. This paper proposes an approach in order to tackle these downsides by combining the positive aspects of TeachingLearning based optimization and Artificial Bee Colony. The performance of the proposed method is assessed on the software effort estimation problem, which is the complex and important issue in the project management. Software developers often carry out the software estimation in the early stages of the software development life cycle to derive the required cost and schedule for a project. There are a large number of methods for effort estimation in which COCOMO II is one of the most widely used models. However, this model has some restricts because its parameters have not been optimized yet. In this work, therefore, we will present the approach to overcome this limitation of COCOMO II model. The experiments have been conducted on NASA software project dataset and the obtained results indicated that the improvement of parameters provided better estimation capabilities compared to the original COCOMO II model.

  3. Evaluation of Arroyo Channel Restoration Efforts using Hydrological Modeling: Rancho San Bernardino, Sonora, MX

    Science.gov (United States)

    Jemison, N. E.; DeLong, S.; Henderson, W. M.; Adams, J.

    2012-12-01

    In the drylands of the southwestern U.S. and northwestern Mexico, historical river channel incision (arroyo cutting) has led to the destruction of riparian ecological systems and cieñega wetlands in many locations. Along Silver Creek on the Arizona-Sonora border, the Cuenca Los Ojos Foundation has been installing rock gabions and concrete and earthen berms with a goal of slowing flash floods, raising groundwater levels, and refilling arroyo channels with sediment in an area that changed from a broad, perennially wet cieñega to a narrow sand- and gravel-dominated arroyo channel with an average depth of ~6 m. The engineering efforts hope to restore desert wetlands, regrow riparian vegetation, and promote sediment deposition along the arroyo floor. Hydrological modeling allows us to predict how rare flood events interact with the restoration efforts and may guide future approaches to dryland ecological restoration. This modeling is complemented by detailed topographic surveying and use of streamflow sensors to monitor hydrological processes in the restoration project. We evaluate the inundation associated with model 10-, 50-, 100-, 500-, and 1,000-year floods through the study area using FLO-2D and HEC-RAS modeling environments in order to evaluate the possibility of returning surface inundation to the former cieñega surface. According to HEC-RAS model predictions, given current channel configuration, it would require a 500-year flood to overtop the channel banks and reinundate the cieñega (now terrace) surface, though the 100-year flood may lead to limited terrace surface inundation. Based on our models, 10-year floods were ~2 m from overtopping the arroyo walls, 50-year floods came ~1.5 m from overtopping the arroyos, 100-year floods were ~1.2 m from overtopping, and 500- and 1,000-year floods at least partially inundated the cieñega surface. The current topography of Silver Creek does not allow for frequent flooding of the former cieñega; model predictions

  4. 34 CFR 403.184 - How does a State request a waiver of the maintenance of effort requirement?

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false How does a State request a waiver of the maintenance of effort requirement? 403.184 Section 403.184 Education Regulations of the Offices of the Department of... APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403.184 How does...

  5. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  6. Systematic Identification of Stakeholders for Engagement with Systems Modeling Efforts in the Snohomish Basin, Washington, USA

    Science.gov (United States)

    Even as stakeholder engagement in systems dynamic modeling efforts is increasingly promoted, the mechanisms for identifying which stakeholders should be included are rarely documented. Accordingly, for an Environmental Protection Agency’s Triple Value Simulation (3VS) mode...

  7. Adaptive effort investment in cognitive and physical tasks: a neurocomputational model.

    Science.gov (United States)

    Verguts, Tom; Vassena, Eliana; Silvetti, Massimo

    2015-01-01

    Despite its importance in everyday life, the computational nature of effort investment remains poorly understood. We propose an effort model obtained from optimality considerations, and a neurocomputational approximation to the optimal model. Both are couched in the framework of reinforcement learning. It is shown that choosing when or when not to exert effort can be adaptively learned, depending on rewards, costs, and task difficulty. In the neurocomputational model, the limbic loop comprising anterior cingulate cortex (ACC) and ventral striatum in the basal ganglia allocates effort to cortical stimulus-action pathways whenever this is valuable. We demonstrate that the model approximates optimality. Next, we consider two hallmark effects from the cognitive control literature, namely proportion congruency and sequential congruency effects. It is shown that the model exerts both proactive and reactive cognitive control. Then, we simulate two physical effort tasks. In line with empirical work, impairing the model's dopaminergic pathway leads to apathetic behavior. Thus, we conceptually unify the exertion of cognitive and physical effort, studied across a variety of literatures (e.g., motivation and cognitive control) and animal species.

  8. Does the incremental shuttle walk test require maximal effort in young obese women?

    Directory of Open Access Journals (Sweden)

    S.P. Jürgensen

    2016-01-01

    Full Text Available Obesity is a chronic disease with a multifaceted treatment approach that includes nutritional counseling, structured exercise training, and increased daily physical activity. Increased body mass elicits higher cardiovascular, ventilatory and metabolic demands to varying degrees during exercise. With functional capacity assessment, this variability can be evaluated so individualized guidance for exercise training and daily physical activity can be provided. The aim of the present study was to compare cardiovascular, ventilatory and metabolic responses obtained during a symptom-limited cardiopulmonary exercise test (CPX on a treadmill to responses obtained by the incremental shuttle walk test (ISWT in obese women and to propose a peak oxygen consumption (VO2 prediction equation through variables obtained during the ISWT. Forty obese women (BMI ≥30 kg/m2 performed one treadmill CPX and two ISWTs. Heart rate (HR, arterial blood pressure (ABP and perceived exertion by the Borg scale were measured at rest, during each stage of the exercise protocol, and throughout the recovery period. The predicted maximal heart rate (HRmax was calculated (210 – age in years (16 and compared to the HR response during the CPX. Peak VO2 obtained during CPX correlated significantly (P<0.05 with ISWT peak VO2 (r=0.79 as well as ISWT distance (r=0.65. The predictive model for CPX peak VO2, using age and ISWT distance explained 67% of the variability. The current study indicates the ISWT may be used to predict aerobic capacity in obese women when CPX is not a viable option.

  9. Adaptive Effort Investment in Cognitive and Physical Tasks: A Neurocomputational Model

    Directory of Open Access Journals (Sweden)

    Tom eVerguts

    2015-03-01

    Full Text Available Despite its importance in everyday life, the computational nature of effort investment remains poorly understood. We propose an effort model obtained from optimality considerations, and a neurocomputational approximation to the optimal model. Both are couched in the framework of reinforcement learning. It is shown that choosing when or when not to exert effort can be adaptively learned, depending on rewards, costs, and task difficulty. In the neurocomputational model, the limbic loop comprising anterior cingulate cortex and ventral striatum in the basal ganglia allocates effort to cortical stimulus-action pathways whenever this is valuable. We demonstrate that the model approximates optimality. Next, we consider two hallmark effects from the cognitive control literature, namely proportion congruency and sequential congruency effects. It is shown that the model exerts both proactive and reactive cognitive control. Then, we simulate two physical effort tasks. In line with empirical work, impairing the model’s dopaminergic pathway leads to apathetic behavior. Thus, we conceptually unify the exertion of cognitive and physical effort, studied across a variety of literatures (e.g., motivation and cognitive control and animal species.

  10. Empirical Study of Homogeneous and Heterogeneous Ensemble Models for Software Development Effort Estimation

    Directory of Open Access Journals (Sweden)

    Mahmoud O. Elish

    2013-01-01

    Full Text Available Accurate estimation of software development effort is essential for effective management and control of software development projects. Many software effort estimation methods have been proposed in the literature including computational intelligence models. However, none of the existing models proved to be suitable under all circumstances; that is, their performance varies from one dataset to another. The goal of an ensemble model is to manage each of its individual models’ strengths and weaknesses automatically, leading to the best possible decision being taken overall. In this paper, we have developed different homogeneous and heterogeneous ensembles of optimized hybrid computational intelligence models for software development effort estimation. Different linear and nonlinear combiners have been used to combine the base hybrid learners. We have conducted an empirical study to evaluate and compare the performance of these ensembles using five popular datasets. The results confirm that individual models are not reliable as their performance is inconsistent and unstable across different datasets. Although none of the ensemble models was consistently the best, many of them were frequently among the best models for each dataset. The homogeneous ensemble of support vector regression (SVR, with the nonlinear combiner adaptive neurofuzzy inference systems-subtractive clustering (ANFIS-SC, was the best model when considering the average rank of each model across the five datasets.

  11. Time and Effort Required by Persons with Spinal Cord Injury to Learn to Use a Powered Exoskeleton for Assisted Walking.

    Science.gov (United States)

    Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P

    2015-01-01

    Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.

  12. Effort dynamics in a fisheries bioeconomic model: A vessel level approach through Game Theory

    Directory of Open Access Journals (Sweden)

    Gorka Merino

    2007-09-01

    Full Text Available Red shrimp, Aristeus antennatus (Risso, 1816 is one of the most important resources for the bottom-trawl fleets in the northwestern Mediterranean, in terms of both landings and economic value. A simple bioeconomic model introducing Game Theory for the prediction of effort dynamics at vessel level is proposed. The game is performed by the twelve vessels exploiting red shrimp in Blanes. Within the game, two solutions are performed: non-cooperation and cooperation. The first is proposed as a realistic method for the prediction of individual effort strategies and the second is used to illustrate the potential profitability of the analysed fishery. The effort strategy for each vessel is the number of fishing days per year and their objective is profit maximisation, individual profits for the non-cooperative solution and total profits for the cooperative one. In the present analysis, strategic conflicts arise from the differences between vessels in technical efficiency (catchability coefficient and economic efficiency (defined here. The ten-year and 1000-iteration stochastic simulations performed for the two effort solutions show that the best strategy from both an economic and a conservationist perspective is homogeneous effort cooperation. However, the results under non-cooperation are more similar to the observed data on effort strategies and landings.

  13. A Covariance Structure Model Test of Antecedents of Adolescent Alcohol Misuse and a Prevention Effort.

    Science.gov (United States)

    Dielman, T. E.; And Others

    1989-01-01

    Questionnaires were administered to 4,157 junior high school students to determine levels of alcohol misuse, exposure to peer use and misuse of alcohol, susceptibility to peer pressure, internal health locus of control, and self-esteem. Conceptual model of antecendents of adolescent alcohol misuse and effectiveness of a prevention effort was…

  14. Commonalities in WEPP and WEPS and efforts towards a single erosion process model

    NARCIS (Netherlands)

    Visser, S.M.; Flanagan, D.C.

    2004-01-01

    Since the late 1980's, the Agricultural Research Service (ARS) of the United States Department of Agriculture (USDA) has been developing process-based erosion models to predict water erosion and wind erosion. During much of that time, the development efforts of the Water Erosion Prediction Project

  15. A Robotics Systems Design Need: A Design Standard to Provide the Systems Focus that is Required for Longterm Exploration Efforts

    Science.gov (United States)

    Dischinger, H. Charles., Jr.; Mullins, Jeffrey B.

    2005-01-01

    The United States is entering a new period of human exploration of the inner Solar System, and robotic human helpers will be partners in that effort. In order to support integration of these new worker robots into existing and new human systems, a new design standard should be developed, to be called the Robot-Systems Integration Standard (RSIS). It will address the requirements for and constraints upon robotic collaborators with humans. These workers are subject to the same functional constraints as humans of work, reach, and visibility/situational awareness envelopes, and they will deal with the same maintenance and communication interfaces. Thus, the RSIS will be created by discipline experts with the same sort of perspective on these and other interface concerns as human engineers.

  16. Integrating multiple distribution models to guide conservation efforts of an endangered toad

    Science.gov (United States)

    Treglia, Michael L.; Fisher, Robert N.; Fitzgerald, Lee A.

    2015-01-01

    Species distribution models are used for numerous purposes such as predicting changes in species’ ranges and identifying biodiversity hotspots. Although implications of distribution models for conservation are often implicit, few studies use these tools explicitly to inform conservation efforts. Herein, we illustrate how multiple distribution models developed using distinct sets of environmental variables can be integrated to aid in identification sites for use in conservation. We focus on the endangered arroyo toad (Anaxyrus californicus), which relies on open, sandy streams and surrounding floodplains in southern California, USA, and northern Baja California, Mexico. Declines of the species are largely attributed to habitat degradation associated with vegetation encroachment, invasive predators, and altered hydrologic regimes. We had three main goals: 1) develop a model of potential habitat for arroyo toads, based on long-term environmental variables and all available locality data; 2) develop a model of the species’ current habitat by incorporating recent remotely-sensed variables and only using recent locality data; and 3) integrate results of both models to identify sites that may be employed in conservation efforts. We used a machine learning technique, Random Forests, to develop the models, focused on riparian zones in southern California. We identified 14.37% and 10.50% of our study area as potential and current habitat for the arroyo toad, respectively. Generally, inclusion of remotely-sensed variables reduced modeled suitability of sites, thus many areas modeled as potential habitat were not modeled as current habitat. We propose such sites could be made suitable for arroyo toads through active management, increasing current habitat by up to 67.02%. Our general approach can be employed to guide conservation efforts of virtually any species with sufficient data necessary to develop appropriate distribution models.

  17. The Effect of the Demand Control and Effort Reward Imbalance Models on the Academic Burnout of Korean Adolescents

    Science.gov (United States)

    Lee, Jayoung; Puig, Ana; Lee, Sang Min

    2012-01-01

    The purpose of this study was to examine the effects of the Demand Control Model (DCM) and the Effort Reward Imbalance Model (ERIM) on academic burnout for Korean students. Specifically, this study identified the effects of the predictor variables based on DCM and ERIM (i.e., demand, control, effort, reward, Demand Control Ratio, Effort Reward…

  18. A New Software Reliability Growth Model: Multigeneration Faults and a Power-Law Testing-Effort Function

    Directory of Open Access Journals (Sweden)

    Fan Li

    2016-01-01

    Full Text Available Software reliability growth models (SRGMs based on a nonhomogeneous Poisson process (NHPP are widely used to describe the stochastic failure behavior and assess the reliability of software systems. For these models, the testing-effort effect and the fault interdependency play significant roles. Considering a power-law function of testing effort and the interdependency of multigeneration faults, we propose a modified SRGM to reconsider the reliability of open source software (OSS systems and then to validate the model’s performance using several real-world data. Our empirical experiments show that the model well fits the failure data and presents a high-level prediction capability. We also formally examine the optimal policy of software release, considering both the testing cost and the reliability requirement. By conducting sensitivity analysis, we find that if the testing-effort effect or the fault interdependency was ignored, the best time to release software would be seriously delayed and more resources would be misplaced in testing the software.

  19. Fundamental Drop Dynamics and Mass Transfer Experiments to Support Solvent Extraction Modeling Efforts

    International Nuclear Information System (INIS)

    Christensen, Kristi; Rutledge, Veronica; Garn, Troy

    2011-01-01

    In support of the Nuclear Energy Advanced Modeling Simulation Safeguards and Separations (NEAMS SafeSep) program, the Idaho National Laboratory (INL) worked in collaboration with Los Alamos National Laboratory (LANL) to further a modeling effort designed to predict mass transfer behavior for selected metal species between individual dispersed drops and a continuous phase in a two phase liquid-liquid extraction (LLE) system. The purpose of the model is to understand the fundamental processes of mass transfer that occur at the drop interface. This fundamental understanding can be extended to support modeling of larger LLE equipment such as mixer settlers, pulse columns, and centrifugal contactors. The work performed at the INL involved gathering the necessary experimental data to support the modeling effort. A custom experimental apparatus was designed and built for performing drop contact experiments to measure mass transfer coefficients as a function of contact time. A high speed digital camera was used in conjunction with the apparatus to measure size, shape, and velocity of the drops. In addition to drop data, the physical properties of the experimental fluids were measured to be used as input data for the model. Physical properties measurements included density, viscosity, surface tension and interfacial tension. Additionally, self diffusion coefficients for the selected metal species in each experimental solution were measured, and the distribution coefficient for the metal partitioning between phases was determined. At the completion of this work, the INL has determined the mass transfer coefficient and a velocity profile for drops rising by buoyancy through a continuous medium under a specific set of experimental conditions. Additionally, a complete set of experimentally determined fluid properties has been obtained. All data will be provided to LANL to support the modeling effort.

  20. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  1. Model recommendations meet management reality: implementation and evaluation of a network-informed vaccination effort for endangered Hawaiian monk seals

    Science.gov (United States)

    Barbieri, Michelle M.; Murphy, Samantha; Baker, Jason D.; Harting, Albert L.; Craft, Meggan E.; Littnan, Charles L.

    2018-01-01

    Where disease threatens endangered wildlife populations, substantial resources are required for management actions such as vaccination. While network models provide a promising tool for identifying key spreaders and prioritizing efforts to maximize efficiency, population-scale vaccination remains rare, providing few opportunities to evaluate performance of model-informed strategies under realistic scenarios. Because the endangered Hawaiian monk seal could be heavily impacted by disease threats such as morbillivirus, we implemented a prophylactic vaccination programme. We used contact networks to prioritize vaccinating animals with high contact rates. We used dynamic network models to simulate morbillivirus outbreaks under real and idealized vaccination scenarios. We then evaluated the efficacy of model recommendations in this real-world vaccination project. We found that deviating from the model recommendations decreased the efficiency; requiring 44% more vaccinations to achieve a given decrease in outbreak size. However, we gained protection more quickly by vaccinating available animals rather than waiting to encounter priority seals. This work demonstrates the value of network models, but also makes trade-offs clear. If vaccines were limited but time was ample, vaccinating only priority animals would maximize herd protection. However, where time is the limiting factor, vaccinating additional lower-priority animals could more quickly protect the population. PMID:29321294

  2. Meta-requirements that Model Change

    OpenAIRE

    Gouri Prakash

    2010-01-01

    One of the common problems encountered in software engineering is addressing and responding to the changing nature of requirements. While several approaches have been devised to address this issue, ranging from instilling resistance to changing requirements in order to mitigate impact to project schedules, to developing an agile mindset towards requirements, the approach discussed in this paper is one of conceptualizing the delta in requirement and modeling it, in order t...

  3. Simulation and modeling efforts to support decision making in healthcare supply chain management.

    Science.gov (United States)

    AbuKhousa, Eman; Al-Jaroodi, Jameela; Lazarova-Molnar, Sanja; Mohamed, Nader

    2014-01-01

    Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM) by improving the decision making pertaining processes' efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM) has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges.

  4. Simulation and Modeling Efforts to Support Decision Making in Healthcare Supply Chain Management

    Directory of Open Access Journals (Sweden)

    Eman AbuKhousa

    2014-01-01

    Full Text Available Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM by improving the decision making pertaining processes’ efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges.

  5. Competition for marine space: modelling the Baltic Sea fisheries and effort displacement under spatial restrictions

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Eigaard, Ole Ritzau

    2015-01-01

    to fishery and from vessel to vessel. The impact assessment of new spatial plans involving fisheries should be based on quantitative bioeconomic analyses that take into account individual vessel decisions, and trade-offs in cross-sector conflicting interests.Weuse a vessel-oriented decision-support tool (the...... DISPLACE model) to combine stochastic variations in spatial fishing activities with harvested resource dynamics in scenario projections. The assessment computes economic and stock status indicators by modelling the activity of Danish, Swedish, and German vessels (.12 m) in the international western Baltic...... Sea commercial fishery, together with the underlying size-based distribution dynamics of the main fishery resources of sprat, herring, and cod. The outcomes of alternative scenarios for spatial effort displacement are exemplified by evaluating the fishers’s abilities to adapt to spatial plans under...

  6. Modeling Efforts to Aid in the Determination of Process Enrichment Levels for Identifying Potential Material Diversion

    International Nuclear Information System (INIS)

    Guenther, C F; Elayat, H A; O'Connell, W J

    2006-01-01

    Efforts have been under way at Lawrence Livermore National Laboratory (LLNL) to develop detailed analytical models that simulate enrichment and conversion facilities for the purpose of aiding in the detection of material diversion as part of an overall safeguards strategy. These models could be used to confirm proper accountability of the nuclear materials at facilities worldwide. Operation of an enrichment process for manufacturing commercial reactor fuel presents proliferation concerns including both diversion and the potential for further enrichment to make weapons grade material. While inspections of foreign reprocessing facilities by the International Atomic Energy Agency (IAEA) are meant to ensure that such diversion is not occurring, it must be verified that such diversion is not taking place through both examination of the facility and taking specific measurements such as the radiation fields outside of various process lines. Our current effort is developing algorithms that would be incorporated into the current process models that would provide both neutron and gamma radiation fields outside any process line for the purpose of to determining the most effective locations for placing in-plant monitoring equipment. These algorithms, while providing dose and spectral information, could also be designed to provide detector responses that could be physically measured at various points on the process line. Such information could be used to optimize detector locations in support of real-time on-site monitoring to determine the enrichment levels within a process stream. The results of parametric analyses to establish expected variations for several different process streams and configurations are presented. Based upon these results, the capability of a sodium iodide (NaI(Tl)), high-purity germanium (HPGe), or neutron detection system is being investigated from the standpoint of their viability in quantitatively measuring and discerning the enrichment and potential

  7. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Linking sociological with physiological data: the model of effort-reward imbalance at work.

    Science.gov (United States)

    Siegrist, J; Klein, D; Voigt, K H

    1997-01-01

    While socio-epidemiologic studies documented impressive associations of indicators of chronic psychosocial stress with cardiovascular (c.v.) disease evidence on patho-physiologic processes is still limited. In this regard, the concept of heightened c.v. and hormonal reactivity (RE) to mental stress was proposed and explored. While this concept is a static one we suggest a more dynamic two-stage model of RE where recurrent high responsiveness (stage 1) in the long run results in attenuated, reduced maximal RE due to functional adaptation (stage 2). We present results of an indirect test of this hypothesis in a group of 68 healthy middle-aged men undergoing a modified Stroop Test: in men suffering from high chronic work stress in terms of effort-reward imbalance significantly reduced RE in heart rate, adrenaline and cortisol was found after adjusting for relevant confounders. In conclusion, results underscore the potential of linking sociological with physiological data in stress research.

  9. A Collaborative Effort Between Caribbean States for Tsunami Numerical Modeling: Case Study CaribeWave15

    Science.gov (United States)

    Chacón-Barrantes, Silvia; López-Venegas, Alberto; Sánchez-Escobar, Rónald; Luque-Vergara, Néstor

    2017-10-01

    Historical records have shown that tsunami have affected the Caribbean region in the past. However infrequent, recent studies have demonstrated that they pose a latent hazard for countries within this basin. The Hazard Assessment Working Group of the ICG/CARIBE-EWS (Intergovernmental Coordination Group of the Early Warning System for Tsunamis and Other Coastal Threats for the Caribbean Sea and Adjacent Regions) of IOC/UNESCO has a modeling subgroup, which seeks to develop a modeling platform to assess the effects of possible tsunami sources within the basin. The CaribeWave tsunami exercise is carried out annually in the Caribbean region to increase awareness and test tsunami preparedness of countries within the basin. In this study we present results of tsunami inundation using the CaribeWave15 exercise scenario for four selected locations within the Caribbean basin (Colombia, Costa Rica, Panamá and Puerto Rico), performed by tsunami modeling researchers from those selected countries. The purpose of this study was to provide the states with additional results for the exercise. The results obtained here were compared to co-seismic deformation and tsunami heights within the basin (energy plots) provided for the exercise to assess the performance of the decision support tools distributed by PTWC (Pacific Tsunami Warning Center), the tsunami service provider for the Caribbean basin. However, comparison of coastal tsunami heights was not possible, due to inconsistencies between the provided fault parameters and the modeling results within the provided exercise products. Still, the modeling performed here allowed to analyze tsunami characteristics at the mentioned states from sources within the North Panamá Deformed Belt. The occurrence of a tsunami in the Caribbean may affect several countries because a great variety of them share coastal zones in this basin. Therefore, collaborative efforts similar to the one presented in this study, particularly between neighboring

  10. Index of Effort: An Analytical Model for Evaluating and Re-Directing Student Recruitment Activities for a Local Community College.

    Science.gov (United States)

    Landini, Albert J.

    This index of effort is proposed as a means by which those in charge of student recruitment activities at community colleges can be sure that their efforts are being directed toward all of the appropriate population. The index is an analytical model based on the concept of socio-economic profiles, using small area 1970 census data, and is the…

  11. Prediction Model for Object Oriented Software Development Effort Estimation Using One Hidden Layer Feed Forward Neural Network with Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Chandra Shekhar Yadav

    2014-01-01

    Full Text Available The budget computation for software development is affected by the prediction of software development effort and schedule. Software development effort and schedule can be predicted precisely on the basis of past software project data sets. In this paper, a model for object-oriented software development effort estimation using one hidden layer feed forward neural network (OHFNN has been developed. The model has been further optimized with the help of genetic algorithm by taking weight vector obtained from OHFNN as initial population for the genetic algorithm. Convergence has been obtained by minimizing the sum of squared errors of each input vector and optimal weight vector has been determined to predict the software development effort. The model has been empirically validated on the PROMISE software engineering repository dataset. Performance of the model is more accurate than the well-established constructive cost model (COCOMO.

  12. Habitat models to assist plant protection efforts in Shenandoah National Park, Virginia, USA

    Science.gov (United States)

    Van Manen, F.T.; Young, J.A.; Thatcher, C.A.; Cass, W.B.; Ulrey, C.

    2005-01-01

    During 2002, the National Park Service initiated a demonstration project to develop science-based law enforcement strategies for the protection of at-risk natural resources, including American ginseng (Panax quinquefolius L.), bloodroot (Sanguinaria canadensis L.), and black cohosh (Cimicifuga racemosa (L.) Nutt. [syn. Actaea racemosa L.]). Harvest pressure on these species is increasing because of the growing herbal remedy market. We developed habitat models for Shenandoah National Park and the northern portion of the Blue Ridge Parkway to determine the distribution of favorable habitats of these three plant species and to demonstrate the use of that information to support plant protection activities. We compiled locations for the three plant species to delineate favorable habitats with a geographic information system (GIS). We mapped potential habitat quality for each species by calculating a multivariate statistic, Mahalanobis distance, based on GIS layers that characterized the topography, land cover, and geology of the plant locations (10-m resolution). We tested model performance with an independent dataset of plant locations, which indicated a significant relationship between Mahalanobis distance values and species occurrence. We also generated null models by examining the distribution of the Mahalanobis distance values had plants been distributed randomly. For all species, the habitat models performed markedly better than their respective null models. We used our models to direct field searches to the most favorable habitats, resulting in a sizeable number of new plant locations (82 ginseng, 73 bloodroot, and 139 black cohosh locations). The odds of finding new plant locations based on the habitat models were 4.5 (black cohosh) to 12.3 (American ginseng) times greater than random searches; thus, the habitat models can be used to improve the efficiency of plant protection efforts, (e.g., marking of plants, law enforcement activities). The field searches also

  13. On the importance of controlling for effort in analysis of count survey data: Modeling population change from Christmas Bird Count data

    Science.gov (United States)

    Link, W.A.; Sauer, J.R.; Helbig, Andreas J.; Flade, Martin

    1999-01-01

    Count survey data are commonly used for estimating temporal and spatial patterns of population change. Since count surveys are not censuses, counts can be influenced by 'nuisance factors' related to the probability of detecting animals but unrelated to the actual population size. The effects of systematic changes in these factors can be confounded with patterns of population change. Thus, valid analysis of count survey data requires the identification of nuisance factors and flexible models for their effects. We illustrate using data from the Christmas Bird Count (CBC), a midwinter survey of bird populations in North America. CBC survey effort has substantially increased in recent years, suggesting that unadjusted counts may overstate population growth (or understate declines). We describe a flexible family of models for the effect of effort, that includes models in which increasing effort leads to diminishing returns in terms of the number of birds counted.

  14. Modeling the Movement of Homicide by Type to Inform Public Health Prevention Efforts.

    Science.gov (United States)

    Zeoli, April M; Grady, Sue; Pizarro, Jesenia M; Melde, Chris

    2015-10-01

    We modeled the spatiotemporal movement of hotspot clusters of homicide by motive in Newark, New Jersey, to investigate whether different homicide types have different patterns of clustering and movement. We obtained homicide data from the Newark Police Department Homicide Unit's investigative files from 1997 through 2007 (n = 560). We geocoded the address at which each homicide victim was found and recorded the date of and the motive for the homicide. We used cluster detection software to model the spatiotemporal movement of statistically significant homicide clusters by motive, using census tract and month of occurrence as the spatial and temporal units of analysis. Gang-motivated homicides showed evidence of clustering and diffusion through Newark. Additionally, gang-motivated homicide clusters overlapped to a degree with revenge and drug-motivated homicide clusters. Escalating dispute and nonintimate familial homicides clustered; however, there was no evidence of diffusion. Intimate partner and robbery homicides did not cluster. By tracking how homicide types diffuse through communities and determining which places have ongoing or emerging homicide problems by type, we can better inform the deployment of prevention and intervention efforts.

  15. Examination of a Process Model of Adolescent Smoking Self-Change Efforts in Relation to Gender

    Science.gov (United States)

    MacPherson, Laura; Myers, Mark G.

    2010-01-01

    Little information describes how adolescents change their smoking behavior. This study investigated the role of gender in the relationship of motivation and cognitive variables with adolescent smoking self-change efforts. Self-report and semi-structured interview data from a prospective study of smoking self-change efforts were examined among 98…

  16. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  17. MODELING THE STRUCTURAL RELATIONS AMONG LEARNING STRATEGIES, SELF-EFFICACY BELIEFS, AND EFFORT REGULATION

    Directory of Open Access Journals (Sweden)

    Şenol Şen

    2016-06-01

    Full Text Available This research examined the relations among students’ learning strategies (elaboration, organization, critical thinking and metacognitive learning strategies, self-efficacy beliefs, and effort regulation. The Motivated Strategies for Learning Questionnaire (MSLQ was used to measure students’ learning strategies, self-efficacy beliefs, and effort regulation. A total of 227 high school students participated in the research. Confirmatory factor analysis and path analysis were performed to examine the relations among the variables of the research. Results revealed that students’ metacognitive learning strategies and self-efficacy beliefs statistically and significantly predicted their effort regulation. In addition, the students’ self-efficacy beliefs directly affected deep cognitive learning strategies and effort regulation but indirectly affected metacognitive learning strategies. Furthermore, 88.6 % of the variance in effort regulation was explained by metacognitive learning strategies and self-efficacy beliefs.

  18. Economic effort management in multispecies fisheries: the FcubEcon model

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans; Ulrich, Clara

    2010-01-01

    allocation between fleets should not be based on biological considerations alone, but also on the economic behaviour of fishers, because fisheries management has a significant impact on human behaviour as well as on ecosystem development. The FcubEcon management framework for effort allocation between fleets......-harvest potential and fish-stock-preservation considerations. Effort allocation between fleets should not be based on biological considerations alone, but also on the economic behaviour of fishers, because fisheries management has a significant impact on human behaviour as well as on ecosystem development. The Fcub...... in the development of management tools based on fleets, fisheries, and areas, rather than on unit fish stocks. A natural consequence of this has been to consider effort rather than quota management, a final effort decision being based on fleet-harvest potential and fish-stock-preservation considerations. Effort...

  19. Upending the social ecological model to guide health promotion efforts toward policy and environmental change.

    Science.gov (United States)

    Golden, Shelley D; McLeroy, Kenneth R; Green, Lawrence W; Earp, Jo Anne L; Lieberman, Lisa D

    2015-04-01

    Efforts to change policies and the environments in which people live, work, and play have gained increasing attention over the past several decades. Yet health promotion frameworks that illustrate the complex processes that produce health-enhancing structural changes are limited. Building on the experiences of health educators, community activists, and community-based researchers described in this supplement and elsewhere, as well as several political, social, and behavioral science theories, we propose a new framework to organize our thinking about producing policy, environmental, and other structural changes. We build on the social ecological model, a framework widely employed in public health research and practice, by turning it inside out, placing health-related and other social policies and environments at the center, and conceptualizing the ways in which individuals, their social networks, and organized groups produce a community context that fosters healthy policy and environmental development. We conclude by describing how health promotion practitioners and researchers can foster structural change by (1) conveying the health and social relevance of policy and environmental change initiatives, (2) building partnerships to support them, and (3) promoting more equitable distributions of the resources necessary for people to meet their daily needs, control their lives, and freely participate in the public sphere. © 2015 Society for Public Health Education.

  20. Overview of past, ongoing and future efforts of the integrated modeling of global change for Northern Eurasia

    Science.gov (United States)

    Monier, Erwan; Kicklighter, David; Sokolov, Andrei; Zhuang, Qianlai; Melillo, Jerry; Reilly, John

    2016-04-01

    Northern Eurasia is both a major player in the global carbon budget (it includes roughly 70% of the Earth's boreal forest and more than two-thirds of the Earth's permafrost) and a region that has experienced dramatic climate change (increase in temperature, growing season length, floods and droughts) over the past century. Northern Eurasia has also undergone significant land-use change, both driven by human activity (including deforestation, expansion of agricultural lands and urbanization) and natural disturbances (such as wildfires and insect outbreaks). These large environmental and socioeconomic impacts have major implications for the carbon cycle in the region. Northern Eurasia is made up of a diverse set of ecosystems that range from tundra to forests, with significant areas of croplands and pastures as well as deserts, with major urban areas. As such, it represents a complex system with substantial challenges for the modeling community. In this presentation, we provide an overview of past, ongoing and possible future efforts of the integrated modeling of global change for Northern Eurasia. We review the variety of existing modeling approaches to investigate specific components of Earth system dynamics in the region. While there are a limited number of studies that try to integrate various aspects of the Earth system (through scale, teleconnections or processes), we point out that there are few systematic analyses of the various feedbacks within the Earth system (between components, regions or scale). As a result, there is a lack of knowledge of the relative importance of such feedbacks, and it is unclear how policy relevant current studies are that fail to account for these feedbacks. We review the role of Earth system models, and their advantages/limitations compared to detailed single component models. We further introduce the human activity system (global trade, economic models, demographic model and so on), the need for coupled human/earth system models

  1. Modeling requirements for in situ vitrification

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom

  2. The complex nature of mixed farming systems requires multidimensional actions supported by integrative research and development efforts

    DEFF Research Database (Denmark)

    González-García, E; Gourdine, J L; Alexandre, G

    2012-01-01

    and animal systems and finally increasing opportunities in rural livelihoods. Focusing our analysis and discussion on field experiences and empirical knowledge in the Caribbean islands, this paper discusses the opportunities for a change needed in current MFS research–development philosophy. The importance...... the requirement for a change in research strategies and initiatives through the development of a complex but necessary multi-/inter-/trans-disciplinary teamwork spirit. We stress as essential the collaboration and active participation of local and regional actors, stakeholders and end-users in the identification...

  3. Evaluation of an ARPS-based canopy flow modeling system for use in future operational smoke prediction efforts

    Science.gov (United States)

    M. T. Kiefer; S. Zhong; W. E. Heilman; J. J. Charney; X. Bian

    2013-01-01

    Efforts to develop a canopy flow modeling system based on the Advanced Regional Prediction System (ARPS) model are discussed. The standard version of ARPS is modified to account for the effect of drag forces on mean and turbulent flow through a vegetation canopy, via production and sink terms in the momentum and subgrid-scale turbulent kinetic energy (TKE) equations....

  4. DISPLACE: a dynamic, individual-based model for spatial fishing planning and effort displacement: Integrating underlying fish population models

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Miethe, Tanja

    or to the alteration of individual fishing patterns. We demonstrate that integrating the spatial activity of vessels and local fish stock abundance dynamics allow for interactions and more realistic predictions of fishermen behaviour, revenues and stock abundance......We previously developed a spatially explicit, individual-based model (IBM) evaluating the bio-economic efficiency of fishing vessel movements between regions according to the catching and targeting of different species based on the most recent high resolution spatial fishery data. The main purpose...... was to test the effects of alternative fishing effort allocation scenarios related to fuel consumption, energy efficiency (value per litre of fuel), sustainable fish stock harvesting, and profitability of the fisheries. The assumption here was constant underlying resource availability. Now, an advanced...

  5. Modelling human resource requirements for the nuclear industry in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG) (Netherlands); Flore, Massimo; Estorff, Ulrik von [Joint Research Center (JRC) (Netherlands)

    2017-11-15

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  6. One State's Systems Change Efforts to Reduce Child Care Expulsion: Taking the Pyramid Model to Scale

    Science.gov (United States)

    Vinh, Megan; Strain, Phil; Davidon, Sarah; Smith, Barbara J.

    2016-01-01

    This article describes the efforts funded by the state of Colorado to address unacceptably high rates of expulsion from child care. Based on the results of a 2006 survey, the state of Colorado launched two complementary policy initiatives in 2009 to impact expulsion rates and to improve the use of evidence-based practices related to challenging…

  7. Bodily Effort Enhances Learning and Metacognition: Investigating the Relation Between Physical Effort and Cognition Using Dual-Process Models of Embodiment.

    Science.gov (United States)

    Skulmowski, Alexander; Rey, Günter Daniel

    2017-01-01

    Recent embodiment research revealed that cognitive processes can be influenced by bodily cues. Some of these cues were found to elicit disparate effects on cognition. For instance, weight sensations can inhibit problem-solving performance, but were shown to increase judgments regarding recall probability (judgments of learning; JOLs) in memory tasks. We investigated the effects of physical effort on learning and metacognition by conducting two studies in which we varied whether a backpack was worn or not while 20 nouns were to be learned. Participants entered a JOL for each word and completed a recall test. Experiment 1 ( N = 18) revealed that exerting physical effort by wearing a backpack led to higher JOLs for easy nouns, without a notable effect on difficult nouns. Participants who wore a backpack reached higher recall scores. Therefore, physical effort may act as a form of desirable difficulty during learning. In Experiment 2 ( N = 30), the influence of physical effort on JOL s and learning disappeared when more difficult nouns were to be learned, implying that a high cognitive load may diminish bodily effects. These findings suggest that physical effort mainly influences superficial modes of thought and raise doubts concerning the explanatory power of metaphor-centered accounts of embodiment for higher-level cognition.

  8. 2D and 3D Modeling Efforts in Fuel Film Cooling of Liquid Rocket Engines (Conference Paper with Briefing Charts)

    Science.gov (United States)

    2017-01-12

    Conference Paper with Briefing Charts 3. DATES COVERED (From - To) 17 November 2016 – 12 January 2017 4. TITLE AND SUBTITLE 2D and 3D Modeling ...98) Prescribed by ANSI Std. 239.18 2D and 3D Modeling Efforts in Fuel Film Cooling of Liquid Rocket Engines Kevin C. Brown∗, Edward B. Coy†, and...wide. As a consequence, the 3D simulations may better model the experimental setup used, but are perhaps not representative of the long circumferential

  9. Health Promotion Efforts as Predictors of Physical Activity in Schools: An Application of the Diffusion of Innovations Model

    Science.gov (United States)

    Glowacki, Elizabeth M.; Centeio, Erin E.; Van Dongen, Daniel J.; Carson, Russell L.; Castelli, Darla M.

    2016-01-01

    Background: Implementing a comprehensive school physical activity program (CSPAP) effectively addresses public health issues by providing opportunities for physical activity (PA). Grounded in the Diffusion of Innovations model, the purpose of this study was to identify how health promotion efforts facilitate opportunities for PA. Methods: Physical…

  10. Inverse problem for a physiologically structured population model with variable-effort harvesting

    Directory of Open Access Journals (Sweden)

    Andrusyak Ruslan V.

    2017-04-01

    Full Text Available We consider the inverse problem of determining how the physiological structure of a harvested population evolves in time, and of finding the time-dependent effort to be expended in harvesting, so that the weighted integral of the density, which may be, for example, the total number of individuals or the total biomass, has prescribed dynamics. We give conditions for the existence of a unique, global, weak solution to the problem. Our investigation is carried out using the method of characteristics and a generalization of the Banach fixed-point theorem.

  11. Markov Modeling of Component Fault Growth Over A Derived Domain of Feasible Output Control Effort Modifications

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper introduces a novel Markov process formulation of stochastic fault growth modeling, in order to facilitate the development and analysis of...

  12. The European Integrated Tokamak Modelling (ITM) effort: achievements and first physics results

    NARCIS (Netherlands)

    G.L. Falchetto,; Coster, D.; Coelho, R.; Scott, B. D.; Figini, L.; Kalupin, D.; Nardon, E.; Nowak, S.; L.L. Alves,; Artaud, J. F.; Basiuk, V.; João P.S. Bizarro,; C. Boulbe,; Dinklage, A.; Farina, D.; B. Faugeras,; Ferreira, J.; Figueiredo, A.; Huynh, P.; Imbeaux, F.; Ivanova-Stanik, I.; Jonsson, T.; H.-J. Klingshirn,; Konz, C.; Kus, A.; Marushchenko, N. B.; Pereverzev, G.; M. Owsiak,; Poli, E.; Peysson, Y.; R. Reimer,; Signoret, J.; Sauter, O.; Stankiewicz, R.; Strand, P.; Voitsekhovitch, I.; Westerhof, E.; T. Zok,; Zwingmann, W.; ITM-TF contributors,; ASDEX Upgrade team,; JET-EFDA Contributors,

    2014-01-01

    A selection of achievements and first physics results are presented of the European Integrated Tokamak Modelling Task Force (EFDA ITM-TF) simulation framework, which aims to provide a standardized platform and an integrated modelling suite of validated numerical codes for the simulation and

  13. Artificial Neural Networks for Reducing Computational Effort in Active Truncated Model Testing of Mooring Lines

    DEFF Research Database (Denmark)

    Christiansen, Niels Hørbye; Voie, Per Erlend Torbergsen; Høgsberg, Jan Becker

    2015-01-01

    is by active truncated models. In these models only the very top part of the system is represented by a physical model whereas the behavior of the part below the truncation is calculated by numerical models and accounted for in the physical model by active actuators applying relevant forces to the physical...... orders of magnitude faster than conventional numerical methods. The AAN ability to learn and predict the nonlinear relation between a given input and the corresponding output makes the hybrid method tailor made for the active actuators used in the truncated experiments. All the ANN training can be done...... prior to the experiment and with a properly trained ANN it is no problem to obtain accurate simulations much faster than real time-without any need for large computational capacity. The present study demonstrates how this hybrid method can be applied to the active truncated experiments yielding a system...

  14. LMDzT-INCA dust forecast model developments and associated validation efforts

    International Nuclear Information System (INIS)

    Schulz, M; Cozic, A; Szopa, S

    2009-01-01

    The nudged atmosphere global climate model LMDzT-INCA is used to forecast global dust fields. Evaluation is undertaken in retrospective for the forecast results of the year 2006. For this purpose AERONET/Photons sites in Northern Africa and on the Arabian Peninsula are chosen where aerosol optical depth is dominated by dust. Despite its coarse resolution, the model captures 48% of the day to day dust variability near Dakar on the initial day of the forecast. On weekly and monthly scale the model captures respectively 62% and 68% of the variability. Correlation coefficients between daily AOD values observed and modelled at Dakar decrease from 0.69 for the initial forecast day to 0.59 and 0.41 respectively for two days ahead and five days ahead. If one requests that the model should be able to issue a warning for an exceedance of aerosol optical depth of 0.5 and issue no warning in the other cases, then the model was wrong in 29% of the cases for day 0, 32% for day 2 and 35% for day 5. A reanalysis run with archived ECMWF winds is only slightly better (r=0.71) but was in error in 25% of the cases. Both the improved simulation of the monthly versus daily variability and the deterioration of the forecast with time can be explained by model failure to simulate the exact timing of a dust event.

  15. Evaluation of Thin Plate Hydrodynamic Stability through a Combined Numerical Modeling and Experimental Effort

    Energy Technology Data Exchange (ETDEWEB)

    Tentner, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Bojanowski, C. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Wilson, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Solbrekken, G [Univ. of Missouri, Columbia, MO (United States); Jesse, C. [Univ. of Missouri, Columbia, MO (United States); Kennedy, J. [Univ. of Missouri, Columbia, MO (United States); Rivers, J. [Univ. of Missouri, Columbia, MO (United States); Schnieders, G. [Univ. of Missouri, Columbia, MO (United States)

    2017-05-01

    An experimental and computational effort was undertaken in order to evaluate the capability of the fluid-structure interaction (FSI) simulation tools to describe the deflection of a Missouri University Research Reactor (MURR) fuel element plate redesigned for conversion to lowenriched uranium (LEU) fuel due to hydrodynamic forces. Experiments involving both flat plates and curved plates were conducted in a water flow test loop located at the University of Missouri (MU), at conditions and geometries that can be related to the MURR LEU fuel element. A wider channel gap on one side of the test plate, and a narrower on the other represent the differences that could be encountered in a MURR element due to allowed fabrication variability. The difference in the channel gaps leads to a pressure differential across the plate, leading to plate deflection. The induced plate deflection the pressure difference induces in the plate was measured at specified locations using a laser measurement technique. High fidelity 3-D simulations of the experiments were performed at MU using the computational fluid dynamics code STAR-CCM+ coupled with the structural mechanics code ABAQUS. Independent simulations of the experiments were performed at Argonne National Laboratory (ANL) using the STAR-CCM+ code and its built-in structural mechanics solver. The simulation results obtained at MU and ANL were compared with the corresponding measured plate deflections.

  16. Integrated modelling requires mass collaboration (Invited)

    Science.gov (United States)

    Moore, R. V.

    2009-12-01

    The need for sustainable solutions to the world’s problems is self evident; the challenge is to anticipate where, in the environment, economy or society, the proposed solution will have negative consequences. If we failed to realise that the switch to biofuels would have the seemingly obvious result of reduced food production, how much harder will it be to predict the likely impact of policies whose impacts may be more subtle? It has been clear for a long time that models and data will be important tools for assessing the impact of events and the measures for their mitigation. They are an effective way of encapsulating knowledge of a process and using it for prediction. However, most models represent a single or small group of processes. The sustainability challenges that face us now require not just the prediction of a single process but the prediction of how many interacting processes will respond in given circumstances. These processes will not be confined to a single discipline but will often straddle many. For example, the question, “What will be the impact on river water quality of the medical plans for managing a ‘flu pandemic and could they cause a further health hazard?” spans medical planning, the absorption of drugs by the body, the spread of disease, the hydraulic and chemical processes in sewers and sewage treatment works and river water quality. This question nicely reflects the present state of the art. We have models of the processes and standards, such as the Open Modelling Interface (the OpenMI), allow them to be linked together and to datasets. We can therefore answer the question but with the important proviso that we thought to ask it. The next and greater challenge is to deal with the open question, “What are the implications of the medical plans for managing a ‘flu pandemic?”. This implies a system that can make connections that may well not have occurred to us and then evaluate their probable impact. The final touch will be to

  17. Modeling Impact and Cost-Effectiveness of Increased Efforts to Attract Voluntary Medical Male Circumcision Clients Ages 20-29 in Zimbabwe.

    Directory of Open Access Journals (Sweden)

    Katharine Kripke

    Full Text Available Zimbabwe aims to increase circumcision coverage to 80% among 13- to 29-year-olds. However, implementation data suggest that high coverage among men ages 20 and older may not be achievable without efforts specifically targeted to these men, incurring additional costs per circumcision. Scale-up scenarios were created based on trends in implementation data in Zimbabwe, and the cost-effectiveness of increasing efforts to recruit clients ages 20-29 was examined.Zimbabwe voluntary medical male circumcision (VMMC program data were used to project trends in male circumcision coverage by age into the future. The projection informed a base scenario in which, by 2018, the country achieves 80% circumcision coverage among males ages 10-19 and lower levels of coverage among men above age 20. The Zimbabwe DMPPT 2.0 model was used to project costs and impacts, assuming a US$109 VMMC unit cost in the base scenario and a 3% discount rate. Two other scenarios assumed that the program could increase coverage among clients ages 20-29 with a corresponding increase in unit cost for these age groups.When circumcision coverage among men ages 20-29 is increased compared with a base scenario reflecting current implementation trends, fewer VMMCs are required to avert one infection. If more than 50% additional effort (reflected as multiplying the unit cost by >1.5 is required to double the increase in coverage among this age group compared with the base scenario, the cost per HIV infection averted is higher than in the base scenario.Although increased investment in recruiting VMMC clients ages 20-29 may lead to greater overall impact if recruitment efforts are successful, it may also lead to lower cost-effectiveness, depending on the cost of increasing recruitment. Programs should measure the relationship between increased effort and increased ability to attract this age group.

  18. RECONSTRUCTION OF PENSION FUND PERFORMANCE MODEL AS AN EFFORT TO WORTHY PENSION FUND GOVERNANCE

    Directory of Open Access Journals (Sweden)

    Apriyanto Gaguk

    2017-08-01

    Full Text Available This study aims to reconstruct the performance assessment model on Pension Fund by modifying Baldrige Assessment method that is adjusted to the conditions in Dana Pensiun A (Pension Fund A in order to realize Good Pension Fund Governance. This study design uses case study analysis. The research sites were conducted in Dana Pensiun A. The informants in the study included the employer, supervisory board, pension fund management, active and passive pension fund participant as well as financial services authority elements as the regulator. The result of this research is a construction of a comprehensive and profound retirement performance assessment model with attention to aspects of growth and fair distribution. The model includes the parameters of leadership, strategic planning, stakeholders focus, measurement, analysis, and knowledge management, workforce focus, standard operational procedure focus, result, just and fair distribution of wealth and power.

  19. MCNP6 and DRiFT modeling efforts for the NEUANCE/DANCE detector array

    Energy Technology Data Exchange (ETDEWEB)

    Pinilla, Maria Isabel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-30

    This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.

  20. Dynamic material flow modeling: an effort to calibrate and validate aluminum stocks and flows in Austria.

    Science.gov (United States)

    Buchner, Hanno; Laner, David; Rechberger, Helmut; Fellner, Johann

    2015-05-05

    A calibrated and validated dynamic material flow model of Austrian aluminum (Al) stocks and flows between 1964 and 2012 was developed. Calibration and extensive plausibility testing was performed to illustrate how the quality of dynamic material flow analysis can be improved on the basis of the consideration of independent bottom-up estimates. According to the model, total Austrian in-use Al stocks reached a level of 360 kg/capita in 2012, with buildings (45%) and transport applications (32%) being the major in-use stocks. Old scrap generation (including export of end-of-life vehicles) amounted to 12.5 kg/capita in 2012, still being on the increase, while Al final demand has remained rather constant at around 25 kg/capita in the past few years. The application of global sensitivity analysis showed that only small parts of the total variance of old scrap generation could be explained by the variation of single parameters, emphasizing the need for comprehensive sensitivity analysis tools accounting for interaction between parameters and time-delay effects in dynamic material flow models. Overall, it was possible to generate a detailed understanding of the evolution of Al stocks and flows in Austria, including plausibility evaluations of the results. Such models constitute a reliable basis for evaluating future recycling potentials, in particular with respect to application-specific qualities of current and future national Al scrap generation and utilization.

  1. Ideals, activities, dissonance, and processing: a conceptual model to guide educators' efforts to stimulate student reflection.

    Science.gov (United States)

    Thompson, Britta M; Teal, Cayla R; Rogers, John C; Paterniti, Debora A; Haidet, Paul

    2010-05-01

    Medical schools are increasingly incorporating opportunities for reflection into their curricula. However, little is known about the cognitive and/or emotional processes that occur when learners participate in activities designed to promote reflection. The purpose of this study was to identify and elucidate those processes. In 2008, the authors analyzed qualitative data from focus groups that were originally conducted to evaluate an educational activity designed to promote reflection. These data afforded the opportunity to explore the processes of reflection in detail. Transcripts (94 pages, single-spaced) from four focus groups were analyzed using a narrative framework. The authors spent approximately 40 hours in group and 240 hours in individual coding activities. The authors developed a conceptual model of five major elements in students' reflective processes: the educational activity, the presence or absence of cognitive or emotional dissonance, and two methods of processing dissonance (preservation or reconciliation). The model also incorporates the relationship between the student's internal ideal of what a doctor is or does and the student's perception of the teacher's ideal of what a doctor is or does. The model further identifies points at which educators may be able to influence the processes of reflection and the development of professional ideals. Students' cognitive and emotional processes have important effects on the success of educational activities intended to stimulate reflection. Although additional research is needed, this model-which incorporates ideals, activities, dissonance, and processing-can guide educators as they plan and implement such activities.

  2. A coupled modelling effort to study the fate of contaminated sediments downstream of the Coles Hill deposit, Virginia, USA

    Directory of Open Access Journals (Sweden)

    C. F. Castro-Bolinaga

    2015-03-01

    Full Text Available This paper presents the preliminary results of a coupled modelling effort to study the fate of tailings (radioactive waste-by product downstream of the Coles Hill uranium deposit located in Virginia, USA. The implementation of the overall modelling process includes a one-dimensional hydraulic model to qualitatively characterize the sediment transport process under severe flooding conditions downstream of the potential mining site, a two-dimensional ANSYS Fluent model to simulate the release of tailings from a containment cell located partially above the local ground surface into the nearby streams, and a one-dimensional finite-volume sediment transport model to examine the propagation of a tailings sediment pulse in the river network located downstream. The findings of this investigation aim to assist in estimating the potential impacts that tailings would have if they were transported into rivers and reservoirs located downstream of the Coles Hill deposit that serve as municipal drinking water supplies.

  3. Controls over Ocean Mesopelagic Interior Carbon Storage (COMICS: fieldwork, synthesis and modelling efforts

    Directory of Open Access Journals (Sweden)

    Richard John Sanders

    2016-08-01

    Full Text Available The ocean’s biological carbon pump plays a central role in regulating atmospheric CO2 levels. In particular, the depth at which sinking organic carbon is broken down and respired in the mesopelagic zone is critical, with deeper remineralisation resulting in greater carbon storage. Until recently, however, a balanced budget of the supply and consumption of organic carbon in the mesopelagic had not been constructed in any region of the ocean, and the processes controlling organic carbon turnover are still poorly understood. Large-scale data syntheses suggest that a wide range of factors can influence remineralisation depth including upper-ocean ecological interactions, and interior dissolved oxygen concentration and temperature. However these analyses do not provide a mechanistic understanding of remineralisation, which increases the challenge of appropriately modelling the mesopelagic carbon dynamics. In light of this, the UK Natural Environment Research Council has funded a programme with this mechanistic understanding as its aim, drawing targeted fieldwork right through to implementation of a new parameterisation for mesopelagic remineralisation within an IPCC class global biogeochemical model. The Controls over Ocean Mesopelagic Interior Carbon Storage (COMICS programme will deliver new insights into the processes of carbon cycling in the mesopelagic zone and how these influence ocean carbon storage. Here we outline the programme’s rationale, its goals, planned fieldwork and modelling activities, with the aim of stimulating international collaboration.

  4. Combined observational and modeling efforts of aerosol-cloud-precipitation interactions over Southeast Asia

    Science.gov (United States)

    Loftus, Adrian; Tsay, Si-Chee; Nguyen, Xuan Anh

    2016-04-01

    Low-level stratocumulus (Sc) clouds cover more of the Earth's surface than any other cloud type rendering them critical for Earth's energy balance, primarily via reflection of solar radiation, as well as their role in the global hydrological cycle. Stratocumuli are particularly sensitive to changes in aerosol loading on both microphysical and macrophysical scales, yet the complex feedbacks involved in aerosol-cloud-precipitation interactions remain poorly understood. Moreover, research on these clouds has largely been confined to marine environments, with far fewer studies over land where major sources of anthropogenic aerosols exist. The aerosol burden over Southeast Asia (SEA) in boreal spring, attributed to biomass burning (BB), exhibits highly consistent spatiotemporal distribution patterns, with major variability due to changes in aerosol loading mediated by processes ranging from large-scale climate factors to diurnal meteorological events. Downwind from source regions, the transported BB aerosols often overlap with low-level Sc cloud decks associated with the development of the region's pre-monsoon system, providing a unique, natural laboratory for further exploring their complex micro- and macro-scale relationships. Compared to other locations worldwide, studies of springtime biomass-burning aerosols and the predominately Sc cloud systems over SEA and their ensuing interactions are underrepresented in scientific literature. Measurements of aerosol and cloud properties, whether ground-based or from satellites, generally lack information on microphysical processes; thus cloud-resolving models are often employed to simulate the underlying physical processes in aerosol-cloud-precipitation interactions. The Goddard Cumulus Ensemble (GCE) cloud model has recently been enhanced with a triple-moment (3M) bulk microphysics scheme as well as the Regional Atmospheric Modeling System (RAMS) version 6 aerosol module. Because the aerosol burden not only affects cloud

  5. Exploring Spatiotemporal Trends in Commercial Fishing Effort of an Abalone Fishing Zone: A GIS-Based Hotspot Model.

    Directory of Open Access Journals (Sweden)

    M Ali Jalali

    Full Text Available Assessing patterns of fisheries activity at a scale related to resource exploitation has received particular attention in recent times. However, acquiring data about the distribution and spatiotemporal allocation of catch and fishing effort in small scale benthic fisheries remains challenging. Here, we used GIS-based spatio-statistical models to investigate the footprint of commercial diving events on blacklip abalone (Haliotis rubra stocks along the south-west coast of Victoria, Australia from 2008 to 2011. Using abalone catch data matched with GPS location we found catch per unit of fishing effort (CPUE was not uniformly spatially and temporally distributed across the study area. Spatial autocorrelation and hotspot analysis revealed significant spatiotemporal clusters of CPUE (with distance thresholds of 100's of meters among years, indicating the presence of CPUE hotspots focused on specific reefs. Cumulative hotspot maps indicated that certain reef complexes were consistently targeted across years but with varying intensity, however often a relatively small proportion of the full reef extent was targeted. Integrating CPUE with remotely-sensed light detection and ranging (LiDAR derived bathymetry data using generalized additive mixed model corroborated that fishing pressure primarily coincided with shallow, rugose and complex components of reef structures. This study demonstrates that a geospatial approach is efficient in detecting patterns and trends in commercial fishing effort and its association with seafloor characteristics.

  6. Community effort endorsing multiscale modelling, multiscale data science and multiscale computing for systems medicine.

    Science.gov (United States)

    Zanin, Massimiliano; Chorbev, Ivan; Stres, Blaz; Stalidzans, Egils; Vera, Julio; Tieri, Paolo; Castiglione, Filippo; Groen, Derek; Zheng, Huiru; Baumbach, Jan; Schmid, Johannes A; Basilio, José; Klimek, Peter; Debeljak, Nataša; Rozman, Damjana; Schmidt, Harald H H W

    2017-12-05

    Systems medicine holds many promises, but has so far provided only a limited number of proofs of principle. To address this road block, possible barriers and challenges of translating systems medicine into clinical practice need to be identified and addressed. The members of the European Cooperation in Science and Technology (COST) Action CA15120 Open Multiscale Systems Medicine (OpenMultiMed) wish to engage the scientific community of systems medicine and multiscale modelling, data science and computing, to provide their feedback in a structured manner. This will result in follow-up white papers and open access resources to accelerate the clinical translation of systems medicine. © The Author 2017. Published by Oxford University Press.

  7. Evaluating the equation-of-state models of nitrogen in the dissociation regime: an experimental effort

    Science.gov (United States)

    Li, Jiangtao; Chen, Qifeng; Fu, Zhijian; Gu, Yunjun; Zheng, Jun; Li, Chengjun

    2017-06-01

    A number of experiments were designed so that pre-compressed nitrogen (20 MPa) was shock-compressed reverberatively into a regime where molecular dissociation is expected to influence significantly the equation-of-state and transport properties. The equation of state of nitrogen after each compression process was probed by a joint diagnostics of multichannel optical pyrometer (MCOP) and Doppler pin system (DPS). The equation of state data thereby obtained span a pressure-density range of about 0.02-130 GPa and 0.22-5.9 g/cc. Furthermore, based on the uncertainties of the measurements, a Monte Carlo method was employed to evaluate the probability distribution of the thermodynamic state after each compression. According to Monte Carlo results, a number of equation-of-state models or calculations for nitrogen in the dissociation regime were assessed.

  8. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  9. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  10. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  11. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  12. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  13. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  14. Applying the effort-reward imbalance model to household and family work: a population-based study of German mothers.

    Science.gov (United States)

    Sperlich, Stefanie; Peter, Richard; Geyer, Siegfried

    2012-01-06

    This paper reports on results of a newly developed questionnaire for the assessment of effort-reward imbalance (ERI) in unpaid household and family work. Using a cross-sectional population-based survey of German mothers (n = 3129) the dimensional structure of the theoretical ERI model was validated by means of Confirmatory Factor Analysis (CFA). Analyses of Variance were computed to examine relationships between ERI and social factors and health outcomes. CFA revealed good psychometric properties indicating that the subscale 'effort' is based on one latent factor and the subscale 'reward' is composed of four dimensions: 'intrinsic value of family and household work', 'societal esteem', 'recognition from the partner', and 'affection from the child(ren)'. About 19.3% of mothers perceived lack of reciprocity and 23.8% showed high rates of overcommitment in terms of inability to withdraw from household and family obligations. Socially disadvantaged mothers were at higher risk of ERI, in particular with respect to the perception of low societal esteem. Gender inequality in the division of household and family work and work-family conflict accounted most for ERI in household and family work. Analogous to ERI in paid work we could demonstrate that ERI affects self-rated health, somatic complaints, mental health and, to some extent, hypertension. The newly developed questionnaire demonstrates satisfied validity and promising results for extending the ERI model to household and family work.

  15. Applying the effort-reward imbalance model to household and family work: a population-based study of German mothers

    Directory of Open Access Journals (Sweden)

    Sperlich Stefanie

    2012-01-01

    Full Text Available Abstract Background This paper reports on results of a newly developed questionnaire for the assessment of effort-reward imbalance (ERI in unpaid household and family work. Methods: Using a cross-sectional population-based survey of German mothers (n = 3129 the dimensional structure of the theoretical ERI model was validated by means of Confirmatory Factor Analysis (CFA. Analyses of Variance were computed to examine relationships between ERI and social factors and health outcomes. Results CFA revealed good psychometric properties indicating that the subscale 'effort' is based on one latent factor and the subscale 'reward' is composed of four dimensions: 'intrinsic value of family and household work', 'societal esteem', 'recognition from the partner', and 'affection from the child(ren'. About 19.3% of mothers perceived lack of reciprocity and 23.8% showed high rates of overcommitment in terms of inability to withdraw from household and family obligations. Socially disadvantaged mothers were at higher risk of ERI, in particular with respect to the perception of low societal esteem. Gender inequality in the division of household and family work and work-family conflict accounted most for ERI in household and family work. Analogous to ERI in paid work we could demonstrate that ERI affects self-rated health, somatic complaints, mental health and, to some extent, hypertension. Conclusions The newly developed questionnaire demonstrates satisfied validity and promising results for extending the ERI model to household and family work.

  16. REQUIREMENTS PARTICLE NETWORKS: AN APPROACH TO FORMAL SOFTWARE FUNCTIONAL REQUIREMENTS MODELLING

    OpenAIRE

    Wiwat Vatanawood; Wanchai Rivepiboon

    2001-01-01

    In this paper, an approach to software functional requirements modelling using requirements particle networks is presented. In our approach, a set of requirements particles is defined as an essential tool to construct a visual model of software functional requirements specification during the software analysis phase and the relevant formal specification is systematically generated without the experience of writing formal specification. A number of algorithms are presented to perform these for...

  17. Non-monotonic modelling from initial requirements: a proposal and comparison with monotonic modelling methods

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wupper, H.; Wieringa, Roelf J.

    2008-01-01

    Researchers make a significant effort to develop new modelling languages and tools. However, they spend less effort developing methods for constructing models using these languages and tools. We are developing a method for building an embedded system model for formal verification. Our method

  18. Building traceable Event-B models from requirements

    OpenAIRE

    Alkhammash, Eman; Butler, Michael; Fathabadi, Asieh Salehi; Cîrstea, Corina

    2015-01-01

    Abstract Bridging the gap between informal requirements and formal specifications is a key challenge in systems engineering. Constructing appropriate abstractions in formal models requires skill and managing the complexity of the relationships between requirements and formal models can be difficult. In this paper we present an approach that aims to address the twin challenges of finding appropriate abstractions and managing traceability between requirements and models. Our approach is based o...

  19. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...... describe an incremental approach to requirements validation and systems modelling. Formal modelling facilitates a high degree of automation: it serves for validation and traceability. The foundation for our approach are requirements that are structured according to the WRSPM reference model. We provide...... a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements...

  20. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  1. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  2. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  3. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  4. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  5. Short-term dispersal of Fukushima-derived radionuclides off Japan: modeling efforts and model-data intercomparison

    Directory of Open Access Journals (Sweden)

    I. I. Rypina

    2013-07-01

    Full Text Available The Great East Japan Earthquake and tsunami that caused a loss of power at the Fukushima nuclear power plants (FNPP resulted in emission of radioactive isotopes into the atmosphere and the ocean. In June of 2011, an international survey measuring a variety of radionuclide isotopes, including 137Cs, was conducted in surface and subsurface waters off Japan. This paper presents the results of numerical simulations specifically aimed at interpreting these observations and investigating the spread of Fukushima-derived radionuclides off the coast of Japan and into the greater Pacific Ocean. Together, the simulations and observations allow us to study the dominant mechanisms governing this process, and to estimate the total amount of radionuclides in discharged coolant waters and atmospheric airborne radionuclide fallout. The numerical simulations are based on two different ocean circulation models, one inferred from AVISO altimetry and NCEP/NCAR reanalysis wind stress, and the second generated numerically by the NCOM model. Our simulations determine that > 95% of 137Cs remaining in the water within ~600 km of Fukushima, Japan in mid-June 2011 was due to the direct oceanic discharge. The estimated strength of the oceanic source is 16.2 ± 1.6 PBq, based on minimizing the model-data mismatch. We cannot make an accurate estimate for the atmospheric source strength since most of the fallout cesium had left the survey area by mid-June. The model explained several key features of the observed 137Cs distribution. First, the absence of 137Cs at the southernmost stations is attributed to the Kuroshio Current acting as a transport barrier against the southward progression of 137Cs. Second, the largest 137Cs concentrations were associated with a semi-permanent eddy that entrained 137Cs-rich waters, collecting and stirring them around the eddy perimeter. Finally, the intermediate 137Cs concentrations at the westernmost stations are attributed to younger, and

  6. Modeling Domain Variability in Requirements Engineering with Contexts

    Science.gov (United States)

    Lapouchnian, Alexei; Mylopoulos, John

    Various characteristics of the problem domain define the context in which the system is to operate and thus impact heavily on its requirements. However, most requirements specifications do not consider contextual properties and few modeling notations explicitly specify how domain variability affects the requirements. In this paper, we propose an approach for using contexts to model domain variability in goal models. We discuss the modeling of contexts, the specification of their effects on system goals, and the analysis of goal models with contextual variability. The approach is illustrated with a case study.

  7. Associations of Extrinsic and Intrinsic Components of Work Stress with Health: A Systematic Review of Evidence on the Effort-Reward Imbalance Model.

    Science.gov (United States)

    Siegrist, Johannes; Li, Jian

    2016-04-19

    Mainstream psychological stress theory claims that it is important to include information on people's ways of coping with work stress when assessing the impact of stressful psychosocial work environments on health. Yet, some widely used respective theoretical models focus exclusively on extrinsic factors. The model of effort-reward imbalance (ERI) differs from them as it explicitly combines information on extrinsic and intrinsic factors in studying workers' health. As a growing number of studies used the ERI model in recent past, we conducted a systematic review of available evidence, with a special focus on the distinct contribution of its intrinsic component, the coping pattern "over-commitment", towards explaining health. Moreover, we explore whether the interaction of intrinsic and extrinsic components exceeds the size of effects on health attributable to single components. Results based on 51 reports document an independent explanatory role of "over-commitment" in explaining workers' health in a majority of studies. However, support in favour of the interaction hypothesis is limited and requires further exploration. In conclusion, the findings of this review support the usefulness of a work stress model that combines extrinsic and intrinsic components in terms of scientific explanation and of designing more comprehensive worksite stress prevention programs.

  8. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  9. Capabilities and requirements for modelling radionuclide transport in the geosphere

    International Nuclear Information System (INIS)

    Paige, R.W.; Piper, D.

    1989-02-01

    This report gives an overview of geosphere flow and transport models suitable for use by the Department of the Environment in the performance assessment of radioactive waste disposal sites. An outline methodology for geosphere modelling is proposed, consisting of a number of different types of model. A brief description of each of the component models is given, indicating the purpose of the model, the processes being modelled and the methodologies adopted. Areas requiring development are noted. (author)

  10. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  11. Parenting and the Development of Effortful Control from Early Childhood to Early Adolescence: A Transactional Developmental Model

    Science.gov (United States)

    Capaldi, Deborah M.; Kerr, David C. R.; Bertrand, Maria; Pears, Katherine C.; Owen, Lee

    2016-01-01

    Poor effortful control is a key temperamental factor underlying behavioral problems. The bidirectional association of child effortful control with both positive parenting and negative discipline was examined from ages approximately 3 to 13–14 years, involving 5 time points, and using data from parents and children in the Oregon Youth Study-Three Generational Study (N = 318 children from 150 families). Based on a dynamic developmental systems approach, it was hypothesized that there would be concurrent associations between parenting and child effortful control and bidirectional effects across time from each aspect of parenting to effortful control and from effortful control to each aspect of parenting. It was also hypothesized that associations would be more robust in early childhood, from ages 3 to 7 years, and would diminish as indicated by significantly weaker effects at the older ages, 11–12 to 13–14 years. Longitudinal feedback or mediated effects were also tested. Findings supported (a) stability in each construct over multiple developmental periods; (b) concurrent associations, which were significantly weaker at the older ages; (c) bidirectional effects, consistent with the interpretation that at younger ages children’s effortful control influenced parenting, whereas at older child ages, parenting influenced effortful control; and (d) a transactional effect, such that maternal parenting in late childhood was a mechanism explaining children’s development of effortful control from midchildhood to early adolescence. PMID:27427809

  12. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  13. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  14. Using the DayCent Ecosystem Model to Predict Methane Emissions from Wetland Rice Production in Support for Mitigation Efforts

    Science.gov (United States)

    Ogle, S. M.; Parton, W. J.; Cheng, K.; Pan, G.

    2014-12-01

    Wetland rice production is a major source of greenhouse gas (GHG) emissions to the atmosphere, and rice production is predicted to increase dramatically in the future due to expected growth in human populations. Mitigating GHG emissions from future rice production is possible with best management practices for water management, residue management and organic amendments. Policy initiatives and programs that promote practices to reduce GHG emissions from rice production will likely need robust methods for quantifying emission reductions. Frameworks based on process-based model provide one alternative for estimating emissions reductions. The advantages of this approach are that the models are relatively inexpensive to apply, incorporate a variety of management and environmental drivers influencing emissions, and can be used to predict future emissions for planning purposes. The disadvantages are that the models can be challenging to parameterize and evaluate, and require a relatively large amount of data. The DayCent ecosystem model simulates plant and soil processes, and is an example of a model that could be used to quantify emission reductions for reporting mitigation activities associated with rice production systems. DayCent estimates methane emissions, which is the major source of GHG emissions from wetland rice, but also estimates nitrous oxide emissions and soil organic C stock changes. DayCent has been evaluated using data from China, explaining 83% of the variation in methane emissions from 72 experimental rice fields. In addition, DayCent has been applied regionally in the United States to estimate methane, nitrous oxide emissions, and soil C stock changes, in compliance with the guidelines for reporting GHG emissions to the UN Framework Convention on Climate Change. Given the cost of alternatives, process-based models such as DayCent may offer the best way forward for estimating GHG emissions from rice production, and with quantification of uncertainty

  15. Requirements Traceability and Transformation Conformance in Model-Driven Development

    NARCIS (Netherlands)

    Andrade Almeida, João; van Eck, Pascal; Iacob, Maria Eugenia

    2006-01-01

    The variety of design artefacts (models) produced in a model-driven design process results in an intricate rela-tionship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship. This frame-work is a basis for tracing

  16. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    In this paper we argue that previous models of cognitive abilities (e.g. memory, analogy) have been constructed to satisfy functional requirements of implicit commonsense psychological theories held by researchers and nonresearchers alike...

  17. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  18. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  19. Hybriding CMMI and requirement engineering maturity and capability models

    OpenAIRE

    Buglione, Luigi; Hauck, Jean Carlo R.; Gresse von Wangenheim, Christiane; Mc Caffery, Fergal

    2012-01-01

    peer-reviewed Estimation represents one of the most critical processes for any project and it is highly dependent on the quality of requirements elicitation and management. Therefore, the management of requirements should be prioritised in any process improvement program, because the less precise the requirements gathering, analysis and sizing, the greater the error in terms of time and cost estimation. Maturity and Capability Models (MCM) represent a good tool for assessing the status of ...

  20. Generic skills requirements (KSA model) towards future mechanical ...

    African Journals Online (AJOL)

    Generic Skills is a basic requirement that engineers need to master in all areas of Engineering. This study was conducted throughout the peninsular Malaysia involving small, medium and heavy industries using the KSA Model. The objectives of this study are studying the level of requirement of Generic Skills that need to be ...

  1. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  2. Efforts to Address the Aging Academic Workforce: Assessing Progress through a Three-Stage Model of Institutional Change

    Science.gov (United States)

    Kaskie, Brian; Walker, Mark; Andersson, Matthew

    2017-01-01

    The aging of the academic workforce is becoming more relevant to policy discussions in higher education. Yet there has been no formal, large-scale analysis of institutional efforts to develop policies and programs for aging employees. We fielded a representative survey of human resource specialists at 187 colleges and universities across the…

  3. Business Process Simulation: Requirements for Business and Resource Models

    OpenAIRE

    Audrius Rima; Olegas Vasilecas

    2015-01-01

    The purpose of Business Process Model and Notation (BPMN) is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  4. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  5. Building a Narrative Based Requirements Engineering Mediation Model

    Science.gov (United States)

    Ma, Nan; Hall, Tracy; Barker, Trevor

    This paper presents a narrative-based Requirements Engineering (RE) mediation model to help RE practitioners to effectively identify, define, and resolve conflicts of interest, goals, and requirements. Within the SPI community, there is a common belief that social, human, and organizational issues significantly impact on the effectiveness of software process improvement in general and the requirements engineering process in particularl. Conflicts among different stakeholders are an important human and social issue that need more research attention in the SPI and RE community. By drawing on the conflict resolution literature and IS literature, we argue that conflict resolution in RE is a mediated process, in which a requirements engineer can act as a mediator among different stakeholders. To address socio-psychological aspects of conflict in RE and SPI, Winslade and Monk (2000)'s narrative mediation model is introduced, justified, and translated into the context of RE.

  6. A proposal for a coordinated effort for the determination of brainwide neuroanatomical connectivity in model organisms at a mesoscopic scale.

    Directory of Open Access Journals (Sweden)

    Jason W Bohland

    2009-03-01

    Full Text Available In this era of complete genomes, our knowledge of neuroanatomical circuitry remains surprisingly sparse. Such knowledge is critical, however, for both basic and clinical research into brain function. Here we advocate for a concerted effort to fill this gap, through systematic, experimental mapping of neural circuits at a mesoscopic scale of resolution suitable for comprehensive, brainwide coverage, using injections of tracers or viral vectors. We detail the scientific and medical rationale and briefly review existing knowledge and experimental techniques. We define a set of desiderata, including brainwide coverage; validated and extensible experimental techniques suitable for standardization and automation; centralized, open-access data repository; compatibility with existing resources; and tractability with current informatics technology. We discuss a hypothetical but tractable plan for mouse, additional efforts for the macaque, and technique development for human. We estimate that the mouse connectivity project could be completed within five years with a comparatively modest budget.

  7. Effects of fishing effort allocation scenarios on energy efficiency and profitability: an individual-based model applied to Danish fisheries

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Andersen, Bo Sølgaard

    2010-01-01

    Global concerns about CO2 emissions, national CO2 quotas, and rising fuel prices are incentives for the commercial fishing fleet industry to change their fishing practices and reduce fuel consumption, which constitutes a significant part of fishing costs. Vessel-based fuel consumption, energy...... against three alternative effort allocation scenarios for the assumed fishermen's adaptation to these factors: (A) preferring nearby fishing grounds rather than distant grounds with potentially larger catches and higher values, (B) shifting to other fisheries targeting resources located closer...... engine specifications, and fish and fuel prices. The outcomes of scenarios A and B indicate a trade-off between fuel savings and energy efficiency improvements when effort is displaced closer to the harbour compared to reductions in total landing amounts and profit. Scenario C indicates that historic...

  8. Operation TOMODACHI: A Model for American Disaster Response Efforts and the Collective use of Military Forces Abroad

    Science.gov (United States)

    2012-01-01

    Gen Burton Field (Yokota AB, Japan), March 2011 2 Knight, Bill, Colonel, 374 AW/CV, et. al ., “Operation TOMODACHI, MAF Response to Japan’s Nuclear...was best described by the 459th Airlift Squadron Commander, Lt Col Eugene “Gene” Capone during an interview after the completion of mapping efforts...12 Capone , Eugene, Lt Col, 459 AS/CC, personal interview with Dr. John Treiber, transcribed by Dr. John Treiber

  9. Modelling Security Requirements Through Extending Scrum Agile Development Framework

    OpenAIRE

    Alotaibi, Minahi

    2016-01-01

    Security is today considered as a basic foundation in software development and therefore, the modelling and implementation of security requirements is an essential part of the production of secure software systems. Information technology organisations are moving towards agile development methods in order to satisfy customers' changing requirements in light of accelerated evolution and time restrictions with their competitors in software production. Security engineering is considered difficult...

  10. Modeling Requirements for Simulating the Effects of Extreme Acts of Terrorism: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Allen, M.; Hiebert-Dodd, K.; Marozas, D.; Paananen, O.; Pryor, R.J.; Reinert, R.K.

    1998-10-01

    This white paper presents the initial requirements for developing a new computer model for simulating the effects of extreme acts of terrorism in the United States. General characteristics of the model are proposed and the level of effort to prepare a complete written description of the model, prior to coding, is detailed. The model would simulate the decision processes and interactions of complex U. S. systems engaged in responding to and recovering from four types of terrorist incidents. The incident scenarios span the space of extreme acts of terrorism that have the potential to affect not only the impacted area, but also the entire nation. The model would be useful to decision-makers in assessing and analyzing the vulnerability of the nation's complex infrastructures, in prioritizing resources to reduce risk, and in planning strategies for immediate response and for subsequent recovery from terrorist incidents.

  11. The Efforts to Improve Mathematics Learning Achievement Results of High School Students as Required by Competency-Based Curriculum and Lesson Level-Based Curriculum

    Science.gov (United States)

    Sidabutar, Ropinus

    2016-01-01

    The research was aimed to investigate the effect of various, innovated teaching models to improved the student's achievement in various topic in Mathematics. The study was conduct experiment by using innovated teaching with contextual, media and web which are the compared. with conventional teaching method. The result showed the innovation in the…

  12. Ethnicity, Effort, Self-Efficacy, Worry, and Statistics Achievement in Malaysia: A Construct Validation of the State-Trait Motivation Model

    Science.gov (United States)

    Awang-Hashim, Rosa; O'Neil, Harold F., Jr.; Hocevar, Dennis

    2002-01-01

    The relations between motivational constructs, effort, self-efficacy and worry, and statistics achievement were investigated in a sample of 360 undergraduates in Malaysia. Both trait (cross-situational) and state (task-specific) measures of each construct were used to test a mediational trait (r) state (r) performance (TSP) model. As hypothesized,…

  13. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  14. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  15. Critical Business Requirements Model and Metrics for Intranet ROI

    OpenAIRE

    Luqi; Jacoby, Grant A.

    2005-01-01

    Journal of Electronic Commerce Research, Vol. 6, No. 1, pp. 1-30. This research provides the first theoretical model, the Intranet Efficiency and Effectiveness Model (IEEM), to measure intranet overall value contributions based on a corporation’s critical business requirements by applying a balanced baseline of metrics and conversion ratios linked to key business processes of knowledge workers, IT managers and business decision makers -- in effect, closing the gap of understanding...

  16. Applying a Theory-Driven Framework to Guide Quality Improvement Efforts in Nursing Homes: The LOCK Model.

    Science.gov (United States)

    Mills, Whitney L; Pimentel, Camilla B; Palmer, Jennifer A; Snow, A Lynn; Wewiorski, Nancy J; Allen, Rebecca S; Hartmann, Christine W

    2017-06-23

    Implementing quality improvement (QI) programs in nursing homes continues to encounter significant challenges, despite recognized need. QI approaches provide nursing home staff with opportunities to collaborate on developing and testing strategies for improving care delivery. We present a theory-driven and user-friendly adaptable framework and facilitation package to overcome existing challenges and guide QI efforts in nursing homes. The framework is grounded in the foundational concepts of strengths-based learning, observation, relationship-based teams, efficiency, and organizational learning. We adapted these concepts to QI in the nursing home setting, creating the "LOCK" framework. The LOCK framework is currently being disseminated across the Veterans Health Administration. The LOCK framework has five tenets: (a) Look for the bright spots, (b) Observe, (c) Collaborate in huddles, (d) Keep it bite-sized, and (e) facilitation. Each tenet is described. We also present a case study documenting how a fictional nursing home can implement the LOCK framework as part of a QI effort to improve engagement between staff and residents. The case study describes sample observations, processes, and outcomes. We also discuss practical applications for nursing home staff, the adaptability of LOCK for different QI projects, the specific role of facilitation, and lessons learned. The proposed framework complements national efforts to improve quality of care and quality of life for nursing home residents and may be valuable across long-term care settings and QI project types. Published by Oxford University Press on behalf of The Gerontological Society of America 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  17. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  18. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  19. A Stage-Structured Prey-Predator Fishery Model In The Presence Of Toxicity With Taxation As A Control Parameter of Harvesting Effort

    Directory of Open Access Journals (Sweden)

    Sumit Kaur Bhatia

    2017-08-01

    Full Text Available In this paper we have considered stage-structured fishery model in the presence of toxicity, which is diminishing due to the current excessive use of fishing efforts resulting in devastating consequences. The purpose of this study is to propose a bio-economic mathematical model by introducing taxes to the profit per unit biomass of the harvested fish of each species with the intention of controlling fishing efforts in the presence of toxicity. We obtained both boundary and interior equilibrium points along with the conditions ensuring their validity. Local stability for the interior equilibrium point has been found by the trace-determinant criterion and global stability has been analyzed through a suitable Lyapunov function. We have also obtained the optimal harvesting policy with the help of Pontryagin's maximum principle. Lastly, numerical simulation with the help of MATLAB have been done and thus, the results of the formulated model have been established.

  20. Atmospheric disturbance modelling requirements for flying qualities applications

    Science.gov (United States)

    Moorhouse, D. J.

    1978-01-01

    Flying qualities are defined as those airplane characteristics which govern the ease or precision with which the pilot can accomplish the mission. Some atmospheric disturbance modelling requirements for aircraft flying qualities applications are reviewed. It is concluded that some simplifications are justified in identifying the primary influence on aircraft response and pilot control. It is recommended that a universal environmental model be developed, which could form the reference for different applications. This model should include the latest information on winds, turbulence, gusts, visibility, icing and precipitation. A chosen model would be kept by a national agency and updated regularly by feedback from users. A user manual is believed to be an essential part of such a model.

  1. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  2. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    Energy Technology Data Exchange (ETDEWEB)

    Thiele, Ines; Hyduke, Daniel R.; Steeb, Benjamin; Fankam, Guy; Allen, Douglas K.; Bazzani, Susanna; Charusanti, Pep; Chen, Feng-Chi; Fleming, Ronan MT; Hsiung, Chao A.; De Keersmaecker, Sigrid CJ; Liao, Yu-Chieh; Marchal, Kathleen; Mo, Monica L.; Özdemir, Emre; Raghunathan, Anu; Reed, Jennifer L.; Shin, Sook-Il; Sigurbjörnsdóttir, Sara; Steinmann, Jonas; Sudarsan, Suresh; Swainston, Neil; Thijs, Inge M.; Zengler, Karsten; Palsson, Bernhard O.; Adkins, Joshua N.; Bumann, Dirk

    2011-01-01

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Finally, taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.

  3. [Psychosocial stress and disease risks in occupational life. Results of international studies on the demand-control and the effort-reward imbalance models].

    Science.gov (United States)

    Siegrist, J; Dragano, N

    2008-03-01

    Given the far-reaching changes of modern working life, psychosocial stress at work has received increased attention. Its influence on stress-related disease risks is analysed with the help of standardised measurements based on theoretical models. Two such models have gained special prominence in recent years, the demand-control model and the effort-reward imbalance model. The former model places its emphasis on a distinct combination of job characteristics, whereas the latter model's focus is on the imbalance between efforts spent and rewards received in turn. The predictive power of these models with respect to coronary or cardiovascular disease and depression was tested in a number of prospective epidemiological investigations. In summary, twofold elevated disease risks are observed. Effects on cardiovascular disease are particularly pronounced among men, whereas no gender differences are observed for depression. Additional evidence derived from experimental and ambulatory monitoring studies supplements this body of findings. Current scientific evidence justifies an increased awareness and assessment of these newly discovered occupational risks, in particular by occupational health professionals. Moreover, structural and interpersonal measures of stress prevention and health promotion at work are warranted, with special emphasis on gender differences.

  4. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1982-09-01

    Report III, Volume 1 contains those specifications numbered A through J, as follows: General Specifications (A); Specifications for Pressure Vessels (C); Specifications for Tanks (D); Specifications for Exchangers (E); Specifications for Fired Heaters (F); Specifications for Pumps and Drivers (G); and Specifications for Instrumentation (J). The standard specifications of Bechtel Petroleum Incorporated have been amended as necessary to reflect the specific requirements of the Breckinridge Project, and the more stringent specifications of Ashland Synthetic Fuels, Inc. These standard specifications are available to the Initial Effort (Phase Zero) work performed by all contractors and subcontractors. Report III, Volume 1 also contains the unique specifications prepared for Plants 8, 15, and 27. These specifications will be substantially reviewed during Phase I of the project, and modified as necessary for use during the engineering, procurement, and construction of this project.

  5. Benthic macroinvertebrate field sampling effort required to ...

    Science.gov (United States)

    This multi-year pilot study evaluated a proposed field method for its effectiveness in the collection of a benthic macroinvertebrate sample adequate for use in the condition assessment of streams and rivers in the Neuquén Province, Argentina. A total of 13 sites, distributed across three rivers, were sampled. At each site, benthic macroinvertebrates were collected at 11 transects. Each sample was processed independently in the field and laboratory. Based on a literature review and resource considerations, the collection of 300 organisms (minimum) at each site was determined to be necessary to support a robust condition assessment, and therefore, selected as the criterion for judging the adequacy of the method. This targeted number of organisms was collected at all sites, at a minimum, when collections from all 11 transects were combined. Subsequent bootstrapping analysis of data was used to estimate whether collecting at fewer transects would reach the minimum target number of organisms for all sites. In a subset of sites, the total number of organisms frequently fell below the target when fewer than 11 transects collections were combined.Site conditions where <300 organisms might be collected are discussed. These preliminary results suggest that the proposed field method results in a sample that is adequate for robust condition assessment of the rivers and streams of interest. When data become available from a broader range of sites, the adequacy of the field

  6. Effort rights-based management

    DEFF Research Database (Denmark)

    Squires, Dale; Maunder, Mark; Allen, Robin

    2017-01-01

    populations and providing TACs or TAEs. Both approaches have advantages and disadvantages, and there are trade-offs between the two approaches. In a narrow economic sense, catch rights are superior because of the type of incentives created, but once the costs of research to improve stock assessments...... employed to manage marine fisheries to capture the advantages of both approaches. In hybrid systems, catch or effort RBM dominates and controls on the other supplements. RBM using either catch or effort by itself addresses only the target species stock externality and not the remaining externalities......Effort rights-based fisheries management (RBM) is less widely used than catch rights, whether for groups or individuals. Because RBM on catch or effort necessarily requires a total allowable catch (TAC) or total allowable effort (TAE), RBM is discussed in conjunction with issues in assessing fish...

  7. Modeling the effects of promotional efforts on aggregate pharmaceutical demand : What we know and challenges for the future

    NARCIS (Netherlands)

    Wieringa, J.E.; Osinga, E.C.; Conde, E.R.; Leeflang, P.S.H.; Stern, P.; Ding, M.; Eliashberg, J.; Stremersch, S.

    2014-01-01

    Pharmaceutical marketing is becoming an important area of research in its own right, as evidenced by the steady increase in relevant papers published in the major marketing journals in recent years. These papers utilize different modeling techniques and types of data. In this chapter we focus on

  8. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    . This is important because it represents the identi cation of what is being designed (the reactive system), and what is given and being made assumptions about (the environment). The representation of the environment is further partitioned to distinguish human actors from non-human actors. This allows the modeler...... to addressing the problem of validating formal requirements models through interactive graphical animations is presented. Executable Use Cases (EUCs) provide a framework for integrating three tiers of descriptions of specifications and environment assumptions: the lower tier is an informal description...... to distinguish the modeling artifacts describing the environment from those describing the specifications for a reactive system. The formalization allows for clear identi cation of interfaces between interacting domains, where the interaction takes place through an abstraction of possibly parameterized states...

  9. Betting on change: Tenet deal with Vanguard shows it's primed to try ACO effort, new payment model.

    Science.gov (United States)

    Kutscher, Beth

    2013-07-01

    Tenet Healthcare Corp.'s acquisition of Vanguard Health Systems is a sign the investor-owned chain is willing to take a chance on alternative payment models such as accountable care organizations. There's no certainty that ACOs will deliver the improvements on quality or cost savings, but Vanguard Vice Chairman Keith Pitts, left, says his system's Pioneer ACO in Detroit has already achieved some cost savings.

  10. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  11. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  12. CFD Modelling and Experimental Testing of Thermal Calcination of Kaolinite Rich Clay Particles - An Effort towards Green Concrete

    DEFF Research Database (Denmark)

    Gebremariam, Abraham Teklay

    processes in a calciner and develop a useful tool that can aid in design of a smart clay calcination technology, which makes the major objective of this study. In this thesis, a numerical approach is mainly used to investigate the flash calcination of clay particles. A transient one-dimensional particle...... are crucial not only to maximize the yield of the desired product but also minimize the energy consumption during operation. Thus, the experimentally validated calcination model and simulation results can aid in an improved understanding of clay calcination process and also new conceptual design......Cement industry is one of the major industrial emitters of greenhouse gases, generating 5-7% of the total anthropogenic CO2 emissions. Consequently, use of supplementary cementitious materials (SCM) to replace part of the CO2-intensive cement clinker is an attractive way to mitigate CO2 emissions...

  13. EFFORTS TO IMPROVE MATH LEARNING RESULT OF FOURTH GRADE STUDENTS THROUGH CONTEXTUAL MODEL TEACHING AND LEARNING WITH CUISENAIRE RODS MEDIA

    Directory of Open Access Journals (Sweden)

    Intan Kurnia Sari

    2017-05-01

    Full Text Available The purpose of this research is to improve students' learning outcomes in mathematics learning through Contextual Teaching and Learning (CTL model with Cuisenaire Rods media in grade IV Dukuh 03 Salatiga Elementary School semester II year 2015/2016. This research was conducted to help teachers who still used conventional methods and had not yet maximized learning media in the classroom. This research is a classroom action research that consists of two cycles. Each cycle consists of four phases: planning, implementation, observation, and reflection. The subjects of this study were 26 students. Data collection techniques used in this study were observation, testing, and documentation. The instruments used were test items, the student activity and teacher observation sheet. Data analysis was performed by using a comparative descriptive analysis by comparing the results of pre-cycle, the first cycle, and the second cycle. The indicator of success in this study was that 75% students reaching the score of ≥ 64. The research showed an increase in the value of the average grade from 61.77 in the pre-cycle to 78 in the first cycle and up to 85 in the second cycle. The number of students who passed the study increased from 11 students (42.31% in the pre-cycle to 20 students (76.92% in the first cycle and up to 24 students (92.31% in the second cycle, so it can be concluded that the application of Contextual Teaching and Learning (CTL model using Cuisenaire Rods media can improve students' learning outcomes in mathematics in the area of adding and subtracting fractions with the same denominator and with different denominators.

  14. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  15. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    Science.gov (United States)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  16. Requirements traceability in model-driven development: Applying model and transformation conformance

    NARCIS (Netherlands)

    Andrade Almeida, João; Iacob, Maria Eugenia; van Eck, Pascal

    The variety of design artifacts (models) produced in a model-driven design process results in an intricate relationship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship, which helps in assessing the quality of

  17. Intersection of Effort and Risk: Ethological and Neurobiological Perspectives

    Directory of Open Access Journals (Sweden)

    Mike A Miller

    2013-11-01

    Full Text Available The physical effort required to seek out and extract a resource is an important consideration for a foraging animal. A second consideration is the variability or risk associated with resource delivery. An intriguing observation from ethological studies is that animals shift their preference from stable to variable food sources under conditions of increased physical effort or falling energetic reserves. Although theoretical models for this effect exist, no exploration into its biological basis has been pursued. Recent advances in understanding the neural basis of effort- and risk-guided decision making suggest that opportunities exist for determining how effort influences risk preference. In this review, we describe the intersection between the neural systems involved in effort- and risk-guided decision making and outline two mechanisms by which effort-induced changes in dopamine release may increase the preference for variable rewards.

  18. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kim, Dong-Sang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Skorski, Daniel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Matyas, Josef [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-07-01

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminary in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.

  19. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  20. Dopamine, behavioral economics, and effort

    Directory of Open Access Journals (Sweden)

    John D Salamone

    2009-09-01

    Full Text Available Abstract. There are numerous problems with the hypothesis that brain dopamine (DA systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Research and theory related to the functions of mesolimbic DA are undergoing a substantial conceptual restructuring, with the traditional emphasis on hedonia and primary reward yielding to other concepts and lines of inquiry. The present review is focused upon the involvement of nucleus accumbens DA in behavioral activation and effort-related processes. Viewed from the framework of behavioral economics, the effects of accumbens DA depletions and antagonism on food-reinforced behavior are highly dependent upon the work requirements of the instrumental task, and DA depleted rats are more sensitive to increases in response costs (i.e., ratio requirements. Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related choice behavior. Rats with accumbens DA depletions or antagonism reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead these rats select a less-effortful type of food-seeking behavior. Nucleus accumbens DA and adenosine interact in the regulation of effort-related functions, and other brain structures (anterior cingulate cortex, amygdala, ventral pallidum also are involved. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue or anergia in depression and other neurological disorders.

  1. Modeling the minimum enzymatic requirements for optimal cellulose conversion

    International Nuclear Information System (INIS)

    Den Haan, R; Van Zyl, W H; Van Zyl, J M; Harms, T M

    2013-01-01

    Hydrolysis of cellulose is achieved by the synergistic action of endoglucanases, exoglucanases and β-glucosidases. Most cellulolytic microorganisms produce a varied array of these enzymes and the relative roles of the components are not easily defined or quantified. In this study we have used partially purified cellulases produced heterologously in the yeast Saccharomyces cerevisiae to increase our understanding of the roles of some of these components. CBH1 (Cel7), CBH2 (Cel6) and EG2 (Cel5) were separately produced in recombinant yeast strains, allowing their isolation free of any contaminating cellulolytic activity. Binary and ternary mixtures of the enzymes at loadings ranging between 3 and 100 mg g −1 Avicel allowed us to illustrate the relative roles of the enzymes and their levels of synergy. A mathematical model was created to simulate the interactions of these enzymes on crystalline cellulose, under both isolated and synergistic conditions. Laboratory results from the various mixtures at a range of loadings of recombinant enzymes allowed refinement of the mathematical model. The model can further be used to predict the optimal synergistic mixes of the enzymes. This information can subsequently be applied to help to determine the minimum protein requirement for complete hydrolysis of cellulose. Such knowledge will be greatly informative for the design of better enzymatic cocktails or processing organisms for the conversion of cellulosic biomass to commodity products. (letter)

  2. WNA CORDEL Code Convergence Effort, The example of Harmonisation of NDE Qualification requirements. AP1000R Global Plant Licensing. EPR Family Presentation. WNA CORDEL report - What can nuclear learn from aviation?

    International Nuclear Information System (INIS)

    Wasylyk, Andrew; Delong, Richard; Green, John; Bouteille, Francois; Pouget-Abadie, Xavier; Raetzke, Christian

    2013-01-01

    The industry representatives provided their insights about new reactor activities, what reactor designers, operators/licensees, and representatives from standards development organizations are doing to promote standardization of designs and convergence of standards and what are their expectations toward MDEP to further enhance standardization of designs and convergence of standards. The industry emphasised that they are embracing harmonisation to address new reactor issues and that they would hope that the regulators do the same. AREVA, Westinghouse and CORDEL described their efforts in maintaining standard design as much as possible to gain efficiency in licensing, constructing and operating new nuclear power plant worldwide. They considered that MDEP work was valuable, but should be pursed further to avoid differences in the design driven by differing regulatory requirements. The need that the regulators identify areas where convergence is not likely to be reached was also underlined. Cooperation between regulators involved in licensing of aircraft was mentioned as an example to be followed

  3. Research requirements for a unified approach to modelling chemical effects associated with radioactive waste disposal

    International Nuclear Information System (INIS)

    Krol, A.A.; Read, D.

    1986-09-01

    This report contains the results of a review of the current modelling, laboratory experiments and field experiments being conducted in the United Kingdom to aid understanding and improve prediction of the effects of chemistry on the disposal of radioactive wastes. The aim has been to summarise present work and derive a structure for future research effort that would support the use of probabilistic risk assessment (pra) methods for the disposal of radioactive wastes. The review was conducted by a combination of letter and personal visits, and preliminary results were reported to a plenary meeting of participants held in April, 1986. Following this meeting, copies of the report were circulated to participants at draft stage, so that the finalised report should be taken to provide as far as possible a consensus of opinion of research requirements. (author)

  4. Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration

    Science.gov (United States)

    Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang

    Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.

  5. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  6. Finite element model approach of a cylindrical lithium ion battery cell with a focus on minimization of the computational effort and short circuit prediction

    Science.gov (United States)

    Raffler, Marco; Sevarin, Alessio; Ellersdorfer, Christian; Heindl, Simon F.; Breitfuss, Christoph; Sinz, Wolfgang

    2017-08-01

    In this research, a parameterized beam-element-based mechanical modeling approach for cylindrical lithium ion batteries is developed. With the goal to use the cell model in entire vehicle crash simulations, focus of development is on minimizing the computational effort whilst simultaneously obtaining accurate mechanical behavior. The cylindrical cell shape is approximated by radial beams connected to each other in circumferential and longitudinal directions. The discrete beam formulation is used to define an anisotropic material behavior. An 18650 lithium ion cell model constructed in LS-Dyna is used to show the high degree of parameterization of the approach. A criterion which considers the positive pole deformation and the radial deformation of the cell is developed for short circuit prediction during simulation. An abuse testing program, consisting of radial crush, axial crush, and penetration is performed to evaluate the mechanical properties and internal short circuit behavior of a commercially available 18650 lithium cell. Additional 3-point-bending tests are performed to verify the approach objectively. By reducing the number of strength-related elements to 1600, a fast and accurate cell model can be created. Compared to typical cell models in technical literature, simulation time of a single cell load case can be reduced by approx. 90%.

  7. Effort in Multitasking: Local and Global Assessment of Effort.

    Science.gov (United States)

    Kiesel, Andrea; Dignath, David

    2017-01-01

    When performing multiple tasks in succession, self-organization of task order might be superior compared to external-controlled task schedules, because self-organization allows optimizing processing modes and thus reduces switch costs, and it increases commitment to task goals. However, self-organization is an additional executive control process that is not required if task order is externally specified and as such it is considered as time-consuming and effortful. To compare self-organized and externally controlled task scheduling, we suggest assessing global subjective and objectives measures of effort in addition to local performance measures. In our new experimental approach, we combined characteristics of dual tasking settings and task switching settings and compared local and global measures of effort in a condition with free choice of task sequence and a condition with cued task sequence. In a multi-tasking environment, participants chose the task order while the task requirement of the not-yet-performed task remained the same. This task preview allowed participants to work on the previously non-chosen items in parallel and resulted in faster responses and fewer errors in task switch trials than in task repetition trials. The free-choice group profited more from this task preview than the cued group when considering local performance measures. Nevertheless, the free-choice group invested more effort than the cued group when considering global measures. Thus, self-organization in task scheduling seems to be effortful even in conditions in which it is beneficiary for task processing. In a second experiment, we reduced the possibility of task preview for the not-yet-performed tasks in order to hinder efficient self-organization. Here neither local nor global measures revealed substantial differences between the free-choice and a cued task sequence condition. Based on the results of both experiments, we suggest that global assessment of effort in addition to

  8. Relational-model based change management for non-functional requirements: approach and experiment

    NARCIS (Netherlands)

    Kassab, M.; Ormandjieva, O.; Daneva, Maia; Rolland, Colette; Collard, Martine

    2011-01-01

    In software industry, many organizations either focus their traceability efforts on Functional Requirements (FRs) or else fail entirely to implement an effective traceability process. NonFunctional Requirements (NFRs) such as security, safety, performance, and reliability are treated in a rather ad

  9. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  10. Modelling efforts needed to advance herpes simplex virus (HSV) vaccine development: Key findings from the World Health Organization Consultation on HSV Vaccine Impact Modelling.

    Science.gov (United States)

    Gottlieb, Sami L; Giersing, Birgitte; Boily, Marie-Claude; Chesson, Harrell; Looker, Katharine J; Schiffer, Joshua; Spicknall, Ian; Hutubessy, Raymond; Broutet, Nathalie

    2017-06-21

    Development of a vaccine against herpes simplex virus (HSV) is an important goal for global sexual and reproductive health. In order to more precisely define the health and economic burden of HSV infection and the theoretical impact and cost-effectiveness of an HSV vaccine, in 2015 the World Health Organization convened an expert consultation meeting on HSV vaccine impact modelling. The experts reviewed existing model-based estimates and dynamic models of HSV infection to outline critical future modelling needs to inform development of a comprehensive business case and preferred product characteristics for an HSV vaccine. This article summarizes key findings and discussions from the meeting on modelling needs related to HSV burden, costs, and vaccine impact, essential data needs to carry out those models, and important model components and parameters. Copyright © 2017. Published by Elsevier Ltd.

  11. A systematic review and meta-analysis of the effort-reward imbalance model of workplace stress with indicators of immune function.

    Science.gov (United States)

    Eddy, Pennie; Heckenberg, Rachael; Wertheim, Eleanor H; Kent, Stephen; Wright, Bradley J

    2016-12-01

    Despite considerable research into associations between the effort reward imbalance (ERI) model and various health outcomes over the past 20years, the underlying mechanisms responsible for the association remain unclear. Recently, ERI investigations have examined associations with immune sub-systems (e.g., leukocytes, cytokines and immunoglobulins). Synthesis of the amalgamated research evidence will aid clarity to this field of enquiry. We conducted a meta-analysis and reviewed the associations of ERI and over-commitment (OC) in the workplace with immunity. Electronic databases were searched with the phrase 'effort reward imbalance' which initially yielded 319 studies leading to 57 full text studies being screened. Seven studies that met inclusion criteria were combined using mixed and random effects models. Greater ERI was associated with lower immunity (r=-0.09, CI -0.14, -0.05, p<0.001). Sub-group analyses revealed the effect with mucosal immunity was stronger (r=-0.33, CI -0.47 to -0.18) than trends between both cytokine (r=-0.04, CI -0.07, -0.01) and leukocyte sub-groups (r=-0.02 CI -0.04, 0.01) respectively (k=7, N=9952). Over-commitment was also associated with lower immunity (r=-0.05, CI -0.09, 0.01, p=0.014); subgroup (leukocytes, cytokines, mucosal immunity) associations, however, were homogenous (Q=1.83, df=2, p=0.400, k=6, N=2358). Greater ERI and OC were both associated with lower immunity. The association between mucosal immunity and ERI was stronger than the cytokine and leukocyte sub-groups. OC moderated the relationship between ERI and immunity. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Aerosol-Radiation-Cloud Interactions in the South-East Atlantic: Model-Relevant Observations and the Beneficiary Modeling Efforts in the Realm of the EVS-2 Project ORACLES

    Science.gov (United States)

    Redemann, Jens

    2018-01-01

    Globally, aerosols remain a major contributor to uncertainties in assessments of anthropogenically-induced changes to the Earth climate system, despite concerted efforts using satellite and suborbital observations and increasingly sophisticated models. The quantification of direct and indirect aerosol radiative effects, as well as cloud adjustments thereto, even at regional scales, continues to elude our capabilities. Some of our limitations are due to insufficient sampling and accuracy of the relevant observables, under an appropriate range of conditions to provide useful constraints for modeling efforts at various climate scales. In this talk, I will describe (1) the efforts of our group at NASA Ames to develop new airborne instrumentation to address some of the data insufficiencies mentioned above; (2) the efforts by the EVS-2 ORACLES project to address aerosol-cloud-climate interactions in the SE Atlantic and (3) time permitting, recent results from a synergistic use of A-Train aerosol data to test climate model simulations of present-day direct radiative effects in some of the AEROCOM phase II global climate models.

  13. Reproductive effort in viscous populations

    NARCIS (Netherlands)

    Pen, Ido

    Here I study a kin selection model of reproductive effort, the allocation of resources to fecundity versus survival, in a patch-structured population. Breeding females remain in the same patch for life. Offspring have costly, partial long-distance dispersal and compete for breeding sites, which

  14. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required...

  15. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  16. Simulation Modeling Requirements for Loss-of-Control Accident Prevention of Turboprop Transport Aircraft

    Science.gov (United States)

    Crider, Dennis; Foster, John V.

    2012-01-01

    In-flight loss of control remains the leading contributor to aviation accident fatalities, with stall upsets being the leading causal factor. The February 12, 2009. Colgan Air, Inc., Continental Express flight 3407 accident outside Buffalo, New York, brought this issue to the forefront of public consciousness and resulted in recommendations from the National Transportation Safety Board to conduct training that incorporates stalls that are fully developed and develop simulator standards to support such training. In 2010, Congress responded to this accident with Public Law 11-216 (Section 208), which mandates full stall training for Part 121 flight operations. Efforts are currently in progress to develop recommendations on implementation of stall training for airline pilots. The International Committee on Aviation Training in Extended Envelopes (ICATEE) is currently defining simulator fidelity standards that will be necessary for effective stall training. These recommendations will apply to all civil transport aircraft including straight-wing turboprop aircraft. Government-funded research over the previous decade provides a strong foundation for stall/post-stall simulation for swept-wing, conventional tail jets to respond to this mandate, but turboprops present additional and unique modeling challenges. First among these challenges is the effect of power, which can provide enhanced flow attachment behind the propellers. Furthermore, turboprops tend to operate for longer periods in an environment more susceptible to ice. As a result, there have been a significant number of turboprop accidents as a result of the early (lower angle of attack) stalls in icing. The vulnerability of turboprop configurations to icing has led to studies on ice accumulation and the resulting effects on flight behavior. Piloted simulations of these effects have highlighted the important training needs for recognition and mitigation of icing effects, including the reduction of stall margins

  17. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  18. Downscaling the marine modelling effort: Development, application and assessment of a 3D ecosystem model implemented in a small coastal area

    Science.gov (United States)

    Kolovoyiannis, V. N.; Tsirtsis, G. E.

    2013-07-01

    The present study deals with the development, application and evaluation of a modelling tool, implemented along with a field sampling program, in a limited coastal area in the Northeast Aegean. The aim was to study, understand and quantify physical circulation and water column ecological processes in a high resolution simulation of a past annual cycle. The marine ecosystem model consists of a three dimensional hydrodynamic component suitable for coastal areas (Princeton Ocean Model) coupled to a simple ecological model of five variables, namely, phytoplankton, nitrate, ammonia, phosphate and dissolved organic carbon concentrations. The ecological parameters (e.g. half saturation constants and maximum uptake rates for nutrients) were calibrated using a specially developed automated procedure. Model errors were evaluated using qualitative, graphic techniques and were quantified with a number of goodness-of-fit measures. Regarding physical variables, the goodness-of-fit of model to field data varied from fairly to quite good. Indicatively, the cost function, expressed as mean value per sampling station, ranged from 0.15 to 0.23 for temperature and 0.81 to 3.70 for current speed. The annual cycle of phytoplankton biomass was simulated with sufficient accuracy (e.g. mean cost function ranging from 0.49 to 2.67), partly attributed to the adequate reproduction of the dynamics of growth limiting nutrients, nitrate, ammonia and the main limiting nutrient, phosphate, whose mean cost function ranged from 0.97 to 1.88. Model results and field data provided insight to physical processes such as the development of a wind-driven, coastal jet type of surface alongshore flow with a subsurface countercurrent flowing towards opposite direction and the formation of rotational flows in the embayments of the coastline when the offshore coastal current speed approaches values of about 0.1 m/s. The percentage of field measurements where the N:P ratio was found over 16:1 varied between

  19. User Requirements Model for University Timetable Management System

    OpenAIRE

    Ahmad Althunibat; Mohammad I. Muhairat

    2016-01-01

    Automated timetables are used to schedule courses, lectures and rooms in universities by considering some constraints. Inconvenient and ineffective timetables often waste time and money. Therefore, it is important to investigate the requirements and potential needs of users. Thus, eliciting user requirements of University Timetable Management System (TMS) and their implication becomes an important process for the implementation of TMS. On this note, this paper seeks to propose a m...

  20. The Modeling of Factors That Influence Coast Guard Manpower Requirements

    Science.gov (United States)

    2014-12-01

    Requirements Determination process. Finally, sincere thanks, hugs, and kisses to my family. I appreciate your enduring patience and encouragement. I...allowances. To help clarify the process, Phase II has guiding principles and core assumptions that direct the Phase. Three of the four guiding principles are...analyst is determining for the first time what manpower is required. The second notable guiding principle is “MRD analysts shall identify and

  1. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  2. Modelling Imperfect Product Line Requirements with Fuzzy Feature Diagrams

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Weston, Nathan; Rashid, Awais

    In this article, we identify that partial, vague and conflicting information can severely limit the effectiveness of approaches that derive feature trees from textual requirement specifications. We examine the impact such imperfect information has on feature tree extraction and we propose the use of

  3. REQUIREMENTS PATTERNS FOR FORMAL CONTRACTS IN ARCHITECTURAL ANALYSIS AND DESIGN LANGUAGE (AADL) MODELS

    Science.gov (United States)

    2017-04-17

    The transition relation only admits traces of Lpatt if and only if the variable pass is invariant...also allows requirements to compose more easily. For the remainder of this report we refer to this effort as the Contract Requirements Patterns (CRP...constraints are over fresh variables: run, timer, recc, and pass ; they are shown in Figure 4. The constraints only restrict the values of the fresh

  4. From requirement document to formal modelling and decomposition of control systems

    OpenAIRE

    Yeganefard, Sanaz

    2014-01-01

    Formal modelling of control systems can help with identifying missing requirements and design flaws before implementing them. However, modelling using formal languages can be challenging and time consuming. Therefore intermediate steps may be required to simplify the transition from informal requirements to a formal model.In this work we firstly provide a four-stage approach for structuring and formalising requirements of a control system. This approach is based on monitored, controlled, mode...

  5. How the dual process model of human cognition can inform efforts to de-implement ineffective and harmful clinical practices: A preliminary model of unlearning and substitution.

    Science.gov (United States)

    Helfrich, Christian D; Rose, Adam J; Hartmann, Christine W; van Bodegom-Vos, Leti; Graham, Ian D; Wood, Suzanne J; Majerczyk, Barbara R; Good, Chester B; Pogach, Leonard M; Ball, Sherry L; Au, David H; Aron, David C

    2018-02-01

    One way to understand medical overuse at the clinician level is in terms of clinical decision-making processes that are normally adaptive but become maladaptive. In psychology, dual process models of cognition propose 2 decision-making processes. Reflective cognition is a conscious process of evaluating options based on some combination of utility, risk, capabilities, and/or social influences. Automatic cognition is a largely unconscious process occurring in response to environmental or emotive cues based on previously learned, ingrained heuristics. De-implementation strategies directed at clinicians may be conceptualized as corresponding to cognition: (1) a process of unlearning based on reflective cognition and (2) a process of substitution based on automatic cognition. We define unlearning as a process in which clinicians consciously change their knowledge, beliefs, and intentions about an ineffective practice and alter their behaviour accordingly. Unlearning has been described as "the questioning of established knowledge, habits, beliefs and assumptions as a prerequisite to identifying inappropriate or obsolete knowledge underpinning and/or embedded in existing practices and routines." We hypothesize that as an unintended consequence of unlearning strategies clinicians may experience "reactance," ie, feel their professional prerogative is being violated and, consequently, increase their commitment to the ineffective practice. We define substitution as replacing the ineffective practice with one or more alternatives. A substitute is a specific alternative action or decision that either precludes the ineffective practice or makes it less likely to occur. Both approaches may work independently, eg, a substitute could displace an ineffective practice without changing clinicians' knowledge, and unlearning could occur even if no alternative exists. For some clinical practice, unlearning and substitution strategies may be most effectively used together. By taking into

  6. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio...... ambiguity in the process of elicitation and analysis through the use of empirically informed quality goals attached to functional goals. The authors demonstrate the benefit of articulating a quality goal without turning it into a functional goal. Their study shows that quality goals kept at a high level...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  7. Requirements and Problems in Parallel Model Development at DWD

    Directory of Open Access Journals (Sweden)

    Ulrich Schäattler

    2000-01-01

    Full Text Available Nearly 30 years after introducing the first computer model for weather forecasting, the Deutscher Wetterdienst (DWD is developing the 4th generation of its numerical weather prediction (NWP system. It consists of a global grid point model (GME based on a triangular grid and a non-hydrostatic Lokal Modell (LM. The operational demand for running this new system is immense and can only be met by parallel computers. From the experience gained in developing earlier NWP models, several new problems had to be taken into account during the design phase of the system. Most important were portability (including efficieny of the programs on several computer architectures and ease of code maintainability. Also the organization and administration of the work done by developers from different teams and institutions is more complex than it used to be. This paper describes the models and gives some performance results. The modular approach used for the design of the LM is explained and the effects on the development are discussed.

  8. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  9. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  10. Modelling production of field crops and its requirements

    NARCIS (Netherlands)

    Wit, de C.T.; Keulen, van H.

    1987-01-01

    Simulation models are being developed that enable quantitative estimates of the growth and production of the main agricultural crops under a wide range of weather and soil conditions. For this purpose, several hierarchically ordered production situations are distinguished in such a way that the

  11. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    Science.gov (United States)

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  12. Illinois highway materials sustainability efforts of 2015.

    Science.gov (United States)

    2016-08-01

    This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2015. This report meets the requirements of Illinois Publ...

  13. Illinois highway materials sustainability efforts of 2016.

    Science.gov (United States)

    2017-07-04

    This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2016. This report meets the requirements of Illinois Publ...

  14. Illinois highway materials sustainability efforts of 2013.

    Science.gov (United States)

    2014-08-01

    This report presents the sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling and reclaiming materials for use in highway construction. This report meets the requirements of : Illinois Public Act 097-0314 by docum...

  15. Illinois highway materials sustainability efforts of 2014.

    Science.gov (United States)

    2015-08-01

    This report presents the 2014 sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling reclaimed materials in highway construction. This report meets the requirements of Illinois : Public Act 097-0314 by documenting I...

  16. Modelling of capital requirements in the energy sector: capital market access. Final memorandum

    Energy Technology Data Exchange (ETDEWEB)

    1978-04-01

    Formal modelling techniques for analyzing the capital requirements of energy industries have been performed at DOE. A survey has been undertaken of a number of models which forecast energy-sector capital requirements or which detail the interactions of the energy sector and the economy. Models are identified which can be useful as prototypes for some portion of DOE's modelling needs. The models are examined to determine any useful data bases which could serve as inputs to an original DOE model. A selected group of models are examined which can comply with the stated capabilities. The data sources being used by these models are covered and a catalog of the relevant data bases is provided. The models covered are: capital markets and capital availability models (Fossil 1, Bankers Trust Co., DRI Macro Model); models of physical capital requirements (Bechtel Supply Planning Model, ICF Oil and Gas Model and Coal Model, Stanford Research Institute National Energy Model); macroeconomic forecasting models with input-output analysis capabilities (Wharton Annual Long-Term Forecasting Model, Brookhaven/University of Illinois Model, Hudson-Jorgenson/Brookhaven Model); utility models (MIT Regional Electricity Model-Baughman Joskow, Teknekron Electric Utility Simulation Model); and others (DRI Energy Model, DRI/Zimmerman Coal Model, and Oak Ridge Residential Energy Use Model).

  17. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  18. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...... of the Danish VAT law in Web Ontology Language (OWL) and in Con¿git Product Modeling Language (CPML)....

  19. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    triangles (.raw) to the native triangular facet file (.facet). The software vendors recommend the use of McNeil and Associates’ Rhinoceros 3D for all...surface modeling and export. Rhinoceros has the capability and precision to create highly detailed 3D surface geometry suitable for radar cross section... white before ending up at blue as the temperature increases [27]. IR radiation was discovered in 1800 but its application is still limited in

  20. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies....

  1. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  2. Formal Requirements Modeling with Executable Use Cases and Coloured Petri Nets

    OpenAIRE

    Jørgensen, Jens Bæk; Tjell, Simon; Fernandes, Joao Miguel

    2009-01-01

    This paper presents executable use cases (EUCs), which constitute a model-based approach to requirements engineering. EUCs may be used as a supplement to model-driven development (MDD) and can describe and link user-level requirements and more technical software specifications. In MDD, user-level requirements are not always explicitly described, since usually it is sufficient that one provides a specification, or platform-independent model, of the software that is to be developed. Th...

  3. Perceived distributed effort in team ball sports.

    Science.gov (United States)

    Beniscelli, Violeta; Tenenbaum, Gershon; Schinke, Robert Joel; Torregrosa, Miquel

    2014-01-01

    In this study, we explored the multifaceted concept of perceived mental and physical effort in team sport contexts where athletes must invest individual and shared efforts to reach a common goal. Semi-structured interviews were conducted with a convenience sample of 15 Catalan professional coaches (3 women and 12 men, 3 each from the following sports: volleyball, basketball, handball, soccer, and water polo) to gain their views of three perceived effort-related dimensions: physical, psychological, and tactical. From a theoretical thematic analysis, it was found that the perception of effort is closely related to how effort is distributed within the team. Moreover, coaches viewed physical effort in relation to the frequency and intensity of the players' involvement in the game. They identified psychological effort in situations where players pay attention to proper cues, and manage emotions under difficult circumstances. Tactical effort addressed the decision-making process of players and how they fulfilled their roles while taking into account the actions of their teammates and opponents. Based on these findings, a model of perceived distributed effort was developed, which delineates the elements that compose each of the aforementioned dimensions. Implications of perceived distributed effort in team coordination and shared mental models are discussed.

  4. Requirements for modeling airborne microbial contamination in space stations

    Science.gov (United States)

    Van Houdt, Rob; Kokkonen, Eero; Lehtimäki, Matti; Pasanen, Pertti; Leys, Natalie; Kulmala, Ilpo

    2018-03-01

    Exposure to bioaerosols is one of the facets that affect indoor air quality, especially for people living in densely populated or confined habitats, and is associated to a wide range of health effects. Good indoor air quality is thus vital and a prerequisite for fully confined environments such as space habitats. Bioaerosols and microbial contamination in these confined space stations can have significant health impacts, considering the unique prevailing conditions and constraints of such habitats. Therefore, biocontamination in space stations is strictly monitored and controlled to ensure crew and mission safety. However, efficient bioaerosol control measures rely on solid understanding and knowledge on how these bioaerosols are created and dispersed, and which factors affect the survivability of the associated microorganisms. Here we review the current knowledge gained from relevant studies in this wide and multidisciplinary area of bioaerosol dispersion modeling and biological indoor air quality control, specifically taking into account the specific space conditions.

  5. Voluntary versus Enforced Team Effort

    Directory of Open Access Journals (Sweden)

    Claudia Keser

    2011-08-01

    Full Text Available We present a model where each of two players chooses between remuneration based on either private or team effort. Although at least one of the players has the equilibrium strategy to choose private remuneration, we frequently observe both players to choose team remuneration in a series of laboratory experiments. This allows for high cooperation payoffs but also provides individual free-riding incentives. Due to significant cooperation, we observe that, in team remuneration, participants make higher profits than in private remuneration. We also observe that, when participants are not given the option of private remuneration, they cooperate significantly less.

  6. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    The project cogeneration plant supplies electric power, process steam and treated boiler feedwater for use by the project plants. The plant consists of multiple turbine generators and steam generators connected to a common main steam header. The major plant systems which are required to produce steam, electrical power and treated feedwater are discussed individually. The systems are: steam, steam generator, steam generator fuel, condensate and feedwater deaeration, condensate and blowdown collection, cooling water, boiler feedwater treatment, coal handling, ash handling (fly ash and bottom ash), electrical, and control system. The plant description is based on the Phase Zero design basis established for Plant 31 in July of 1980 and the steam/condensate balance as presented on Drawing 31-E-B-1. Updating of steam requirements as more refined process information becomes available has generated some changes in the steam balance. Boiler operation with these updated requirements is reflected on Drawing 31-D-B-1A. The major impact of updating has been that less 600 psig steam generated within the process units requires more extraction steam from the turbine generators to close the 600 psig steam balance. Since the 900 psig steam generation from the boilers was fixed at 1,200,000 lb/hr, the additional extraction steam required to close the 600 psig steam balance decreased the quantity of electrical power available from the turbine generators. In the next phase of engineering work, the production of 600 psig steam will be augmented by increasing convection bank steam generation in the Plant 3 fired heaters by 140,000 to 150,000 lb/hr. This modification will allow full rated power generation from the turbine generators.

  7. Resource requirements, impacts, and potential constraints associated with various energy futures. Annual report. [Energy Supply Planning Model

    Energy Technology Data Exchange (ETDEWEB)

    Gallagher, J.M.; Barany, R.; Paskert, P.F.; Zimmerman, R.G.J.

    1977-03-01

    This is the first annual report describing an effort by Bechtel Corporation to adapt and apply the Energy Supply Planning Model (ESPM) to the support of the systems analysis activities of the United States Energy Research and Development Administration. Office of the Assistant Administrator for Planning Analysis and Evaluation (ERDA/APAE). The primary emphasis of this program is the identification of resource requirements and the associated impacts and potential constraints associated with various future energy options for this country. Accomplishments in 1976 include model application and analysis of energy scenarios derived from the 1976 update of the ERDA National Plan and the ''1976 National Energy Outlook'' of the Federal Energy Administration; analysis of the availability of engineers, manual manpower, and selected materials and equipment commodities; addition of aluminum, carbon steel, and alloy steel materials requirements to the model data base; the addition of several kinds of energy facilities to the modeling system; and refinement of the various aspects of the model and data base. This program shows the need for a clear statement of national energy policy.

  8. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    Control systems are purposeful systems involving goal-oriented information processing (cyber) and technical (physical) structures. Requirements modeling formalizes fundamental concepts and relations of a system architecture at a high-level design stage and can be used to identify potential design...... issues early. For requirements formulation of control structures, cyber and physical aspects need to be jointly represented to express interdependencies, check for consistency and discover potentially conflicting requirements. Early identification of potential conflicts may prevent larger problems...... modeling for early requirements checking using a suitable modeling language, and illustrates how this approach enables the identification of several classes of controller conflict....

  9. A semi-automated approach for generating natural language requirements documents based on business process models

    NARCIS (Netherlands)

    Aysolmaz, Banu; Leopold, Henrik; Reijers, Hajo A.; Demirörs, Onur

    2018-01-01

    Context: The analysis of requirements for business-related software systems is often supported by using business process models. However, the final requirements are typically still specified in natural language. This means that the knowledge captured in process models must be consistently

  10. Requirements-level semantics and model checking of object-oriented statecharts

    NARCIS (Netherlands)

    Eshuis, H.; Jansen, D.N.; Wieringa, Roelf J.

    2002-01-01

    In this paper we define a requirements-level execution semantics for object-oriented statecharts and show how properties of a system specified by these statecharts can be model checked using tool support for model checkers. Our execution semantics is requirements-level because it uses the perfect

  11. Swedish nuclear waste efforts

    Energy Technology Data Exchange (ETDEWEB)

    Rydberg, J.

    1981-09-01

    After the introduction of a law prohibiting the start-up of any new nuclear power plant until the utility had shown that the waste produced by the plant could be taken care of in an absolutely safe way, the Swedish nuclear utilities in December 1976 embarked on the Nuclear Fuel Safety Project, which in November 1977 presented a first report, Handling of Spent Nuclear Fuel and Final Storage of Vitrified Waste (KBS-I), and in November 1978 a second report, Handling and Final Storage of Unreprocessed Spent Nuclear Fuel (KBS II). These summary reports were supported by 120 technical reports prepared by 450 experts. The project engaged 70 private and governmental institutions at a total cost of US $15 million. The KBS-I and KBS-II reports are summarized in this document, as are also continued waste research efforts carried out by KBS, SKBF, PRAV, ASEA and other Swedish organizations. The KBS reports describe all steps (except reprocessing) in handling chain from removal from a reactor of spent fuel elements until their radioactive waste products are finally disposed of, in canisters, in an underground granite depository. The KBS concept relies on engineered multibarrier systems in combination with final storage in thoroughly investigated stable geologic formations. This report also briefly describes other activities carried out by the nuclear industry, namely, the construction of a central storage facility for spent fuel elements (to be in operation by 1985), a repository for reactor waste (to be in operation by 1988), and an intermediate storage facility for vitrified high-level waste (to be in operation by 1990). The R and D activities are updated to September 1981.

  12. Worldwide effort against smoking.

    Science.gov (United States)

    1986-07-01

    The 39th World Health Assembly, which met in May 1986, recognized the escalating health problem of smoking-related diseases and affirmed that tobacco smoking and its use in other forms are incompatible with the attainment of "Health for All by the Year 2000." If properly implemented, antismoking campaigns can decrease the prevalence of smoking. Nations as a whole must work toward changing smoking habits, and governments must support these efforts by officially stating their stand against smoking. Over 60 countries have introduced legislation affecting smoking. The variety of policies range from adopting a health education program designed to increase peoples' awareness of its dangers to increasing taxes to deter smoking by increasing tobacco prices. Each country must adopt an antismoking campaign which works most effectively within the cultural parameters of the society. Other smoking policies include: printed warnings on cigarette packages; health messages via radio, television, mobile teams, pamphlets, health workers, clinic walls, and newspapers; prohibition of smoking in public areas and transportation; prohibition of all advertisement of cigarettes and tobacco; and the establishment of upper limits of tar and nicotine content in cigarettes. The tobacco industry spends about $2000 million annually on worldwide advertising. According to the World Health Organization (WHO), controlling this overabundance of tobacco advertisements is a major priority in preventing the spread of smoking. Cigarette and tobacco advertising can be controlled to varying degrees, e.g., over a dozen countries have enacted a total ban on advertising on television or radio, a mandatory health warning must accompany advertisements in other countries, and tobacco companies often are prohibited from sponsoring sports events. Imposing a substantial tax on cigarettes is one of the most effective means to deter smoking. However, raising taxes and banning advertisements is not enough because

  13. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  14. Towards a Concerted Effort

    DEFF Research Database (Denmark)

    Johansen, Mette-Louise; Mouritsen, Tina; Montgomery, Edith

    2006-01-01

    basis. The book provides recommendations for organizing and implementing well-defined network meeting flows as well as methods for achieving systemic meeting management. The network-oriented approach emphasizes involvement of the parents, knowledge-sharing between specialist groups and dialogue......This book contains a method model for the prevention of youth crime in Danish municipalities. The method model consists of instructions for conducting processual network meetings between traumatized refugee parents and the professional specialists working with their children on an intermunicipal...... and division of responsibilities between specialists and parents. The book is based on a method development project carried out in Karlebo municipality involving refugee families and welfare staff representatives in the municipality, the health system, and the police. The project was carried out with financial...

  15. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    Report IV, Volume 7 provides descriptions, data, and drawings pertaining to the Oxygen Plant (Plant 15) and Naphtha Hydrotreating and Reforming (Plant 18). The Oxygen Plant (Plant 15) utilizes low-pressure air separation to manufacture the oxygen required in Gasification and Purification (Plant 12). The Oxygen Plant also supplies nitrogen as needed by the H-COAL process. Naphtha Hydrotreating and Reforming (Plant 18) upgrades the raw H-COAL naphtha. The following information is provided for both plants described in this volume: a description of the plant's process design, including the utility balance, catalysts and chemicals usage, and process flow diagrams; an equipment list including item numbers and descriptions; data sheets and sketches for major plant components (Oxygen Plant only); and pertinent engineering drawings. An appendix contains: an overall site plan showing the locations of all plants; and the symbols and legend for the piping and instrument diagrams included in this volume.

  16. Programming effort analysis of the ELLPACK language

    Science.gov (United States)

    Rice, J. R.

    1978-01-01

    ELLPACK is a problem statement language and system for elliptic partial differential equations which is implemented by a FORTRAN preprocessor. ELLPACK's principal purpose is as a tool for the performance evaluation of software. However, it is used here as an example with which to study the programming effort required for problem solving. It is obvious that problem statement languages can reduce programming effort tremendously; the goal is to quantify this somewhat. This is done by analyzing the lengths and effort (as measured by Halstead's software science technique) of various approaches to solving these problems.

  17. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    Report IV, Volume 6 provides descriptions, data, and drawings pertaining to Gasification and Purification (Plant 12). Gasification and Purification (Plant 12) produces makeup hydrogen for H-COAL Preheating and Reaction (Plant 3), and produces a medium Btu fuel gas for consumption in fired heaters. The following information is included: a description of the plant's process design, including the utility balance, catalysts and chemicals usage, and a process flow diagram; an equipment list, including item numbers and descriptions; data sheets and sketches for major plant components; and pertinent engineering drawings. An appendix contains: an overall site plan showing the locations of all plants; and the symbols and legend for the piping and instrument diagrams included in this volume. Gasification and Purification (Plant 12) utilizes process technology from three licensors: gasification of vacuum bottoms using the Texaco process, shift conversion using the Haldor Topsoe process, and purification of fuel gas and shifted gas using the Allied Chemical Selexol process. This licensed technology is proprietary in nature. As a result, this volume does not contain full disclosure of these processes although a maximum of information has been presented consistent with the confidentiality requirements. Where data appears incomplete in this volume, it is due to the above described limitations. Full data concerning this plant are available for DOE review at the Houston offices of Bechtel Petroleum, Inc.

  18. Linking nitrogen deposition to nitrate concentrations in groundwater below nature areas : modelling approach and data requirements

    NARCIS (Netherlands)

    Bonten, L.T.C.; Mol-Dijkstra, J.P.; Wieggers, H.J.J.; Vries, de W.; Pul, van W.A.J.; Hoek, van den K.W.

    2009-01-01

    This study determines the most suitable model and required model improvements to link atmospheric deposition of nitrogen and other elements in the Netherlands to measurements of nitrogen and other elements in the upper groundwater. The deterministic model SMARTml was found to be the most suitable

  19. Dopamine, Effort-Based Choice, and Behavioral Economics: Basic and Translational Research.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Yang, Jen-Hau; Rotolo, Renee; Presby, Rose

    2018-01-01

    Operant behavior is not only regulated by factors related to the quality or quantity of reinforcement, but also by the work requirements inherent in performing instrumental actions. Moreover, organisms often make effort-related decisions involving economic choices such as cost/benefit analyses. Effort-based decision making is studied using behavioral procedures that offer choices between high-effort options leading to relatively preferred reinforcers vs. low effort/low reward choices. Several neural systems, including the mesolimbic dopamine (DA) system and other brain circuits, are involved in regulating effort-related aspects of motivation. Considerable evidence indicates that mesolimbic DA transmission exerts a bi-directional control over exertion of effort on instrumental behavior tasks. Interference with DA transmission produces a low-effort bias in animals tested on effort-based choice tasks, while increasing DA transmission with drugs such as DA transport blockers tends to enhance selection of high-effort options. The results from these pharmacology studies are corroborated by the findings from recent articles using optogenetic, chemogenetic and physiological techniques. In addition to providing important information about the neural regulation of motivated behavior, effort-based choice tasks are useful for developing animal models of some of the motivational symptoms that are seen in people with various psychiatric and neurological disorders (e.g., depression, schizophrenia, Parkinson's disease). Studies of effort-based decision making may ultimately contribute to the development of novel drug treatments for motivational dysfunction.

  20. Egg origin determination efforts

    International Nuclear Information System (INIS)

    Horvath, A.; Futo, I.; Vodila, G.; Palcsu, L.

    2012-01-01

    whites outside this interval, foreign origin can be assumed, since the isotope ratio of the drinking water samples covers natural waters characteristic for Hungary. As a conclusion, the same applies for eggs, as well. The three foreign egg samples can be separated from the Hungarian ones based on their δ 18 O and δD values, however, differences in the shifts compared to their own drinking water samples masks the region-specific information of the drinking water. From the micro elemental composition of the egg shells it can be stated that the identification of the samples can be performed with a precision of 97.1%, therefore differences in the elemental composition are large enough to characterise the origin of the eggs by the elemental analysis of the egg shell. It is recommended from the market protection point of view that to compare the elemental composition data of the shell of the supposedly foreign egg with an information database established for each production plant in Hungary. As on the information stamp of the mislabelled eggs the code of a Hungarian production plant is seen, it would be comparable with the real eggs originating from that plant. In this way, foreign eggs may be filtered out. Of course, to verify this method, further investigations are required.

  1. Developing a Predictive Model for Unscheduled Maintenance Requirements on United States Air Force Installations

    National Research Council Canada - National Science Library

    Kovich, Matthew D; Norton, J. D

    2008-01-01

    .... This paper develops one such method by using linear regression and time series analysis to develop a predictive model to forecast future year man-hour and funding requirements for unscheduled maintenance...

  2. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  3. Towards a framework for improving goal-oriented requirement models quality

    OpenAIRE

    Cares, Carlos; Franch Gutiérrez, Javier

    2009-01-01

    Goal-orientation is a widespread and useful approach to Requirements Engineering. However, quality assessment frameworks focused on goal-oriented processes are either limited or remain on the theoretical side. Requirements quality initiatives range from simple metrics applicable to requirements documents, to general-purpose quality frameworks that include syntactic, semantic and pragmatic concerns. In some recent works, we have proposed a metrics framework for goal-oriented models, b...

  4. A manpower training requirements model for new weapons systems, with applications to the infantry fighting vehicle

    OpenAIRE

    Kenehan, Douglas J.

    1981-01-01

    Approved for public release; distribution is unlimited This thesis documents the methodology and parameters used in designing a manpower training requirements model for new weapons systems. This model provides manpower planners with the capability of testing alternative fielding policies and adjusting model parameters to improve the use of limited personnel resources. Use of the model is illustrated in a detailed analysis of the planned introduction of the Infantry Fighting Vehicle into t...

  5. Fuel requirements for experimental devices in MTR reactors. A perturbation model for reactor core analysis

    International Nuclear Information System (INIS)

    Beeckmans de West-Meerbeeck, A.

    1991-01-01

    Irradiation in neutron absorbing devices, requiring high fast neutron fluxes in the core or high thermal fluxes in the reflector and flux traps, lead to higher density fuel and larger core dimensions. A perturbation model of the reactor core helps to estimate the fuel requirements. (orig.)

  6. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms of a derivat...

  7. The Nuremberg Code subverts human health and safety by requiring animal modeling

    OpenAIRE

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-01-01

    Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive...

  8. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  9. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  10. How do feelings influence effort? An empirical study of entrepreneurs' affect and venture effort.

    Science.gov (United States)

    Foo, Maw-Der; Uy, Marilyn A; Baron, Robert A

    2009-07-01

    How do feelings influence the effort of entrepreneurs? To obtain data on this issue, the authors implemented experience sampling methodology in which 46 entrepreneurs used cell phones to provide reports on their affect, future temporal focus, and venture effort twice daily for 24 days. Drawing on the affect-as-information theory, the study found that entrepreneurs' negative affect directly predicts entrepreneurs' effort toward tasks that are required immediately. Results were consistent for within-day and next-day time lags. Extending the theory, the study found that positive affect predicts venture effort beyond what is immediately required and that this relationship is mediated by future temporal focus. The mediating effects were significant only for next-day outcomes. Implications of findings on the nature of the affect-effort relationship for different time lags are discussed.

  11. Modeling the Impact of Simulated Educational Interventions on the Use and Abuse of Pharmaceutical Opioids in the United States: A Report on Initial Efforts

    Science.gov (United States)

    Wakeland, Wayne; Nielsen, Alexandra; Schmidt, Teresa D.; McCarty, Dennis; Webster, Lynn R.; Fitzgerald, John; Haddox, J. David

    2013-01-01

    Three educational interventions were simulated in a system dynamics model of the medical use, trafficking, and nonmedical use of pharmaceutical opioids. The study relied on secondary data obtained in the literature for the period of 1995 to 2008 as well as expert panel recommendations regarding model parameters and structure. The behavior of the…

  12. Estimating required information size by quantifying diversity in random-effects model meta-analyses

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Thorlund, Kristian; Brok, Jesper

    2009-01-01

    BACKGROUND: There is increasing awareness that meta-analyses require a sufficiently large information size to detect or reject an anticipated intervention effect. The required information size in a meta-analysis may be calculated from an anticipated a priori intervention effect or from...... an intervention effect suggested by trials with low-risk of bias. METHODS: Information size calculations need to consider the total model variance in a meta-analysis to control type I and type II errors. Here, we derive an adjusting factor for the required information size under any random-effects model meta......-trial variability and a sampling error estimate considering the required information size. D2 is different from the intuitively obvious adjusting factor based on the common quantification of heterogeneity, the inconsistency (I2), which may underestimate the required information size. Thus, D2 and I2 are compared...

  13. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  14. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  15. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  16. Knowledge Based Characterization of Cross-Models Constraints to Check Design and Modeling Requirements

    Science.gov (United States)

    Simonn Zayas, David; Monceaux, Anne; Ait-Ameur, Yamine

    2011-08-01

    Nowadays, complexity of systems frequently implies different engineering teams handling various descriptive models. Each team having a variety of expertise backgrounds, domain knowledge and modeling practices, the heterogeneity of the models themselves is a logical consequence. Therefore, even individually models are well managed; their diversity becomes a problem when engineers need to share their models to perform some overall validations. One way of reducing this heterogeneity is to take into consideration the implicit knowledge which is not contained in the models but it is cardinal to understand them. In a first stage of our research, we have defined and implemented an approach recommending the formalization of implicit knowledge to enrich models in order to ease cross- model checks. Nevertheless, to fill the gap between the specification of the system and the validation of a cross- model constraint, in this paper we suggest giving values to some relevant characteristics to reinforce the approach.

  17. School Trips: Are They Worth the Effort?

    Science.gov (United States)

    Johnston, Robert

    2015-01-01

    Even the most basic of school trips will require booking places, arranging transport, writing to parents, collecting payments, planning activities, producing worksheets and, of course, endless risk assessments. It always leaves teachers wondering: "is it really worth all this effort?" Robert Johnston believes that every teacher should…

  18. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high...... accuracy. The work aims to identify the minimum amount of input data required for parameterizing an accurate model of the PV plant. The analysis was carried out for both amorphous silicon (a-Si) and cadmium telluride (CdTe), using crystalline silicon (c-Si) as a base for comparison. In the studied cases...

  19. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  20. The Nuremberg Code subverts human health and safety by requiring animal modeling.

    Science.gov (United States)

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-07-08

    The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  1. Workshop on Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials

    Energy Technology Data Exchange (ETDEWEB)

    Giles, GE

    2005-02-03

    The purpose of this Workshop on ''Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials'' was to solicit functional requirements for tools that help Incident Managers plan for and deal with the consequences of industrial or terrorist releases of materials into the nation's waterways and public water utilities. Twenty representatives attended and several made presentations. Several hours of discussions elicited a set of requirements. These requirements were summarized in a form for the attendees to vote on their highest priority requirements. These votes were used to determine the prioritized requirements that are reported in this paper and can be used to direct future developments.

  2. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  3. Update on the Status of the On-Going Range Dependent Low Frequency Active Sonar Model Benchmarking Effort : From Cambridge to Kos [abstract

    NARCIS (Netherlands)

    Zampolli, M.; Ainslie, M.A.

    2011-01-01

    In April 2010, a symposium in Memory of David Weston was held at Clare College in Cambridge (UK). International researchers from academia and research laboratories met to discuss two sets of test problems for sonar performance models, one aimed at understanding mammal echolocation sonar („Problem

  4. Creating the finite element models of car seats with passive head restraints to meet the requirements of passive safety

    Directory of Open Access Journals (Sweden)

    D. Yu. Solopov

    2014-01-01

    Full Text Available A problem solution to create the car chairs using modern software complexes (CAE based on the finite elements is capable to increase an efficiency of designing process significantly. Designing process is complicated by the fact that at present there are no available techniques focused on this sort of tasks.This article shows the features to create the final element models (FEM of the car chairs having three levels of complexity. It assesses a passive safety, which is ensured by the developed chair models with passive head restraints according to requirements of UNECE No 25 Regulations, and an accuracy of calculation results compared with those of full-scale experiments.This work is part of the developed technique, which allows effective development of the car chair designs both with passive, and with active head restraints, meeting the requirements of passive safety.By results of calculations and experiments it was established that at assessment by an UNECE No 25 technique the "rough" FEM (the 1st and 2nd levels can be considered as rational (in terms of effort to its creation and task solution and by the errors of results, and it is expedient to use them for preliminary and multiple calculations. Detailed models (the 3rd level provide the greatest accuracy (for accelerations the relative error makes 10%, for movements it is 11%, while in comparison with calculations, the relative error for a model of head restraint only decreases by 5% for accelerations and for 9% for movements.The materials presented in the article are used both in research activities and in training students at the Chair of Wheel Vehicles of the Scientific and Educational Complex "Special Mechanical Engineering" of Bauman Moscow State Technical University.

  5. Advanced materials characterization and modeling using synchrotron, neutron, TEM, and novel micro-mechanical techniques - A European effort to accelerate fusion materials development

    DEFF Research Database (Denmark)

    Linsmeier, Ch.; Fu, C.-C.; Kaprolat, A.

    2013-01-01

    having energies up to 14 MeV. In addition to withstanding the effects of neutrons, the mechanical stability of structural materials has to be maintained up to high temperatures. Plasma-exposed materials must be compatible with the fusion plasma, both with regard to the generation of impurities injected......For the realization of fusion as an energy source, the development of suitable materials is one of the most critical issues. The required material properties are in many aspects unique compared to the existing solutions, particularly the need for necessary resistance to irradiation with neutrons...... as testing under neutron flux-induced conditions. For the realization of a DEMO power plant, the materials solutions must be available in time. The European initiative FEMaS-CA – Fusion Energy Materials Science – Coordination Action – aims at accelerating materials development by integrating advanced...

  6. Learning Environment and Student Effort

    Science.gov (United States)

    Hopland, Arnt O.; Nyhus, Ole Henning

    2016-01-01

    Purpose: The purpose of this paper is to explore the relationship between satisfaction with learning environment and student effort, both in class and with homework assignments. Design/methodology/approach: The authors use data from a nationwide and compulsory survey to analyze the relationship between learning environment and student effort. The…

  7. Multidisciplinary Efforts Driving Translational Theranostics

    Science.gov (United States)

    Hu, Tony Y.

    2014-01-01

    This themed issue summarizes significant efforts aimed at using “biological language” to discern between “friends” and “foes” in the context of theranostics for true clinical application. It is expected that the success of theranostics depends on multidisciplinary efforts, combined to expedite our understanding of host responses to “customized” theranostic agents and formulating individualized therapies. PMID:25285169

  8. Genetic programming as alternative for predicting development effort of individual software projects.

    Directory of Open Access Journals (Sweden)

    Arturo Chavoya

    Full Text Available Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment.

  9. Requirements for psychological models to support design: Towards ecological task analysis

    Science.gov (United States)

    Kirlik, Alex

    1991-01-01

    Cognitive engineering is largely concerned with creating environmental designs to support skillful and effective human activity. A set of necessary conditions are proposed for psychological models capable of supporting this enterprise. An analysis of the psychological nature of the design product is used to identify a set of constraints that models must meet if they can usefully guide design. It is concluded that cognitive engineering requires models with resources for describing the integrated human-environment system, and that these models must be capable of describing the activities underlying fluent and effective interaction. These features are required in order to be able to predict the cognitive activity that will be required given various design concepts, and to design systems that promote the acquisition of fluent, skilled behavior. These necessary conditions suggest that an ecological approach can provide valuable resources for psychological modeling to support design. Relying heavily on concepts from Brunswik's and Gibson's ecological theories, ecological task analysis is proposed as a framework in which to predict the types of cognitive activity required to achieve productive behavior, and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The framework is described in terms, and illustrated with an example from the previous research on modeling skilled human-environment interaction.

  10. Effort to Accelerate MBSE Adoption and Usage at JSC

    Science.gov (United States)

    Wang, Lui; Izygon, Michel; Okron, Shira; Garner, Larry; Wagner, Howard

    2016-01-01

    This paper describes the authors' experience in adopting Model Based System Engineering (MBSE) at the NASA/Johnson Space Center (JSC). Since 2009, NASA/JSC has been applying MBSE using the Systems Modeling Language (SysML) to a number of advanced projects. Models integrate views of the system from multiple perspectives, capturing the system design information for multiple stakeholders. This method has allowed engineers to better control changes, improve traceability from requirements to design and manage the numerous interactions between components. As the project progresses, the models become the official source of information and used by multiple stakeholders. Three major types of challenges that hamper the adoption of the MBSE technology are described. These challenges are addressed by a multipronged approach that includes educating the main stakeholders, implementing an organizational infrastructure that supports the adoption effort, defining a set of modeling guidelines to help engineers in their modeling effort, providing a toolset that support the generation of valuable products, and providing a library of reusable models. JSC project case studies are presented to illustrate how the proposed approach has been successfully applied.

  11. Respiratory effort from the photoplethysmogram.

    Science.gov (United States)

    Addison, Paul S

    2017-03-01

    The potential for a simple, non-invasive measure of respiratory effort based on the pulse oximeter signal - the photoplethysmogram or 'pleth' - was investigated in a pilot study. Several parameters were developed based on a variety of manifestations of respiratory effort in the signal, including modulation changes in amplitude, baseline, frequency and pulse transit times, as well as distinct baseline signal shifts. Thirteen candidate parameters were investigated using data from healthy volunteers. Each volunteer underwent a series of controlled respiratory effort maneuvers at various set flow resistances and respiratory rates. Six oximeter probes were tested at various body sites. In all, over three thousand pleth-based effort-airway pressure (EP) curves were generated across the various airway constrictions, respiratory efforts, respiratory rates, subjects, probe sites, and the candidate parameters considered. Regression analysis was performed to determine the existence of positive monotonic relationships between the respiratory effort parameters and resulting airway pressures. Six of the candidate parameters investigated exhibited a distinct positive relationship (poximeter probe and an ECG (P2E-Effort) and the other using two pulse oximeter probes placed at different peripheral body sites (P2-Effort); and baseline shifts in heart rate, (BL-HR-Effort). In conclusion, a clear monotonic relationship was found between several pleth-based parameters and imposed respiratory loadings at the mouth across a range of respiratory rates and flow constrictions. The results suggest that the pleth may provide a measure of changing upper airway dynamics indicative of the effort to breathe. Copyright © 2017 The Author. Published by Elsevier Ltd.. All rights reserved.

  12. Definition of common support equipment and space station interface requirements for IOC model technology experiments

    Science.gov (United States)

    Russell, Richard A.; Waiss, Richard D.

    1988-01-01

    A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.

  13. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  14. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    International Nuclear Information System (INIS)

    St John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program

  15. From Effort to Value: Preschool Children's Alternative to Effort Justification.

    Science.gov (United States)

    Benozio, Avi; Diesendruck, Gil

    2015-09-01

    In the current studies, we addressed the development of effort-based object valuation. Four- and 6-year-olds invested either great or little effort in order to obtain attractive or unattractive rewards. Children were allowed to allocate these rewards to an unfamiliar recipient (dictator game). Investing great effort to obtain attractive rewards (a consonant situation) led 6-year-olds, but not 4-year-olds, to enhance the value of the rewards and thus distribute fewer of them to others. After investing effort to attain unattractive rewards (a dissonant situation), 6-year-olds cognitively reduced the dissonance between effort and reward quality by reappraising the value of the rewards and thus distributing fewer of them. In contrast, 4-year-olds reduced the dissonance behaviorally by discarding the rewards. These findings provide evidence for the emergence of an effort-value link and underline possible mechanisms underlying the primacy of cognitive versus behavioral solutions to dissonance reduction. © The Author(s) 2015.

  16. Modeling and Simulation for Exploring Human-Robot Team Interaction Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Dudenhoeffer, Donald Dean; Bruemmer, David Jonathon; Davis, Midge Lee

    2001-12-01

    Small-sized and micro-robots will soon be available for deployment in large-scale forces. Consequently, the ability of a human operator to coordinate and interact with largescale robotic forces is of great interest. This paper describes the ways in which modeling and simulation have been used to explore new possibilities for human-robot interaction. The paper also discusses how these explorations have fed implementation of a unified set of command and control concepts for robotic force deployment. Modeling and simulation can play a major role in fielding robot teams in actual missions. While live testing is preferred, limitations in terms of technology, cost, and time often prohibit extensive experimentation with physical multi-robot systems. Simulation provides insight, focuses efforts, eliminates large areas of the possible solution space, and increases the quality of actual testing.

  17. Conclusions of the workshop on the ATLAS requirements on shower models

    CERN Document Server

    Bosman, M; Efthymiopoulos, I; Froidevaux, D; Gianotti, F; Kiryunin, A E; Knobloch, J; Loch, P; Osculati, B; Perini, L; Sala, P R; Seman, M

    1999-01-01

    The workshop addressed the question of shower models/packages and related issues needed for the simulation of ATLAS physics and test beam data. Part of the material discussed during the workshop is reviewed in this note. Results presented on the comparison berween ATLAS test beam data and Monte Carlo prediction of various shower models are briefly summarized. The requirements put forward by the various detector communities and first attempts to quantify them are being reviewed.

  18. System Design Description and Requirements for Modeling the Off-Gas Systems for Fuel Recycling Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Daryl R. Haefner; Jack D. Law; Troy J. Tranter

    2010-08-01

    This document provides descriptions of the off-gases evolved during spent nuclear fuel processing and the systems used to capture the gases of concern. Two reprocessing techniques are discussed, namely aqueous separations and electrochemical (pyrochemical) processing. The unit operations associated with each process are described in enough detail so that computer models to mimic their behavior can be developed. The document also lists the general requirements for the desired computer models.

  19. The dynamic system of parental work of care for children with special health care needs: a conceptual model to guide quality improvement efforts.

    Science.gov (United States)

    Hexem, Kari R; Bosk, Abigail M; Feudtner, Chris

    2011-10-25

    The work of care for parents of children with complex special health care needs may be increasing, while excessive work demands may erode the quality of care. We sought to summarize knowledge and develop a general conceptual model of the work of care. Systematic review of peer-reviewed journal articles that focused on parents of children with special health care needs and addressed factors related to the physical and emotional work of providing care for these children. From the large pool of eligible articles, we selected articles in a randomized sequence, using qualitative techniques to identify the conceptual components of the work of care and their relationship to the family system. The work of care for a child with special health care needs occurs within a dynamic system that comprises 5 core components: (1) performance of tasks such as monitoring symptoms or administering treatments, (2) the occurrence of various events and the pursuit of valued outcomes regarding the child's physical health, the parent's mental health, or other attributes of the child or family, (3) operating with available resources and within certain constraints (4) over the passage of time, (5) while mentally representing or depicting the ever-changing situation and detecting possible problems and opportunities. These components interact, some with simple cause-effect relationships and others with more complex interdependencies. The work of care affecting the health of children with special health care needs and their families can best be understood, studied, and managed as a multilevel complex system.

  20. The pharmacology of effort-related choice behavior: Dopamine, depression, and individual differences.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Yohn, Samantha; Lopez Cruz, Laura; San Miguel, Noemi; Alatorre, Luisa

    2016-06-01

    This review paper is focused upon the involvement of mesolimbic dopamine (DA) and related brain systems in effort-based processes. Interference with DA transmission affects instrumental behavior in a manner that interacts with the response requirements of the task, such that rats with impaired DA transmission show a heightened sensitivity to ratio requirements. Impaired DA transmission also affects effort-related choice behavior, which is assessed by tasks that offer a choice between a preferred reinforcer that has a high work requirement vs. less preferred reinforcer that can be obtained with minimal effort. Rats and mice with impaired DA transmission reallocate instrumental behavior away from food-reinforced tasks with high response costs, and show increased selection of low reinforcement/low cost options. Tests of effort-related choice have been developed into models of pathological symptoms of motivation that are seen in disorders such as depression and schizophrenia. These models are being employed to explore the effects of conditions associated with various psychopathologies, and to assess drugs for their potential utility as treatments for effort-related symptoms. Studies of the pharmacology of effort-based choice may contribute to the development of treatments for symptoms such as psychomotor slowing, fatigue or anergia, which are seen in depression and other disorders. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. The dynamic system of parental work of care for children with special health care needs: A conceptual model to guide quality improvement efforts

    Directory of Open Access Journals (Sweden)

    Hexem Kari R

    2011-10-01

    Full Text Available Abstract Background The work of care for parents of children with complex special health care needs may be increasing, while excessive work demands may erode the quality of care. We sought to summarize knowledge and develop a general conceptual model of the work of care. Methods Systematic review of peer-reviewed journal articles that focused on parents of children with special health care needs and addressed factors related to the physical and emotional work of providing care for these children. From the large pool of eligible articles, we selected articles in a randomized sequence, using qualitative techniques to identify the conceptual components of the work of care and their relationship to the family system. Results The work of care for a child with special health care needs occurs within a dynamic system that comprises 5 core components: (1 performance of tasks such as monitoring symptoms or administering treatments, (2 the occurrence of various events and the pursuit of valued outcomes regarding the child's physical health, the parent's mental health, or other attributes of the child or family, (3 operating with available resources and within certain constraints (4 over the passage of time, (5 while mentally representing or depicting the ever-changing situation and detecting possible problems and opportunities. These components interact, some with simple cause-effect relationships and others with more complex interdependencies. Conclusions The work of care affecting the health of children with special health care needs and their families can best be understood, studied, and managed as a multilevel complex system.

  2. Reproductive effort decreases antibody responsiveness

    NARCIS (Netherlands)

    Deerenberg, Charlotte; Arpanius, Victor; Daan, Serge; Bos, Nicolaas

    1997-01-01

    The prevalence and intensity of parasitic infection often increases in animals when they are reproducing. This may be a consequence of increased rates of parasite transmission due to reproductive effort. Alternatively, endocrine changes associated with reproduction can lead to immunosuppression.

  3. EA Shuttle Document Retention Effort

    Science.gov (United States)

    Wagner, Howard A.

    2010-01-01

    This slide presentation reviews the effort of code EA at Johnson Space Center (JSC) to identify and acquire databases and documents from the space shuttle program that are adjudged important for retention after the retirement of the space shuttle.

  4. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  5. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  6. Nash Equilibria in Shared Effort Games

    NARCIS (Netherlands)

    Polevoy, G.; Trajanovski, S.; De Weerdt, M.M.

    2014-01-01

    Shared effort games model people's contribution to projects and sharing the obtained profits. Those games generalize both public projects like writing for Wikipedia, where everybody shares the resulting benefits, and all-pay auctions such as contests and political campaigns, where only the winner

  7. Net benefits of wildfire prevention education efforts

    Science.gov (United States)

    Jeffrey P. Prestemon; David T. Butry; Karen L. Abt; Ronda. Sutphen

    2010-01-01

    Wildfire prevention education efforts involve a variety of methods, including airing public service announcements, distributing brochures, and making presentations, which are intended to reduce the occurrence of certain kinds of wildfires. A Poisson model of preventable Florida wildfires from 2002 to 2007 by fire management region was developed. Controlling for...

  8. A Conceptual Model and Process for Client-driven Agile Requirements Prioritization

    NARCIS (Netherlands)

    Racheva, Z.; Daneva, Maia; Herrmann, Andrea; Wieringa, Roelf J.

    Continuous customer-centric requirements reprioritization is essential in successfully performing agile software development. Yet, in the agile RE literature, very little is known about how agile reprioritization happens in practice. Generic conceptual models about this process are missing, which in

  9. INTEGRATED DATA CAPTURING REQUIREMENTS FOR 3D SEMANTIC MODELLING OF CULTURAL HERITAGE: THE INCEPTION PROTOCOL

    Directory of Open Access Journals (Sweden)

    R. Di Giulio

    2017-02-01

    In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  10. Handling non-functional requirements in model-driven development: an ongoing industrial survey

    NARCIS (Netherlands)

    Ameller, David; Franch, Xavier; Gómez, Cristina; Araújo, João; Berntsson Svensson, Richard; Biffle, Stefan; Cabot, Jordi; Cortelessa, Vittorio; Daneva, Maia; Méndez Fernández, Daniel; Moreira, Ana; Muccini, Henry; Vallecillo, Antonio; Wimmer, Manuel; Amaral, Vasco; Brunelière, Hugo; Burgueño, Loli; Goulão, Miguel; Schätz, Bernard; Teufl, Sabine

    2015-01-01

    Model-Driven Development (MDD) is no longer a novel development paradigm. It has become mature from a research perspective and recent studies show its adoption in industry. Still, some issues remain a challenge. Among them, we are interested in the treatment of non-functional requirements (NFRs) in

  11. Projected irrigation requirements for upland crops using soil moisture model under climate change in South Korea

    Science.gov (United States)

    An increase in abnormal climate change patterns and unsustainable irrigation in uplands cause drought and affect agricultural water security, crop productivity, and price fluctuations. In this study, we developed a soil moisture model to project irrigation requirements (IR) for upland crops under cl...

  12. Teaching Tip: Using Rapid Game Prototyping for Exploring Requirements Discovery and Modeling

    Science.gov (United States)

    Dalal, Nikunj

    2012-01-01

    We describe the use of rapid game prototyping as a pedagogic technique to experientially explore and learn requirements discovery, modeling, and specification in systems analysis and design courses. Students have a natural interest in gaming that transcends age, gender, and background. Rapid digital game creation is used to build computer games…

  13. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  14. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand

    2015-01-01

    (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses......Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production...... distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against...

  15. The realisation of legal protection requirements with the aid of models of nuclear facilities

    International Nuclear Information System (INIS)

    Wildberg, D.W.; Herrmann, H.J.

    1978-08-01

    In the Federal Republic of Germany, the model-based planning, construction and operation of nuclear facilities is still in its initial stages. Based on a few examples, the authors show that with the atomic energy legislature and with the laws in the conventional sector, the legislator had enacted requirements at a relatively early stage for the protection of the individual person in the facility and for the population at large in the vicinity of the facility. However, in the realization of these protection requirements, there are still problems, and these are often very basic in nature. The best solution here seems to be to tackle the problems with the help of models. This would permit subjects like serviceability, testability, use of external personnel, spatial distribution of redundancies, rescue of injured persons, fire protection measures, physical protection and the dismantling of facilities, which are multifarious in nature and have overlapping requirements, to be presented and discussed in greater depth and detail. The positive aspects of the use of models are presented, and the advantages and disadvantages of models are discussed in detail. Finally, the variety of models, which can ben used during the different phases of a nuclear facility, are discussed, and some remarks are made regarding the costs of models. One section of the report deals with examples of the practical use of models: Models have proved themselves in the past in the construction of refineries and chemical plants, and have successfully demonstrated their suitability in the field of nuclear technology. The examples of these need not be limited to those in the Federal Republic of Germany. (orig.) 891 HP [de

  16. On meeting capital requirements with a chance-constrained optimization model.

    Science.gov (United States)

    Atta Mills, Ebenezer Fiifi Emire; Yu, Bo; Gu, Lanlan

    2016-01-01

    This paper deals with a capital to risk asset ratio chance-constrained optimization model in the presence of loans, treasury bill, fixed assets and non-interest earning assets. To model the dynamics of loans, we introduce a modified CreditMetrics approach. This leads to development of a deterministic convex counterpart of capital to risk asset ratio chance constraint. We pursue the scope of analyzing our model under the worst-case scenario i.e. loan default. The theoretical model is analyzed by applying numerical procedures, in order to administer valuable insights from a financial outlook. Our results suggest that, our capital to risk asset ratio chance-constrained optimization model guarantees banks of meeting capital requirements of Basel III with a likelihood of 95 % irrespective of changes in future market value of assets.

  17. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  18. Maximum effort in the minimum-effort game

    Czech Academy of Sciences Publication Activity Database

    Engelmann, Dirk; Normann, H.-T.

    2010-01-01

    Roč. 13, č. 3 (2010), s. 249-259 ISSN 1386-4157 Institutional research plan: CEZ:AV0Z70850503 Keywords : minimum-effort game * coordination game * experiments * social capital Subject RIV: AH - Economics Impact factor: 1.868, year: 2010

  19. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    International Nuclear Information System (INIS)

    Murcia, J P; Réthoré, P E; Natarajan, A; Sørensen, J D

    2015-01-01

    Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against the traditional binning method with trapezoidal and Simpson's integration rules.The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model evaluations for a general wind power plant is proposed based on the convergence of the present method for each case. (paper)

  20. A New Rapid Simplified Model for Urban Rainstorm Inundation with Low Data Requirements

    Directory of Open Access Journals (Sweden)

    Ji Shen

    2016-11-01

    Full Text Available This paper proposes a new rapid simplified inundation model (NRSIM for flood inundation caused by rainstorms in an urban setting that can simulate the urban rainstorm inundation extent and depth in a data-scarce area. Drainage basins delineated from a floodplain map according to the distribution of the inundation sources serve as the calculation cells of NRSIM. To reduce data requirements and computational costs of the model, the internal topography of each calculation cell is simplified to a circular cone, and a mass conservation equation based on a volume spreading algorithm is established to simulate the interior water filling process. Moreover, an improved D8 algorithm is outlined for the simulation of water spilling between different cells. The performance of NRSIM is evaluated by comparing the simulated results with those from a traditional rapid flood spreading model (TRFSM for various resolutions of digital elevation model (DEM data. The results are as follows: (1 given high-resolution DEM data input, the TRFSM model has better performance in terms of precision than NRSIM; (2 the results from TRFSM are seriously affected by the decrease in DEM data resolution, whereas those from NRSIM are not; and (3 NRSIM always requires less computational time than TRFSM. Apparently, compared with the complex hydrodynamic or traditional rapid flood spreading model, NRSIM has much better applicability and cost-efficiency in real-time urban inundation forecasting for data-sparse areas.

  1. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  2. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  3. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  4. Vocal effort and voice handicap among teachers.

    Science.gov (United States)

    Sampaio, Márcio Cardoso; dos Reis, Eduardo José Farias Borges; Carvalho, Fernando Martins; Porto, Lauro Antonio; Araújo, Tânia Maria

    2012-11-01

    The relationship between voice handicap and professional vocal effort was investigated among teachers in a cross-sectional study of census nature on 4496 teachers within the public elementary education network in Salvador, Bahia, Brazil. Voice handicap (the outcome of interest) was evaluated using the Voice Handicap Index 10. The main exposure, the lifetime vocal effort index, was obtained as the product of the number of years working as a teacher multiplied by the mean weekly working hours. The prevalence of voice handicap was 28.8% among teachers with high professional vocal effort and 21.3% among those with acceptable vocal effort, thus yielding a crude prevalence ratio (PR) of 1.36 (95% confidence interval [CI]=1.14-1.61). In the final logistic model, the prevalence of voice handicap was statistically associated with the professional vocal effort index (PR=1.47; 95% CI=1.19-1.82), adjusted according to sex, microphone availability in the classroom, excessive noise, pressure from the school management, heartburn, and rhinitis. Copyright © 2012 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  5. Effort problem of chemical pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okrajni, J.; Ciesla, M.; Mutwil, K. [Silesian Technical University, Katowice (Poland)

    1998-12-31

    The problem of the technical state assessment of the chemical pipelines working under mechanical and thermal loading has been shown in the paper. The pipelines effort after the long time operating period has been analysed. Material geometrical and loading conditions of the crack initiation and crack growth process in the chosen object has been discussed. Areas of the maximal effort have been determined. The material structure charges after the long time operating period have been described. Mechanisms of the crack initiation and crack growth in the pipeline elements have been analysed and mutual relations between the chemical and mechanical influences have been shown. (orig.) 16 refs.

  6. What model resolution is required in climatological downscaling over complex terrain?

    Science.gov (United States)

    El-Samra, Renalda; Bou-Zeid, Elie; El-Fadel, Mutasem

    2018-05-01

    This study presents results from the Weather Research and Forecasting (WRF) model applied for climatological downscaling simulations over highly complex terrain along the Eastern Mediterranean. We sequentially downscale general circulation model results, for a mild and wet year (2003) and a hot and dry year (2010), to three local horizontal resolutions of 9, 3 and 1 km. Simulated near-surface hydrometeorological variables are compared at different time scales against data from an observational network over the study area comprising rain gauges, anemometers, and thermometers. The overall performance of WRF at 1 and 3 km horizontal resolution was satisfactory, with significant improvement over the 9 km downscaling simulation. The total yearly precipitation from WRF's 1 km and 3 km domains exhibited Wind speeds, on the other hand, are generally overestimated for all model resolutions, in comparison with observational data, particularly on the coast (up to 50%) compared to inland stations (up to 40%). The findings therefore indicate that a 3 km resolution is sufficient for the downscaling, especially that it would allow more years and scenarios to be investigated compared to the higher 1 km resolution at the same computational effort. In addition, the results provide a quantitative measure of the potential errors for various hydrometeorological variables.

  7. Evaluation of risk impact of changes to surveillance requirements addressing model and parameter uncertainties

    International Nuclear Information System (INIS)

    Martorell, S.; Villamizar, M.; Martón, I.; Villanueva, J.F.; Carlos, S.; Sánchez, A.I.

    2014-01-01

    This paper presents a three steps based approach for the evaluation of risk impact of changes to Surveillance Requirements based on the use of the Probabilistic Risk Assessment and addressing identification, treatment and analysis of model and parameter uncertainties in an integrated manner. The paper includes also an example of application that focuses on the evaluation of the risk impact of a Surveillance Frequency change for the Reactor Protection System of a Nuclear Power Plant using a level 1 Probabilistic Risk Assessment. Surveillance Requirements are part of Technical Specifications that are included into the Licensing Basis for operation of Nuclear Power Plants. Surveillance Requirements aim at limiting risk of undetected downtimes of safety related equipment by imposing equipment operability checks, which consist of testing of equipment operational parameters with established Surveillance Frequency and Test Strategy

  8. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  9. Estimates of methionine and sulfur amino acid requirements for laying hens using different models

    Directory of Open Access Journals (Sweden)

    AA Saki

    2012-09-01

    Full Text Available This experiment was conducted to evaluate the effects of dietary methionine (Met content on the performance of white commercial laying hens and to determine Met and total sulfur amino acids requirements (TSAA. These requirements were estimated using three statistical models (broken-line regression, exponential and second order equations to evaluate their abilit to determine amino acid requirements. A total of 216 laying hens (23 wks of age was used in a completely randomized design (CRD with six treatments with four replicates of nine birds each. The basal diet contained 15.25% crude protein, 2830.16 kcal/kg ME and 0.24% Met. Synthetic DL-Met was added to the deficient (basal diet in 0.05% increments to make the other five experimental diets (0.29, 0.34, 0.39, 0.44 and 0.49% Met. Increasing Met level from 0.24 to 0.34% significantly increased egg production, egg weight, egg mass, egg content, and feed intake and decreased feed conversion ratio (p<0.05. However, further Met increases, from 0.34 to 0.49%, no longer influenced these parameters. Out of the three models, the broken-line regression model presented better estimates of AA requirements. Based on broken-line equations, average Met and TSAA requirements of the laying hens were 0.31 and 0.60% (245.50 and 469.25 mg/hen/day from 22 to 36 wks of age, respectively.

  10. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  11. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  12. Seamless Requirements

    OpenAIRE

    Naumchev, Alexandr; Meyer, Bertrand

    2017-01-01

    Popular notations for functional requirements specifications frequently ignore developers' needs, target specific development models, or require translation of requirements into tests for verification; the results can give out-of-sync or downright incompatible artifacts. Seamless Requirements, a new approach to specifying functional requirements, contributes to developers' understanding of requirements and to software quality regardless of the process, while the process itself becomes lighter...

  13. High resolution weather data for urban hydrological modelling and impact assessment, ICT requirements and future challenges

    Science.gov (United States)

    ten Veldhuis, Marie-claire; van Riemsdijk, Birna

    2013-04-01

    Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This

  14. ZNJPrice/Earnings Ratio Model through Dividend Yield and Required Yield Above Expected Inflation

    Directory of Open Access Journals (Sweden)

    Emil Mihalina

    2010-07-01

    Full Text Available Price/earnings ratio is the most popular and most widespread evaluation model used to assess relative capital asset value on financial markets. In functional terms, company earnings in the very long term can be described with high significance. Empirically, it is visible from long-term statistics that the demanded (required yield on capital markets has certain regularity. Thus, investors first require a yield above the stable inflation rate and then a dividend yield and a capital increase caused by the growth of earnings that influence the price, with the assumption that the P/E ratio is stable. By combining the Gordon model for current dividend value, the model of market capitalization of earnings (price/earnings ratio and bearing in mind the influence of the general price levels on company earnings, it is possible to adjust the price/earnings ratio by deriving a function of the required yield on capital markets measured by a market index through dividend yield and inflation rate above the stable inflation rate increased by profit growth. The S&P 500 index for example, has in the last 100 years grown by exactly the inflation rate above the stable inflation rate increased by profit growth. The comparison of two series of price/earnings ratios, a modelled one and an average 7-year ratio, shows a notable correlation in the movement of two series of variables, with a three year deviation. Therefore, it could be hypothesized that three years of the expected inflation level, dividend yield and profit growth rate of the market index are discounted in the current market prices. The conclusion is that, at the present time, the relationship between the adjusted average price/earnings ratio and its effect on the market index on one hand and the modelled price/earnings ratio on the other can clearly show the expected dynamics and course in the following period.

  15. A critical review of the data requirements for fluid flow models through fractured rock

    International Nuclear Information System (INIS)

    Priest, S.D.

    1986-01-01

    The report is a comprehensive critical review of the data requirements for ten models of fluid flow through fractured rock, developed in Europe and North America. The first part of the report contains a detailed review of rock discontinuities and how their important geometrical properties can be quantified. This is followed by a brief summary of the fundamental principles in the analysis of fluid flow through two-dimensional discontinuity networks and an explanation of a new approach to the incorporation of variability and uncertainty into geotechnical models. The report also contains a review of the geological and geotechnical properties of anhydrite and granite. Of the ten fluid flow models reviewed, only three offer a realistic fracture network model for which it is feasible to obtain the input data. Although some of the other models have some valuable or novel features, there is a tendency to concentrate on the simulation of contaminant transport processes, at the expense of providing a realistic fracture network model. Only two of the models reviewed, neither of them developed in Europe, have seriously addressed the problem of analysing fluid flow in three-dimensional networks. (author)

  16. Minimal Requirements for Primary HIV Latency Models Based on a Systematic Review.

    Science.gov (United States)

    Bonczkowski, Pawel; De Scheerder, Marie-Angélique; De Spiegelaere, Ward; Vandekerckhove, Linos

    2016-01-01

    Due to the scarcity of HIV-1 latently infected cells in patients, in vitro primary latency models are now commonly used to study the HIV-1 reservoir. To this end, a number of experimental systems have been developed. Most of these models differ based on the nature of the primary CD4+ T-cell type, the used HIV strains, activation methods, and latency assessment strategies. Despite these differences, most models share some common characteristics. Here, we provide a systematic review covering the primary HIV latency models that have been used to date with the aim to compare these models and identify minimal requirements for such experiments. A systematic search on PubMed and Web of Science databases generated a short list of 17 unique publications that propose new in vitro latency models. Based on the described methods, we propose and discuss a generalized workflow, visualizing all the necessary steps to perform such an in vitro study, with the key choices and validation steps that need to be made; from cell type selection until the model readout.

  17. The Role of Dispersion in Radionuclide Transport - Data and Modeling Requirements: Revision No. 1

    Energy Technology Data Exchange (ETDEWEB)

    Stoller-Navarro Joint Venture

    2004-02-01

    This document is the collaborative effort of the members of an ad hoc subcommittee of the Underground Test Area Project Technical Working Group. This subcommittee was to answer questions and concerns raised by the Nevada Division of Environmental Protection to the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office, regarding Pahute Mesa Corrective Action Units (CAUs) 101 and 102. The document attempts to synthesize the combined comments made by each member of this subcommittee into insights made in the role of dispersion in radionuclide transport data and modeling. Dispersion is one of many processes that control the concentration of radionuclides in groundwater beneath the Nevada Test Site where CAUs 101 and 102 are located. In order to understand the role of dispersion in radionuclide transport, there is a critical need for CAU- or site-specific data related to transport parameters which is currently lacking, particularly in the case of Western a nd Central Pahute Mesa. The purpose of this technical basis document is to: (1) define dispersion and its role in contaminant transport, (2) present a synopsis of field-scale dispersion measurements, (3) provide a literature review of theories to explain field-scale dispersion, (4) suggest approaches to account for dispersion in CAU-scale radionuclide modeling, and (5) to determine if additional dispersion measurements should be made at this time.

  18. Our Electron Model vindicates Schr"odinger's Incomplete Results and Require Restatement of Heisenberg's Uncertainty Principle

    Science.gov (United States)

    McLeod, David; McLeod, Roger

    2008-04-01

    The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

  19. Simulation of fertilizer requirement for irrigated wheat in Eastern India using the QUEFTS model

    Directory of Open Access Journals (Sweden)

    Debtanu Maiti

    2006-01-01

    Full Text Available Crop modeling can provide us with information about fertilizer dose to achieve the target yield, crop conditions, etc. Due to conventional and imbalanced fertilizer application, nutrient use efficiency in wheat is low. Estimation of fertilizer requirements based on quantitative approaches can assist in improving yields and nutrient use efficiency. Field experiments were conducted at 20 sites in eastern India (Nadia district of West Bengal to assess the soil supply, requirement, and internal efficiency of N, P, K, and Zn in wheat. The data were used to calibrate the QUEFTS (Quantitative Evaluation of the Fertility of Tropical Soils model for site-specific, balanced fertilizer recommendations. The parameters of maximum accumulation (a and maximum dilution (d in wheat were calculated for N (35, 100, P (129, 738, K (17, 56, and Zn (21502, 140244. Grain yield of wheat showed statistically significant correlation with N (R2 = 0.937**, P (R2 = 0.901**, and K uptake (R2 = 0.801**. The NPK ratio to produce 1 tonne grain yield of wheat was calculated to be 4.9:1.0:8.9. The relationships between chemical properties and nutrient-supplying capacity of soils were also established. The model was validated using the data from four other experiments. Observed yields with different amounts of N, P, K, and Zn were in good agreement with the predicted values, suggesting that the validated QUEFTS model can be used for site-specific nutrient management of wheat.

  20. Modeling orbital relative motion to enable formation design from application requirements

    Science.gov (United States)

    Fasano, Giancarmine; D'Errico, Marco

    2009-11-01

    While trajectory design for single satellite Earth observation missions is usually performed by means of analytical and relatively simple models of orbital dynamics including the main perturbations for the considered cases, most literature on formation flying dynamics is devoted to control issues rather than mission design. This work aims at bridging the gap between mission requirements and relative dynamics in multi-platform missions by means of an analytical model that describes relative motion for satellites moving on near circular low Earth orbits. The development is based on the orbital parameters approach and both the cases of close and large formations are taken into account. Secular Earth oblateness effects are included in the derivation. Modeling accuracy, when compared to a nonlinear model with two body and J2 forces, is shown to be of the order of 0.1% of relative coordinates for timescales of hundreds of orbits. An example of formation design is briefly described shaping a two-satellite formation on the basis of geometric requirements for synthetic aperture radar interferometry.

  1. Antidepressant-like Effects of Electroconvulsive Seizures Require Adult Neurogenesis in a Neuroendocrine Model of Depression.

    Science.gov (United States)

    Schloesser, Robert J; Orvoen, Sophie; Jimenez, Dennisse V; Hardy, Nicholas F; Maynard, Kristen R; Sukumar, Mahima; Manji, Husseini K; Gardier, Alain M; David, Denis J; Martinowich, Keri

    2015-01-01

    Neurogenesis continues throughout life in the hippocampal dentate gyrus. Chronic treatment with monoaminergic antidepressant drugs stimulates hippocampal neurogenesis, and new neurons are required for some antidepressant-like behaviors. Electroconvulsive seizures (ECS), a laboratory model of electroconvulsive therapy (ECT), robustly stimulate hippocampal neurogenesis. ECS requires newborn neurons to improve behavioral deficits in a mouse neuroendocrine model of depression. We utilized immunohistochemistry for doublecortin (DCX), a marker of migrating neuroblasts, to assess the impact of Sham or ECS treatments (1 treatment per day, 7 treatments over 15 days) on hippocampal neurogenesis in animals receiving 6 weeks of either vehicle or chronic corticosterone (CORT) treatment in the drinking water. We conducted tests of anxiety- and depressive-like behavior to investigate the ability of ECS to reverse CORT-induced behavioral deficits. We also determined whether adult neurons are required for the effects of ECS. For these studies we utilized a pharmacogenetic model (hGFAPtk) to conditionally ablate adult born neurons. We then evaluated behavioral indices of depression after Sham or ECS treatments in CORT-treated wild-type animals and CORT-treated animals lacking neurogenesis. ECS is able to rescue CORT-induced behavioral deficits in indices of anxiety- and depressive-like behavior. ECS increases both the number and dendritic complexity of adult-born migrating neuroblasts. The ability of ECS to promote antidepressant-like behavior is blocked in mice lacking adult neurogenesis. ECS ameliorates a number of anxiety- and depressive-like behaviors caused by chronic exposure to CORT. ECS requires intact hippocampal neurogenesis for its efficacy in these behavioral indices. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Expansion of the Kano model to identify relevant customer segments and functional requirements

    DEFF Research Database (Denmark)

    Atlason, Reynir Smari; Stefansson, Arnaldur Smari; Wietz, Miriam

    2017-01-01

    or a service. A current limitation of the Kano model is that it does not allow developers to visualise which combined sets of FRs would provide the highest satisfaction between different customer segments. In this paper, a stepwise method to address this particular shortcoming is presented. First......The Kano model of customer satisfaction has been widely used to analyse perceived needs of customers. The model provides product developers valuable information about if, and then how much a given functional requirement (FR) will impact customer satisfaction if implemented within a product, system...... more than one combined customer segment. It further shows which segments provide the highest possibility for high satisfaction of combined sets of FRs. We demonstrate the usefulness of this approach in a case study involving customers’ preference for outdoor sports equipment....

  3. Evaluation of olive flowering at low latitude sites in Argentina using a chilling requirement model

    Energy Technology Data Exchange (ETDEWEB)

    Aybar, V.E.; Melo-Abreu, J.P. de; Searles, P.S.; Matias, A.G.; Del Rio, C.; Caballero, C. M.; Rousseaux, M.C.

    2015-07-01

    Olive production has expanded significantly from the Mediterranean Basin into the New World over the last two decades. In some cases, cultivars of European origin have been introduced at a large commercial scale with little previous evaluation of potential productivity. The objective of this study was to evaluate whether a temperature-driven simulation model developed in the Mediterranean Basin to predict normal flowering occurrence and flowering date using cultivar-specific thermal requirements was suitable for the low latitude areas of Northwest Argentina. The model was validated at eight sites over several years and a wide elevation range (350–1200 m above mean sea level) for three cultivars (‘Arbequina’, ‘Frantoio’, ‘Leccino’) with potentially different chilling requirements. In ‘Arbequina’, normal flowering was observed at almost all sites and in all years, while normal flowering events in ‘Frantoio’ and ‘Leccino’ were uncommon. The model successfully predicted if flowering would be normal in 92% and 83% of the cases in ‘Arbequina’ and ‘Frantoio’, respectively, but was somewhat less successful in ‘Leccino’ (61%). When flowering occurred, the predicted flowering date was within ± 7 days of the observed date in 71% of the cases. Overall, the model results indicate that cultivar-specific simulation models may be used as an approximate tool to predict whether individual cultivars will be successful in new growing areas. In Northwest Argentina, the model could be used to identify cultivars to replace ‘Frantoio’ and ‘Leccino’ and to simulate global warming scenarios. (Author)

  4. Application and project portfolio valuation using enterprise architecture and business requirements modelling

    Science.gov (United States)

    Quartel, Dick; Steen, Maarten W. A.; Lankhorst, Marc M.

    2012-05-01

    This article describes an architecture-based approach to IT valuation. This approach offers organisations an instrument to valuate their application and project portfolios and to make well-balanced decisions about IT investments. The value of a software application is assessed in terms of its contribution to a selection of business goals. Based on such assessments, the value of different applications can be compared, and requirements for innovation, development, maintenance and phasing out can be identified. IT projects are proposed to realise the requirements. The value of each project is assessed in terms of the value it adds to one or more applications. This value can be obtained by relating the 'as-is' application portfolio to the 'to-be' portfolio that is being proposed by the project portfolio. In this way, projects can be ranked according to their added value, given a certain selection of business goals. The approach uses ArchiMate to model the relationship between software applications, business processes, services and products. In addition, two language extensions are used to model the relationship of these elements to business goals and requirements and to projects and project portfolios. The approach is illustrated using the portfolio method of Bedell and has been implemented in BiZZdesign Architect.

  5. Safety Culture: A Requirement for New Business Models — Lessons Learned from Other High Risk Industries

    International Nuclear Information System (INIS)

    Kecklund, L.

    2016-01-01

    Technical development and changes on global markets affects all high risk industries creating opportunities as well as risks related to the achievement of safety and business goals. Changes in legal and regulatory frameworks as well as in market demands create a need for major changes. Several high risk industries are facing a situation where they have to develop new business models. Within the transportation domain, e.g., aviation and railways, there is a growing concern related to how the new business models may affects safety issues. New business models in aviation and railways include extensive use of outsourcing and subcontractors to reduce costs resulting in, e.g., negative changes in working conditions, work hours, employment conditions and high turnover rates. The energy sector also faces pressures to create new business models for transition to renewable energy production to comply with new legal and regulatory requirements and to make best use of new reactor designs. In addition, large scale phase out and decommissioning of nuclear facilities have to be managed by the nuclear industry. Some negative effects of new business models have already arisen within the transportation domain, e.g., the negative effects of extensive outsourcing and subcontractor use. In the railway domain the infrastructure manager is required by European and national regulations to assure that all subcontractors are working according to the requirements in the infrastructure managers SMS (Safety Management System). More than ten levels of subcontracts can be working in a major infrastructure project making the system highly complex and thus difficult to control. In the aviation domain, tightly coupled interacting computer networks supplying airport services, as well as air traffic control, are managed and maintained by several different companies creating numerous interfaces which must be managed by the SMS. There are examples where a business model with several low

  6. Modelling regional variability of irrigation requirements due to climate change in Northern Germany.

    Science.gov (United States)

    Riediger, Jan; Breckling, Broder; Svoboda, Nikolai; Schröder, Winfried

    2016-01-15

    The question whether global climate change invalidates the efficiency of established land use practice cannot be answered without systemic considerations on a region specific basis. In this context plant water availability and irrigation requirements, respectively, were investigated in Northern Germany. The regions under investigation--Diepholz, Uelzen, Fläming and Oder-Spree--represent a climatic gradient with increasing continentality from West to East. Besides regional climatic variation and climate change, soil conditions and crop management differ on the regional scale. In the model regions, temporal seasonal droughts influence crop success already today, but on different levels of intensity depending mainly on climate conditions. By linking soil water holding capacities, crop management data and calculations of evapotranspiration and precipitation from the climate change scenario RCP 8.5 irrigation requirements for maintaining crop productivity were estimated for the years 1991 to 2070. Results suggest that water requirement for crop irrigation is likely to increase with considerable regional variation. For some of the regions, irrigation requirements might increase to such an extent that the established regional agricultural practice might be hard to retain. Where water availability is limited, agricultural practice, like management and cultivated crop spectrum, has to be changed to deal with the new challenges. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Modeling traceability information and functionality requirement in export-oriented tilapia chain.

    Science.gov (United States)

    Zhang, Xiaoshuan; Feng, Jianying; Xu, Mark; Hu, Jinyou

    2011-05-01

    Tilapia has been named as the 'food fish of the 21st century' and has become the most important farmed fish. China is the world leader in tilapia production and export. Identifying information and functional requirements is critical in developing an efficient traceability system because traceability has become a fundamental prerequisite for exporting aquaculture products. This paper examines the export-oriented tilapia chains and information flow in the chains, and identifies the key actors, information requirements and information-capturing points. Unified Modeling Language (UML) technology is adopted to describe the information and functionality requirement for chain traceability. The barriers of traceability system adoption are also identified. The results show that the traceability data consist of four categories that must be recorded by each link in the chain. The functionality requirement is classified into four categories from the fundamental information record to decisive quality control; the top three barriers to the traceability system adoption are: high costs of implementing the system, lack of experienced and professional staff; and low level of government involvement and support. Copyright © 2011 Society of Chemical Industry.

  8. Estimating Software Effort Hours for Major Defense Acquisition Programs

    Science.gov (United States)

    Wallshein, Corinne C.

    2010-01-01

    Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…

  9. Mental Effort in Binary Categorization Aided by Binary Cues

    Science.gov (United States)

    Botzer, Assaf; Meyer, Joachim; Parmet, Yisrael

    2013-01-01

    Binary cueing systems assist in many tasks, often alerting people about potential hazards (such as alarms and alerts). We investigate whether cues, besides possibly improving decision accuracy, also affect the effort users invest in tasks and whether the required effort in tasks affects the responses to cues. We developed a novel experimental tool…

  10. Technical support document for proposed revision of the model energy code thermal envelope requirements

    Energy Technology Data Exchange (ETDEWEB)

    Conner, C.C.; Lucas, R.G.

    1993-02-01

    This report documents the development of the proposed revision of the council of American Building Officials' (CABO) 1993 supplement to the 1992 Model Energy Code (MEC) (referred to as the 1993 MEC) building thermal envelope requirements for single-family and low-rise multifamily residences. The goal of this analysis was to develop revised guidelines based on an objective methodology that determined the most cost-effective (least total life-cycle cost [LCC]) combination of energy conservation measures (ECMs) for residences in different locations. The ECMs with the lowest LCC were used as a basis for proposing revised MEC maximum U[sub o]-value (thermal transmittance) curves in the MEC format. The changes proposed here affect the requirements for group R'' residences. The group R residences are detached one- and two-family dwellings (referred to as single-family) and all other residential buildings three stories or less (referred to as multifamily).

  11. Technical support document for proposed revision of the model energy code thermal envelope requirements

    Energy Technology Data Exchange (ETDEWEB)

    Conner, C.C.; Lucas, R.G.

    1993-02-01

    This report documents the development of the proposed revision of the council of American Building Officials` (CABO) 1993 supplement to the 1992 Model Energy Code (MEC) (referred to as the 1993 MEC) building thermal envelope requirements for single-family and low-rise multifamily residences. The goal of this analysis was to develop revised guidelines based on an objective methodology that determined the most cost-effective (least total life-cycle cost [LCC]) combination of energy conservation measures (ECMs) for residences in different locations. The ECMs with the lowest LCC were used as a basis for proposing revised MEC maximum U{sub o}-value (thermal transmittance) curves in the MEC format. The changes proposed here affect the requirements for ``group R`` residences. The group R residences are detached one- and two-family dwellings (referred to as single-family) and all other residential buildings three stories or less (referred to as multifamily).

  12. Classical Causal Models for Bell and Kochen-Specker Inequality Violations Require Fine-Tuning

    Directory of Open Access Journals (Sweden)

    Eric G. Cavalcanti

    2018-04-01

    Full Text Available Nonlocality and contextuality are at the root of conceptual puzzles in quantum mechanics, and they are key resources for quantum advantage in information-processing tasks. Bell nonlocality is best understood as the incompatibility between quantum correlations and the classical theory of causality, applied to relativistic causal structure. Contextuality, on the other hand, is on a more controversial foundation. In this work, I provide a common conceptual ground between nonlocality and contextuality as violations of classical causality. First, I show that Bell inequalities can be derived solely from the assumptions of no signaling and no fine-tuning of the causal model. This removes two extra assumptions from a recent result from Wood and Spekkens and, remarkably, does not require any assumption related to independence of measurement settings—unlike all other derivations of Bell inequalities. I then introduce a formalism to represent contextuality scenarios within causal models and show that all classical causal models for violations of a Kochen-Specker inequality require fine-tuning. Thus, the quantum violation of classical causality goes beyond the case of spacelike-separated systems and already manifests in scenarios involving single systems.

  13. Model of assessment of requirements of privacy, security and quality of service for mobile medical applications

    Directory of Open Access Journals (Sweden)

    Edward Paul Guillen Pinto

    2017-08-01

    Full Text Available Introduction: The development of mobile technologies has facilitated the creation of mHealth applications, which are considered key tools for safe and quality care for patients from remote populations and with lack of infrastructure for the provision of health services. The article considers a proposal for an evaluation model that allows to determine weaknesses and vulnerabilities at the security level and quality of service (QoS in mHealth applications. Objective: To carry out an approximation of a model of analysis that supports the decision making, concerning the use and production of safe applications, minimizing the impact and the probability of occurrence of the risks of computer security. Materials and methods: The type of applied research is of a descriptive type, because each one details the characteristics that the mobile health applications must have to achieve an optimum level of safety. The methodology uses the rules that regulate applications and mixes them with techniques of security analysis, using the characterization of risks posed by Open Web Application Security Project-OWASP and the QoS requirements of the International Telecommunication Union-ITU. Results: An effective analysis was obtained in actual current applications, which shows their weaknesses and the aspects to be corrected to comply with appropriate security parameters. Conclusions: The model allows to evaluate the safety and quality of service (QoS requirements of mobile health applications that can be used to evaluate current applications or to generate the criteria before deployment.

  14. Model of an aquaponic system for minimised water, energy and nitrogen requirements.

    Science.gov (United States)

    Reyes Lastiri, D; Slinkert, T; Cappon, H J; Baganz, D; Staaks, G; Keesman, K J

    2016-01-01

    Water and nutrient savings can be established by coupling water streams between interacting processes. Wastewater from production processes contains nutrients like nitrogen (N), which can and should be recycled in order to meet future regulatory discharge demands. Optimisation of interacting water systems is a complex task. An effective way of understanding, analysing and optimising such systems is by applying mathematical models. The present modelling work aims at supporting the design of a nearly emission-free aquaculture and hydroponic system (aquaponics), thus contributing to sustainable production and to food security for the 21st century. Based on the model, a system that couples 40 m(3) fish tanks and a hydroponic system of 1,000 m(2) can produce 5 tons of tilapia and 75 tons of tomato yearly. The system requires energy to condense and recover evaporated water, for lighting and heating, adding up to 1.3 GJ/m(2) every year. In the suggested configuration, the fish can provide about 26% of the N required in a plant cycle. A coupling strategy that sends water from the fish to the plants in amounts proportional to the fish feed input, reduces the standard deviation of the NO3(-) level in the fish cycle by 35%.

  15. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    Science.gov (United States)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried

  16. Modelling of radon control and air cleaning requirements in underground uranium mines

    International Nuclear Information System (INIS)

    El Fawal, M.; Gadalla, A.

    2014-01-01

    As a part of a comprehensive study concerned with control workplace short-lived radon daughter concentration in underground uranium mines to safe levels, a computer program has been developed and verified, to calculate ventilation parameters e.g. local pressures, flow rates and radon daughter concentration levels. The computer program is composed of two parts, one part for mine ventilation and the other part for radon daughter levels calculations. This program has been validated in an actual case study to calculate radon concentration levels, pressure and flow rates required to maintain acceptable levels of radon concentrations in each point of the mine. The required fan static pressure and the approximate energy consumption were also estimated. The results of the calculations have been evaluated and compared with similar investigation. It was found that the calculated values are in good agreement with the corresponding values obtained using ''REDES'' standard ventilation modelling software. The developed computer model can be used as an available tool to help in the evaluation of ventilation systems proposed by mining authority, to assist the uranium mining industry in maintaining the health and safety of the workers underground while efficiently achieving economic production targets. It could be used also for regulatory inspection and radiation protection assessments of workers in the underground mining. Also with using this model, one can effectively design, assess and manage underground mine ventilation systems. Values of radon decay products concentration in units of working level, pressures drop and flow rates required to reach the acceptable radon concentration relative to the recommended levels, at different extraction points in the mine and fan static pressure could be estimated which are not available using other software. (author)

  17. Vitamin D Signaling in the Bovine Immune System: A Model for Understanding Human Vitamin D Requirements

    Directory of Open Access Journals (Sweden)

    Corwin D. Nelson

    2012-03-01

    Full Text Available The endocrine physiology of vitamin D in cattle has been rigorously investigated and has yielded information on vitamin D requirements, endocrine function in health and disease, general metabolism, and maintenance of calcium homeostasis in cattle. These results are relevant to human vitamin D endocrinology. The current debate regarding vitamin D requirements is centered on the requirements for proper intracrine and paracrine vitamin D signaling. Studies in adult and young cattle can provide valuable insight for understanding vitamin D requirements as they relate to innate and adaptive immune responses during infectious disease. In cattle, toll-like receptor recognition activates intracrine and paracrine vitamin D signaling mechanism in the immune system that regulates innate and adaptive immune responses in the presence of adequate 25-hydroxyvitamin D. Furthermore, experiments with mastitis in dairy cattle have provided in vivo evidence for the intracrine vitamin D signaling mechanism in macrophages as well as vitamin D mediated suppression of infection. Epidemiological evidence indicates that circulating concentrations above 32 ng/mL of 25-hydroxyvitamin D are necessary for optimal vitamin D signaling in the immune system, but experimental evidence is lacking for that value. Experiments in cattle can provide that evidence as circulating 25-hydroxyvitamin D concentrations can be experimentally manipulated within ranges that are normal for humans and cattle. Additionally, young and adult cattle can be experimentally infected with bacteria and viruses associated with significant diseases in both cattle and humans. Utilizing the bovine model to further delineate the immunomodulatory role of vitamin D will provide potentially valuable insights into the vitamin D requirements of both humans and cattle, especially as they relate to immune response capacity and infectious disease resistance.

  18. Meeting Human Reliability Requirements through Human Factors Design, Testing, and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Boring

    2007-06-01

    In the design of novel systems, it is important for the human factors engineer to work in parallel with the human reliability analyst to arrive at the safest achievable design that meets design team safety goals and certification or regulatory requirements. This paper introduces the System Development Safety Triptych, a checklist of considerations for the interplay of human factors and human reliability through design, testing, and modeling in product development. This paper also explores three phases of safe system development, corresponding to the conception, design, and implementation of a system.

  19. Verification of voltage/ frequency requirement for emergency diesel generator in nuclear power plant using dynamic modeling

    International Nuclear Information System (INIS)

    Hur, J.S.; Roh, M.S.

    2013-01-01

    Full-text: One major cause of the plant shutdown is the loss of electrical power. The study is to comprehend the coping action against station blackout including emergency diesel generator, sequential loading of safety system and to ensure that the emergency diesel generator should meet requirements, especially voltage and frequency criteria using modeling tool. This paper also considered the change of the sequencing time and load capacity only for finding electrical design margin. However, the revision of load list must be verified with safety analysis. From this study, it is discovered that new load calculation is a key factor in EDG localization and in-house capability increase. (author)

  20. Relation Between Listening Effort and Speech Intelligibility in Noise.

    Science.gov (United States)

    Krueger, Melanie; Schulte, Michael; Zokoll, Melanie A; Wagener, Kirsten C; Meis, Markus; Brand, Thomas; Holube, Inga

    2017-10-12

    Subjective ratings of listening effort might be applicable to estimate hearing difficulties at positive signal-to-noise ratios (SNRs) at which speech intelligibility scores are near 100%. Hence, ratings of listening effort were compared with speech intelligibility scores at different SNRs, and the benefit of hearing aids was evaluated. Two groups of listeners, 1 with normal hearing and 1 with hearing impairment, performed adaptive speech intelligibility and adaptive listening effort tests (Adaptive Categorical Listening Effort Scaling; Krueger, Schulte, Brand, & Holube, 2017) with sentences of the Oldenburg Sentence Test (Wagener, Brand, & Kollmeier, 1999a, 1999b; Wagener, Kühnel, & Kollmeier, 1999) in 4 different maskers. Model functions were fitted to the data to estimate the speech reception threshold and listening effort ratings for extreme effort and no effort. Listeners with hearing impairment showed higher rated listening effort compared with listeners with normal hearing. For listeners with hearing impairment, the rating extreme effort, which corresponds to negative SNRs, was more correlated to the speech reception threshold than the rating no effort, which corresponds to positive SNRs. A benefit of hearing aids on speech intelligibility was only verifiable at negative SNRs, whereas the effect on listening effort showed high individual differences mainly at positive SNRs. The adaptive procedure for rating subjective listening effort yields information beyond using speech intelligibility to estimate hearing difficulties and to evaluate hearing aids.

  1. Greater effort increases perceived value in an invertebrate.

    Science.gov (United States)

    Czaczkes, Tomer J; Brandstetter, Birgit; di Stefano, Isabella; Heinze, Jürgen

    2018-03-05

    Expending effort is generally considered to be undesirable. However, both humans and vertebrates will work for a reward they could also get for free. Moreover, cues associated with high-effort rewards are preferred to low-effort associated cues. Many explanations for these counterintuitive findings have been suggested, including cognitive dissonance (self-justification) or a greater contrast in state (e.g., energy or frustration level) before and after an effort-linked reward. Here, we test whether effort expenditure also increases perceived value in ants, using both classical cue-association methods and pheromone deposition, which correlates with perceived value. In 2 separate experimental setups, we show that pheromone deposition is higher toward the reward that requires more effort: 47% more pheromone deposition was performed for rewards reached via a vertical runway (high effort) compared with ones reached via a horizontal runway (low effort), and deposition rates were 28% higher on rough (high effort) versus smooth (low effort) runways. Using traditional cue-association methods, 63% of ants trained on different surface roughness, and 70% of ants trained on different runway elevations, preferred the high-effort related cues on a Y maze. Finally, pheromone deposition to feeders requiring memorization of one path bifurcation was up to 29% higher than to an identical feeder requiring no learning. Our results suggest that effort affects value perception in ants. This effect may stem from a cognitive process, which monitors the change in a generalized hedonic state before and after reward. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Mathematically modelling the power requirement for a vertical shaft mowing machine

    Directory of Open Access Journals (Sweden)

    Jorge Simón Pérez de Corcho Fuentes

    2008-09-01

    Full Text Available This work describes a mathematical model for determining the power demand for a vertical shaft mowing machine, particularly taking into account the influence of speed on cutting power, which is different from that of other models of mowers. The influence of the apparatus’ rotation and translation speeds was simulated in determining power demand. The results showed that no chan-ges in cutting power were produced by varying the knives’ angular speed (if translation speed was constant, while cutting power became increased if translation speed was increased. Variations in angular speed, however, influenced other parameters deter-mining total power demand. Determining this vertical shaft mower’s cutting pattern led to obtaining good crop stubble quality at the mower’s lower rotation speed, hence reducing total energy requirements.

  3. Fast tracking ICT infrastructure requirements and design, based on Enterprise Reference Architecture and matching Reference Models

    DEFF Research Database (Denmark)

    Bernus, Peter; Baltrusch, Rob; Vesterager, Johan

    2002-01-01

    The Globemen Consortium has developed the virtual enterprise reference architecture and methodology (VERAM), based on GERAM and developed reference models for virtual enterprise management and joint mission delivery. The planned virtual enterprise capability includes the areas of sales...... and marketing, global engineering, and customer relationship management. The reference models are the basis for the development of ICT infrastructure requirements. These in turn can be used for ICT infrastructure specification (sometimes referred to as 'ICT architecture').Part of the ICT architecture...... is industry-wide, part of it is industry-specific and a part is specific to the domains of the joint activity that characterises the given Virtual Enterprise Network at hand. The article advocates a step by step approach to building virtual enterprise capability....

  4. The USEtox story: A survey of model developer visions and user requirements

    DEFF Research Database (Denmark)

    Westh, Torbjørn Bochsen; Hauschild, Michael Zwicky; Birkved, Morten

    2015-01-01

    into LCA software and methods, (4) improve update/testing procedures, (5) strengthen communication between developers and users, and (6) extend model scope. By generalizing our recommendations to guide scientific model development in a broader context, we emphasize to acknowledge different levels of user......, we analyzed user expectations and experiences and compared them with the developers’ visions. Methods We applied qualitative and quantitative data collection methods including an online questionnaire, semistructured user and developer interviews, and review of scientific literature. Questionnaire...... and interview results were analyzed in an actor-network perspective in order to understand user needs and to compare these with the developers’ visions. Requirement engineering methods, more specifically function tree, system context, and activity diagrams, were iteratively applied and structured to develop...

  5. Modeling regulatory policies associated with offshore structure removal requirements in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Mark J. [Center for Energy Studies, Louisiana State University, Energy Coast and Environment Building, Baton Rouge, LA (United States)

    2008-07-15

    Federal regulations require that a lease in the Outer Continental Shelf of the Gulf of Mexico be cleared of all structures within one year after production on the lease ceases, but in recent years, the Minerals Management Service has begun to encourage operators to remove idle (non-producing) structures on producing leases that are no longer ''economically viable''. At the end of 2003, there were 2175 producing structures, 898 idle (non-producing) structures, and 440 auxiliary (never-producing) structures on 1356 active leases; and 329 idle structures and 65 auxiliary structures on 273 inactive leases. The purpose of this paper is to model the impact of alternative regulatory policies on the removal trends of structures and the inventory of idle iron, and to provide first-order estimates of the cost of each regulatory option. A description of the modeling framework and implementation results is presented. (author)

  6. A new model for evaluating maintenance energy requirements in dogs: allometric equation from 319 pet dogs.

    Science.gov (United States)

    Divol, Guilhem; Priymenko, Nathalie

    2017-01-01

    Reports concerning maintenance energy requirements (MER) in dogs are common but most of the data cover laboratory or utility dogs. This study establishes those of healthy adult pet dogs and the factors which cause these energy requirements to vary. Within the framework of a nutrition teaching exercise, each student followed a pet from his entourage and gathered accurate records of its feeding habits. Data have been restricted to healthy adult dogs with an ideal body weight (BW) which did not vary more than 5 % during the study period. A total of 319 eligible records were analysed using multiple linear regression. Variation factors such as ownership, breed, sex and neutered status, bedding location, temperament and feeding habits were then analysed individually using a non-parametric model. Two models result from this study, one excluding age ( r 2 0·813) and a more accurate one which takes into consideration the age in years ( r 2 0·816). The second model was assessed with the main variation factors and shows that: MER (kcal) = k 1 × k 2 × k 3 × k 4 × k 5 × 128 × BW 0·740 × age -0·050 /d ( r 2 0·836), with k 1 the effect of the breed, k 2 the effect of sex and neutered status, k 3 the effect of bedding location, k 4 the effect of temperament and k 5 the effect of the type of feed. The resulting model is very similar to the recommendations made by the National Research Council (2006) but a greater accuracy was obtained using age raised to a negative power, as demonstrated in human nutrition.

  7. Analysis Efforts Supporting NSTX Upgrades

    International Nuclear Information System (INIS)

    Zhang, H.; Titus, P.; Rogoff, P.; Zolfaghari, A.; Mangra, D.; Smith, M.

    2010-01-01

    The National Spherical Torus Experiment (NSTX) is a low aspect ratio, spherical torus (ST) configuration device which is located at Princeton Plasma Physics Laboratory (PPPL) This device is presently being updated to enhance its physics by doubling the TF field to 1 Tesla and increasing the plasma current to 2 Mega-amperes. The upgrades include a replacement of the centerstack and addition of a second neutral beam. The upgrade analyses have two missions. The first is to support design of new components, principally the centerstack, the second is to qualify existing NSTX components for higher loads, which will increase by a factor of four. Cost efficiency was a design goal for new equipment qualification, and reanalysis of the existing components. Showing that older components can sustain the increased loads has been a challenging effort in which designs had to be developed that would limit loading on weaker components, and would minimize the extent of modifications needed. Two areas representing this effort have been chosen to describe in more details: analysis of the current distribution in the new TF inner legs, and, second, analysis of the out-of-plane support of the existing TF outer legs.

  8. APS Education and Diversity Efforts

    Science.gov (United States)

    Prestridge, Katherine; Hodapp, Theodore

    2015-11-01

    American Physical Society (APS) has a wide range of education and diversity programs and activities, including programs that improve physics education, increase diversity, provide outreach to the public, and impact public policy. We present the latest programs spearheaded by the Committee on the Status of Women in Physics (CSWP), with highlights from other diversity and education efforts. The CSWP is working to increase the fraction of women in physics, understand and implement solutions for gender-specific issues, enhance professional development opportunities for women in physics, and remedy issues that impact gender inequality in physics. The Conferences for Undergraduate Women in Physics, Professional Skills Development Workshops, and our new Professional Skills program for students and postdocs are all working towards meeting these goals. The CSWP also has site visit and conversation visit programs, where department chairs request that the APS assess the climate for women in their departments or facilitate climate discussions. APS also has two significant programs to increase participation by underrepresented minorities (URM). The newest program, the APS National Mentoring Community, is working to provide mentoring to URM undergraduates, and the APS Bridge Program is an established effort that is dramatically increasing the number of URM PhDs in physics.

  9. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    Science.gov (United States)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  10. [In-depth interviews and the Kano model to determine user requirements in a burns unit].

    Science.gov (United States)

    González-Revaldería, J; Holguín-Holgado, P; Lumbreras-Marín, E; Núñez-López, G

    To determine the healthcare requirements of patients in a Burns Unit, using qualitative techniques, such us in-depth personal interviews and Kano's methodology. Qualitative methodology using in-depth personal interviews (12 patients), Kano's conceptual model, and the SERVQHOS questionnaire (24 patients). All patients had been hospitalised in the last 12 months in the Burns Unit. Using Kano's methodology, service attributes were grouped by affinity diagrams, and classified as follows: must-be, attractive (unexpected, great satisfaction), and one-dimensional (linked to the degree of functionality of the service). The outcomes were compared with those obtained with SERVQHOS questionnaire. From the analysis of in-depth interviews, 11 requirements were obtained, referring to hotel aspects, information, need for closer staff relationship, and organisational aspects. The attributes classified as must-be were free television and automatic TV disconnection at midnight. Those classified as attractive were: individual room for more privacy, information about dressing change times in order to avoid anxiety, and additional staff for in-patients. The results were complementary to those obtained with the SERVQHOS questionnaire. In-depth personal interviews provide extra knowledge about patient requirements, complementing the information obtained with questionnaires. With this methodology, a more active patient participation is achieved and the companion's opinion is also taken into account. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  11. Model-based Assessment for Balancing Privacy Requirements and Operational Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-17

    The smart grid changes the way energy is produced and distributed. In addition both, energy and information is exchanged bidirectionally among participating parties. Therefore heterogeneous systems have to cooperate effectively in order to achieve a common high-level use case, such as smart metering for billing or demand response for load curtailment. Furthermore, a substantial amount of personal data is often needed for achieving that goal. Capturing and processing personal data in the smart grid increases customer concerns about privacy and in addition, certain statutory and operational requirements regarding privacy aware data processing and storage have to be met. An increase of privacy constraints, however, often limits the operational capabilities of the system. In this paper, we present an approach that automates the process of finding an optimal balance between privacy requirements and operational requirements in a smart grid use case and application scenario. This is achieved by formally describing use cases in an abstract model and by finding an algorithm that determines the optimum balance by forward mapping privacy and operational impacts. For this optimal balancing algorithm both, a numeric approximation and – if feasible – an analytic assessment are presented and investigated. The system is evaluated by applying the tool to a real-world use case from the University of Southern California (USC) microgrid.

  12. Integrated Data Capturing Requirements for 3d Semantic Modelling of Cultural Heritage: the Inception Protocol

    Science.gov (United States)

    Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.

    2017-02-01

    The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  13. Corporate Data Network (CDN) data requirements task. Enterprise Model. Volume 1

    International Nuclear Information System (INIS)

    1985-11-01

    The NRC has initiated a multi-year program to centralize its information processing in a Corporate Data Network (CDN). The new information processing environment will include shared databases, telecommunications, office automation tools, and state-of-the-art software. Touche Ross and Company was contracted with to perform a general data requirements analysis for shared databases and to develop a preliminary plan for implementation of the CDN concept. The Enterprise Model (Vol. 1) provided the NRC with agency-wide information requirements in the form of data entities and organizational demand patterns as the basis for clustering the entities into logical groups. The Data Dictionary (Vol. 2) provided the NRC with definitions and example attributes and properties for each entity. The Data Model (Vol. 3) defined logical databases and entity relationships within and between databases. The Preliminary Strategic Data Plan (Vol. 4) prioritized the development of databases and included a workplan and approach for implementation of the shared database component of the Corporate Data Network

  14. THE 3C COOPERATION MODEL APPLIED TO THE CLASSICAL REQUIREMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vagner Luiz Gava

    2012-08-01

    Full Text Available Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users’ workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system

  15. ANALYSIS OF RAINFALL DATA TO ESTIMATE RAIN CONTRIBUTION TOWARDS CROP WATER REQUIREMENT USING CROPWAT MODEL

    Directory of Open Access Journals (Sweden)

    Tahir Saeed Laghari

    2014-12-01

    Full Text Available A study was carried out to define the analysis of rainfall data in order to estimate its contribution towards crop water requirements to overcome these problems. Rainfall and climatic data was collected from metrological stations, C.P UAF rain gauge (A, (AARI, (B, (CAA, (C and (WAPDA, (D, Faisalabad of given region and this data was reserved for cross validation. The test station’s (A rainfall data was subjected to double mass curve technique to check its consistency with respect to other rainfall stations (B, C and D in that area. The results derived by double curve technique were accurate for interested gauge station because there was no any break in curve. Then this consistent data was used to determine effective rainfall. The ETo was established by using penman-monteith method in the course of CROPWAT model and its effect with respect to other parameters like sun shine hour, wind speed, maximum & minimum temperature and rainfall humidity were determined. It was founded that the reference evapotranspiration (ETo is more during April to September due to increase in temperature and low in remaining months. After that data was placed in the model to acquire crop water requirement and irrigation of illustrative crops (wheat & maize from the district. Through which we estimated that 7.5% rainfall for wheat and 15.5% rainfall for maize can contribute in actual irrigation per year. Through which we determined that 92.5 % and 84.5 % irrigation is required for wheat and maize crop respectively.

  16. Bioenergetics model for estimating food requirements of female Pacific walruses (Odobenus rosmarus divergens)

    Science.gov (United States)

    Noren, S.R.; Udevitz, M.S.; Jay, C.V.

    2012-01-01

    Pacific walruses Odobenus rosmarus divergens use sea ice as a platform for resting, nursing, and accessing extensive benthic foraging grounds. The extent of summer sea ice in the Chukchi Sea has decreased substantially in recent decades, causing walruses to alter habitat use and activity patterns which could affect their energy requirements. We developed a bioenergetics model to estimate caloric demand of female walruses, accounting for maintenance, growth, activity (active in-water and hauled-out resting), molt, and reproductive costs. Estimates for non-reproductive females 0–12 yr old (65−810 kg) ranged from 16359 to 68960 kcal d−1 (74−257 kcal d−1 kg−1) for years with readily available sea ice for which we assumed animals spent 83% of their time in water. This translated into the energy content of 3200–5960 clams per day, equivalent to 7–8% and 14–9% of body mass per day for 5–12 and 2–4 yr olds, respectively. Estimated consumption rates of 12 yr old females were minimally affected by pregnancy, but lactation had a large impact, increasing consumption rates to 15% of body mass per day. Increasing the proportion of time in water to 93%, as might happen if walruses were required to spend more time foraging during ice-free periods, increased daily caloric demand by 6–7% for non-lactating females. We provide the first bioenergetics-based estimates of energy requirements for walruses and a first step towards establishing bioenergetic linkages between demography and prey requirements that can ultimately be used in predicting this population’s response to environmental change.

  17. Diverse Secreted Effectors Are Required for Salmonella Persistence in a Mouse Infection Model

    Energy Technology Data Exchange (ETDEWEB)

    Kidwai, Afshan S.; Mushamiri, Ivy T.; Niemann, George; Brown, Roslyn N.; Adkins, Joshua N.; Heffron, Fred

    2013-08-12

    Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS) apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI) was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  18. Diverse secreted effectors are required for Salmonella persistence in a mouse infection model.

    Directory of Open Access Journals (Sweden)

    Afshan S Kidwai

    Full Text Available Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  19. Termination of prehospital resuscitative efforts

    DEFF Research Database (Denmark)

    Mikkelsen, Søren; Schaffalitzky de Muckadell, Caroline; Binderup, Lars Grassmé

    2017-01-01

    . The medical records with possible documentation of ethical issues were independently reviewed by two philosophers in order to identify explicit ethical or philosophical considerations pertaining to the decision to resuscitate or not. RESULTS: In total, 1275 patients were either declared dead at the scene......BACKGROUND: Discussions on ethical aspects of life-and-death decisions within the hospital are often made in plenary. The prehospital physician, however, may be faced with ethical dilemmas in life-and-death decisions when time-critical decisions to initiate or refrain from resuscitative efforts...... need to be taken without the possibility to discuss matters with colleagues. Little is known whether these considerations regarding ethical issues in crucial life-and-death decisions are documented prehospitally. This is a review of the ethical considerations documented in the prehospital medical...

  20. Summary report of a seminar on geosphere modelling requirements of deep disposal of low and intermediate level radioactive wastes

    International Nuclear Information System (INIS)

    Piper, D.; Paige, R.W.; Broyd, T.W.

    1989-02-01

    A seminar on the geosphere modelling requirements of deep disposal of low and intermediate level radioactive wastes was organised by WS Atkins Engineering Sciences as part of Her Majesty's Inspectorate of Pollution's Radioactive Waste Assessment Programme. The objectives of the seminar were to review geosphere modelling capabilities and prioritise, if possible, any requirements for model development. Summaries of the presentations and subsequent discussions are given in this report. (author)

  1. Dynamic Computational Model of Symptomatic Bacteremia to Inform Bacterial Separation Treatment Requirements.

    Directory of Open Access Journals (Sweden)

    Sinead E Miller

    Full Text Available The rise of multi-drug resistance has decreased the effectiveness of antibiotics, which has led to increased mortality rates associated with symptomatic bacteremia, or bacterial sepsis. To combat decreasing antibiotic effectiveness, extracorporeal bacterial separation approaches have been proposed to capture and separate bacteria from blood. However, bacteremia is dynamic and involves host-pathogen interactions across various anatomical sites. We developed a mathematical model that quantitatively describes the kinetics of pathogenesis and progression of symptomatic bacteremia under various conditions, including bacterial separation therapy, to better understand disease mechanisms and quantitatively assess the biological impact of bacterial separation therapy. Model validity was tested against experimental data from published studies. This is the first multi-compartment model of symptomatic bacteremia in mammals that includes extracorporeal bacterial separation and antibiotic treatment, separately and in combination. The addition of an extracorporeal bacterial separation circuit reduced the predicted time of total bacteria clearance from the blood of an immunocompromised rodent by 49%, compared to antibiotic treatment alone. Implementation of bacterial separation therapy resulted in predicted multi-drug resistant bacterial clearance from the blood of a human in 97% less time than antibiotic treatment alone. The model also proposes a quantitative correlation between time-dependent bacterial load among tissues and bacteremia severity, analogous to the well-known 'area under the curve' for characterization of drug efficacy. The engineering-based mathematical model developed may be useful for informing the design of extracorporeal bacterial separation devices. This work enables the quantitative identification of the characteristics required of an extracorporeal bacteria separation device to provide biological benefit. These devices will potentially

  2. Estimating Irrigation Water Requirements using MODIS Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Imhoff, Marc L.; Bounoua, Lahouari; Harriss, Robert; Harriss, Robert; Wells, Gordon; Glantz, Michael; Dukhovny, Victor A.; Orlovsky, Leah

    2007-01-01

    An inverse process approach using satellite-driven (MODIS) biophysical modeling was used to quantitatively assess water resource demand in semi-arid and arid agricultural lands by comparing the carbon and water flux modeled under both equilibrium (in balance with prevailing climate) and non-equilibrium (irrigated) conditions. Since satellite observations of irrigated areas show higher leaf area indices (LAI) than is supportable by local precipitation, we postulate that the degree to which irrigated lands vary from equilibrium conditions is related to the amount of irrigation water used. For an observation year we used MODIS vegetation indices, local climate data, and the SiB2 photosynthesis-conductance model to examine the relationship between climate and the water stress function for a given grid-cell and observed leaf area. To estimate the minimum amount of supplemental water required for an observed cell, we added enough precipitation to the prevailing climatology at each time step to minimize the water stress function and bring the soil to field capacity. The experiment was conducted on irrigated lands on the U.S. Mexico border and Central Asia and compared to estimates of irrigation water used.

  3. Cognitive dissonance in children: justification of effort or contrast?

    Science.gov (United States)

    Alessandri, Jérôme; Darcheville, Jean-Claude; Zentall, Thomas R

    2008-06-01

    Justification of effort is a form of cognitive dissonance in which the subjective value of an outcome is directly related to the effort that went into obtaining it. However, it is likely that in social contexts (such as the requirements for joining a group) an inference can be made (perhaps incorrectly) that an outcome that requires greater effort to obtain in fact has greater value. Here we present evidence that a cognitive dissonance effect can be found in children under conditions that offer better control for the social value of the outcome. This effect is quite similar to contrast effects that recently have been studied in animals. We suggest that contrast between the effort required to obtain the outcome and the outcome itself provides a more parsimonious account of this phenomenon and perhaps other related cognitive dissonance phenomena as well. Research will be needed to identify cognitive dissonance processes that are different from contrast effects of this kind.

  4. Coercion and polio eradication efforts in Moradabad.

    Science.gov (United States)

    Rentmeester, Christy A; Dasgupta, Rajib; Feemster, Kristen A; Packard, Randall M

    2014-01-01

    We introduce the problem of vaccine coercion as reported in Moradabad, India. We offer commentary and critical analysis on ethical complexities at the intersection of global public health and regional political strife and relate them to broader vaccine goals. We draw upon a historical example from malaria vaccine efforts, focusing specifically on ethical and health justice issues expressed through the use of coercion in vaccine administration. We suggest how coercion is indicative of failed leadership in public health and consider community-based collaborations as models for cultivating local investment and trust in vaccination campaigns and for success in global public health initiatives.

  5. Assessment for Improvement: Two Models for Assessing a Large Quantitative Reasoning Requirement

    Directory of Open Access Journals (Sweden)

    Mary C. Wright

    2015-03-01

    Full Text Available We present two models for assessment of a large and diverse quantitative reasoning (QR requirement at the University of Michigan. These approaches address two key challenges in assessment: (1 dissemination of findings for curricular improvement and (2 resource constraints associated with measurement of large programs. Approaches we present for data collection include convergent validation of self-report surveys, as well as use of mixed methods and learning analytics. Strategies we present for dissemination of findings include meetings with instructors to share data and best practices, sharing of results through social media, and use of easily accessible dashboards. These assessment approaches may be of particular interest to universities with large numbers of students engaging in a QR experience, projects that involve multiple courses with diverse instructional goals, or those who wish to promote evidence-based curricular improvement.

  6. Assessment of sustainable yield and optimum fishing effort for the ...

    African Journals Online (AJOL)

    The tilapia (Oreochromis niloticus, L. 1758) stock of Lake Hawassa, Ethiopia, was assessed to estimate sustainable yield (MSY) and optimum fishing effort (fopt) using length-based analytical models (Jone's cohort analysis and Thompson and Bell). Pertinent data (length, weight, catch, effort, etc.) were collected on a daily ...

  7. Optimal growth of Lactobacillus casei in a Cheddar cheese ripening model system requires exogenous fatty acids.

    Science.gov (United States)

    Tan, W S; Budinich, M F; Ward, R; Broadbent, J R; Steele, J L

    2012-04-01

    Flavor development in ripening Cheddar cheese depends on complex microbial and biochemical processes that are difficult to study in natural cheese. Thus, our group has developed Cheddar cheese extract (CCE) as a model system to study these processes. In previous work, we found that CCE supported growth of Lactobacillus casei, one of the most prominent nonstarter lactic acid bacteria (NSLAB) species found in ripening Cheddar cheese, to a final cell density of 10(8) cfu/mL at 37°C. However, when similar growth experiments were performed at 8°C in CCE derived from 4-mo-old cheese (4mCCE), the final cell densities obtained were only about 10(6) cfu/mL, which is at the lower end of the range of the NSLAB population expected in ripening Cheddar cheese. Here, we report that addition of Tween 80 to CCE resulted in a significant increase in the final cell density of L. casei during growth at 8°C and produced concomitant changes in cytoplasmic membrane fatty acid (CMFA) composition. Although the effect was not as dramatic, addition of milk fat or a monoacylglycerol (MAG) mixture based on the MAG profile of milk fat to 4mCCE also led to an increased final cell density of L. casei in CCE at 8°C and changes in CMFA composition. These observations suggest that optimal growth of L. casei in CCE at low temperature requires supplementation with a source of fatty acids (FA). We hypothesize that L. casei incorporates environmental FA into its CMFA, thereby reducing its energy requirement for growth. The exogenous FA may then be modified or supplemented with FA from de novo synthesis to arrive at a CMFA composition that yields the functionality (i.e., viscosity) required for growth in specific conditions. Additional studies utilizing the CCE model to investigate microbial contributions to cheese ripening should be conducted in CCE supplemented with 1% milk fat. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Model Based User's Access Requirement Analysis of E-Governance Systems

    Science.gov (United States)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  9. A clinically relevant model of osteoinduction: a process requiring calcium phosphate and BMP/Wnt signalling.

    Science.gov (United States)

    Eyckmans, J; Roberts, S J; Schrooten, J; Luyten, F P

    2010-06-01

    In this study, we investigated a clinically relevant model of in vivo ectopic bone formation utilizing human periosteum derived cells (HPDCs) seeded in a Collagraft carrier and explored the mechanisms by which this process is driven. Bone formation occurred after eight weeks when a minimum of one million HPDCs was loaded on Collagraft carriers and implanted subcutaneously in NMRI nu/nu mice. De novo bone matrix, mainly secreted by the HPDCs, was found juxta-proximal of the calcium phosphate (CaP) granules suggesting that CaP may have triggered the 'osteoinductive program'. Indeed, removal of the CaP granules by ethylenediaminetetraacetic acid decalcification prior to cell seeding and implantation resulted in loss of bone formation. In addition, inhibition of endogenous bone morphogenetic protein and Wnt signalling by overexpression of the secreted antagonists Noggin and Frzb, respectively, also abrogated osteoinduction. Proliferation of the engrafted HPDCs was strongly reduced in the decalcified scaffolds or when seeded with adenovirus-Noggin/Frzb transduced HPDCs indicating that cell division of the engrafted HPDCs is required for the direct bone formation cascade. These data suggest that this model of bone formation is similar to that observed during physiological intramembranous bone development and may be of importance when investigating tissue engineering strategies.

  10. The Design of Effective ICT-Supported Learning Activities: Exemplary Models, Changing Requirements, and New Possibilities

    Directory of Open Access Journals (Sweden)

    Cameron Richards

    2005-01-01

    Full Text Available Despite the imperatives of policy and rhetoric about their integration in formal education, Information and Communication Technologies (ICTs are often used as an "add-on" in many classrooms and in many lesson plans. Nevertheless, many teachers find that interesting and well-planned tasks, projects, and resources provide a key to harnessing the educational potential of digital resources, Internet communications and interactive multimedia to engage the interest, interaction, and knowledge construction of young learners. To the extent that such approaches go beyond and transform traditional "transmission" models of teaching and formal lesson planning, this paper investigates the changing requirements and new possibilities represented by the challenge of integrating ICTs in education in a way which at the same time connects more effectively with both the specific contents of the curriculum and the various stages and elements of the learning process. Case studies from teacher education foundation courses provide an exemplary focus of inquiry in order to better link relevant new theories or models of learning with practice, to build upon related learner-centered strategies for integrating ICT resources and tools, and to incorporate interdependent functions of learning as information access, communication, and applied interactions. As one possible strategy in this direction, the concept of an "ICT-supported learning activity" suggests the need for teachers to approach this increasing challenge more as "designers" of effective and integrated learning rather than mere "transmitters" of skills or information through an add-on use of ICTs.

  11. Economic growth, biodiversity loss and conservation effort.

    Science.gov (United States)

    Dietz, Simon; Adger, W Neil

    2003-05-01

    This paper investigates the relationship between economic growth, biodiversity loss and efforts to conserve biodiversity using a combination of panel and cross section data. If economic growth is a cause of biodiversity loss through habitat transformation and other means, then we would expect an inverse relationship. But if higher levels of income are associated with increasing real demand for biodiversity conservation, then investment to protect remaining diversity should grow and the rate of biodiversity loss should slow with growth. Initially, economic growth and biodiversity loss are examined within the framework of the environmental Kuznets hypothesis. Biodiversity is represented by predicted species richness, generated for tropical terrestrial biodiversity using a species-area relationship. The environmental Kuznets hypothesis is investigated with reference to comparison of fixed and random effects models to allow the relationship to vary for each country. It is concluded that an environmental Kuznets curve between income and rates of loss of habitat and species does not exist in this case. The role of conservation effort in addressing environmental problems is examined through state protection of land and the regulation of trade in endangered species, two important means of biodiversity conservation. This analysis shows that the extent of government environmental policy increases with economic development. We argue that, although the data are problematic, the implications of these models is that conservation effort can only ever result in a partial deceleration of biodiversity decline partly because protected areas serve multiple functions and are not necessarily designated to protect biodiversity. Nevertheless institutional and policy response components of the income biodiversity relationship are important but are not well captured through cross-country regression analysis.

  12. Change Impact Analysis for SysML Requirements Models based on Semantics of Trace Relations

    NARCIS (Netherlands)

    ten Hove, David; Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas; de Goede, Koos; Oldevik, J.; Olsen, G. K.; Neple, T.; Kolovos, D.

    2009-01-01

    Change impact analysis is one of the applications of requirements traceability in software engineering community. In this paper, we focus on requirements and requirements relations from traceability perspective. We provide formal definitions of the requirements relations in SysML for change impact

  13. Analysis and Design Environment for Large Scale System Models and Collaborative Model Development Project

    Data.gov (United States)

    National Aeronautics and Space Administration — As NASA modeling efforts grow more complex and more distributed among many working groups, new tools and technologies are required to integrate their efforts...

  14. Analysis and Design Environment for Large Scale System Models and Collaborative Model Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — As NASA modeling efforts grow more complex and more distributed among many working groups, new tools and technologies are required to integrate their efforts...

  15. Control and Effort Costs Influence the Motivational Consequences of Choice

    Directory of Open Access Journals (Sweden)

    Holly Sullivan-Toole

    2017-05-01

    Full Text Available The act of making a choice, apart from any outcomes the choice may yield, has, paradoxically, been linked to both the enhancement and the detriment of intrinsic motivation. Research has implicated two factors in potentially mediating these contradictory effects: the personal control conferred by a choice and the costs associated with a choice. Across four experiments, utilizing a physical effort task disguised as a simple video game, we systematically varied costs across two levels of physical effort requirements (Low-Requirement, High-Requirement and control over effort costs across three levels of choice (Free-Choice, Restricted-Choice, and No-Choice to disambiguate how these factors affect the motivational consequences of choosing within an effortful task. Together, our results indicated that, in the face of effort requirements, illusory control alone may not sufficiently enhance perceptions of personal control to boost intrinsic motivation; rather, the experience of actual control may be necessary to overcome effort costs and elevate performance. Additionally, we demonstrated that conditions of illusory control, while otherwise unmotivating, can through association with the experience of free-choice, be transformed to have a positive effect on motivation.

  16. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  17. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  18. 1996 Design effort for IFMIF HEBT

    International Nuclear Information System (INIS)

    Blind, B.

    1997-01-01

    The paper details the 1996 design effort for the IFMIF HEBT. Following a brief overview, it lists the primary requirements for the beam at the target, describes the design approach and design tools used, introduces the beamline modules, gives the results achieved with the design at this stage, points out possible improvements and gives the names and computer locations of the TRACE3-D and PARMILA files that sum up the design work. The design does not fully meet specifications in regards to the flatness of the distribution at the target. With further work, including if necessary some backup options, the flatness specifications may be realized. It is not proposed that the specifications, namely flatness to ±5% and higher-intensity ridges that are no more than 15% above average, be changed at this time. The design also does not meet the requirement that the modules of all beamlines should operate at the same settings. However, the goal of using identical components and operational procedures has been met and only minor returning is needed to produce very similar beam distributions from all beamlines. Significant further work is required in the following areas: TRACE3-D designs and PARMILA runs must be made for the beams coming from accelerators No. 3 and No. 4. Transport of 30-MeV and 35-MeV beams to the targets and beam dump must be studied. Comprehensive error studies must be made. These must result in tolerance specifications and may require design iterations. Detailed interfacing with target-spot instrumentation is required. This instrumentation must be able to check all aspects of the specifications

  19. Aminoglycoside Concentrations Required for Synergy with Carbapenems against Pseudomonas aeruginosa Determined via Mechanistic Studies and Modeling.

    Science.gov (United States)

    Yadav, Rajbharan; Bulitta, Jürgen B; Schneider, Elena K; Shin, Beom Soo; Velkov, Tony; Nation, Roger L; Landersdorfer, Cornelia B

    2017-12-01

    This study aimed to systematically identify the aminoglycoside concentrations required for synergy with a carbapenem and characterize the permeabilizing effect of aminoglycosides on the outer membrane of Pseudomonas aeruginosa Monotherapies and combinations of four aminoglycosides and three carbapenems were studied for activity against P. aeruginosa strain AH298-GFP in 48-h static-concentration time-kill studies (SCTK) (inoculum: 10 7.6 CFU/ml). The outer membrane-permeabilizing effect of tobramycin alone and in combination with imipenem was characterized via electron microscopy, confocal imaging, and the nitrocefin assay. A mechanism-based model (MBM) was developed to simultaneously describe the time course of bacterial killing and prevention of regrowth by imipenem combined with each of the four aminoglycosides. Notably, 0.25 mg/liter of tobramycin, which was inactive in monotherapy, achieved synergy (i.e., ≥2-log 10 more killing than the most active monotherapy at 24 h) combined with imipenem. Electron micrographs, confocal image analyses, and the nitrocefin uptake data showed distinct outer membrane damage by tobramycin, which was more extensive for the combination with imipenem. The MBM indicated that aminoglycosides enhanced the imipenem target site concentration up to 4.27-fold. Tobramycin was the most potent aminoglycoside to permeabilize the outer membrane; tobramycin (0.216 mg/liter), gentamicin (0.739 mg/liter), amikacin (1.70 mg/liter), or streptomycin (5.19 mg/liter) was required for half-maximal permeabilization. In summary, our SCTK, mechanistic studies and MBM indicated that tobramycin was highly synergistic and displayed the maximum outer membrane disruption potential among the tested aminoglycosides. These findings support the optimization of highly promising antibiotic combination dosage regimens for critically ill patients. Copyright © 2017 American Society for Microbiology.

  20. Review of data requirements for groundwater flow and solute transport modelling and the ability of site investigation methods to meet these requirements

    International Nuclear Information System (INIS)

    McEwen, T.J.; Chapman, N.A.; Robinson, P.C.

    1990-08-01

    This report describes the data requirements for the codes that may be used in the modelling of groundwater flow and radionuclide transport during the assessment of a Nirex site for the deep disposal of low and intermediate level radioactive waste and also the site investigation methods that exist to supply the data for these codes. The data requirements for eight codes are reviewed, with most emphasis on three of the more significant codes, VANDAL, NAMMU and CHEMTARD. The largest part of the report describes and discusses the site investigation techniques and each technique is considered in terms of its ability to provide the data necessary to characterise the geological and hydrogeological environment around a potential repository. (author)

  1. Shell Inspection History and Current CMM Inspection Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Montano, Joshua Daniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-26

    The following report provides a review of past and current CMM Shell Inspection efforts. Calibration of the Sheffield rotary contour gauge has expired and the primary inspector, Matthew Naranjo, has retired. Efforts within the Inspection team are transitioning from maintaining and training new inspectors on Sheffield to off-the-shelf CMM technology. Although inspection of a shell has many requirements, the scope of the data presented in this report focuses on the inner contour, outer contour, radial wall thickness and mass comparisons.

  2. Predictive and Stochastic Approach for Software Effort Estimation

    OpenAIRE

    Srinivasa Rao T.; Hari CH.V.M.K.; Prasad Reddy P.V.G.D

    2013-01-01

    Software cost Estimation is the process of predicting the amount of time (Effort) required to build a software system. The primary reason for cost estimation is to enable the client or the developer to perform a cost-benefit analysis. Effort Estimations are determined in terms of person-months, which can be translated into actual dollar cost. The accuracy of the estimate will be depending on the amount of accurate information of the final product. Specification with uncertainty represents a r...

  3. Measuring the Cognitive Effort of Literal Translation Processes

    DEFF Research Database (Denmark)

    Schaeffer, Moritz; Carl, Michael

    2014-01-01

    It has been claimed that human translators rely on some sort of literal translation equivalences to produce translations and to check their validity. More effort would be required if translations are less literal. However, to our knowledge, there is no established metric to measure and quantify...... this claim. This paper attempts to bridge this gap by introducing a metric for measuring literality of translations and assesses the effort that is observed when translators produce translations which deviate from the introduced literality definition....

  4. Requirement for Serratia marcescens Cytolysin in a Murine Model of Hemorrhagic Pneumonia

    Science.gov (United States)

    González-Juarbe, Norberto; Mares, Chris A.; Hinojosa, Cecilia A.; Medina, Jorge L.; Cantwell, Angelene; Dube, Peter H.; Bergman, Molly A.

    2014-01-01

    Serratia marcescens, a member of the carbapenem-resistant Enterobacteriaceae, is an important emerging pathogen that causes a wide variety of nosocomial infections, spreads rapidly within hospitals, and has a systemic mortality rate of ≤41%. Despite multiple clinical descriptions of S. marcescens nosocomial pneumonia, little is known regarding the mechanisms of bacterial pathogenesis and the host immune response. To address this gap, we developed an oropharyngeal aspiration model of lethal and sublethal S. marcescens pneumonia in BALB/c mice and extensively characterized the latter. Lethal challenge (>4.0 × 106 CFU) was characterized by fulminate hemorrhagic pneumonia with rapid loss of lung function and death. Mice challenged with a sublethal dose (<2.0 × 106 CFU) rapidly lost weight, had diminished lung compliance, experienced lung hemorrhage, and responded to the infection with extensive neutrophil infiltration and histopathological changes in tissue architecture. Neutrophil extracellular trap formation and the expression of inflammatory cytokines occurred early after infection. Mice depleted of neutrophils were exquisitely susceptible to an otherwise nonlethal inoculum, thereby demonstrating the requirement for neutrophils in host protection. Mutation of the genes encoding the cytolysin ShlA and its transporter ShlB resulted in attenuated S. marcescens strains that failed to cause profound weight loss, extended illness, hemorrhage, and prolonged lung pathology in mice. This study describes a model of S. marcescens pneumonia that mimics known clinical features of human illness, identifies neutrophils and the toxin ShlA as a key factors important for defense and infection, respectively, and provides a solid foundation for future studies of novel therapeutics for this important opportunistic pathogen. PMID:25422267

  5. Utilization of a mental health collaborative care model among patients who require interpreter services.

    Science.gov (United States)

    Njeru, Jane W; DeJesus, Ramona S; St Sauver, Jennifer; Rutten, Lila J; Jacobson, Debra J; Wilson, Patrick; Wieland, Mark L

    2016-01-01

    Immigrants and refugees to the United States have a higher prevalence of depression compared to the general population and are less likely to receive adequate mental health services and treatment. Those with limited English proficiency (LEP) are at an even higher risk of inadequate mental health care. Collaborative care management (CCM) models for depression are effective in achieving treatment goals among a wide range of patient populations, including patients with LEP. The purpose of this study was to assess the utilization of a statewide initiative that uses CCM for depression management, among patients with LEP in a large primary care practice. This was a retrospective cohort study of patients with depression in a large primary care practice in Minnesota. Patients who met criteria for enrollment into the CCM [with a provider-generated diagnosis of depression or dysthymia in the electronic medical records, and a Patient Health Questionnaire-9 (PHQ-9) score ≥10]. Patient-identified need for interpreter services was used as a proxy for LEP. Rates of enrollment into the DIAMOND (Depression Improvement Across Minnesota, Offering A New Direction) program, a statewide initiative that uses CCM for depression management were measured. These rates were compared between eligible patients who require interpreter services versus patients who do not. Of the 7561 patients who met criteria for enrollment into the DIAMOND program during the study interval, 3511 were enrolled. Only 18.2 % of the eligible patients with LEP were enrolled into DIAMOND compared with the 47.2 % of the eligible English proficient patients. This finding persisted after adjustment for differences in age, gender and depression severity scores (adjusted OR [95 % confidence interval] = 0.43 [0.23, 0.81]). Within primary care practices, tailored interventions are needed, including those that address cultural competence and language navigation, to improve the utilization of this effective model among

  6. Semantics of trace relations in requirements models for consistency checking and inferencing

    NARCIS (Netherlands)

    Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas; Veldhuis, Jan-Willem

    2009-01-01

    Requirements traceability is the ability to relate requirements back to stakeholders and forward to corresponding design artifacts, code, and test cases. Although considerable research has been devoted to relating requirements in both forward and backward directions, less attention has been paid to

  7. ON THE REQUIREMENTS FOR REALISTIC MODELING OF NEUTRINO TRANSPORT IN SIMULATIONS OF CORE-COLLAPSE SUPERNOVAE

    International Nuclear Information System (INIS)

    Lentz, Eric J.; Mezzacappa, Anthony; Hix, W. Raphael; Messer, O. E. Bronson; Liebendörfer, Matthias; Bruenn, Stephen W.

    2012-01-01

    We have conducted a series of numerical experiments with the spherically symmetric, general relativistic, neutrino radiation hydrodynamics code AGILE-BOLTZTRAN to examine the effects of several approximations used in multidimensional core-collapse supernova simulations. Our code permits us to examine the effects of these approximations quantitatively by removing, or substituting for, the pieces of supernova physics of interest. These approximations include: (1) using Newtonian versus general relativistic gravity, hydrodynamics, and transport; (2) using a reduced set of weak interactions, including the omission of non-isoenergetic neutrino scattering, versus the current state-of-the-art; and (3) omitting the velocity-dependent terms, or observer corrections, from the neutrino Boltzmann kinetic equation. We demonstrate that each of these changes has noticeable effects on the outcomes of our simulations. Of these, we find that the omission of observer corrections is particularly detrimental to the potential for neutrino-driven explosions and exhibits a failure to conserve lepton number. Finally, we discuss the impact of these results on our understanding of current, and the requirements for future, multidimensional models.

  8. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  9. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  10. Does the digital age require new models of democracy? : lasswell's policy scientist of democracy vs. Liquid democracy

    NARCIS (Netherlands)

    Jelena Gregorius

    2015-01-01

    This essay provides a debate about Lasswell’s policy scientist of democracy (PSOD, 1948) in comparison to the model of liquid democracy (21st century) based on the question if the digital age requires new models of democracy. The PSOD of Lasswell, a disciplinary persona, is in favour of an elitist

  11. Application of a hydrodynamic and sediment transport model for guidance of response efforts related to the Deepwater Horizon oil spill in the Northern Gulf of Mexico along the coast of Alabama and Florida

    Science.gov (United States)

    Plant, Nathaniel G.; Long, Joseph W.; Dalyander, P. Soupy; Thompson, David M.; Raabe, Ellen A.

    2013-01-01

    U.S. Geological Survey (USGS) scientists have provided a model-based assessment of transport and deposition of residual Deepwater Horizon oil along the shoreline within the northern Gulf of Mexico in the form of mixtures of sand and weathered oil, known as surface residual balls (SRBs). The results of this USGS research, in combination with results from other components of the overall study, will inform operational decisionmaking. The results will provide guidance for response activities and data collection needs during future oil spills. In May 2012 the U.S. Coast Guard, acting as the Deepwater Horizon Federal on-scene coordinator, chartered an operational science advisory team to provide a science-based review of data collected and to conduct additional directed studies and sampling. The goal was to characterize typical shoreline profiles and morphology in the northern Gulf of Mexico to identify likely sources of residual oil and to evaluate mechanisms whereby reoiling phenomena may be occurring (for example, burial and exhumation and alongshore transport). A steering committee cochaired by British Petroleum Corporation (BP) and the National Oceanic and Atmospheric Administration (NOAA) is overseeing the project and includes State on-scene coordinators from four States (Alabama, Florida, Louisiana, and Mississippi), trustees of the U.S. Department of the Interior (DOI), and representatives from the U.S. Coast Guard. This report presents the results of hydrodynamic and sediment transport models and developed techniques for analyzing potential SRB movement and burial and exhumation along the coastline of Alabama and Florida. Results from these modeling efforts are being used to explain the complexity of reoiling in the nearshore environment and to broaden consideration of the different scenarios and difficulties that are being faced in identifying and removing residual oil. For instance, modeling results suggest that larger SRBs are not, under the most commonly

  12. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Margaria, Tiziana (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  13. Is My Effort Worth It?

    DEFF Research Database (Denmark)

    Liu, Fei; Xiao, Bo; Lim, Eric T. K.

    2016-01-01

    Inefficiencies associated with online information search are becoming increasingly prevalent in digital environments due to a surge in Consumer Generated Content (CGC). Despite growing scholarly interest in investigating information search behavior and practical demands to optimize users’ search...... experience, there is a paucity of studies that investigate the impact of search features on search outcomes. We therefore draw on Information Foraging Theory (IFT) to disentangle the dual role of search cost in shaping the utility of information search. We also extend the Information Seeking Model...... by advancing a typology of information search tactics, each incurring distinctive search cost. Furthermore, two types of search tasks were adapted from prior research to explore how search tactics differ between goal-oriented and exploratory conditions. Our hypotheses were validated via an online experiment...

  14. ICRP new recommendations. Committee 2's efforts

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    The International Commission on Radiological Protection (ICRP) may release new primary radiation protection recommendation in 2007. Committee 2 has underway reviews of the dosimetric and biokinetic models and associated data used in calculating dose coefficients for intakes of radionuclides and exposures to external radiation fields. This paper outlines the work plans of Committee 2 during the current term, 2005-2009, in anticipation of the new primary recommendations. The two task groups of Committee 2 responsible for the computations of dose coefficients, INDOS and DOCAL, are reviewing the models and data used in the computations. INDOS is reviewing the lung model and the biokinetic models that describe the behavior of the radionuclides in the body. DOCAL is reviewing its computational formulations with the objective of harmonizing the formulation with those of nuclear medicine, and developing new computational phantoms representing the adult male and female reference individuals of ICRP Publication 89. In addition, DOCAL will issue a publication on nuclear decay data to replace ICRP Publication 38. While the current efforts are focused on updating the dose coefficients for occupational intakes of radionuclides plans are being formulated to address dose coefficients for external radiation fields which include consideration of high energy fields associated with accelerators and space travel and the updating of dose coefficients for members of the public. (author)

  15. Modeling Multi-Reservoir Hydropower Systems in the Sierra Nevada with Environmental Requirements and Climate Warming

    Science.gov (United States)

    Rheinheimer, David Emmanuel

    Hydropower systems and other river regulation often harm instream ecosystems, partly by altering the natural flow and temperature regimes that ecosystems have historically depended on. These effects are compounded at regional scales. As hydropower and ecosystems are increasingly valued globally due to growing values for clean energy and native species as well as and new threats from climate warming, it is important to understand how climate warming might affect these systems, to identify tradeoffs between different water uses for different climate conditions, and to identify promising water management solutions. This research uses traditional simulation and optimization to explore these issues in California's upper west slope Sierra Nevada mountains. The Sierra Nevada provides most of the water for California's vast water supply system, supporting high-elevation hydropower generation, ecosystems, recreation, and some local municipal and agricultural water supply along the way. However, regional climate warming is expected to reduce snowmelt and shift runoff to earlier in the year, affecting all water uses. This dissertation begins by reviewing important literature related to the broader motivations of this study, including river regulation, freshwater conservation, and climate change. It then describes three substantial studies. First, a weekly time step water resources management model spanning the Feather River watershed in the north to the Kern River watershed in the south is developed. The model, which uses the Water Evaluation And Planning System (WEAP), includes reservoirs, run-of-river hydropower, variable head hydropower, water supply demand, and instream flow requirements. The model is applied with a runoff dataset that considers regional air temperature increases of 0, 2, 4 and 6 °C to represent historical, near-term, mid-term and far-term (end-of-century) warming. Most major hydropower turbine flows are simulated well. Reservoir storage is also

  16. Reduction of predictive uncertainty in estimating irrigation water requirement through multi-model ensembles and ensemble averaging

    Science.gov (United States)

    Multsch, S.; Exbrayat, J.-F.; Kirby, M.; Viney, N. R.; Frede, H.-G.; Breuer, L.

    2015-04-01

    Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural versus model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty among reference ET is far more important than model parametric uncertainty introduced by crop coefficients. These crop coefficients are used to estimate irrigation water requirement following the single crop coefficient approach. Using the reliability ensemble averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA ensemble average (45%) in comparison to the equally weighted ensemble average (66%). We conclude that multi-model ensemble predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.

  17. Contributions to the Science Modeling Requirements Document; Earth Limb & Auroral Backgrounds

    National Research Council Canada - National Science Library

    Meng, C

    1990-01-01

    .... In addition, the targets' interactions with th atmosphere will provide detection and discrimination methods that can be evaluated only by using models incorporating realistic atmospheric models...

  18. Putting User Stories First: Experiences Adapting the Legacy Data Models and Information Architecture at NASA JPL's PO.DAAC to Accommodate the New Information Lifecycle Required by SWOT

    Science.gov (United States)

    McGibbney, L. J.; Hausman, J.; Laurencelle, J. C.; Toaz, R., Jr.; McAuley, J.; Freeborn, D. J.; Stoner, C.

    2016-12-01

    The Surface Water & Ocean Topography (SWOT) mission brings together two communities focused on a better understanding of the world's oceans and its terrestrial surface waters. U.S. and French oceanographers and hydrologists and international partners have joined forces to develop this new space mission. At NASA JPL's PO.DAAC, the team is currently engaged in the gathering of SWOT User Stores (access patterns, metadata requirements, primary and value added product requirements, data access protocols, etc.) to better inform the adaptive planning of what will be known as the next generation PO.DAAC Information Architecture (IA). The IA effort acknowledges that missions such as SWOT (and NISAR) have few or no precedent in terms of data volume, hot and cold storage, archival, analysis, existing system engineering complexities, etc. and that the only way we can better understand the projected impacts of such requirements is to interface directly with the User Community. Additionally, it also acknowledges that collective learning has taken place to understand certain limitations in the existing data models (DM) underlying the existing PO.DAAC Data Management and Archival System. This work documents an evolutionary, use case based, standards driven approach to adapting the legacy DM and accompanying knowledge representation infrastructure at NASA JPL's PO.DAAC to address forthcoming DAAC mission requirements presented by missions such as SWOT. Some of the topics covered in this evolution include, but are not limited to: How we are leveraging lessons learned from the development of existing DM (such as that generated for SMAP) in an attempt to map them to SWOT. What is the governance model for the SWOT IA? What are the `governing' entities? What is the hierarchy of the `governed entities'? How are elements grouped? How is the design-working group formed? How is model independence maintained and what choices/requirements do we have for the implementation language? The use of

  19. A Semi-Automated Approach for the Co-Refinement of Requirements and Architecture Models

    OpenAIRE

    Blouin, Dominique; Barkowski, Matthias; Schneider, Melanie; Giese, Holger; Dyck, Johannes; Borde, Etienne; Tamzalit, Dalila; Noppen, Joost

    2017-01-01

    Requirements and architecture specifications are strongly related as the second provides a solution to a problem stated by the first. This coupling is typically realized by traceability links and maintaining such links becomes extremely difficult as both requirements and architecture specifications frequently evolve, and in particular when the architecture is refined providing an increasing level of details. In such case, not only the traceability must evolve but the requirements must be refi...

  20. Perception of effort in Exercise Science: Definition, measurement and perspectives.

    Science.gov (United States)

    Pageaux, Benjamin

    2016-11-01

    Perception of effort, also known as perceived exertion or sense of effort, can be described as a cognitive feeling of work associated with voluntary actions. The aim of the present review is to provide an overview of what is perception of effort in Exercise Science. Due to the addition of sensations other than effort in its definition, the neurophysiology of perceived exertion remains poorly understood. As humans have the ability to dissociate effort from other sensations related to physical exercise, the need to use a narrower definition is emphasised. Consequently, a definition and some brief guidelines for its measurement are provided. Finally, an overview of the models present in the literature aiming to explain its neurophysiology, and some perspectives for future research are offered.

  1. Aerodynamic and acoustic features of vocal effort.

    Science.gov (United States)

    Rosenthal, Allison L; Lowell, Soren Y; Colton, Raymond H

    2014-03-01

    The purpose of this study was to determine the aerodynamic and acoustic features of speech produced at comfortable, maximal and minimal levels of vocal effort. Prospective, quasi-experimental research design. Eighteen healthy participants with normal voice were included in this study. After task training, participants produced repeated syllable combinations at comfortable, maximal and minimal levels of vocal effort. A pneumotachometer and vented (Rothenberg) mask were used to record aerodynamic data, with simultaneous recording of the acoustic signal for subsequent analysis. Aerodynamic measures of subglottal pressure, translaryngeal airflow, maximum flow declination rate (MFDR), and laryngeal resistance were analyzed, along with acoustic measures of cepstral peak prominence (CPP) and its standard deviation (SD). Participants produced significantly greater subglottal pressure, translaryngeal airflow, and MFDR during maximal effort speech as compared with comfortable vocal effort. When producing speech at minimal vocal effort, participants lowered subglottal pressure, MFDR, and laryngeal resistance. Acoustic changes associated with changes in vocal effort included significantly higher CPP during maximal effort speech and significantly lower CPP SD during minimal effort speech, when each was compared with comfortable effort. For healthy speakers without voice disorders, subglottal pressure, translaryngeal airflow, and MFDR may be important factors that contribute to an increased sense of vocal effort. Changes in the cepstral signal also occur under conditions of increased or decreased vocal effort relative to comfortable effort. Copyright © 2014 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  2. Data Requirements and Modeling for Gas Hydrate-Related Mixtures and a Comparison of Two Association Models

    DEFF Research Database (Denmark)

    Liang, Xiaodong; Aloupis, Georgios; Kontogeorgis, Georgios M.

    2017-01-01

    used association models in the chemical and petroleum industries. The CPA model is extensively used in flow assurance, in which the gas hydrate formation is one of the central topics. Experimental data play a vital role in validating models and obtaining model parameters. In this work, we will compare...

  3. Transaction-based building controls framework, Volume 2: Platform descriptive model and requirements

    Energy Technology Data Exchange (ETDEWEB)

    Akyol, Bora A. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Haack, Jereme N. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Carpenter, Brandon J. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Lutes, Robert G. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Hernandez, George [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)

    2015-07-31

    Transaction-based Building Controls (TBC) offer a control systems platform that provides an agent execution environment that meets the growing requirements for security, resource utilization, and reliability. This report outlines the requirements for a platform to meet these needs and describes an illustrative/exemplary implementation.

  4. Devising a Structural Equation Model of Relationships between Preservice Teachers' Time and Study Environment Management, Effort Regulation, Self-Efficacy, Control of Learning Beliefs, and Metacognitive Self-Regulation

    Science.gov (United States)

    Sen, Senol; Yilmaz, Ayhan

    2016-01-01

    The objective of this study is to analyze the relationship between preservice teachers' time and study environment management, effort regulation, self-efficacy beliefs, control of learning beliefs and metacognitive self-regulation. This study also investigates the direct and indirect effects of metacognitive self-regulation on time and study…

  5. A MODEL BUILDING CODE ARTICLE ON FALLOUT SHELTERS WITH RECOMMENDATIONS FOR INCLUSION OF REQUIREMENTS FOR FALLOUT SHELTER CONSTRUCTION IN FOUR NATIONAL MODEL BUILDING CODES.

    Science.gov (United States)

    American Inst. of Architects, Washington, DC.

    A MODEL BUILDING CODE FOR FALLOUT SHELTERS WAS DRAWN UP FOR INCLUSION IN FOUR NATIONAL MODEL BUILDING CODES. DISCUSSION IS GIVEN OF FALLOUT SHELTERS WITH RESPECT TO--(1) NUCLEAR RADIATION, (2) NATIONAL POLICIES, AND (3) COMMUNITY PLANNING. FALLOUT SHELTER REQUIREMENTS FOR SHIELDING, SPACE, VENTILATION, CONSTRUCTION, AND SERVICES SUCH AS ELECTRICAL…

  6. Single-ended prediction of listening effort using deep neural networks.

    Science.gov (United States)

    Huber, Rainer; Krüger, Melanie; Meyer, Bernd T

    2018-03-01

    The effort required to listen to and understand noisy speech is an important factor in the evaluation of noise reduction schemes. This paper introduces a model for Listening Effort prediction from Acoustic Parameters (LEAP). The model is based on methods from automatic speech recognition, specifically on performance measures that quantify the degradation of phoneme posteriorgrams produced by a deep neural net: Noise or artifacts introduced by speech enhancement often result in a temporal smearing of phoneme representations, which is measured by comparison of phoneme vectors. This procedure does not require a priori knowledge about the processed speech, and is therefore single-ended. The proposed model was evaluated using three datasets of noisy speech signals with listening effort ratings obtained from normal hearing and hearing impaired subjects. The prediction quality was compared to several baseline models such as the ITU-T standard P.563 for single-ended speech quality assessment, the American National Standard ANIQUE+ for single-ended speech quality assessment, and a single-ended SNR estimator. In all three datasets, the proposed new model achieved clearly better prediction accuracies than the baseline models; correlations with subjective ratings were above 0.9. So far, the model is trained on the specific noise types used in the evaluation. Future work will be concerned with overcoming this limitation by training the model on a variety of different noise types in a multi-condition way in order to make it generalize to unknown noise types. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Model-driven requirements engineering (MDRE) for real-time ultra-wide instantaneous bandwidth signal simulation

    Science.gov (United States)

    Chang, Daniel Y.; Rowe, Neil C.

    2013-05-01

    While conducting a cutting-edge research in a specific domain, we realize that (1) requirements clarity and correctness are crucial to our success [1], (2) hardware is hard to change, most work is in software requirements development, coding and testing [2], (3) requirements are constantly changing, so that configurability, reusability, scalability, adaptability, modularity and testability are important non-functional attributes [3], (4) cross-domain knowledge is necessary for complex systems [4], and (5) if our research is successful, the results could be applied to other domains with similar problems. In this paper, we propose to use model-driven requirements engineering (MDRE) to model and guide our requirements/development, since models are easy to understand, execute, and modify. The domain for our research is Electronic Warfare (EW) real-time ultra-wide instantaneous bandwidth (IBW1) signal simulation. The proposed four MDRE models are (1) Switch-and-Filter architecture, (2) multiple parallel data bit streams alignment, (3) post-ADC and pre-DAC bits re-mapping, and (4) Discrete Fourier Transform (DFT) filter bank. This research is unique since the instantaneous bandwidth we are dealing with is in gigahertz range instead of conventional megahertz.

  8. OPTIMIZATION OF ATM AND BRANCH CASH OPERATIONS USING AN INTEGRATED CASH REQUIREMENT FORECASTING AND CASH OPTIMIZATION MODEL

    Directory of Open Access Journals (Sweden)

    Canser BİLİR

    2018-04-01

    Full Text Available In this study, an integrated cash requirement forecasting and cash inventory optimization model is implemented in both the branch and automated teller machine (ATM networks of a mid-sized bank in Turkey to optimize the bank’s cash supply chain. The implemented model’s objective is to minimize the idle cash levels at both branches and ATMs without decreasing the customer service level (CSL by providing the correct amount of cash at the correct location and time. To the best of our knowledge, the model is the first integrated model in the literature to be applied to both ATMs and branches simultaneously. The results demonstrated that the integrated model dramatically decreased the idle cash levels at both branches and ATMs without degrading the availability of cash and hence customer satisfaction. An in-depth analysis of the results also indicated that the results were more remarkable for branches. The results also demonstrated that the utilization of various seasonal indices plays a very critical role in the forecasting of cash requirements for a bank. Another unique feature of the study is that the model is the first to include the recycling feature of ATMs. The results demonstrated that as a result of the inclusion of the deliberate seasonal indices in the forecasting model, the integrated cash optimization models can be used to estimate the cash requirements of recycling ATMs.

  9. Minimum required number of specimen records to develop accurate species distribution models

    NARCIS (Netherlands)

    Proosdij, van A.S.J.; Sosef, M.S.M.; Wieringa, J.J.; Raes, N.

    2016-01-01

    Species distribution models (SDMs) are widely used to predict the occurrence of species. Because SDMs generally use presence-only data, validation of the predicted distribution and assessing model accuracy is challenging. Model performance depends on both sample size and species’ prevalence, being

  10. Minimum required number of specimen records to develop accurate species distribution models

    NARCIS (Netherlands)

    Proosdij, van A.S.J.; Sosef, M.S.M.; Wieringa, Jan; Raes, N.

    2015-01-01

    Species Distribution Models (SDMs) are widely used to predict the occurrence of species. Because SDMs generally use presence-only data, validation of the predicted distribution and assessing model accuracy is challenging. Model performance depends on both sample size and species’ prevalence, being

  11. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    National Research Council Canada - National Science Library

    Broy, Manfred; Leucker, Martin

    2007-01-01

    The objective of this project is the development of an integrated suite of technologies focusing on end-to-end software development supporting requirements analysis, design, implementation, and verification...

  12. Towards requirements elicitation in service-oriented business networks using value and goal modelling

    NARCIS (Netherlands)

    Mantovaneli Pessoa, Rodrigo; van Sinderen, Marten J.; Quartel, Dick; Shishkov, Boris; Cordeiro, J.; Ranchordas, A.

    2009-01-01

    Due to the contemporary trends towards increased focus on core competences and outsourcing of non-core activities, enterprises are forming strategic alliances and building business networks. This often requires cross enterprise interoperability and integration of their information systems, leading

  13. Investigating the Nature of Relationship between Software Size and Development Effort

    OpenAIRE

    Bajwa, Sohaib-Shahid

    2008-01-01

    Software effort estimation still remains a challenging and debatable research area. Most of the software effort estimation models take software size as the base input. Among the others, Constructive Cost Model (COCOMO II) is a widely known effort estimation model. It uses Source Lines of Code (SLOC) as the software size to estimate effort. However, many problems arise while using SLOC as a size measure due to its late availability in the software life cycle. Therefore, a lot of research has b...

  14. Projected Irrigation Requirement Under Climate Change in Korean Peninsula by Apply Global Hydrologic Model to Local Scale.

    Science.gov (United States)

    Yang, B.; Lee, D. K.

    2016-12-01

    Understanding spatial distribution of irrigation requirement is critically important for agricultural water management. However, many studies considering future agricultural water management in Korea assessed irrigation requirement on watershed or administrative district scale, but have not accounted the spatial distribution. Lumped hydrologic model has typically used in Korea for simulating watershed scale irrigation requirement, while distribution hydrologic model can simulate the spatial distribution grid by grid. To overcome this shortcoming, here we applied a grid base global hydrologic model (H08) into local scale to estimate spatial distribution under future irrigation requirement of Korean Peninsula. Korea is one of the world's most densely populated countries, with also high produce and demand of rice which requires higher soil moisture than other crops. Although, most of the precipitation concentrate in particular season and disagree with crop growth season. This precipitation character makes management of agricultural water which is approximately 60% of total water usage critical issue in Korea. Furthermore, under future climate change, the precipitation predicted to be more concentrated and necessary need change of future water management plan. In order to apply global hydrological model into local scale, we selected appropriate major crops under social and local climate condition in Korea to estimate cropping area and yield, and revised the cropping area map more accurately. As a result, future irrigation requirement estimation varies under each projection, however, slightly decreased in most case. The simulation reveals, evapotranspiration increase slightly while effective precipitation also increase to balance the irrigation requirement. This finding suggest practical guideline to decision makers for further agricultural water management plan including future development of water supply plan to resolve water scarcity.

  15. Economic response to harvest and effort control in fishery

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans

    for fisheries management. The report outlines bio-economic models, which are designed to shed light on the efficiency of different management tools in terms of quota or effort restrictions given the objectives of the Common Fisheries Policy about sustainable and economic viable fisheries. The report addresses...... the complexities of biological and economic interaction in a multispecies, multifleet framework and outlines consistent mathematical models....

  16. Contemporary and emerging efforts on material degradation

    International Nuclear Information System (INIS)

    Andresen, Peter L.

    2011-01-01

    After decades of surprises and lost capacity in commercial nuclear power plants, there are major efforts throughout the world to address materials degradation proactively, although to date there has been vastly more paper studies than R and D. Proactivity requires sufficient knowledge and extrapolation/projection of prior and yet-to-be-observed degradation in reactors. This is complex because relatively subtle variations of conditions that gave rise to historical problems can reduce or enhance their incidence, and of course simple evolution vs. time leads to increased incidence. Problems not yet observed are even more difficult to assess, and extremes of opinion range from a vast array of imaginative potential degradation mechanisms to the view that if a problem has not yet surfaced in plant, then it won't be a future problem. Environmentally assisted cracking in high temperature water has been extensively studied. But it is sufficiently complex-involving dozens of important parameters - that important issues continue to emerge as careful studies have been performed. This paper summarizes a number of emerging issues, and highlights the need for improvements in experimental sophistication and for deeper probing into the nature and importance of these emerging issues. With sophisticated laboratory measurements that can reproduce plant conditions and degradation, it is reasonable to conclude that lab data can act as a preview of future field experience. (author)

  17. Requirements Engineering

    CERN Document Server

    Hull, Elizabeth; Dick, Jeremy

    2011-01-01

    Written for those who want to develop their knowledge of requirements engineering process, whether practitioners or students.Using the latest research and driven by practical experience from industry, Requirements Engineering gives useful hints to practitioners on how to write and structure requirements. It explains the importance of Systems Engineering and the creation of effective solutions to problems. It describes the underlying representations used in system modeling and introduces the UML2, and considers the relationship between requirements and modeling. Covering a generic multi-layer r

  18. Teaching Case: IS Security Requirements Identification from Conceptual Models in Systems Analysis and Design: The Fun & Fitness, Inc. Case

    Science.gov (United States)

    Spears, Janine L.; Parrish, James L., Jr.

    2013-01-01

    This teaching case introduces students to a relatively simple approach to identifying and documenting security requirements within conceptual models that are commonly taught in systems analysis and design courses. An introduction to information security is provided, followed by a classroom example of a fictitious company, "Fun &…

  19. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment... levels Use the following methods in appendix A of this part to measure oxygen (or carbon dioxide) 1...

  20. A Census of Statistics Requirements at U.S. Journalism Programs and a Model for a "Statistics for Journalism" Course

    Science.gov (United States)

    Martin, Justin D.

    2017-01-01

    This essay presents data from a census of statistics requirements and offerings at all 4-year journalism programs in the United States (N = 369) and proposes a model of a potential course in statistics for journalism majors. The author proposes that three philosophies underlie a statistics course for journalism students. Such a course should (a)…

  1. 45 CFR 310.10 - What are the functional requirements for the Model Tribal IV-D System?

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false What are the functional requirements for the Model Tribal IV-D System? 310.10 Section 310.10 Public Welfare Regulations Relating to Public Welfare OFFICE OF CHILD SUPPORT ENFORCEMENT (CHILD SUPPORT ENFORCEMENT PROGRAM), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN...

  2. A model for the design of computer integrated manufacturing systems: Identification of information requirements of decision makers

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1990-01-01

    and compatibility of data bases. It is, however, a question whether traditional models of work process or task procedures are suited for design of advanced information systems such as integrated manufacturing systems. Modern technology and the rapid succession of designs, materials and processes require flexible...... are developed to support production planning and control processes as they are found in the present organizations. In this case, the result has been the evolution of "islands of automation" and in the CIM literature, integration is widely discussed in terms of standardization of communication protocols...... should rather aim at creating a resource envelope within which people can adapt their work strategies to the current requirements and personal preferences without loosing support from the system. This requirement implies that for design purposes, models of procedures and processes should be replaced...

  3. Reducing the computational requirements for simulating tunnel fires by combining multiscale modelling and multiple processor calculation

    DEFF Research Database (Denmark)

    Vermesi, Izabella; Rein, Guillermo; Colella, Francesco

    2017-01-01

    in FDS version 6.0, a widely used fire-specific, open source CFD software. Furthermore, it compares the reduction in simulation time given by multiscale modelling with the one given by the use of multiple processor calculation. This was done using a 1200m long tunnel with a rectangular cross...... processor calculation (97% faster when using a single mesh and multiscale modelling; only 46% faster when using the full tunnel and multiple meshes). In summary, it was found that multiscale modelling with FDS v.6.0 is feasible, and the combination of multiple meshes and multiscale modelling was established...

  4. Validation Of Developed Materials Requirement Planning MRP Integrated Flow System Model Of Ims For Piemf

    Directory of Open Access Journals (Sweden)

    T.T Amachree

    2017-08-01

    Full Text Available Developed MRP as the Most Significant Inventory management Strategy that will correlate strongly with PIEMF. The result of the test case of MRP-based integrated flow system model as shown in table 6 Indicate that the model is effective and valid for PIEMF at 95 confidence interval with F-value 3.121 and P-value sig. 0.034. The model provides abstract representation and timely understanding of the subject matter and as a true indication of a situation of IMS for PIEMF. The flow system model will serve as a veritable decision support system of inventory management for PIEMF

  5. Models and Tabu Search Metaheuristics for Service Network Design with Asset-Balance Requirements

    DEFF Research Database (Denmark)

    Pedersen, Michael Berliner; Crainic, T.G.; Madsen, Oli B.G.

    2009-01-01

    This paper focuses on a generic model for service network design, which includes asset positioning and utilization through constraints on asset availability at terminals. We denote these relations as "design-balance constraints" and focus on the design-balanced capacitated multicommodity network...... design model, a generalization of the capacitated multicommodity network design model generally used in service network design applications. Both arc-and cycle-based formulations for the new model are presented. The paper also proposes a tabu search metaheuristic framework for the arc-based formulation...

  6. On the required complexity of vehicle dynamic models for use in simulation-based highway design.

    Science.gov (United States)

    Brown, Alexander; Brennan, Sean

    2014-06-01

    This paper presents the results of a comprehensive project whose goal is to identify roadway design practices that maximize the margin of safety between the friction supply and friction demand. This study is motivated by the concern for increased accident rates on curves with steep downgrades, geometries that contain features that interact in all three dimensions - planar curves, grade, and superelevation. This complexity makes the prediction of vehicle skidding quite difficult, particularly for simple simulation models that have historically been used for road geometry design guidance. To obtain estimates of friction margin, this study considers a range of vehicle models, including: a point-mass model used by the American Association of State Highway Transportation Officials (AASHTO) design policy, a steady-state "bicycle model" formulation that considers only per-axle forces, a transient formulation of the bicycle model commonly used in vehicle stability control systems, and finally, a full multi-body simulation (CarSim and TruckSim) regularly used in the automotive industry for high-fidelity vehicle behavior prediction. The presence of skidding--the friction demand exceeding supply--was calculated for each model considering a wide range of vehicles and road situations. The results indicate that the most complicated vehicle models are generally unnecessary for predicting skidding events. However, there are specific maneuvers, namely braking events within lane changes and curves, which consistently predict the worst-case friction margins across all models. This suggests that any vehicle model used for roadway safety analysis should include the effects of combined cornering and braking. The point-mass model typically used by highway design professionals may not be appropriate to predict vehicle behavior on high-speed curves during braking in low-friction situations. However, engineers can use the results of this study to help select the appropriate vehicle dynamic

  7. Modelling biomechanical requirements of a rider for different horse-riding techniques at trot

    NARCIS (Netherlands)

    Cocq, de P.; Muller, M.; Clayton, H.M.; Leeuwen, van J.L.

    2013-01-01

    The simplest model possible for bouncing systems consists of a point mass bouncing passively on a mass-less spring without viscous losses. This type of spring–mass model has been used to describe the stance period of symmetric running gaits. In this study, we investigated the interaction between

  8. Process-based models are required to manage ecological systems in a changing world

    Science.gov (United States)

    K. Cuddington; M.-J. Fortin; L.R. Gerber; A. Hastings; A. Liebhold; M. OConnor; C. Ray

    2013-01-01

    Several modeling approaches can be used to guide management decisions. However, some approaches are better fitted than others to address the problem of prediction under global change. Process-based models, which are based on a theoretical understanding of relevant ecological processes, provide a useful framework to incorporate specific responses to altered...

  9. Disease models of chronic inflammatory airway disease : applications and requirements for clinical trials

    NARCIS (Netherlands)

    Diamant, Zuzana; Clarke, Graham W.; Pieterse, Herman; Gispert, Juan

    Purpose of reviewThis review will discuss methodologies and applicability of key inflammatory models of respiratory disease in proof of concept or proof of efficacy clinical studies. In close relationship with these models, induced sputum and inflammatory cell counts will be addressed for

  10. 76 FR 36870 - Special Conditions: Gulfstream Model GVI Airplane; Design Roll Maneuver Requirement for...

    Science.gov (United States)

    2011-06-23

    ... issue a finding of regulatory adequacy pursuant to section 611 of Public Law 92-574, the ``Noise Control... Requirement for Electronic Flight Controls AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Final... airplane will have a novel or unusual design feature associated with an electronic flight control system...

  11. A Curriculum Model: Engineering Design Graphics Course Updates Based on Industrial and Academic Institution Requirements

    Science.gov (United States)

    Meznarich, R. A.; Shava, R. C.; Lightner, S. L.

    2009-01-01

    Engineering design graphics courses taught in colleges or universities should provide and equip students preparing for employment with the basic occupational graphics skill competences required by engineering and technology disciplines. Academic institutions should introduce and include topics that cover the newer and more efficient graphics…

  12. The effects of sleep loss on capacity and effort

    Directory of Open Access Journals (Sweden)

    Mindy Engle-Friedman

    2014-12-01

    Full Text Available Sleep loss appears to affect the capacity for performance and access to energetic resources. This paper reviews research examining the physical substrates referred to as resource capacity, the role of sleep in protecting that capacity and the reaction of the system as it attempts to respond with effort to overcome the limitations on capacity caused by sleep loss. Effort is the extent to which an organism will exert itself beyond basic levels of functioning or attempt alternative strategies to maintain performance. The purpose of this review is to bring together research across sleep disciplines to clarify the substrates that constitute and influence capacity for performance, consider how the loss of sleep influences access to those resources, examine cortical, physiological, perceptual, behavioral and subjective effort responses and consider how these responses reflect a system reacting to changes in the resource environment. When sleep deprived, the ability to perform tasks that require additional energy is impaired and the ability of the system to overcome the deficiencies caused by sleep loss is limited. Taking on tasks that require effort including school work, meal preparation, pulling off the road to nap when driving drowsy appear to be more challenging during sleep loss. Sleep loss impacts the effort-related choices we make and those choices may influence our health and safety.

  13. The influence of music on mental effort and driving performance.

    Science.gov (United States)

    Ünal, Ayça Berfu; Steg, Linda; Epstude, Kai

    2012-09-01

    The current research examined the influence of loud music on driving performance, and whether mental effort mediated this effect. Participants (N=69) drove in a driving simulator either with or without listening to music. In order to test whether music would have similar effects on driving performance in different situations, we manipulated the simulated traffic environment such that the driving context consisted of both complex and monotonous driving situations. In addition, we systematically kept track of drivers' mental load by making the participants verbally report their mental effort at certain moments while driving. We found that listening to music increased mental effort while driving, irrespective of the driving situation being complex or monotonous, providing support to the general assumption that music can be a distracting auditory stimulus while driving. However, drivers who listened to music performed as well as the drivers who did not listen to music, indicating that music did not impair their driving performance. Importantly, the increases in mental effort while listening to music pointed out that drivers try to regulate their mental effort as a cognitive compensatory strategy to deal with task demands. Interestingly, we observed significant improvements in driving performance in two of the driving situations. It seems like mental effort might mediate the effect of music on driving performance in situations requiring sustained attention. Other process variables, such as arousal and boredom, should also be incorporated to study designs in order to reveal more on the nature of how music affects driving. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Time preferences, study effort, and academic performance

    NARCIS (Netherlands)

    Non, J.A.; Tempelaar, D.T.

    2014-01-01

    We analyze the relation between time preferences, study effort, and academic performance among first-year Business and Economics students. Time preferences are measured by stated preferences for an immediate payment over larger delayed payments. Data on study efforts are derived from an electronic

  15. The Role of Effort Justification in Psychotherapy.

    Science.gov (United States)

    Axsom, Danny; Cooper, Joel

    The possible influence of cognitive dissonance in psychotherapy was examined by conceptualizing therapy as an effort justification process. It was predicted that freely choosing to undergo a highly effortful procedure would aid in positive therapeutic change. Subjects (N=52) participated in a weight-reduction experiment in which the degree of…

  16. Listening Effort With Cochlear Implant Simulations

    NARCIS (Netherlands)

    Pals, Carina; Sarampalis, Anastasios; Başkent, Deniz

    2013-01-01

    Purpose: Fitting a cochlear implant (CI) for optimal speech perception does not necessarily optimize listening effort. This study aimed to show that listening effort may change between CI processing conditions for which speech intelligibility remains constant. Method: Nineteen normal-hearing

  17. The Effect of Age on Listening Effort

    Science.gov (United States)

    Degeest, Sofie; Keppler, Hannah; Corthals, Paul

    2015-01-01

    Purpose: The objective of this study was to investigate the effect of age on listening effort. Method: A dual-task paradigm was used to evaluate listening effort in different conditions of background noise. Sixty adults ranging in age from 20 to 77 years were included. A primary speech-recognition task and a secondary memory task were performed…

  18. Effort and Selection Effects of Incentive Contracts

    NARCIS (Netherlands)

    Bouwens, J.F.M.G.; van Lent, L.A.G.M.

    2003-01-01

    We show that the improved effort of employees associated with incentive contracts depends on the properties of the performance measures used in the contract.We also find that the power of incentives in the contract is only indirectly related to any improved employee effort.High powered incentive

  19. Assessment of Required Accuracy of Digital Elevation Data for Hydrologic Modeling

    Science.gov (United States)

    Kenward, T.; Lettenmaier, D. P.

    1997-01-01

    The effect of vertical accuracy of Digital Elevation Models (DEMs) on hydrologic models is evaluated by comparing three DEMs and resulting hydrologic model predictions applied to a 7.2 sq km USDA - ARS watershed at Mahantango Creek, PA. The high resolution (5 m) DEM was resempled to a 30 m resolution using method that constrained the spatial structure of the elevations to be comparable with the USGS and SIR-C DEMs. This resulting 30 m DEM was used as the reference product for subsequent comparisons. Spatial fields of directly derived quantities, such as elevation differences, slope, and contributing area, were compared to the reference product, as were hydrologic model output fields derived using each of the three DEMs at the common 30 m spatial resolution.

  20. Modeling, Simulation, and Analysis for State and Local Emergency Planning and Response. Operational Requirements Document

    Science.gov (United States)

    2009-01-01

    quickly can users teach themselves to use the model and its out- puts? Efficient to use. Once users have learned to use the model, how fast can they...The following are examples: Common Alerting Protocol Emergency Data Exchange Language Resource Messaging Hospital Availability Exchange. The... videoconferencing or other technology, implementers need to know what they must consider when planning and imple- menting the solution. Guidance in the

  1. Enhanced Viability in Organizations: An Approach to Expanding the Requirements of the Viable System Model

    OpenAIRE

    Elezi, Fatos;Schmidt, Michael;Tommelein, Iris D.;Lindemann, U.

    2014-01-01

    The Viable System Model (VSM) is a functional model of organizational structures whose implementation results in a presumably viable system. Despite scientific arguments and successful implementations in organizations, the VSM is still not widely used in practice. A reason may be that the VSM as defined by Beer addresses only the structural domain of the control (management) system in an organization and appears to miss some prerequisites for viability. Aiming for a more comprehensive approac...

  2. Low-effort thought promotes political conservatism.

    Science.gov (United States)

    Eidelman, Scott; Crandall, Christian S; Goodman, Jeffrey A; Blanchar, John C

    2012-06-01

    The authors test the hypothesis that low-effort thought promotes political conservatism. In Study 1, alcohol intoxication was measured among bar patrons; as blood alcohol level increased, so did political conservatism (controlling for sex, education, and political identification). In Study 2, participants under cognitive load reported more conservative attitudes than their no-load counterparts. In Study 3, time pressure increased participants' endorsement of conservative terms. In Study 4, participants considering political terms in a cursory manner endorsed conservative terms more than those asked to cogitate; an indicator of effortful thought (recognition memory) partially mediated the relationship between processing effort and conservatism. Together these data suggest that political conservatism may be a process consequence of low-effort thought; when effortful, deliberate thought is disengaged, endorsement of conservative ideology increases.

  3. Improving the Enterprise Requirements and Acquisition Model’s Developmental Test and Evaluation Process Fidelity

    Science.gov (United States)

    2014-03-27

    Inc, 2005. Banks, Jerry and Randall R. Gibson. “Don’t Simulate When…10 Rules for Determining when Simulation is Not Appropriate,” IIE Solutions ...capability sooner rather than a perfect solution later. Unfortunately, differing opinions on what deficiencies require additional schedule to address...there are still deficiencies. The DAMS supports pushing a less capable product to the warfighter in less time than providing the 100% solution in a

  4. Estimates of nutritional requirements and use of Small Ruminant Nutrition System model for hair sheep in semiarid conditions

    Directory of Open Access Journals (Sweden)

    Alessandra Pinto de Oliveira

    2014-09-01

    Full Text Available The objective was to determine the efficiency of utilization of metabolizable energy for maintenance (km and weight gain (kf, the dietary requirements of total digestible nutrients (TDN and metabolizable protein (MP, as well as, evaluate the Small Ruminant Nutrition System (SRNS model to predict the dry matter intake (DMI and the average daily gain (ADG of Santa Ines lambs, fed diets containing different levels of metabolizable energy (ME. Thirty five lambs, non-castrated, with initial body weight (BW of 14.77 ± 1.26 kg at approximate two months old, were used. At the beginning of the experiment, five animals were slaughtered to serve as reference for the estimative of empty body weight (EBW and initial body composition of the 30 remaining animals, which were distributed in randomized block design with five treatments (1.13; 1.40; 1.73; 2.22 and 2.60 Mcal/kg DM, and six repetitions. The requirement of metabolizable energy for maintenance was 78.53 kcal/kg EBW0,75/day, with a utilization efficiency of 66%. The average value of efficiency of metabolizable energy utilization for weight gain was 48%. The dietary requirements of TDN and MP increased with the increase in BW and ADG of the animals. The SRNS model underestimated the DMI and ADG of the animals in 6.2% and 24.6%, respectively. Concludes that the values of km and kf are consistent with those observed in several studies with lambs created in the tropics. The dietary requirements of TDN and MP of Santa Ines lambs for different BW and ADG are, approximately, 42% and 24%, respectively, lower than those suggested by the american system of evaluation of food and nutrient requirements of small ruminants. The SRNS model was sensitive to predict the DMI in Santa Ines lambs, however, for variable ADG, more studies are needed, since the model underestimated the response of the animals of this study.

  5. Generation of a convalescent model of virulent Francisella tularensis infection for assessment of host requirements for survival of tularemia.

    Directory of Open Access Journals (Sweden)

    Deborah D Crane

    Full Text Available Francisella tularensis is a facultative intracellular bacterium and the causative agent of tularemia. Development of novel vaccines and therapeutics for tularemia has been hampered by the lack of understanding of which immune components are required to survive infection. Defining these requirements for protection against virulent F. tularensis, such as strain SchuS4, has been difficult since experimentally infected animals typically die within 5 days after exposure to as few as 10 bacteria. Such a short mean time to death typically precludes development, and therefore assessment, of immune responses directed against virulent F. tularensis. To enable identification of the components of the immune system that are required for survival of virulent F. tularensis, we developed a convalescent model of tularemia in C57Bl/6 mice using low dose antibiotic therapy in which the host immune response is ultimately responsible for clearance of the bacterium. Using this model we demonstrate αβTCR(+ cells, γδTCR(+ cells, and B cells are necessary to survive primary SchuS4 infection. Analysis of mice deficient in specific soluble mediators shows that IL-12p40 and IL-12p35 are essential for survival of SchuS4 infection. We also show that IFN-γ is required for survival of SchuS4 infection since mice lacking IFN-γR succumb to disease during the course of antibiotic therapy. Finally, we found that both CD4(+ and CD8(+ cells are the primary producers of IFN-γand that γδTCR(+ cells and NK cells make a minimal contribution toward production of this cytokine throughout infection. Together these data provide a novel model that identifies key cells and cytokines required for survival or exacerbation of infection with virulent F. tularensis and provides evidence that this model will be a useful tool for better understanding the dynamics of tularemia infection.

  6. Disease models of chronic inflammatory airway disease: applications and requirements for clinical trials.

    Science.gov (United States)

    Diamant, Zuzana; Clarke, Graham W; Pieterse, Herman; Gispert, Juan

    2014-01-01

    This review will discuss methodologies and applicability of key inflammatory models of respiratory disease in proof of concept or proof of efficacy clinical studies. In close relationship with these models, induced sputum and inflammatory cell counts will be addressed for phenotype-directed drug development. Additionally, important regulatory aspects regarding noninvestigational medicinal products used in bronchial challenges or clinical inflammatory models of respiratory disease will be highlighted. The recognition of an ever increasing number of phenotypes and endotypes within conditions such as asthma and chronic obstructive pulmonary disease urges phenotyping of study populations already in early clinical phases of drug development. Apart from the choice of a relevant disease model, recent studies show that especially targeted therapies need to be tested in well defined disease subsets for adequate efficacy assessment. Noninvasive biomarkers, especially sputum inflammatory cell counts, aid phenotyping and are useful outcome measures for novel, targeted therapies. Disease phenotyping becomes increasingly important for efficient and cost-effective drug development and subsequent disease management. Inflammatory models of respiratory disease combined with sputum biomarkers are important tools in this approach.

  7. Modelling habitat requirements of white-clawed crayfish (Austropotamobius pallipes using support vector machines

    Directory of Open Access Journals (Sweden)

    Favaro L.

    2011-07-01

    Full Text Available The white-clawed crayfish’s habitat has been profoundly modified in Piedmont (NW Italy due to environmental changes caused by human impact. Consequently, native populations have decreased markedly. In this research project, support vector machines were tested as possible tools for evaluating the ecological factors that determine the presence of white-clawed crayfish. A system of 175 sites was investigated, 98 of which recorded the presence of Austropotamobius pallipes. At each site 27 physical-chemical, environmental and climatic variables were measured according to their importance to A. pallipes. Various feature selection methods were employed. These yielded three subsets of variables that helped build three different types of models: (1 models with no variable selection; (2 models built by applying Goldberg’s genetic algorithm after variable selection; (3 models built by using a combination of four supervised-filter evaluators after variable selection. These different model types helped us realise how important it was to select the right features if we wanted to build support vector machines that perform as well as possible. In addition, support vector machines have a high potential for predicting indigenous crayfish occurrence, according to our findings. Therefore, they are valuable tools for freshwater management, tools that may prove to be much more promising than traditional and other machine-learning techniques.

  8. Using VCL as an Aspect-Oriented Approach to Requirements Modelling

    Science.gov (United States)

    Amálio, Nuno; Kelsen, Pierre; Ma, Qin; Glodt, Christian

    Software systems are becoming larger and more complex. By tackling the modularisation of crosscutting concerns, aspect orientation draws attention to modularity as a means to address the problems of scalability, complexity and evolution in software systems development. Aspect-oriented modelling (AOM) applies aspect-orientation to the construction of models. Most existing AOM approaches are designed without a formal semantics, and use multi-view partial descriptions of behaviour. This paper presents an AOM approach based on the Visual Contract Language (VCL): a visual language for abstract and precise modelling, designed with a formal semantics, and comprising a novel approach to visual behavioural modelling based on design by contract where behavioural descriptions are total. By applying VCL to a large case study of a car-crash crisis management system, the paper demonstrates how modularity of VCL's constructs, at different levels of granularity, help to tackle complexity. In particular, it shows how VCL's package construct and its associated composition mechanisms are key in supporting separation of concerns, coarse-grained problem decomposition and aspect-orientation. The case study's modelling solution has a clear and well-defined modular structure; the backbone of this structure is a collection of packages encapsulating local solutions to concerns.

  9. Data Requirements and Modeling for Gas Hydrate-Related Mixtures and a Comparison of Two Association Models

    DEFF Research Database (Denmark)

    Liang, Xiaodong; Aloupis, Georgios; Kontogeorgis, Georgios M.

    2017-01-01

    the performance of the CPA and sPC-SAFT EOS for modeling the fluid-phase equilibria of gas hydrate-related systems and will try to explore how the models can help in suggesting experimental measurements. These systems contain water, hydrocarbon (alkane or aromatic), and either methanol or monoethylene glycol...... parameter sets have been chosen for the sPC-SAFT EOS for a fair comparison. The comparisons are made for pure fluid properties, vapor liquid-equilibria, and liquid liquid equilibria of binary and ternary mixtures as well as vapor liquid liquid equilibria of quaternary mixtures. The results show, from...

  10. Model complexities and requirements for multimodal transport network design : Assessment of classical, state-of-the-practice, and state-of-the-research models

    NARCIS (Netherlands)

    Van Eck, G.; Brands, T.; Wismans, L.J.J.; Pel, A.J.; Van Nes, R.

    2014-01-01

    In the aim for a more sustainable transport system, governments try to stimulate multimodal trip making by facilitating smooth transfers between modes. The assessment of related multimodal policy measures requires transport models that are capable of handling the complex nature of multimodality.

  11. Model complexities and requirements for multimodal transport network design: assessment of classical, state-of-the-practice, and state-of-the-research models

    NARCIS (Netherlands)

    van Eck, G.; Brands, Ties; Wismans, Luc Johannes Josephus; Pel, A.J.; van Nes, R.

    2014-01-01

    In the aim for a more sustainable transport system, governments try to stimulate multimodal trip making by facilitating smooth transfers between modes. The assessment of related multimodal policy measures requires transport models that are capable of handling the complex nature of multimodality.

  12. Quasi-monte carlo simulation and variance reduction techniques substantially reduce computational requirements of patient-level simulation models: An application to a discrete event simulation model

    NARCIS (Netherlands)

    Treur, M.; Postma, M.

    2014-01-01

    Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using

  13. Practitioner's knowledge representation a pathway to improve software effort estimation

    CERN Document Server

    Mendes, Emilia

    2014-01-01

    The main goal of this book is to help organizations improve their effort estimates and effort estimation processes by providing a step-by-step methodology that takes them through the creation and validation of models that are based on their own knowledge and experience. Such models, once validated, can then be used to obtain predictions, carry out risk analyses, enhance their estimation processes for new projects and generally advance them as learning organizations.Emilia Mendes presents the Expert-Based Knowledge Engineering of Bayesian Networks (EKEBNs) methodology, which she has used and adapted during the course of several industry collaborations with different companies world-wide over more than 6 years. The book itself consists of two major parts: first, the methodology's foundations in knowledge management, effort estimation (with special emphasis on the intricacies of software and Web development) and Bayesian networks are detailed; then six industry case studies are presented which illustrate the pra...

  14. Vegetation-specific model parameters are not required for estimating gross primary production

    Czech Academy of Sciences Publication Activity Database

    Yuan, W.; Cai, W.; Liu, S.; Dong, W.; Chen, J.; Altaf Arain, M.; Blanken, P. D.; Cescatti, A.; Wohlfahrt, G.; Georgiadis, T.; Genesio, L.; Gianelle, D.; Grelle, A.; Kiely, G.; Knohl, A.; Liu, D.; Marek, Michal V.; Merbold, L.; Montagnani, L.; Panferov, O.; Peltoniemi, M.; Rambal, S.; Raschi, A.; Varlagin, A.; Xia, J.

    2014-01-01

    Roč. 292, NOV 24 2014 (2014), s. 1-10 ISSN 0304-3800 Institutional support: RVO:67179843 Keywords : light use efficiency * gross primary production * model parameters Subject RIV: EH - Ecology, Behaviour Impact factor: 2.321, year: 2014

  15. Telematic Requirements for Emergency and Disaster Response derived from Enterprise Models

    NARCIS (Netherlands)

    Widya, I.A.; Vierhout, P.A.M.; Vierhout, P.A.M.; Jones, Valerie M.; Bults, Richard G.A.; van Halteren, Aart; Peuscher, J.; Konstantas, D.; Istepanian, R.S.H.; Laxminarayan, S.; Pattichis, C.S.

    2006-01-01

    One of the prime objectives in disaster response management is to achieve full control of the situation as rapidly as possible. Coordination and communication facility therefore plays an essential role in managing disasters. This chapter discusses Enterprise Models that capture the invariant

  16. A UML Profile Oriented to the Requirements Modeling in Intelligent Tutoring Systems Projects

    OpenAIRE

    Guedes , Gilleanes Thorwald Araujo; Vicari , Rosa Maria

    2010-01-01

    International audience; This paper describes a proposal for the creation of a UML profile oriented to the intelligent tutoring systems project. In this paper we shall describe the proposed profile as well as its application into the modeling of the AMEA intelligent tutoring system.

  17. Towards security requirements: Iconicity as a feature of an informal modeling language

    NARCIS (Netherlands)

    Vasenev, Alexandr; Ionita, Dan; Zoppi, Tomasso; Ceccarelli, Andrea; Wieringa, Roelf J.

    2017-01-01

    Self-adaptive systems need to be designed with respect to threats within their operating conditions. Identifying such threats during the design phase can benefit from the involvement of stakeholders. Using a system model, the stakeholders, who may neither be IT experts nor security experts, can

  18. Inpo/industry job and task analysis efforts

    International Nuclear Information System (INIS)

    Wigley, W.W.

    1985-01-01

    One of the goals of INPO is to develop and coordinate industrywide programs to improve the education, training and qualification of nuclear utility personnel. To accomplish this goal, INPO's Training and Education Division: conducts periodic evaluations of industry training programs; provides assistance to the industry in developing training programs; manages the accreditation of utility training programs. These efforts are aimed at satisfying the need for training programs for nuclear utility personnel to be performance-based. Performance-based means that training programs provide an incumbent with the skills and knowledge required to safely perform the job. One of the ways that INPO has provided assistance to the industry is through the industrywide job and task analysis effort. I will discuss the job analysis and task analysis processes, the current status of JTA efforts, JTA products and JTA lessons learned

  19. Job Satisfaction, Effort, and Performance: A Reasoned Action Perspective

    Directory of Open Access Journals (Sweden)

    Icek Ajzen

    2011-12-01

    Full Text Available In this article the author takes issue with the recurrent reliance on job satisfaction to explain job-related effort and performance.  The disappointing findings in this tradition are explained by lack of compatibility between job satisfaction–-a very broad attitude–-and the more specific effort and performance criteria.  Moreover, attempts to apply the expectancy-value model of attitude to explore the determinants of effort and performance suffer from reliance on unrepresentative sets of beliefs about the likely consequences of these behaviors.  The theory of planned behavior (Ajzen, 1991, 2012, with its emphasis on the proximal antecedents of job effort and performance, is offered as an alternative.  According to the theory, intentions to exert effort and to attain a certain performance level are determined by attitudes, subjective norms, and perceptions of control in relation to these behaviors; and these variables, in turn, are a function of readily accessible beliefs about the likely outcomes of effort and performance, about the normative expectations of important others, and about factors that facilitate or hinder effective performance.

  20. Requirements analysis and data model design for the development of vertical ERP solutions for the ceramic industry

    International Nuclear Information System (INIS)

    Oltra-Badenes, R. F.; Gil-Gomez, H.; Bellver-Lopez, R.; Asensio-Cuenta, S.

    2013-01-01

    Currently, the existing information systems, and specifically the ERP, can not give adequate support to the management of manufacturing companies of ceramic tile, because, among other reasons, not to contemplate the existence of tone, size and quality within the same product. This feature, caused by the lack of homogeneity of the product (LHP), generates various problems in managing the product through the different business processes, such as, stocks management, order management, the production management, etc. Thus, it is necessary to develop an ERP solution that is able to manage adequately the ceramic product, including tone, size and quality. In this paper we analyze the requirements of the ceramic sector, in terms of product identification, and propose a data model to meet these requirements. The model arises as a basic guide for the development of vertical ERP solutions tailored to the ceramic industry. (Author)

  1. Different effort constructs and effort-reward imbalance: effects on employee well-being in ancillary health care workers.

    Science.gov (United States)

    van Vegchel, N; de Jonge, J; Meijer, T; Hamers, J P

    2001-04-01

    The present study investigates the relationship between Effort-Reward Imbalance (ERI) and employee well-being, using three different concepts of efforts (i.e. psychological demands, physical demands and emotional demands). The ERI model had been used as a theoretical framework, indicating that work stress is related to high efforts (i.e. job demands) and low occupational rewards (e.g. money, esteem and security/career opportunities). The ERI model also predicts that, in overcommitted workers, effects of ERI on employee well-being are stronger compared with their less committed counterparts. A cross-sectional survey among 167 ancillary health care workers of two nursing homes was conducted. Multiple univariate logistic regression analyses were used to test the relationship between ERI and employee well-being. Results of the logistic regression analyses showed that employees with both high (psychological, physical and emotional) efforts and low rewards had higher risks of psychosomatic health complaints, physical health symptoms and job dissatisfaction (odds ratios (ORs) ranged from 5.09 to 18.55). Moreover, employees who reported both high efforts and high rewards had elevated risks of physical symptoms and exhaustion (ORs ranged from 6.17 to 9.39). No support was found for the hypothesis on the moderating effect of overcommitment. Results show some support for the ERI model; ancillary health care workers with high effort/low reward imbalance had elevated risks of poor employee well-being. In addition, results show that the combination of high efforts and high rewards is important for employee well-being. Finally, some practical implications are discussed to combat work stress in health care work.

  2. Does the digital age require new models of democracy?: lasswell's policy scientist of democracy vs. Liquid democracy

    OpenAIRE

    Gregorius, Jelena

    2015-01-01

    This essay provides a debate about Lasswell’s policy scientist of democracy (PSOD, 1948) in comparison to the model of liquid democracy (21st century) based on the question if the digital age requires new models of democracy. The PSOD of Lasswell, a disciplinary persona, is in favour of an elitist approach to democracy including elite decision-making, as well as the values of wealth and power. Liquid democracy, on the other hand, emerged from the notion that the Internet provides a vast amoun...

  3. Wrist extension strength required for power grip: a study using a radial nerve block model.

    Science.gov (United States)

    Suzuki, T; Kunishi, T; Kakizaki, J; Iwakura, N; Takahashi, J; Kuniyoshi, K

    2012-06-01

    The aim of this study was to investigate the correlation of wrist extension strength (WES) and grip strength (GS) using a radial nerve block, and to determine the WES required to prevent the "wrist flexion phenomenon" (antagonistic WES) when making a fist. We tested 14 arms in seven healthy males. WES and GS were measured before blocking as standard WES and standard GS. All participants then had radial nerve blocks with mepivacaine hydrochloride. During the recovery process from radial nerve blockade, WES and GS were recorded every 5 minutes. There was a very strong correlation between WES and GS (p < 0.0001). The mean antagonistic WES was 51% of standard WES, and the mean GS, recorded at the same time, was 66% of standard GS.

  4. EU grid computing effort takes on malaria

    CERN Multimedia

    Lawrence, Stacy

    2006-01-01

    Malaria is the world's most common parasitic infection, affecting more thatn 500 million people annually and killing more than 1 million. In order to help combat malaria, CERN has launched a grid computing effort (1 page)

  5. Anticipated emotions and effort allocation in weight goal striving

    NARCIS (Netherlands)

    Nelissen, R.M.A.; de Vet, H.C.W.; Zeelenberg, M.

    2011-01-01

    Objective. This study aimed to investigate the influence of anticipated emotions on preventive health behaviour if specified at the level of behavioural outcomes. Consistent with predictions from a recently developed model of goal pursuit, we hypothesized that the impact of emotions on effort levels

  6. Components of Effortful Control and Their Relations to Children's Shyness

    Science.gov (United States)

    Eggum-Wilkens, Natalie D.; Reichenberg, Ray E.; Eisenberg, Nancy; Spinrad, Tracy L.

    2016-01-01

    Relations between children's (n = 213) mother-reported effortful control components (attention focusing, attention shifting, inhibitory control at 42 months; activational control at 72 months) and mother-reported shyness trajectories across 42, 54, 72, and 84 months of age were examined. In growth models, shyness decreased. Inhibitory control and…

  7. A technique for estimating maximum harvesting effort in a stochastic ...

    Indian Academy of Sciences (India)

    Unknown

    realistic environmental variability the maximum harvesting effort is less than what is estimated in the deterministic model. This method also enables us to find out the safe regions in the parametric space for which the chance of extinction of the species is minimized. A real life fishery problem has been considered to obtain.

  8. Worker Morale and Effort : Is the Relationship Causal?

    NARCIS (Netherlands)

    Hassink, W.H.J.; Fernandez, Roberto M.

    2015-01-01

    We investigate a unique setting which enables us to distinguish between two theories of work performance. A standard labor supply framework implies a negative effect of the nonpecuniary cost of work on the employee’s effort. In contrast, a model of worker morale that is consistent with a widely used

  9. Preference for rewards that follow greater effort and greater delay.

    Science.gov (United States)

    Alessandri, Jérôme; Darcheville, Jean-Claude; Delevoye-Turrell, Yvonne; Zentall, Thomas R

    2008-11-01

    Humans prefer (conditioned) rewards that follow greater effort (Aronson & Mills, 1959). This phenomenon can be interpreted as evidence for cognitive dissonance (or as justification of effort) but may also result from (1) the contrast between the relatively greater effort and the signal for reinforcement or (2) the delay reduction signaled by the conditioned reinforcer. In the present study, we examined the effect of prior force and prior time to produce stimuli associated with equal reinforcement. As expected, pressing with greater force or for a longer time was less preferred than pressing with less force or for a shorter time. However, participants preferred the conditioned reinforcer that followed greater force and more time. Furthermore, participants preferred a long duration with no force requirement over a shorter duration with a high force requirement and, consistent with the contrast account but not with the delay reduction account, they preferred the conditioned stimulus that followed the less preferred, shorter duration, high-force event. Within-trial contrast provides a more parsimonious account than justification of effort, and a more complete account than delay reduction.

  10. Using process control concepts to model conditions required for sudden-onset occupational injuries.

    Science.gov (United States)

    Keyserling, W Monroe; Smith, Gordon S

    2007-07-01

    Sudden-onset injury results from a momentary energy exchange between an agent and host, producing immediately discernible tissue damage. These injuries are common in both occupational and nonoccupational settings; typical causes include falls, mechanical contact/crushing, exposure to temperature extremes, and exposure to electrical current. We review epidemiologic and engineering approaches to injury prevention and propose a process control model for describing risk-of-exposure to injury agents during the Pre-event phase of sudden-onset injury. Process control is a proactive approach to quality engineering that is based on the premise of preventing defective products from being manufactured in the first place, instead of relying on reactive inspections to detect defects at the end of the manufacturing process. Principles of process control can be applied by occupational health and safety professionals to prevent workplace injury. The proposed model describes how work activities (process inputs) cause risk-of-exposure to injury agents to fluctuate over the course of a work shift. Risk-of-exposure is a complex function with many input factors including: the nature/magnitude of hazards, the presence and effectiveness of engineering controls, safety climate, management attitudes and practices, the surrounding work environment, the physical and mental states of the worker, and the quality and quantity of supervision and training. Injury can occur only when this function crosses a certain threshold and the host is exposed to injurious energy via physical contact. Certain factors that contribute to risk-of-exposure are stable for extended time periods (weeks, months, years), whereas other factors are transient (durations of minutes or seconds). The model extends classical work by Haddon and others, provides preliminary insights to designing epidemiologic studies and developing fault-tolerant work systems, and illustrates how interdisciplinary approaches can improve our

  11. Singularity free N-body simulations called 'Dynamic Universe Model' don't require dark matter

    Science.gov (United States)

    Naga Parameswara Gupta, Satyavarapu

    For finding trajectories of Pioneer satellite (Anomaly), New Horizons satellite going to Pluto, the Calculations of Dynamic Universe model can be successfully applied. No dark matter is assumed within solar system radius. The effect on the masses around SUN shows as though there is extra gravitation pull toward SUN. It solves the Dynamics of Extra-solar planets like Planet X, satellite like Pioneer and NH for 3-Position, 3-velocity 3-accelaration for their masses, considering the complex situation of Multiple planets, Stars, Galaxy parts and Galaxy centre and other Galaxies Using simple Newtonian Physics. It already solved problems Missing mass in Galaxies observed by galaxy circular velocity curves successfully. Singularity free Newtonian N-body simulations Historically, King Oscar II of Sweden an-nounced a prize to a solution of N-body problem with advice given by Güsta Mittag-Leffler in 1887. He announced `Given a system of arbitrarily many mass points that attract each according to Newton's law, under the assumption that no two points ever collide, try to find a representation of the coordinates of each point as a series in a variable that is some known function of time and for all of whose values the series converges uniformly.'[This is taken from Wikipedia]. The announced dead line that time was1st June 1888. And after that dead line, on 21st January 1889, Great mathematician Poincaré claimed that prize. Later he himself sent a telegram to journal Acta Mathematica to stop printing the special issue after finding the error in his solution. Yet for such a man of science reputation is important than money. [ Ref Book `Celestial mechanics: the waltz of the planets' By Alessandra Celletti, Ettore Perozzi, page 27]. He realized that he has been wrong in his general stability result! But till now nobody could solve that problem or claimed that prize. Later all solutions resulted in singularities and collisions of masses, given by many people

  12. SEBAL Model Using to Estimate Irrigation Water Efficiency & Water Requirement of Alfalfa Crop

    Science.gov (United States)

    Zeyliger, Anatoly; Ermolaeva, Olga

    2013-04-01

    The sustainability of irrigation is a complex and comprehensive undertaking, requiring an attention to much more than hydraulics, chemistry, and agronomy. A special combination of human, environmental, and economic factors exists in each irrigated region and must be recognized and evaluated. A way to evaluate the efficiency of irrigation water use for crop production is to consider the so-called crop-water production functions, which express the relation between the yield of a crop and the quantity of water applied to it or consumed by it. The term has been used in a somewhat ambiguous way. Some authors have defined the Crop-Water Production Functions between yield and the total amount of water applied, whereas others have defined it as a relation between yield and seasonal evapotranspiration (ET). In case of high efficiency of irrigation water use the volume of water applied is less than the potential evapotranspiration (PET), then - assuming no significant change of soil moisture storage from beginning of the growing season to its end-the volume of water may be roughly equal to ET. In other case of low efficiency of irrigation water use the volume of water applied exceeds PET, then the excess of volume of water applied over PET must go to either augmenting soil moisture storage (end-of-season moisture being greater than start-of-season soil moisture) or to runoff or/and deep percolation beyond the root zone. In presented contribution some results of a case study of estimation of biomass and leaf area index (LAI) for irrigated alfalfa by SEBAL algorithm will be discussed. The field study was conducted with aim to compare ground biomass of alfalfa at some irrigated fields (provided by agricultural farm) at Saratov and Volgograd Regions of Russia. The study was conducted during vegetation period of 2012 from April till September. All the operations from importing the data to calculation of the output data were carried by eLEAF company and uploaded in Fieldlook web

  13. A new model to produce infectious hepatitis C virus without the replication requirement.

    Directory of Open Access Journals (Sweden)

    Miriam Triyatni

    2011-04-01

    Full Text Available Numerous constraints significantly hamper the experimental study of hepatitis C virus (HCV. Robust replication in cell culture occurs with only a few strains, and is invariably accompanied by adaptive mutations that impair in vivo infectivity/replication. This problem complicates the production and study of authentic HCV, including the most prevalent and clinically important genotype 1 (subtypes 1a and 1b. Here we describe a novel cell culture approach to generate infectious HCV virions without the HCV replication requirement and the associated cell-adaptive mutations. The system is based on our finding that the intracellular environment generated by a West-Nile virus (WNV subgenomic replicon rendered a mammalian cell line permissive for assembly and release of infectious HCV particles, wherein the HCV RNA with correct 5' and 3' termini was produced in the cytoplasm by a plasmid-driven dual bacteriophage RNA polymerase-based transcription/amplification system. The released particles preferentially contained the HCV-based RNA compared to the WNV subgenomic RNA. Several variations of this system are described with different HCV-based RNAs: (i HCV bicistronic particles (HCVbp containing RNA encoding the HCV structural genes upstream of a cell-adapted subgenomic replicon, (ii HCV reporter particles (HCVrp containing RNA encoding the bacteriophage SP6 RNA polymerase in place of HCV nonstructural genes, and (iii HCV wild-type particles (HCVwt containing unmodified RNA genomes of diverse genotypes (1a, strain H77; 1b, strain Con1; 2a, strain JFH-1. Infectivity was assessed based on the signals generated by the HCV RNA molecules introduced into the cytoplasm of target cells upon virus entry, i.e. HCV RNA replication and protein production for HCVbp in Huh-7.5 cells as well as for HCVwt in HepG2-CD81 cells and human liver slices, and SP6 RNA polymerase-driven firefly luciferase for HCVrp in target cells displaying candidate HCV surface receptors. HCV

  14. A new model to produce infectious hepatitis C virus without the replication requirement.

    Science.gov (United States)

    Triyatni, Miriam; Berger, Edward A; Saunier, Bertrand

    2011-04-01

    Numerous constraints significantly hamper the experimental study of hepatitis C virus (HCV). Robust replication in cell culture occurs with only a few strains, and is invariably accompanied by adaptive mutations that impair in vivo infectivity/replication. This problem complicates the production and study of authentic HCV, including the most prevalent and clinically important genotype 1 (subtypes 1a and 1b). Here we describe a novel cell culture approach to generate infectious HCV virions without the HCV replication requirement and the associated cell-adaptive mutations. The system is based on our finding that the intracellular environment generated by a West-Nile virus (WNV) subgenomic replicon rendered a mammalian cell line permissive for assembly and release of infectious HCV particles, wherein the HCV RNA with correct 5' and 3' termini was produced in the cytoplasm by a plasmid-driven dual bacteriophage RNA polymerase-based transcription/amplification system. The released particles preferentially contained the HCV-based RNA compared to the WNV subgenomic RNA. Several variations of this system are described with different HCV-based RNAs: (i) HCV bicistronic particles (HCVbp) containing RNA encoding the HCV structural genes upstream of a cell-adapted subgenomic replicon, (ii) HCV reporter particles (HCVrp) containing RNA encoding the bacteriophage SP6 RNA polymerase in place of HCV nonstructural genes, and (iii) HCV wild-type particles (HCVwt) containing unmodified RNA genomes of diverse genotypes (1a, strain H77; 1b, strain Con1; 2a, strain JFH-1). Infectivity was assessed based on the signals generated by the HCV RNA molecules introduced into the cytoplasm of target cells upon virus entry, i.e. HCV RNA replication and protein production for HCVbp in Huh-7.5 cells as well as for HCVwt in HepG2-CD81 cells and human liver slices, and SP6 RNA polymerase-driven firefly luciferase for HCVrp in target cells displaying candidate HCV surface receptors. HCV infectivity

  15. Structural Relations among Negative Affect, Mate Value, and Mating Effort

    Directory of Open Access Journals (Sweden)

    Beth Randi Kirsner

    2009-07-01

    Full Text Available We compared the ability of models based on evolutionary economic theory and Life History (LH Theory to explain relations among self-reported negative affect, mate value, and mating effort. Method: Two hundred thirty-eight undergraduates provided multiple measures of these latent constructs, permitting us to test a priori predictions based on Kirsner, Figueredo, and Jacobs (2003. We compared the fit of the initial model to the fit of five alternative theory-driven models using nested model comparisons of Structural Equations Models. Rejecting less parsimonious and explanatory models eliminated the original model. Two equally parsimonious models explained the data pattern well. The first, based on evolutionary economic theory, specified that Negative Affect increases both Personal Mate Value and Mating Effort via the direct effects specified in the original model. The second, based on LH Theory, specified that Negative Affect, Personal Mate Value, and Mating Effort relate spuriously through a common latent construct, the LH Factor. The primary limitation of the present study is generalizability. We used self-reports taken from a young, university-based sample that included a spectrum of affective states. We cannot know how well these models generalize to an older population or to actual behavior. Both models predict the presence of a rich pattern of mate acquisition and retention behaviors, including an alarming set of behavioral tactics often not considered or targeted during treatment. Moreover, each model suggests a unique set of problems may arise after an effective intervention. We describe several ways to distinguish these models empirically.

  16. A calibrated advection-aridity evaporation model requiring no humidity data

    Science.gov (United States)

    Crago, Richard D.; Qualls, Russell J.; Feller, Meghan

    2010-09-01

    A modified advection-aridity model was tested with 24 h averaged data from the First ISLSCP (International Satellite Land Surface Climatology Project) Field Experiment-87 and the Cooperative Atmospheric-Surface Exchange Study (CASES) -97 experiments. Two modifications were made. First, the need for humidity measurements in the "drying power" term of the equation was circumvented by assuming the daily average specific humidity for a 24 h day is equal to the specific humidity at the minimum temperature during the day. When compared to measured latent heat fluxes, this modification resulted in no deterioration in advection-aridity estimates compared to versions using measured humidity. A second modification followed previous researchers by formulating the drying power in the Penman equation using Monin-Obukhov similarity (MOS) theory. However, this version calibrated kB-1 (≡ln(zo/zov) by setting the Priestley-Taylor evapotranspiration rate equal (on average) to the Penman evapotranspiration on several moist days and solving for the value of kB-1 as the only unknown. The calibration involved no latent heat flux measurements. The results suggest that the advection-aridity model performs modestly better in the calibrated MOS version than with Penman's original wind function. Further investigation is recommended because MOS theory accounts for varying momentum roughness lengths, and the calibration was not done under ideal (well-watered or nearly saturated) conditions that deteriorated the results somewhat with the CASES data set.

  17. Sample Size Requirements for Estimation of Item Parameters in the Multidimensional Graded Response Model

    Directory of Open Access Journals (Sweden)

    Shengyu eJiang

    2016-02-01

    Full Text Available Likert types of rating scales in which a respondent chooses a response from an ordered set of response options are used to measure a wide variety of psychological, educational, and medical outcome variables. The most appropriate item response theory model for analyzing and scoring these instruments when they provide scores on multiple scales is the multidimensional graded response model (MGRM. A simulation study was conducted to investigate the variables that might affect item parameter recovery for the MGRM. Data were generated based on different sample sizes, test lengths, and scale intercorrelations. Parameter estimates were obtained through the flexiMIRT software. The quality of parameter recovery was assessed by the correlation between true and estimated parameters as well as bias and root- mean-square-error. Results indicated that for the vast majority of cases studied a sample size of N = 500 provided accurate parameter estimates, except for tests with 240 items when 1,000 examinees were necessary to obtain accurate parameter estimates. Increasing sample size beyond N = 1,000 did not increase the accuracy of MGRM parameter estimates.

  18. Requirement for Serratia marcescens cytolysin in a murine model of hemorrhagic pneumonia.

    Science.gov (United States)

    González-Juarbe, Norberto; Mares, Chris A; Hinojosa, Cecilia A; Medina, Jorge L; Cantwell, Angelene; Dube, Peter H; Orihuela, Carlos J; Bergman, Molly A

    2015-02-01

    Serratia marcescens, a member of the carbapenem-resistant Enterobacteriaceae, is an important emerging pathogen that causes a wide variety of nosocomial infections, spreads rapidly within hospitals, and has a systemic mortality rate of ≤41%. Despite multiple clinical descriptions of S. marcescens nosocomial pneumonia, little is known regarding the mechanisms of bacterial pathogenesis and the host immune response. To address this gap, we developed an oropharyngeal aspiration model of lethal and sublethal S. marcescens pneumonia in BALB/c mice and extensively characterized the latter. Lethal challenge (>4.0 × 10(6) CFU) was characterized by fulminate hemorrhagic pneumonia with rapid loss of lung function and death. Mice challenged with a sublethal dose (marcescens strains that failed to cause profound weight loss, extended illness, hemorrhage, and prolonged lung pathology in mice. This study describes a model of S. marcescens pneumonia that mimics known clinical features of human illness, identifies neutrophils and the toxin ShlA as a key factors important for defense and infection, respectively, and provides a solid foundation for future studies of novel therapeutics for this important opportunistic pathogen. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  19. Solid waste initiative Macro Material Flow Modeling conceptual description and requirements

    Energy Technology Data Exchange (ETDEWEB)

    Holter, G.M.; Stapp, D.C.

    1993-01-01

    This report describes a Macro Material Flow Modeling (MMFM) concept and approach that are being adopted to develop a predictive modeling capability that can be used as the basis for evaluating potential impacts from various solid waste management system configurations and operating scenarios, as well as the impacts of various policies on solid waste quantities and compositions. This capability, as part of a broader Solid Waste Initiative at Pacific Northwest Laboratory, is intended to provide an increased understanding of solid waste as a disposal, energy, and resource problem on a national and global scale, particularly over the long term. The results of this increased understanding will eventually have an impact on a variety of US federal government activities, as well as on the activities of other entities. This increased understanding will also help provide the basis for subsequent activities under the Solid Waste Initiative. The report describes current solid waste management practices and their context, defines questions of interest relating to these practices, and proposes an approach that could be employed to analyze these practices and possible alternatives to them. A preliminary review, analysis, and summary of available data to support this approach are also provided.

  20. Development of an estimation model for the evaluation of the energy requirement of dilute acid pretreatments of biomass☆

    Science.gov (United States)

    Mafe, Oluwakemi A.T.; Davies, Scott M.; Hancock, John; Du, Chenyu

    2015-01-01

    This study aims to develop a mathematical model to evaluate the energy required by pretreatment processes used in the production of second generation ethanol. A dilute acid pretreatment process reported by National Renewable Energy Laboratory (NREL) was selected as an example for the model's development. The energy demand of the pretreatment process was evaluated by considering the change of internal energy of the substances, the reaction energy, the heat lost and the work done to/by the system based on a number of simplifying assumptions. Sensitivity analyses were performed on the solid loading rate, temperature, acid concentration and water evaporation rate. The results from the sensitivity analyses established that the solids loading rate had the most significant impact on the energy demand. The model was then verified with data from the NREL benchmark process. Application of this model on other dilute acid pretreatment processes reported in the literature illustrated that although similar sugar yields were reported by several studies, the energy required by the different pretreatments varied significantly. PMID:26109752

  1. Development of an estimation model for the evaluation of the energy requirement of dilute acid pretreatments of biomass.

    Science.gov (United States)

    Mafe, Oluwakemi A T; Davies, Scott M; Hancock, John; Du, Chenyu

    2015-01-01

    This study aims to develop a mathematical model to evaluate the energy required by pretreatment processes used in the production of second generation ethanol. A dilute acid pretreatment process reported by National Renewable Energy Laboratory (NREL) was selected as an example for the model's development. The energy demand of the pretreatment process was evaluated by considering the change of internal energy of the substances, the reaction energy, the heat lost and the work done to/by the system based on a number of simplifying assumptions. Sensitivity analyses were performed on the solid loading rate, temperature, acid concentration and water evaporation rate. The results from the sensitivity analyses established that the solids loading rate had the most significant impact on the energy demand. The model was then verified with data from the NREL benchmark process. Application of this model on other dilute acid pretreatment processes reported in the literature illustrated that although similar sugar yields were reported by several studies, the energy required by the different pretreatments varied significantly.

  2. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident

    International Nuclear Information System (INIS)

    Walsh, Linda; Zhang, Wei

    2016-01-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated ''No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data''. Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome ''all solid cancer'', it is shown here that sex modification is not statistically significant for the outcome ''all solid cancer other than thyroid and breast cancer''. It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and

  3. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Linda [Federal Office for Radiation Protection, Department ' ' Radiation Protection and Health' ' , Oberschleissheim (Germany); University of Zurich, Medical Physics Group, Institute of Physics, Zurich (Switzerland); Zhang, Wei [Public Health England, Centre for Radiation, Chemical and Environmental Hazards, Oxford (United Kingdom)

    2016-03-15

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated ''No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data''. Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome ''all solid cancer'', it is shown here that sex modification is not statistically significant for the outcome ''all solid cancer other than thyroid and breast cancer''. It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and

  4. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident.

    Science.gov (United States)

    Walsh, Linda; Zhang, Wei

    2016-03-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated "No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data". Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome "all solid cancer", it is shown here that sex modification is not statistically significant for the outcome "all solid cancer other than thyroid and breast cancer". It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and thyroid cancers are factored out. Some other notable model

  5. Finite Element Models Development of Car Seats With Passive Head Restraints to Study Their Meeting Requirements for EURO NCAP

    Directory of Open Access Journals (Sweden)

    D. Yu. Solopov

    2014-01-01

    Full Text Available In performing calculations to evaluate passive safety of car seats by computer modelling methods it is desirable to use the final element models (FEM thereby providing the greatest accuracy of calculation results. Besides, it is expedient to use FEM, which can be calculated by computer for a small period of time to give preliminary results for short terms.The paper describes the features to evaluate a passive safety, which is ensured by the developed KEM of seats with passive head restraints according to requirements of the EURO NCAP.Besides, accuracy of calculated results that is provided by the developed KEM was evaluated. Accuracy evaluation was accomplished in relation to the results obtained the by specialists of the organization conducting similar researches (LSTC.This work was performed within the framework of a technique, which allows us to develop effectively the car seat designs both with passive, and active head restraints, meeting requirements for passive safety.By results of made calculations and experiments it was found that when evaluating by the EURO NCAP technique the "rough" KEM (the 1st and 2nd levels can be considered as rational ones (in terms of labour costs for its creation and problem solving as well as by result errors and it is expedient to use them for preliminary and multivariate calculations. Detailed models (the 3rd level provide the greatest accuracy (the greatest accuracy is reached with the evaluated impact of 16km/h speed under the loading conditions "moderate impact". A relative error of full head acceleration is of 12%.In evaluation by EURO NCAP using NIC criterion a conclusion can be drawn that the seat models of the 2nd level (467 936 KE and the 3rd level (1 255 358 KE meet the passive safety requirements according to EURO NCAP requirements under "light", "moderate", and "heavy" impacts.In evaluation by EURO NCAP for preliminary and multivariate calculations a model of the middle level (consisting of 467

  6. Developing a phenomenological model of the proton trajectory within a heterogeneous medium required for proton imaging.

    Science.gov (United States)

    Fekete, Charles-Antoine Collins; Doolan, Paul; Dias, Marta F; Beaulieu, Luc; Seco, Joao

    2015-07-07

    To develop an accurate phenomenological model of the cubic spline path estimate of the proton path, accounting for the initial proton energy and water equivalent thickness (WET) traversed. Monte Carlo (MC) simulations were used to calculate the path of protons crossing various WET (10-30 cm) of different material (LN300, water and CB2-50% CaCO3) for a range of initial energies (180-330 MeV). For each MC trajectory, cubic spline trajectories (CST) were constructed based on the entrance and exit information of the protons and compared with the MC using the root mean square (RMS) metric. The CST path is dependent on the direction vector magnitudes (|P0,1|). First, |P0,1| is set to the proton path length (with factor Λ(Norm)(0,1) = 1.0). Then, two optimal factor Λ(0,1) are introduced in |P0,1|. The factors are varied to minimize the RMS difference with MC paths for every configuration. A set of Λ(opt)(0,1) factors, function of WET/water equivalent path length (WEPL), that minimizes the RMS are presented. MTF analysis is then performed on proton radiographs of a line-pair phantom reconstructed using the CST trajectories. Λ(opt)(0,1) was fitted to the WET/WEPL ratio using a quadratic function (Y = A + BX(2) where A = 1.01,0.99, B = 0.43,-  0.46 respectively for Λ(opt)(0), Λ(opt)(1)). The RMS deviation calculated along the path, between the CST and the MC, increases with the WET. The increase is larger when using Λ(Norm)(0,1) than Λ(opt)(0,1) (difference of 5.0% with WET/WEPL = 0.66). For 230/330 MeV protons, the MTF10% was found to increase by 40/16% respectively for a thin phantom (15 cm) when using the Λ(opt)(0,1) model compared to the Λ(Norm)(0,1) model. Calculation times for Λ(opt)(0,1) are scaled down compared to MLP and RMS deviation are similar within standard deviation.B ased on the results of this study, using CST with the Λ(opt)(0,1) factors reduces the RMS deviation and increases the spatial resolution when reconstructing proton

  7. A recapitulative three-dimensional model of breast carcinoma requires perfusion for multi-week growth

    Directory of Open Access Journals (Sweden)

    Kayla F Goliwas

    2016-07-01

    Full Text Available Breast carcinomas are complex, three-dimensional tissues composed of cancer epithelial cells and stromal components, including fibroblasts and extracellular matrix. In vitro models that more faithfully recapitulate this dimensionality and stromal microenvironment should more accurately elucidate the processes driving carcinogenesis, tumor progression, and therapeutic response. Herein, novel in vitro breast carcinoma surrogates, distinguished by a relevant dimensionality and stromal microenvironment, are described and characterized. A perfusion bioreactor system was used to deliver medium to surrogates containing engineered microchannels and the effects of perfusion, medium composition, and the method of cell incorporation and density of initial cell seeding on the growth and morphology of surrogates were assessed. Perfused surrogates demonstrated significantly greater cell density and proliferation and were more histologically recapitulative of human breast carcinoma than surrogates maintained without perfusion. Although other parameters of the surrogate system, such as medium composition and cell seeding density, affected cell growth, perfusion was the most influential parameter.

  8. Covenant model of corporate compliance. "Corporate integrity" program meets mission, not just legal, requirements.

    Science.gov (United States)

    Tuohey, J F

    1998-01-01

    Catholic healthcare should establish comprehensive compliance strategies, beyond following Medicare reimbursement laws, that reflect mission and ethics. A covenant model of business ethics--rather than a self-interest emphasis on contracts--can help organizations develop a creed to focus on obligations and trust in their relationships. The corporate integrity program (CIP) of Mercy Health System Oklahoma promotes its mission and interests, educates and motivates its employees, provides assurance of systemwide commitment, and enforces CIP policies and procedures. Mercy's creed, based on its mission statement and core values, articulates responsibilities regarding patients and providers, business partners, society and the environment, and internal relationships. The CIP is carried out through an integrated network of committees, advocacy teams, and an expanded institutional review board. Two documents set standards for how Mercy conducts external affairs and clarify employee codes of conduct.

  9. Different effort constructs and effort-reward imbalance: Effects on employee well-being in ancillary health care workers

    NARCIS (Netherlands)

    Vegchel, N. van; Jonge, J. de; Meijer, T.; Hamers, J.P.H.

    2001-01-01

    Aims of the study. The present study investigates the relationship between Effort–Reward Imbalance (ERI) and employee well-being, using three different concepts of efforts (i.e. psychological demands, physical demands and emotional demands). Background. The ERI model had been used as a theoretical

  10. A Comprehensive Energy Analysis and Related Carbon Footprint of Dairy Farms, Part 2: Investigation and Modeling of Indirect Energy Requirements

    Directory of Open Access Journals (Sweden)

    Giuseppe Todde

    2018-02-01

    Full Text Available Dairy cattle farms are continuously developing more intensive systems of management, which require higher utilization of durable and non-durable inputs. These inputs are responsible for significant direct and indirect fossil energy requirements, which are related to remarkable emissions of CO2. This study focused on investigating the indirect energy requirements of 285 conventional dairy farms and the related carbon footprint. A detailed analysis of the indirect energy inputs related to farm buildings, machinery and agricultural inputs was carried out. A partial life cycle assessment approach was carried out to evaluate indirect energy inputs and the carbon footprint of farms over a period of one harvest year. The investigation highlights the importance and the weight related to the use of agricultural inputs, which represent more than 80% of the total indirect energy requirements. Moreover, the analyses carried out underline that the assumption of similarity in terms of requirements of indirect energy and related carbon emissions among dairy farms is incorrect especially when observing different farm sizes and milk production levels. Moreover, a mathematical model to estimate the indirect energy requirements of dairy farms has been developed in order to provide an instrument allowing researchers to assess the energy incorporated into farm machinery, agricultural inputs and buildings. Combining the results of this two-part series, the total energy demand (expressed in GJ per farm results in being mostly due to agricultural inputs and fuel consumption, which have the largest share of the annual requirements for each milk yield class. Direct and indirect energy requirements increased, going from small sized farms to larger ones, from 1302–5109 GJ·y−1, respectively. However, the related carbon dioxide emissions expressed per 100 kg of milk showed a negative trend going from class <5000 to >9000 kg of milk yield, where larger farms were able to

  11. Optimization search effort over the control landscapes for open quantum systems with Kraus-map evolution

    International Nuclear Information System (INIS)

    Oza, Anand; Pechen, Alexander; Beltrani, Vincent; Moore, Katharine; Rabitz, Herschel; Dominy, Jason

    2009-01-01

    A quantum control landscape is defined as the expectation value of a target observable Θ as a function of the control variables. In this work, control landscapes for open quantum systems governed by Kraus map evolution are analyzed. Kraus maps are used as the controls transforming an initial density matrix ρ i into a final density matrix to maximize the expectation value of the observable Θ. The absence of suboptimal local maxima for the relevant control landscapes is numerically illustrated. The dependence of the optimization search effort is analyzed in terms of the dimension of the system N, the initial state ρ i and the target observable Θ. It is found that if the number of nonzero eigenvalues in ρ i remains constant, the search effort does not exhibit any significant dependence on N. If ρ i has no zero eigenvalues, then the computational complexity and the required search effort rise with N. The dimension of the top manifold (i.e., the set of Kraus operators that maximizes the objective) is found to positively correlate with the optimization search efficiency. Under the assumption of full controllability, incoherent control modeled by Kraus maps is found to be more efficient in reaching the same value of the objective than coherent control modeled by unitary maps. Numerical simulations are also performed for control landscapes with linear constraints on the available Kraus maps, and suboptimal maxima are not revealed for these landscapes

  12. A Quantitative bgl Operon Model for E. coli Requires BglF Conformational Change for Sugar Transport

    Science.gov (United States)

    Chopra, Paras; Bender, Andreas

    The bgl operon is responsible for the metabolism of β-glucoside sugars such as salicin or arbutin in E. coli. Its regulatory system involves both positive and negative feedback mechanisms and it can be assumed to be more complex than that of the more closely studied lac and trp operons. We have developed a quantitative model for the regulation of the bgl operon which is subject to in silico experiments investigating its behavior under different hypothetical conditions. Upon administration of 5mM salicin as an inducer our model shows 80-fold induction, which compares well with the 60-fold induction measured experimentally. Under practical conditions 5-10mM inducer are employed, which is in line with the minimum inducer concentration of 1mM required by our model. The necessity of BglF conformational change for sugar transport has been hypothesized previously, and in line with those hypotheses our model shows only minor induction if conformational change is not allowed. Overall, this first quantitative model for the bgl operon gives reasonable predictions that are close to experimental results (where measured). It will be further refined as values of the parameters are determined experimentally. The model was developed in Systems Biology Markup Language (SBML) and it is available from the authors and from the Biomodels repository [www.ebi.ac.uk/biomodels].

  13. Risk assessment of agricultural water requirement based on a multi-model ensemble framework, southwest of Iran

    Science.gov (United States)

    Zamani, Reza; Akhond-Ali, Ali-Mohammad; Roozbahani, Abbas; Fattahi, Rouhollah

    2017-08-01

    Water shortage and climate change are the most important issues of sustainable agricultural and water resources development. Given the importance of water availability in crop production, the present study focused on risk assessment of climate change impact on agricultural water requirement in southwest of Iran, under two emission scenarios (A2 and B1) for the future period (2025-2054). A multi-model ensemble framework based on mean observed temperature-precipitation (MOTP) method and a combined probabilistic approach Long Ashton Research Station-Weather Generator (LARS-WG) and change factor (CF) have been used for downscaling to manage the uncertainty of outputs of 14 general circulation models (GCMs). The results showed an increasing temperature in all months and irregular changes of precipitation (either increasing or decreasing) in the future period. In addition, the results of the calculated annual net water requirement for all crops affected by climate change indicated an increase between 4 and 10 %. Furthermore, an increasing process is also expected regarding to the required water demand volume. The most and the least expected increase in the water demand volume is about 13 and 5 % for A2 and B1 scenarios, respectively. Considering the results and the limited water resources in the study area, it is crucial to provide water resources planning in order to reduce the negative effects of climate change. Therefore, the adaptation scenarios with the climate change related to crop pattern and water consumption should be taken into account.

  14. Separate valuation subsystems for delay and effort decision costs.

    Science.gov (United States)

    Prévost, Charlotte; Pessiglione, Mathias; Météreau, Elise; Cléry-Melin, Marie-Laure; Dreher, Jean-Claude

    2010-10-20

    Decision making consists of choosing among available options on the basis of a valuation of their potential costs and benefits. Most theoretical models of decision making in behavioral economics, psychology, and computer science propose that the desirability of outcomes expected from alternative options can be quantified by utility functions. These utility functions allow a decision maker to assign subjective values to each option under consideration by weighting the likely benefits and costs resulting from an action and to select the one with the highest subjective value. Here, we used model-based neuroimaging to test whether the human brain uses separate valuation systems for rewards (erotic stimuli) associated with different types of costs, namely, delay and effort. We show that humans devalue rewards associated with physical effort in a strikingly similar fashion to those they devalue that are associated with delays, and that a single computational model derived from economics theory can account for the behavior observed in both delay discounting and effort discounting. However, our neuroimaging data reveal that the human brain uses distinct valuation subsystems for different types of costs, reflecting in opposite fashion delayed reward and future energetic expenses. The ventral striatum and the ventromedial prefrontal cortex represent the increasing subjective value of delayed rewards, whereas a distinct network, composed of the anterior cingulate cortex and the anterior insula, represent the decreasing value of the effortful option, coding the expected expense of energy. Together, these data demonstrate that the valuation processes underlying different types of costs can be fractionated at the cerebral level.

  15. Review of current severe accident management approaches in Europe and identification of related modelling requirements for the computer code ASTEC V2.1

    Energy Technology Data Exchange (ETDEWEB)

    Hermsmeyer, S. [European Commission JRC, Petten (Netherlands). Inst. for Energy and Transport; Herranz, L.E.; Iglesias, R. [CIEMAT, Madrid (Spain); and others

    2015-07-15

    The severe accident at the Fukushima-Daiichi nuclear power plant (NPP) has led to a worldwide review of nuclear safety approaches and is bringing a refocussing of R and D in the field. To support these efforts several new Euratom FP7 projects have been launched. The CESAM project focuses on the improvement of the ASTEC computer code. ASTEC is jointly developed by IRSN and GRS and is considered as the European reference code for Severe Accident Analyses since it capitalizes knowledge from the extensive Euro-pean R and D in the field. The project aims at the code's enhancement and extension for use in Severe Accident Management (SAM) analysis of the NPPs of Generation II-III presently under operation or foreseen in the near future in Europe, spent fuel pools included. The work reported here is concerned with the importance, for the further development of the code, of SAM strategies to be simulated. To this end, SAM strategies applied in the EU have been compiled. This compilation is mainly based on the public information made available in the frame of the EU ''stress tests'' for NPPs and has been complemented by information pro-vided by the different CESAM partners. The context of SAM is explained and the strategies are presented. The modelling capabilities for the simulation of these strategies in the current production version 2.0 of ASTEC are discussed. Furthermore, the requirements for the next version of ASTEC V2.1 that is supported in the CESAM project are highlighted. They are a necessary complement to the list of code improvements that is drawn from consolidating new fields of application, like SFP and BWR model enhancements, and from new experimental results on severe accident phenomena.

  16. Data and modelling requirements for CO2 inversions using high-frequency data

    International Nuclear Information System (INIS)

    Law, R.M.; Rayner, P.J.; Steele, L.P.; Enting, I.G.

    2003-01-01

    We explore the future possibilities for CO 2 source estimation from atmospheric concentration data by performing synthetic data experiments. Synthetic data are used to test seasonal CO 2 inversions using high-frequency data. Monthly CO 2 sources over the Australian region are calculated for inversions with data at 4-hourly frequency and averaged over 1 d, 2.5 d, 5 d, 12.17 d and 1 month. The inversion quality, as determined by bias and uncertainty, is degraded when averaging over longer periods. This shows the value of the strong but relatively short-lived signals present in high-frequency records that are removed in averaged and particularly filtered records. Sensitivity tests are performed in which the synthetic data are 'corrupted' to simulate systematic measurement errors such as intercalibration differences or to simulate transport modelling errors. The inversion is also used to estimate the effect of calibration offsets between sites. We find that at short data-averaging periods the inversion is reasonably robust to measurement-type errors. For transport-type errors, the best results are achieved for synoptic (2-5 d) timescales. Overall the tests indicate that improved source estimates should be possible by incorporating continuous measurements into CO 2 inversions

  17. Computational approaches and metrics required for formulating biologically realistic nanomaterial pharmacokinetic models

    International Nuclear Information System (INIS)

    Riviere, Jim E; Scoglio, Caterina; Sahneh, Faryad D; Monteiro-Riviere, Nancy A

    2013-01-01

    The field of nanomaterial pharmacokinetics is in its infancy, with major advances largely restricted by a lack of biologically relevant metrics, fundamental differences between particles and small molecules of organic chemicals and drugs relative to biological processes involved in disposition, a scarcity of sufficiently rich and characterized in vivo data and a lack of computational approaches to integrating nanomaterial properties to biological endpoints. A central concept that links nanomaterial properties to biological disposition, in addition to their colloidal properties, is the tendency to form a biocorona which modulates biological interactions including cellular uptake and biodistribution. Pharmacokinetic models must take this crucial process into consideration to accurately predict in vivo disposition, especially when extrapolating from laboratory animals to humans since allometric principles may not be applicable. The dynamics of corona formation, which modulates biological interactions including cellular uptake and biodistribution, is thereby a crucial process involved in the rate and extent of biodisposition. The challenge will be to develop a quantitative metric that characterizes a nanoparticle's surface adsorption forces that are important for predicting biocorona dynamics. These types of integrative quantitative approaches discussed in this paper for the dynamics of corona formation must be developed before realistic engineered nanomaterial risk assessment can be accomplished. (paper)

  18. Core competency requirements among extension workers in peninsular Malaysia: Use of Borich's needs assessment model.

    Science.gov (United States)

    Umar, Sulaiman; Man, Norsida; Nawi, Nolila Mohd; Latif, Ismail Abd; Samah, Bahaman Abu

    2017-06-01

    The study described the perceived importance of, and proficiency in core agricultural extension competencies among extension workers in Peninsular Malaysia; and evaluating the resultant deficits in the competencies. The Borich's Needs Assessment Model was used to achieve the objectives of the study. A sample of 298 respondents was randomly selected and interviewed using a pre-tested structured questionnaire. Thirty-three core competency items were assessed. Instrument validity and reliability were ensured. The cross-sectional data obtained was analysed using SPSS for descriptive statistics including mean weighted discrepancy score (MWDS). Results of the study showed that on a scale of 5, the most important core extension competency items according to respondents' perception were: "Making good use of information and communication technologies/access and use of web-based resources" (M=4.86, SD=0.23); "Conducting needs assessments" (M=4.84, SD=0.16); "organizing extension campaigns" (M=4.82, SD=0.47) and "Managing groups and teamwork" (M=4.81, SD=0.76). In terms of proficiency, the highest competency identified by the respondents was "Conducting farm and home visits (M=3.62, SD=0.82) followed by 'conducting meetings effectively' (M=3.19, SD=0.72); "Conducting focus group discussions" (M=3.16, SD=0.32) and "conducting community forums" (M=3.13, SD=0.64). The discrepancies implying competency deficits were widest in "Acquiring and allocating resources" (MWDS=12.67); use of information and communication technologies (ICTs) and web-based resources in agricultural extension (MWDS=12.59); and report writing and sharing the results and impacts (MWDS=11.92). It is recommended that any intervention aimed at developing the capacity of extension workers in Peninsular Malaysia should prioritize these core competency items in accordance with the deficits established in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Comparison of listing strategies for allosensitized heart transplant candidates requiring transplant at high urgency: a decision model analysis.

    Science.gov (United States)

    Feingold, B; Webber, S A; Bryce, C L; Park, S Y; Tomko, H E; Comer, D M; Mahle, W T; Smith, K J

    2015-02-01

    Allosensitized children who require a negative prospective crossmatch have a high risk of death awaiting heart transplantation. Accepting the first suitable organ offer, regardless of the possibility of a positive crossmatch, would improve waitlist outcomes but it is unclear whether it would result in improved survival at all times after listing, including posttransplant. We created a Markov decision model to compare survival after listing with a requirement for a negative prospective donor cell crossmatch (WAIT) versus acceptance of the first suitable offer (TAKE). Model parameters were derived from registry data on status 1A (highest urgency) pediatric heart transplant listings. We assumed no possibility of a positive crossmatch in the WAIT strategy and a base-case probability of a positive crossmatch in the TAKE strategy of 47%, as estimated from cohort data. Under base-case assumptions, TAKE showed an incremental survival benefit of 1.4 years over WAIT. In multiple sensitivity analyses, including variation of the probability of a positive crossmatch from 10% to 100%, TAKE was consistently favored. While model input data were less well suited to comparing survival when awaiting transplantation across a negative virtual crossmatch, our analysis suggests that taking the first suitable organ offer under these circumstances is also favored. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.

  20. In-core power sharing and fuel requirement study for a decommissioning Boiling Water Reactor using the linear reactivity model

    International Nuclear Information System (INIS)

    Chen, Chung-Yuan; Tung, Wu-Hsiung; Yaur, Shung-Jung; Kuo, Weng-Sheng

    2014-01-01

    Highlights: • Linear reactivity model (LRM) was modified and applied to Boiling Water Reactor. • The power sharing and fuel requirement study of the last cycle and two cycles before decommissioning was implemented. • The loading pattern design concept for the cycles before decommissioning is carried out. - Abstract: A study of in-core power sharing and fuel requirement for a decommissioning BWR (Boiling Water Reactor) was carried out using the linear reactivity model (LRM). The power sharing of each fuel batch was taken as an independent variable, and the related parameters were set and modified to simulate actual cases. Optimizations of the last cycle and two cycles before decommissioning were both implemented; in the last-one-cycle optimization, a single cycle optimization was carried out with different upper limits of fuel batch power, whereas, in the two-cycle optimization, two cycles were optimized with different cycle lengths, along with two different optimization approaches which are the simultaneous optimization of two cycles (MO) and two successive single-cycle optimizations (SO). The results of the last-one-cycle optimization show that it is better to increase the fresh fuel power and decrease the thrice-burnt fuel power as much as possible. It also shows that relaxing the power limit is good to the fresh fuel requirement which will be reduced under lower power limit. On the other hand, the results of the last-two-cycle (cycle N-1 and N) optimization show that the MO is better than SO, and the power of fresh fuel batch should be decreased in cycle N-1 to save its energy for the next cycle. The results of the single-cycle optimization are found to be the same as that in cycle N of the multi-cycle optimization. Besides that, under the same total energy requirement of two cycles, a long-short distribution of cycle length design can save more fresh fuel

  1. Reduction of sample size requirements by bilateral versus unilateral research designs in animal models for cartilage tissue engineering.

    Science.gov (United States)

    Orth, Patrick; Zurakowski, David; Alini, Mauro; Cucchiarini, Magali; Madry, Henning

    2013-11-01

    Advanced tissue engineering approaches for articular cartilage repair in the knee joint rely on translational animal models. In these investigations, cartilage defects may be established either in one joint (unilateral design) or in both joints of the same animal (bilateral design). We hypothesized that a lower intraindividual variability following the bilateral strategy would reduce the number of required joints. Standardized osteochondral defects were created in the trochlear groove of 18 rabbits. In 12 animals, defects were produced unilaterally (unilateral design; n=12 defects), while defects were created bilaterally in 6 animals (bilateral design; n=12 defects). After 3 weeks, osteochondral repair was evaluated histologically applying an established grading system. Based on intra- and interindividual variabilities, required sample sizes for the detection of discrete differences in the histological score were determined for both study designs (α=0.05, β=0.20). Coefficients of variation (%CV) of the total histological score values were 1.9-fold increased following the unilateral design when compared with the bilateral approach (26 versus 14%CV). The resulting numbers of joints needed to treat were always higher for the unilateral design, resulting in an up to 3.9-fold increase in the required number of experimental animals. This effect was most pronounced for the detection of small-effect sizes and estimating large standard deviations. The data underline the possible benefit of bilateral study designs for the decrease of sample size requirements for certain investigations in articular cartilage research. These findings might also be transferred to other scoring systems, defect types, or translational animal models in the field of cartilage tissue engineering.

  2. Good Faith, Bad Faith? Making an Effort in Dispute Resolution

    Directory of Open Access Journals (Sweden)

    Tania Sourdin

    2013-12-01

    Full Text Available The behaviour of those engaged in negotiation and Alternative Dispute Resolution (ADR processes that are undertaken or required before or after litigation is increasingly the subject of legislative regulation. Recent case law has also more clearly articulated the characteristics of good faith as well as other standards such as 'genuine effort' and explored to a limited extent the behavioural indicators and approaches that could be used to determine the meaning and scope of these types of concepts. Arguably, the growth in mandatory (rather than voluntary ADR may require the articulation of clearer conduct obligations as ADR participants may be disinclined to negotiate or may be relatively unsophisticated or unaware of their negotiation behaviour. This article explores the development of conduct obligations and notes that whilst the requirements need to be linked to the circumstances of each dispute, there are some clear differences in terms of how these requirements are more generally interpreted by lawyers and others.

  3. Effort, success, and nonuse determine arm choice.

    Science.gov (United States)

    Schweighofer, Nicolas; Xiao, Yupeng; Kim, Sujin; Yoshioka, Toshinori; Gordon, James; Osu, Rieko

    2015-07-01

    How do humans choose one arm or the other to reach single targets in front of the body? Current theories of reward-driven decisionmaking predict that choice results from a comparison of "action values," which are the expected rewards for possible actions in a given state. In addition, current theories of motor control predict that in planning arm movements, humans minimize an expected motor cost that balances motor effort and endpoint accuracy. Here, we test the hypotheses that arm choice is determined by comparison of action values comprising expected effort and expected task success for each arm, as well as a handedness bias. Right-handed subjects, in either a large or small target condition, were first instructed to use each hand in turn to shoot through an array of targets and then to choose either hand to shoot through the same targets. Effort was estimated via inverse kinematics and dynamics. A mixed-effects logistic-regression analysis showed that, as predicted, both expected effort and expected success predicted choice, as did arm use in the preceding trial. Finally, individual parameter estimation showed that the handedness bias correlated with mean difference between right- and left-arm success, leading to overall lower use of the left arm. We discuss our results in light of arm nonuse in individuals' poststroke. Copyright © 2015 the American Physiological Society.

  4. Philanthropies Add Weight to "i3" Effort

    Science.gov (United States)

    Robelen, Erik W.; McNeil, Michele

    2010-01-01

    The author reports on a new effort by 12 major education philanthropies that aims to dovetail with the Education Department's "i3" agenda, raising complex issues. The decision by a dozen major education grantmakers to team up on an initiative designed to dovetail with the federal "Investing in Innovation" grant competition is being seen by…

  5. Behavior Contracts: A Home School Cooperative Effort.

    Science.gov (United States)

    Stitely, Rose Patton

    1978-01-01

    This paper focuses on behavior contracts at school which attempt to promote a home school cooperative effort. The contract is drawn up at school, and classroom teachers award points for appropriate school behaviors; parents in turn reward the student if the report is good. (DS)

  6. Has Malaysia's antidrug effort been effective?

    Science.gov (United States)

    Scorzelli, J F

    1992-01-01

    It is a common belief that a massive effort in law enforcement, preventive education and rehabilitation will result in the elimination of a country's drug problem. Based on this premise. Malaysia in 1983 implemented such a multifaceted anti-drug strategy, and the results of a 1987 study by the author suggested that Malaysia's effort had begun to contribute to a steady decrease in the number of identified drug abusers. Although the number of drug-addicted individuals declined, the country's recidivism rates were still high. Because of this high relapse rate, Malaysia expanded their rehabilitation effort and developed a community transition program. In order to determine the impact of these changes on the country's battle against drug abuse, a follow-up study was conducted in 1990. The results of this study did not clearly demonstrate that the Malaysian effort had been successful in eliminating the problem of drug abuse, and raised some questions concerning the effectiveness of the country's drug treatment programs.

  7. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  8. Coordinating the United States Interagency Partnering Effort

    Science.gov (United States)

    2013-03-01

    operations now will mean throwing 18 away hard-fought gains, and expose the United States to new risks from across the globalising ...effort and how they present themselves. OIP will leverage technology and training to provide the best information and personnel for interagency

  9. Hydrogen economy: a little bit more effort

    International Nuclear Information System (INIS)

    Pauron, M.

    2008-01-01

    In few years, the use of hydrogen in economy has become a credible possibility. Today, billions of euros are invested in the hydrogen industry which is strengthened by technological advances in fuel cells development and by an increasing optimism. However, additional research efforts and more financing will be necessary to make the dream of an hydrogen-based economy a reality

  10. Motivational climate, behaviour regulation and perceived effort in soccer athletes

    Directory of Open Access Journals (Sweden)

    Diogo Monteiro

    2014-12-01

    Full Text Available The purpose of this study was to test the integration of two motivational theoretical models (self-determination theory and the achievement goal theory to analyze the impact of motivational climate in the regulation of motivation and athletes´ effort perception. Participated in the study 460 athletes (male football players at both regional and national level, on the categories of beginners, youth, juniors and seniors, with 17.42 ± 4.37 years-old. The quality of the structural equation model was examined by the Chi-square value and some complementary model fit indices. The results support the model fit (S-Bχ²= 288.84, df= 147, p< 0.001, S-Bχ²/df= 1.96, SRMR= 0.049, NNFI= 0.912, CFI= 0.924, RMSEA= 0.046, 90%IC RMSEA= 0.038−0.054, suggesting that a motivational task-oriented climate has a significant positive effect on autonomous motivation, which in turn has a significant positive effect on athletes' effort perception. On the other hand, an ego-oriented environment had a positive effect on the controlled motivation, which in turn had a negative effect on athletes' effort perception, although not significant.

  11. On Deriving Requirements for the Surface Mass Balance forcing of a Greenland Ice Sheet Model using Uncertainty Analyses

    Science.gov (United States)

    Schlegel, N.; Larour, E. Y.; Box, J. E.

    2015-12-01

    During July of 2012, the percentage of the Greenland surface exposed to melt was the largest in recorded history. And, even though evidence of increased melt rates had been captured by remote sensing observations throughout the last decade, this particular event took the community by surprise. How Greenland ice flow will respond to such an event or to increased frequencies of extreme melt events in the future is unclear, as it requires detailed comprehension of Greenland surface climate and the ice sheet's sensitivity to associated uncertainties. With established uncertainty quantification (UQ) tools embedded within the Ice Sheet System Model (ISSM), we conduct decadal-scale forward modeling experiments to 1) quantify the spatial resolution needed to effectively force surface mass balance (SMB) in various regions of the ice sheet and 2) determine the dynamic response of Greenland outlet glaciers to variations in SMB. First, we perform sensitivity analyses to determine how perturbations in SMB affect model output; results allow us to investigate the locations where variations most significantly affect ice flow, and on what spatial scales. Next, we apply Monte-Carlo style sampling analyses to determine how errors in SMB propagate through the model as uncertainties in estimates of Greenland ice discharge and regional mass balance. This work is performed at the California Institute of Technology's Jet Propulsion Laboratory under a contract with the National Aeronautics and Space Administration's Cryosphere Program.

  12. Serotonin transporter is not required for the development of severe pulmonary hypertension in the Sugen hypoxia rat model.

    Science.gov (United States)

    de Raaf, Michiel Alexander; Kroeze, Yvet; Middelman, Anthonieke; de Man, Frances S; de Jong, Helma; Vonk-Noordegraaf, Anton; de Korte, Chris; Voelkel, Norbert F; Homberg, Judith; Bogaard, Harm Jan

    2015-11-15

    Increased serotonin serum levels have been proposed to play a key role in pulmonary arterial hypertension (PAH) by regulating vessel tone and vascular smooth muscle cell proliferation. An intact serotonin system, which critically depends on a normal function of the serotonin transporter (SERT), is required for the development of experimental pulmonary hypertension in rodents exposed to hypoxia or monocrotaline. While these animal models resemble human PAH only with respect to vascular media remodeling, we hypothesized that SERT is likewise required for the presence of lumen-obliterating intima remodeling, a hallmark of human PAH reproduced in the Sugen hypoxia (SuHx) rat model of severe angioproliferative pulmonary hypertension. Therefore, SERT wild-type (WT) and knockout (KO) rats were exposed to the SuHx protocol. SERT KO rats, while completely lacking SERT, were hemodynamically indistinguishable from WT rats. After exposure to SuHx, similar degrees of severe angioproliferative pulmonary hypertension and right ventricular hypertrophy developed in WT and KO rats (right ventricular systolic pressure 60 vs. 55 mmHg, intima thickness 38 vs. 30%, respectively). In conclusion, despite its implicated importance in PAH, SERT does not play an essential role in the pathogenesis of severe angioobliterative pulmonary hypertension in rats exposed to SuHx. Copyright © 2015 the American Physiological Society.

  13. Sensitivity and requirement of improvements of four soybean crop simulation models for climate change studies in Southern Brazil

    Science.gov (United States)

    Battisti, R.; Sentelhas, P. C.; Boote, K. J.

    2017-12-01

    Crop growth models have many uncertainties that affect the yield response to climate change. Based on that, the aim of this study was to evaluate the sensitivity of crop models to systematic changes in climate for simulating soybean attainable yield in Southern Brazil. Four crop models were used to simulate yields: AQUACROP, MONICA, DSSAT, and APSIM, as well as their ensemble. The simulations were performed considering changes of air temperature (0, + 1.5, + 3.0, + 4.5, and + 6.0 °C), [CO2] (380, 480, 580, 680, and 780 ppm), rainfall (- 30, - 15, 0, + 15, and + 30%), and solar radiation (- 15, 0, + 15), applied to daily values. The baseline climate was from 1961 to 2014, totalizing 53 crop seasons. The crop models simulated a reduction of attainable yield with temperature increase, reaching 2000 kg ha-1 for the ensemble at + 6 °C, mainly due to shorter crop cycle. For rainfall, the yield had a higher rate of reduction when it was diminished than when rainfall was increased. The crop models increased yield variability when solar radiation was changed from - 15 to + 15%, whereas [CO2] rise resulted in yield gains, following an asymptotic response, with a mean increase of 31% from 380 to 680 ppm. The models used require further attention to improvements in optimal and maximum cardinal temperature for development rate; runoff, water infiltration, deep drainage, and dynamic of root growth; photosynthesis parameters related to soil water availability; and energy balance of soil-plant system to define leaf temperature under elevated CO2.

  14. Motor effort alters changes of mind in sensorimotor decision making.

    Directory of Open Access Journals (Sweden)

    Diana Burk

    Full Text Available After committing to an action, a decision-maker can change their mind to revise the action. Such changes of mind can even occur when the stream of information that led to the action is curtailed at movement onset. This is explained by the time delays in sensory processing and motor planning which lead to a component at the end of the sensory stream that can only be processed after initiation. Such post-initiation processing can explain the pattern of changes of mind by asserting an accumulation of additional evidence to a criterion level, termed change-of-mind bound. Here we test the hypothesis that physical effort associated with the movement required to change one's mind affects the level of the change-of-mind bound and the time for post-initiation deliberation. We varied the effort required to change from one choice target to another in a reaching movement by varying the geometry of the choice targets or by applying a force field between the targets. We show that there is a reduction in the frequency of change of mind when the separation of the choice targets would require a larger excursion of the hand from the initial to the opposite choice. The reduction is best explained by an increase in the evidence required for changes of mind and a reduced time period of integration after the initial decision. Thus the criteria to revise an initial choice is sensitive to energetic costs.

  15. A Simple Scheme for Modeling Irrigation Water Requirements at the Regional Scale Applied to an Alpine River Catchment

    Directory of Open Access Journals (Sweden)

    Pascalle C. Smith

    2012-11-01

    Full Text Available This paper presents a simple approach for estimating the spatial and temporal variability of seasonal net irrigation water requirement (IWR at the catchment scale, based on gridded land use, soil and daily weather data at 500 × 500 m resolution. In this approach, IWR is expressed as a bounded, linear function of the atmospheric water budget, whereby the latter is defined as the difference between seasonal precipitation and reference evapotranspiration. To account for the effects of soil and crop properties on the soil water balance, the coefficients of the linear relation are expressed as a function of the soil water holding capacity and the so-called crop coefficient. The 12 parameters defining the relation were estimated with good coefficients of determination from a systematic analysis of simulations performed at daily time step with a FAO-type point-scale model for five climatically contrasted sites around the River Rhone and for combinations of six crop and ten soil types. The simple scheme was found to reproduce well results obtained with the daily model at six additional verification sites. We applied the simple scheme to the assessment of irrigation requirements in the whole Swiss Rhone catchment. The results suggest seasonal requirements of 32 × 106 m3 per year on average over 1981–2009, half of which at altitudes above 1500 m. They also disclose a positive trend in the intensity of extreme events over the study period, with an estimated total IWR of 55 × 106 m3 in 2009, and indicate a 45% increase in water demand of grasslands during the 2003 European heat wave in the driest area of the studied catchment. In view of its simplicity, the approach can be extended to other applications, including assessments of the impacts of climate and land-use change.

  16. SeReM2--a meta-model for the structured definition of quality requirements for electronic health record services.

    Science.gov (United States)

    Hoerbst, Alexander; Hackl, Werner; Ammenwerth, Elske

    2010-01-01

    Quality assurance is a major task with regard to Electronic Health Records (EHR). Currently there are only a few approaches explicitly dealing with the quality of EHR services as a whole. The objective of this paper is to introduce a new Meta-Model to structure and describe quality requirements of EHRs. This approach should support the transnational quality certification of EHR services. The Model was developed based on interviews with 24 experts and a systematic literature search and comprises a service and requirements model. The service model represents the structure of a service whereas the requirements model can be used to assign specific predefined aims and requirements to a service. The new model differs from existing approaches as it accounts for modern software architectures and the special attributes of EHRs.

  17. Predicting Software Test Effort in Iterative Development Using a Dynamic Bayesian Network

    OpenAIRE

    Torkar, Richard; Awan, Nasir Majeed; Alvi, Adnan Khadem; Afzal, Wasif

    2010-01-01

    Projects following iterative software development methodologies must still be managed in a way as to maximize quality and minimize costs. However, there are indications that predicting test effort in iterative development is challenging and currently there seem to be no models for test effort prediction. This paper introduces and validates a dynamic Bayesian network for predicting test effort in iterative software devel- opment. The proposed model is validated by the use of data from two indu...

  18. Not all effort is equal: the role of the anterior cingulate cortex in different forms of effort-reward decisions

    Science.gov (United States)

    Holec, Victoria; Pirot, Heather L.; Euston, David R.

    2014-01-01

    The rat anterior cingulate cortex (ACC) mediates effort-based decision making when the task requires the physical effort of climbing a ramp. Normal rats will readily climb a barrier leading to high reward whereas rats with ACC lesions will opt instead for an easily obtained small reward. The present study explored whether the role of ACC in cost-benefit decisions extends beyond climbing by testing its role in ramp climbing as well as two novel cost-benefit decision tasks, one involving the physical effort of lifting weights and the other the emotional cost of overcoming fear (i.e., “courage”). As expected, rats with extensive ACC lesions tested on a ramp-climbing task were less likely to choose a high-reward/high-effort arm than sham controls. However, during the first few trials, lesioned rats were as likely as controls to initially turn into the high-reward arm (HRA) but far less likely to actually climb the barrier, suggesting that the role of the ACC is not in deciding which course of action to pursue, but rather in maintaining a course of action in the face of countervailing forces. In the effort-reward decision task involving weight lifting, some lesion animals behaved like controls while others avoided the HRA. However, the results were not statistically significant and a follow-up study using incremental increasing effort failed to show any difference between lesion and control groups. The results suggest that the ACC is not needed for effort-reward decisions involving weight lifting but may affect motor abilities. Finally, a courage task explored the willingness of rats to overcome the fear of crossing an open, exposed arm to obtain a high reward. Both sham and ACC-lesioned animals exhibited equal tendencies to enter the open arm. However, whereas sham animals gradually improved on the task, ACC-lesioned rats did not. Taken together, the results suggest that the role of the ACC in effort-reward decisions may be limited to certain tasks. PMID:24478659

  19. Not all effort is equal: the role of the anterior cingulate cortex in different forms of effort-reward decisions

    Directory of Open Access Journals (Sweden)

    Victoria eHolec

    2014-01-01

    Full Text Available The rat anterior cingulate cortex (ACC mediates effort-based decision making when the task requires the physical effort of climbing a ramp. Normal rats will readily climb a barrier leading to high reward whereas rats with ACC lesions will opt instead for an easily obtained small reward. The present study explored whether the role of ACC in cost-benefit decisions extends beyond climbing by testing its role in ramp climbing as well as two novel cost-benefit decision tasks, one involving the physical effort of lifting weights and the other the emotional cost of overcoming fear (i.e., courage. As expected, rats with extensive ACC lesions tested on a ramp-climbing task were less likely to choose a high-reward/high-effort arm than sham controls. However, during the first few trials, lesioned rats were as likely as controls to initially turn into the high-reward arm but far less likely to actually climb the barrier, suggesting that the role of the ACC is not in deciding which course of action to pursue, but rather in maintaining a course of action in the face of countervailing forces. In the effort-reward decision task involving weight lifting, some lesion animals behaved like controls while others avoided the high reward arm. However, the results were not statistically significant and a follow-up study using incremental increasing effort failed to show any difference between lesion and control groups. The results suggest that the ACC is not needed for effort-reward decisions involving weight lifting but may affect motor abilities. Finally, a courage task explored the willingness of rats to overcome the fear of crossing an open, exposed arm to obtain a high reward. Both sham and ACC-lesioned animals exhibited equal tendencies to enter the open arm. However, whereas sham animals gradually improved on the task, ACC-lesioned rats did not. Taken together, the results suggest that the role of the ACC in effort-reward decisions may be limited to certain

  20. Heart rate variability related to effort at work.

    Science.gov (United States)

    Uusitalo, Arja; Mets, Terhi; Martinmäki, Kaisu; Mauno, Saija; Kinnunen, Ulla; Rusko, Heikki

    2011-11-01

    Changes in autonomic nervous system function have been related to work stress induced increases in cardiovascular morbidity and mortality. Our purpose was to examine whether various heart rate variability (HRV) measures and new HRV-based relaxation measures are related to self-reported chronic work stress and daily emotions. The relaxation measures are based on neural network modelling of individual baseline heart rate and HRV information. Nineteen healthy hospital workers were studied during two work days during the same work period. Daytime, work time and night time heart rate, as well as physical activity were recorded. An effort-reward imbalance (ERI) questionnaire was used to assess chronic work stress. The emotions of stress, irritation and satisfaction were assessed six times during both days. Seventeen subjects had an ERI ratio over 1, indicating imbalance between effort and reward, that is, chronic work stress. Of the daily emotions, satisfaction was the predominant emotion. The daytime relaxation percentage was higher on Day 2 than on Day 1 (4 ± 6% vs. 2 ± 3%, p work time relaxation on the both Days. Chronic work stress correlated with the vagal activity index of HRV. However, effort at work had many HRV correlates: the higher the work effort the lower daytime HRV and relaxation time. Emotions at work were also correlated with work time (stress and satisfaction) and night time (irritation) HRV. These results indicate that daily emotions at work and chronic work stress, especially effort, is associated with cardiac autonomic function. Neural network modelling of individual heart rate and HRV information may provide additional information in stress research in field conditions. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.