WorldWideScience

Sample records for modeling efforts required

  1. Cognitive Effort Requirements in Recall, Recognition, and Lexical Decision

    Science.gov (United States)

    1985-05-01

    that the amount of integrative processing required for an item is a function of its preexisting or baseline familiarity level . Low frequency words... Lockhart , Craik , & Jacoby, 1976). In the present study, increased effort, and possibly increased distinctiveness, does not influence hit rates, which are...ing of items. Second, a lexical decision task, which does not require elabo- rative processing , leads to an overall poor level of recall. Furthermore

  2. Manage changes in the requirements definition through a collaborative effort

    CSIR Research Space (South Africa)

    Joseph-Malherbe, S

    2009-08-01

    Full Text Available to the engineering effort. A history of changes made to the data repository should be kept throughout the SE process. The software development community refers to such a history of changes as version control. This will enable the systems engineer to generate a list... of changes at any point during the development process, showing the time of the change and by whom the change was introduced. By sharing the change history with stakeholders, they can see how the model evolved and what the rationale was for each change...

  3. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    Science.gov (United States)

    Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.

  4. Identification of efforts required for continued safe operation of KANUPP

    International Nuclear Information System (INIS)

    Ghafoor, M.A.; Hashmi, J.A.; Siddiqui, Z.H.

    1991-01-01

    Kanupp, the first commercial CANDU PHWR, rated at 137 MWe, was built on turnkey basis by the Canadian General Electric Company for the Pakistan Atomic Energy Commission, and went operational in October, 1972 near Karachi. It has operated since then with a lifetime average availability factor of 51.5% and capacity factor of 25%. In 1976, Kanupp suffered loss of technical support from its original vendors due to the Canadian embargo on export of nuclear technology. Simultaneously, the world experienced the most explosive development and advancement in electronic and computer technology, accelerating the obsolescence of such equipment and systems installed in Kanupp. Replacement upgrading of obsolete computers, control and instrumentation was thus the first major set of efforts realized as essential f or continued safe operation. On the other hand, Kanupp was able to cope with the normal maintenance of its process, mechanical and electrical equipment till the late 80's. But now many of these components are reaching the end of their useful life, and developing chronic problems due to ageing, which can only be solved by complete replacement. This is much more difficult for custom-made nuclear process equipment, e.g. the reactor internals and the fuelling machine. Public awareness and international concern about nuclear safety have increased significantly since the TMI and Chernobyl events. Corresponding realization of the critical role of human factors and the importance of operational experience feedback, has helped Kanupp by opening international channels of communication, including renewed cooperation on CANDU technology. The safety standards and criteria for CANDU as well as other NPPs have matured and evolved gradually over the past two decades. First Kanupp has to ensure that its present ageing-induced equipment problems are resolved to satisfy the original safety requirements and public risk targets which are still internationally acceptable. But as a policy, we

  5. Identification of efforts required for continued safe operation of KANUPP

    Energy Technology Data Exchange (ETDEWEB)

    Ghafoor, M A; Hashmi, J A; Siddiqui, Z H [Karachi Nuclear Power Plant, Karachi (Pakistan)

    1991-04-01

    Kanupp, the first commercial CANDU PHWR, rated at 137 MWe, was built on turnkey basis by the Canadian General Electric Company for the Pakistan Atomic Energy Commission, and went operational in October, 1972 near Karachi. It has operated since then with a lifetime average availability factor of 51.5% and capacity factor of 25%. In 1976, Kanupp suffered loss of technical support from its original vendors due to the Canadian embargo on export of nuclear technology. Simultaneously, the world experienced the most explosive development and advancement in electronic and computer technology, accelerating the obsolescence of such equipment and systems installed in Kanupp. Replacement upgrading of obsolete computers, control and instrumentation was thus the first major set of efforts realized as essential f or continued safe operation. On the other hand, Kanupp was able to cope with the normal maintenance of its process, mechanical and electrical equipment till the late 80's. But now many of these components are reaching the end of their useful life, and developing chronic problems due to ageing, which can only be solved by complete replacement. This is much more difficult for custom-made nuclear process equipment, e.g. the reactor internals and the fuelling machine. Public awareness and international concern about nuclear safety have increased significantly since the TMI and Chernobyl events. Corresponding realization of the critical role of human factors and the importance of operational experience feedback, has helped Kanupp by opening international channels of communication, including renewed cooperation on CANDU technology. The safety standards and criteria for CANDU as well as other NPPs have matured and evolved gradually over the past two decades. First Kanupp has to ensure that its present ageing-induced equipment problems are resolved to satisfy the original safety requirements and public risk targets which are still internationally acceptable. But as a policy, we

  6. Hierarchy, Dominance, and Deliberation: Egalitarian Values Require Mental Effort.

    Science.gov (United States)

    Van Berkel, Laura; Crandall, Christian S; Eidelman, Scott; Blanchar, John C

    2015-09-01

    Hierarchy and dominance are ubiquitous. Because social hierarchy is early learned and highly rehearsed, the value of hierarchy enjoys relative ease over competing egalitarian values. In six studies, we interfere with deliberate thinking and measure endorsement of hierarchy and egalitarianism. In Study 1, bar patrons' blood alcohol content was correlated with hierarchy preference. In Study 2, cognitive load increased the authority/hierarchy moral foundation. In Study 3, low-effort thought instructions increased hierarchy endorsement and reduced equality endorsement. In Study 4, ego depletion increased hierarchy endorsement and caused a trend toward reduced equality endorsement. In Study 5, low-effort thought instructions increased endorsement of hierarchical attitudes among those with a sense of low personal power. In Study 6, participants' thinking quickly allocated more resources to high-status groups. Across five operationalizations of impaired deliberative thought, hierarchy endorsement increased and egalitarianism receded. These data suggest hierarchy may persist in part because it has a psychological advantage. © 2015 by the Society for Personality and Social Psychology, Inc.

  7. ERP services effort estimation strategies based on early requirements

    NARCIS (Netherlands)

    Erasmus, I.P.; Daneva, Maia; Kalenborg, Axel; Trapp, Marcus

    2015-01-01

    ERP clients and vendors necessarily estimate their project interventions at a very early stage, before the full requirements to an ERP solution are known and often before a contract is finalized between a vendor/ consulting company and a client. ERP project estimation at the stage of early

  8. Air Quality Science and Regulatory Efforts Require Geostationary Satellite Measurements

    Science.gov (United States)

    Pickering, Kenneth E.; Allen, D. J.; Stehr, J. W.

    2006-01-01

    Air quality scientists and regulatory agencies would benefit from the high spatial and temporal resolution trace gas and aerosol data that could be provided by instruments on a geostationary platform. More detailed time-resolved data from a geostationary platform could be used in tracking regional transport and in evaluating mesoscale air quality model performance in terms of photochemical evolution throughout the day. The diurnal cycle of photochemical pollutants is currently missing from the data provided by the current generation of atmospheric chemistry satellites which provide only one measurement per day. Often peak surface ozone mixing ratios are reached much earlier in the day during major regional pollution episodes than during local episodes due to downward mixing of ozone that had been transported above the boundary layer overnight. The regional air quality models often do not simulate this downward mixing well enough and underestimate surface ozone in regional episodes. Having high time-resolution geostationary data will make it possible to determine the magnitude of this lower-and mid-tropospheric transport that contributes to peak eight-hour average ozone and 24-hour average PM2.5 concentrations. We will show ozone and PM(sub 2.5) episodes from the CMAQ model and suggest ways in which geostationary satellite data would improve air quality forecasting. Current regulatory modeling is typically being performed at 12 km horizontal resolution. State and regional air quality regulators in regions with complex topography and/or land-sea breezes are anxious to move to 4-km or finer resolution simulations. Geostationary data at these or finer resolutions will be useful in evaluating such models.

  9. Efforts and models of education for parents

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal

    2010-01-01

    Artiklen omfatter en gennemgang af modeller for forældreuddannelse, der fortrinsvis anvendes i Danmark. Artiklen indlejrer modellerne i nogle bredere blikke på uddannelsessystemet og den aktuelle diskurs om ansvarliggørelse af forældre.   Udgivelsesdato: Marts 2010...

  10. Standardizing economic analysis in prevention will require substantial effort.

    Science.gov (United States)

    Guyll, Max

    2014-12-01

    It is exceedingly difficult to compare results of economic analyses across studies due to variations in assumptions, methodology, and outcome measures, a fact which surely decreases the impact and usefulness of prevention-related economic research. Therefore, Crowley et al. (Prevention Science, 2013) are precisely correct in their call for increased standardization and have usefully highlighted the issues that must be addressed. However, having made the need clear, the questions become what form the solution should take, and how should it be implemented. The present discussion outlines the rudiments of a comprehensive framework for promoting standardized methodology in the estimation of economic outcomes, as encouraged by Crowley et al. In short, a single, standard, reference case approach should be clearly articulated, and all economic research should be encouraged to apply that standard approach, with results from compliant analyses being reported in a central archive. Properly done, the process would increase the ability of those without specialized training to contribute to the body of economic research pertaining to prevention, and the most difficult tasks of predicting and monetizing distal outcomes would be readily completed through predetermined models. These recommendations might be viewed as somewhat forcible, insomuch as they advocate for prescribing the details of a standard methodology and establishing a means of verifying compliance. However, it is unclear that the best practices proposed by Crowley et al. will be widely adopted in the absence of a strong and determined approach.

  11. 29 CFR 1620.16 - Jobs requiring equal effort in performance.

    Science.gov (United States)

    2010-07-01

    ..., however, that men and women are working side by side on a line assembling parts. Suppose further that one... 29 Labor 4 2010-07-01 2010-07-01 false Jobs requiring equal effort in performance. 1620.16 Section... EQUAL PAY ACT § 1620.16 Jobs requiring equal effort in performance. (a) In general. The jobs to which...

  12. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormanjieva, Olga; Abran, A.; Braungarten, R.; Dumke, R.; Cuadrado-Gallego, J.; Brunekreef, J.

    2009-01-01

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient

  13. City Logistics Modeling Efforts : Trends and Gaps - A Review

    NARCIS (Netherlands)

    Anand, N.R.; Quak, H.J.; Van Duin, J.H.R.; Tavasszy, L.A.

    2012-01-01

    In this paper, we present a review of city logistics modeling efforts reported in the literature for urban freight analysis. The review framework takes into account the diversity and complexity found in the present-day city logistics practice. Next, it covers the different aspects in the modeling

  14. V and V Efforts of Auroral Precipitation Models: Preliminary Results

    Science.gov (United States)

    Zheng, Yihua; Kuznetsova, Masha; Rastaetter, Lutz; Hesse, Michael

    2011-01-01

    Auroral precipitation models have been valuable both in terms of space weather applications and space science research. Yet very limited testing has been performed regarding model performance. A variety of auroral models are available, including empirical models that are parameterized by geomagnetic indices or upstream solar wind conditions, now casting models that are based on satellite observations, or those derived from physics-based, coupled global models. In this presentation, we will show our preliminary results regarding V&V efforts of some of the models.

  15. Efforts - Final technical report on task 4. Physical modelling calidation

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; Olsson, David Dam; Christensen, T. W.

    The present report is documentation for the work carried out in Task 4 at DTU Physical modelling-validation on the Brite/Euram project No. BE96-3340, contract No. BRPR-CT97-0398, with the title Enhanced Framework for forging design using reliable three-dimensional simulation (EFFORTS). The report...

  16. Incorporating Responsiveness to Marketing Efforts in Brand Choice Modeling

    Directory of Open Access Journals (Sweden)

    Dennis Fok

    2014-02-01

    Full Text Available We put forward a brand choice model with unobserved heterogeneity that concerns responsiveness to marketing efforts. We introduce two latent segments of households. The first segment is assumed to respond to marketing efforts, while households in the second segment do not do so. Whether a specific household is a member of the first or the second segment at a specific purchase occasion is described by household-specific characteristics and characteristics concerning buying behavior. Households may switch between the two responsiveness states over time. When comparing the performance of our model with alternative choice models that account for various forms of heterogeneity for three different datasets, we find better face validity for our parameters. Our model also forecasts better.

  17. Routine inspection effort required for verification of a nuclear material production cutoff convention

    International Nuclear Information System (INIS)

    Fishbone, L.G.; Sanborn, J.

    1994-12-01

    Preliminary estimates of the inspection effort to verify a Nuclear Material Cutoff Convention are presented. The estimates are based on (1) a database of about 650 facilities a total of eight states, i.e., the five nuclear-weapons states and three ''threshold'' states; (2) typical figures for inspection requirements for specific facility types derived from IAEA experience, where applicable; and (3) alternative estimates of inspection effort in cutoff options where full IAEA safeguards are not stipulated. Considerable uncertainty must be attached to the effort estimates. About 50--60% of the effort for each option is attributable to 16 large-scale reprocessing plants assumed to be in operation in the eight states; it is likely that some of these will be shut down by the time the convention enters into force. Another important question involving about one third of the overall effort is whether Euratom inspections in France and the U.K. could obviate the need for full-scale IAEA inspections at these facilities. Finally, the database does not yet contain many small-scale and military-related facilities. The results are therefore not presented as predictions but as the consequences of alternative assumptions. Despite the preliminary nature of the estimates, it is clear that a broad application of NPT-like safeguards to the eight states would require dramatic increases in the IAEA's safeguards budget. It is also clear that the major component of the increased inspection effort would occur at large reprocessing plants (and associated plutonium facilities). Therefore, significantly bounding the increased effort requires a limitation on the inspection effort in these facility types

  18. An opportunity cost model of subjective effort and task performance

    Science.gov (United States)

    Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus

    2013-01-01

    Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775

  19. Modeling to Mars: a NASA Model Based Systems Engineering Pathfinder Effort

    Science.gov (United States)

    Phojanamongkolkij, Nipa; Lee, Kristopher A.; Miller, Scott T.; Vorndran, Kenneth A.; Vaden, Karl R.; Ross, Eric P.; Powell, Bobby C.; Moses, Robert W.

    2017-01-01

    The NASA Engineering Safety Center (NESC) Systems Engineering (SE) Technical Discipline Team (TDT) initiated the Model Based Systems Engineering (MBSE) Pathfinder effort in FY16. The goals and objectives of the MBSE Pathfinder include developing and advancing MBSE capability across NASA, applying MBSE to real NASA issues, and capturing issues and opportunities surrounding MBSE. The Pathfinder effort consisted of four teams, with each team addressing a particular focus area. This paper focuses on Pathfinder team 1 with the focus area of architectures and mission campaigns. These efforts covered the timeframe of February 2016 through September 2016. The team was comprised of eight team members from seven NASA Centers (Glenn Research Center, Langley Research Center, Ames Research Center, Goddard Space Flight Center IV&V Facility, Johnson Space Center, Marshall Space Flight Center, and Stennis Space Center). Collectively, the team had varying levels of knowledge, skills and expertise in systems engineering and MBSE. The team applied their existing and newly acquired system modeling knowledge and expertise to develop modeling products for a campaign (Program) of crew and cargo missions (Projects) to establish a human presence on Mars utilizing In-Situ Resource Utilization (ISRU). Pathfinder team 1 developed a subset of modeling products that are required for a Program System Requirement Review (SRR)/System Design Review (SDR) and Project Mission Concept Review (MCR)/SRR as defined in NASA Procedural Requirements. Additionally, Team 1 was able to perform and demonstrate some trades and constraint analyses. At the end of these efforts, over twenty lessons learned and recommended next steps have been identified.

  20. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts.

    Science.gov (United States)

    Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M

    2012-03-09

    A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.

  1. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  2. Synergies Between Grace and Regional Atmospheric Modeling Efforts

    Science.gov (United States)

    Kusche, J.; Springer, A.; Ohlwein, C.; Hartung, K.; Longuevergne, L.; Kollet, S. J.; Keune, J.; Dobslaw, H.; Forootan, E.; Eicker, A.

    2014-12-01

    In the meteorological community, efforts converge towards implementation of high-resolution (precipitation, evapotranspiration and runoff data; confirming that the model does favorably at representing observations. We show that after GRACE-derived bias correction, basin-average hydrological conditions prior to 2002 can be reconstructed better than before. Next, comparing GRACE with CLM forced by EURO-CORDEX simulations allows identifying processes needing improvement in the model. Finally, we compare COSMO-EU atmospheric pressure, a proxy for mass corrections in satellite gravimetry, with ERA-Interim over Europe at timescales shorter/longer than 1 month, and spatial scales below/above ERA resolution. We find differences between regional and global model more pronounced at high frequencies, with magnitude at sub-grid scale and larger scale corresponding to 1-3 hPa (1-3 cm EWH); relevant for the assessment of post-GRACE concepts.

  3. Linking effort and fishing mortality in a mixed fisheries model

    DEFF Research Database (Denmark)

    Thøgersen, Thomas Talund; Hoff, Ayoe; Frost, Hans Staby

    2012-01-01

    in fish stocks has led to overcapacity in many fisheries, leading to incentives for overfishing. Recent research has shown that the allocation of effort among fleets can play an important role in mitigating overfishing when the targeting covers a range of species (multi-species—i.e., so-called mixed...... fisheries), while simultaneously optimising the overall economic performance of the fleets. The so-called FcubEcon model, in particular, has elucidated both the biologically and economically optimal method for allocating catches—and thus effort—between fishing fleets, while ensuring that the quotas...

  4. Perception that "everything requires a lot of effort": transcultural SCL-25 item validation.

    Science.gov (United States)

    Moreau, Nicolas; Hassan, Ghayda; Rousseau, Cécile; Chenguiti, Khalid

    2009-09-01

    This brief report illustrates how the migration context can affect specific item validity of mental health measures. The SCL-25 was administered to 432 recently settled immigrants (220 Haitian and 212 Arabs). We performed descriptive analyses, as well as Infit and Outfit statistics analyses using WINSTEPS Rasch Measurement Software based on Item Response Theory. The participants' comments about the item You feel everything requires a lot of effort in the SCL-25 were also qualitatively analyzed. Results revealed that the item You feel everything requires a lot of effort is an outlier and does not adjust in an expected and valid fashion with its cluster items, as it is over-endorsed by Haitian and Arab healthy participants. Our study thus shows that, in transcultural mental health research, the cultural and migratory contexts may interact and significantly influence the meaning of some symptom items and consequently, the validity of symptom scales.

  5. Army Training: Efforts to Adjust Training Requirements Should Consider the Use of Virtual Training Devices

    Science.gov (United States)

    2016-08-01

    Needed to More Fully Assess Simulation-Based Efforts, GAO-13-698 (Washington, D.C.: Aug. 22, 2013). 5S . Rep. No. 114-49 and H. Rep. No. 114-102. Both...adapt to conditions, tactics, and even methods of conflict that may be impossible to accurately predict. Chemical, biological, radiological...all tasks required of drivers, such as dismounting during an operation to conduct maintenance on a vehicle

  6. Using a cloud to replenish parched groundwater modeling efforts.

    Science.gov (United States)

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  7. Using a cloud to replenish parched groundwater modeling efforts

    Science.gov (United States)

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  8. Characterization of infiltration rates from landfills: supporting groundwater modeling efforts.

    Science.gov (United States)

    Moo-Young, Horace; Johnson, Barnes; Johnson, Ann; Carson, David; Lew, Christine; Liu, Salley; Hancocks, Katherine

    2004-01-01

    The purpose of this paper is to review the literature to characterize infiltration rates from landfill liners to support groundwater modeling efforts. The focus of this investigation was on collecting studies that describe the performance of liners 'as installed' or 'as operated'. This document reviews the state of the science and practice on the infiltration rate through compacted clay liner (CCL) for 149 sites and geosynthetic clay liner (GCL) for 1 site. In addition, it reviews the leakage rate through geomembrane (GM) liners and composite liners for 259 sites. For compacted clay liners (CCL), there was limited information on infiltration rates (i.e., only 9 sites reported infiltration rates.), thus, it was difficult to develop a national distribution. The field hydraulic conductivities for natural clay liners range from 1 x 10(-9) cm s(-1) to 1 x 10(-4) cm s(-1), with an average of 6.5 x 10(-8) cm s(-1). There was limited information on geosynthetic clay liner. For composite lined and geomembrane systems, the leak detection system flow rates were utilized. The average monthly flow rate for composite liners ranged from 0-32 lphd for geomembrane and GCL systems to 0 to 1410 lphd for geomembrane and CCL systems. The increased infiltration for the geomembrane and CCL system may be attributed to consolidation water from the clay.

  9. Requirements model generation to support requirements elicitation: The Secure Tropos experience

    NARCIS (Netherlands)

    Kiyavitskaya, N.; Zannone, N.

    2008-01-01

    In recent years several efforts have been devoted by researchers in the Requirements Engineering community to the development of methodologies for supporting designers during requirements elicitation, modeling, and analysis. However, these methodologies often lack tool support to facilitate their

  10. Lessons learned from HRA and human-system modeling efforts

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    Human-System modeling is not unique to the field of Human Reliability Analysis (HRA). Since human factors professionals first began their explorations of human activities, they have done so with the concept of open-quotes systemclose quotes in mind. Though the two - human and system - are distinct, they can be properly understood only in terms of each other: the system provides a context in which goals and objectives for work are defined, and the human plays either a pre-defined or ad hoc role in meeting these goals. In this sense, every intervention which attempts to evaluate or improve upon some system parameter requires that an understanding of human-system interactions be developed. It is too often the case, however, that somewhere between the inception of a system and its implementation, the human-system relationships are overlooked, misunderstood, or inadequately framed. This results in mismatches between demands versus capabilities of human operators, systems which are difficult to operate, and the obvious end product-human error. The lessons learned from human system modeling provide a valuable feedback mechanism to the process of HRA, and the technologies which employ this form of modeling

  11. Report Summarizing the Effort Required to Initiate Welding of Irradiated Materials within the Welding Cubicle

    Energy Technology Data Exchange (ETDEWEB)

    Frederick, Greg [Electric Power Research Institute (EPRI), Palo Alto, CA (United States); Sutton, Benjamin J. [Electric Power Research Institute (EPRI), Palo Alto, CA (United States); Tatman, Jonathan K. [Electric Power Research Institute (EPRI), Palo Alto, CA (United States); Vance, Mark Christopher [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Allen W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clark, Scarlett R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Feng, Zhili [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Roger G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chen, Jian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tang, Wei [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hu, Xunxiang [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gibson, Brian T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-06-01

    The advanced welding facility within a hot cell at the Radiochemical Engineering Development Center of Oak Ridge National Laboratory (ORNL), which has been jointly funded by the U.S. Department of Energy (DOE), Office of Nuclear Energy, Light Water Reactor Sustainability Program and the Electric Power Research Institute, Long Term Operations Program and the Welding and Repair Technology Center, is in the final phase of development. Research and development activities in this facility will involve direct testing of advanced welding technologies on irradiated materials in order to address the primary technical challenge of helium induced cracking that can arise when conventional fusion welding techniques are utilized on neutron irradiated stainless steels and nickel-base alloys. This report details the effort that has been required since the beginning of fiscal year 2017 to initiate welding research and development activities on irradiated materials within the hot cell cubicle, which houses welding sub-systems that include laser beam welding (LBW) and friction stir welding (FSW) and provides material containment within the hot cell.

  12. Hybrid discrete choice models: Gained insights versus increasing effort

    International Nuclear Information System (INIS)

    Mariel, Petr; Meyerhoff, Jürgen

    2016-01-01

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  13. Hybrid discrete choice models: Gained insights versus increasing effort

    Energy Technology Data Exchange (ETDEWEB)

    Mariel, Petr, E-mail: petr.mariel@ehu.es [UPV/EHU, Economía Aplicada III, Avda. Lehendakari Aguire, 83, 48015 Bilbao (Spain); Meyerhoff, Jürgen [Institute for Landscape Architecture and Environmental Planning, Technical University of Berlin, D-10623 Berlin, Germany and The Kiel Institute for the World Economy, Duesternbrooker Weg 120, 24105 Kiel (Germany)

    2016-10-15

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  14. Requirements Modeling with Agent Programming

    Science.gov (United States)

    Dasgupta, Aniruddha; Krishna, Aneesh; Ghose, Aditya K.

    Agent-oriented conceptual modeling notations are highly effective in representing requirements from an intentional stance and answering questions such as what goals exist, how key actors depend on each other, and what alternatives must be considered. In this chapter, we review an approach to executing i* models by translating these into set of interacting agents implemented in the CASO language and suggest how we can perform reasoning with requirements modeled (both functional and non-functional) using i* models. In this chapter we particularly incorporate deliberation into the agent design. This allows us to benefit from the complementary representational capabilities of the two frameworks.

  15. An experience report on ERP effort estimation driven by quality requirements

    NARCIS (Netherlands)

    Erasmus, Pierre; Daneva, Maya; Schockert, Sixten

    2015-01-01

    Producing useful and accurate project effort estimates is highly dependable on the proper definition of the project scope. In the ERP service industry, the scope of an ERP service project is determined by desired needs which are driven by certain quality attributes that the client expects to be

  16. Efforts and Models of Education for Parents: the Danish Approach

    Directory of Open Access Journals (Sweden)

    Rosendal Jensen, Niels

    2009-12-01

    to underline that Danish welfare policy has been changing rather radical. The classic model was an understanding of welfare as social assurance and/or as social distribution – based on social solidarity. The modern model looks like welfare as social service and/or social investment. This means that citizens are changing role – from user and/or citizen to consumer and/or investor. The Danish state is in correspondence with decisions taken by the government investing in a national future shaped by global competition. The new models of welfare – “service” and “investment” – imply severe changes in hitherto known concepts of family life, relationship between parents and children etc. As an example the investment model points at a new implementation of the relationship between social rights and the rights of freedom. The service model has demonstrated that weakness that the access to qualified services in the field of health or education is becoming more and more dependent of the private purchasing power. The weakness of the investment model is that it represents a sort of “The Winner takes it all” – since a political majority is enabled to make agendas in societal fields former protected by the tripartite power and the rights of freedom of the citizens. The outcome of the Danish development seems to be an establishment of a political governed public service industry which on one side are capable of competing on market conditions and on the other are able being governed by contracts. This represents a new form of close linking of politics, economy and professional work. Attempts of controlling education, pedagogy and thereby the population are not a recent invention. In European history we could easily point at several such experiments. The real news is the linking between political priorities and exercise of public activities by economic incentives. By defining visible goals for the public servants, by introducing measurement of achievements and

  17. Nuclear Hybrid Energy Systems FY16 Modeling Efforts at ORNL

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Greenwood, Michael Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Guler Yigitoglu, Askin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    A nuclear hybrid system uses a nuclear reactor as the basic power generation unit. The power generated by the nuclear reactor is utilized by one or more power customers as either thermal power, electrical power, or both. In general, a nuclear hybrid system will couple the nuclear reactor to at least one thermal power user in addition to the power conversion system. The definition and architecture of a particular nuclear hybrid system is flexible depending on local markets needs and opportunities. For example, locations in need of potable water may be best served by coupling a desalination plant to the nuclear system. Similarly, an area near oil refineries may have a need for emission-free hydrogen production. A nuclear hybrid system expands the nuclear power plant from its more familiar central power station role by diversifying its immediately and directly connected customer base. The definition, design, analysis, and optimization work currently performed with respect to the nuclear hybrid systems represents the work of three national laboratories. Idaho National Laboratory (INL) is the lead lab working with Argonne National Laboratory (ANL) and Oak Ridge National Laboratory. Each laboratory is providing modeling and simulation expertise for the integration of the hybrid system.

  18. APPLYING TEACHING-LEARNING TO ARTIFICIAL BEE COLONY FOR PARAMETER OPTIMIZATION OF SOFTWARE EFFORT ESTIMATION MODEL

    Directory of Open Access Journals (Sweden)

    THANH TUNG KHUAT

    2017-05-01

    Full Text Available Artificial Bee Colony inspired by the foraging behaviour of honey bees is a novel meta-heuristic optimization algorithm in the community of swarm intelligence algorithms. Nevertheless, it is still insufficient in the speed of convergence and the quality of solutions. This paper proposes an approach in order to tackle these downsides by combining the positive aspects of TeachingLearning based optimization and Artificial Bee Colony. The performance of the proposed method is assessed on the software effort estimation problem, which is the complex and important issue in the project management. Software developers often carry out the software estimation in the early stages of the software development life cycle to derive the required cost and schedule for a project. There are a large number of methods for effort estimation in which COCOMO II is one of the most widely used models. However, this model has some restricts because its parameters have not been optimized yet. In this work, therefore, we will present the approach to overcome this limitation of COCOMO II model. The experiments have been conducted on NASA software project dataset and the obtained results indicated that the improvement of parameters provided better estimation capabilities compared to the original COCOMO II model.

  19. Competing probabilistic models for catch-effort relationships in wildlife censuses

    Energy Technology Data Exchange (ETDEWEB)

    Skalski, J.R.; Robson, D.S.; Matsuzaki, C.L.

    1983-01-01

    Two probabilistic models are presented for describing the chance that an animal is captured during a wildlife census, as a function of trapping effort. The models in turn are used to propose relationships between sampling intensity and catch-per-unit-effort (C.P.U.E.) that were field tested on small mammal populations. Capture data suggests a model of diminshing C.P.U.E. with increasing levels of trapping intensity. The catch-effort model is used to illustrate optimization procedures in the design of mark-recapture experiments for censusing wild populations. 14 references, 2 tables.

  20. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  1. Systematic Identification of Stakeholders for Engagement with Systems Modeling Efforts in the Snohomish Basin, Washington, USA

    Science.gov (United States)

    Even as stakeholder engagement in systems dynamic modeling efforts is increasingly promoted, the mechanisms for identifying which stakeholders should be included are rarely documented. Accordingly, for an Environmental Protection Agency’s Triple Value Simulation (3VS) mode...

  2. Does the incremental shuttle walk test require maximal effort in young obese women?

    Directory of Open Access Journals (Sweden)

    S.P. Jürgensen

    2016-01-01

    Full Text Available Obesity is a chronic disease with a multifaceted treatment approach that includes nutritional counseling, structured exercise training, and increased daily physical activity. Increased body mass elicits higher cardiovascular, ventilatory and metabolic demands to varying degrees during exercise. With functional capacity assessment, this variability can be evaluated so individualized guidance for exercise training and daily physical activity can be provided. The aim of the present study was to compare cardiovascular, ventilatory and metabolic responses obtained during a symptom-limited cardiopulmonary exercise test (CPX on a treadmill to responses obtained by the incremental shuttle walk test (ISWT in obese women and to propose a peak oxygen consumption (VO2 prediction equation through variables obtained during the ISWT. Forty obese women (BMI ≥30 kg/m2 performed one treadmill CPX and two ISWTs. Heart rate (HR, arterial blood pressure (ABP and perceived exertion by the Borg scale were measured at rest, during each stage of the exercise protocol, and throughout the recovery period. The predicted maximal heart rate (HRmax was calculated (210 – age in years (16 and compared to the HR response during the CPX. Peak VO2 obtained during CPX correlated significantly (P<0.05 with ISWT peak VO2 (r=0.79 as well as ISWT distance (r=0.65. The predictive model for CPX peak VO2, using age and ISWT distance explained 67% of the variability. The current study indicates the ISWT may be used to predict aerobic capacity in obese women when CPX is not a viable option.

  3. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold (Hal) Edward; Stevenson, Joel O.; Benner, Robert E., Jr. (.,; .); Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  4. A least-effort principle based model for heterogeneous pedestrian flow considering overtaking behavior

    Science.gov (United States)

    Liu, Chi; Ye, Rui; Lian, Liping; Song, Weiguo; Zhang, Jun; Lo, Siuming

    2018-05-01

    In the context of global aging, how to design traffic facilities for a population with a different age composition is of high importance. For this purpose, we propose a model based on the least effort principle to simulate heterogeneous pedestrian flow. In the model, the pedestrian is represented by a three-disc shaped agent. We add a new parameter to realize pedestrians' preference to avoid changing their direction of movement too quickly. The model is validated with numerous experimental data on unidirectional pedestrian flow. In addition, we investigate the influence of corridor width and velocity distribution of crowds on unidirectional heterogeneous pedestrian flow. The simulation results reflect that widening corridors could increase the specific flow for the crowd composed of two kinds of pedestrians with significantly different free velocities. Moreover, compared with a unified crowd, the crowd composed of pedestrians with great mobility differences requires a wider corridor to attain the same traffic efficiency. This study could be beneficial in providing a better understanding of heterogeneous pedestrian flow, and quantified outcomes could be applied in traffic facility design.

  5. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  6. Machine Learning Approach for Software Reliability Growth Modeling with Infinite Testing Effort Function

    Directory of Open Access Journals (Sweden)

    Subburaj Ramasamy

    2017-01-01

    Full Text Available Reliability is one of the quantifiable software quality attributes. Software Reliability Growth Models (SRGMs are used to assess the reliability achieved at different times of testing. Traditional time-based SRGMs may not be accurate enough in all situations where test effort varies with time. To overcome this lacuna, test effort was used instead of time in SRGMs. In the past, finite test effort functions were proposed, which may not be realistic as, at infinite testing time, test effort will be infinite. Hence in this paper, we propose an infinite test effort function in conjunction with a classical Nonhomogeneous Poisson Process (NHPP model. We use Artificial Neural Network (ANN for training the proposed model with software failure data. Here it is possible to get a large set of weights for the same model to describe the past failure data equally well. We use machine learning approach to select the appropriate set of weights for the model which will describe both the past and the future data well. We compare the performance of the proposed model with existing model using practical software failure data sets. The proposed log-power TEF based SRGM describes all types of failure data equally well and also improves the accuracy of parameter estimation more than existing TEF and can be used for software release time determination as well.

  7. [Psychosocial factors at work and cardiovascular diseases: contribution of the Effort-Reward Imbalance model].

    Science.gov (United States)

    Niedhammer, I; Siegrist, J

    1998-11-01

    The effect of psychosocial factors at work on health, especially cardiovascular health, has given rise to growing concern in occupational epidemiology over the last few years. Two theoretical models, Karasek's model and the Effort-Reward Imbalance model, have been developed to evaluate psychosocial factors at work within specific conceptual frameworks in an attempt to take into account the serious methodological difficulties inherent in the evaluation of such factors. Karasek's model, the most widely used model, measures three factors: psychological demands, decision latitude and social support at work. Many studies have shown the predictive effects of these factors on cardiovascular diseases independently of well-known cardiovascular risk factors. More recently, the Effort-Reward Imbalance model takes into account the role of individual coping characteristics which was neglected in the Karasek model. The effort-reward imbalance model focuses on the reciprocity of exchange in occupational life where high-cost/low-gain conditions are considered particularly stressful. Three dimensions of rewards are distinguished: money, esteem and gratifications in terms of promotion prospects and job security. Some studies already support that high-effort/low reward-conditions are predictive of cardiovascular diseases.

  8. Estimation of inspection effort

    International Nuclear Information System (INIS)

    Mullen, M.F.; Wincek, M.A.

    1979-06-01

    An overview of IAEA inspection activities is presented, and the problem of evaluating the effectiveness of an inspection is discussed. Two models are described - an effort model and an effectiveness model. The effort model breaks the IAEA's inspection effort into components; the amount of effort required for each component is estimated; and the total effort is determined by summing the effort for each component. The effectiveness model quantifies the effectiveness of inspections in terms of probabilities of detection and quantities of material to be detected, if diverted over a specific period. The method is applied to a 200 metric ton per year low-enriched uranium fuel fabrication facility. A description of the model plant is presented, a safeguards approach is outlined, and sampling plans are calculated. The required inspection effort is estimated and the results are compared to IAEA estimates. Some other applications of the method are discussed briefly. Examples are presented which demonstrate how the method might be useful in formulating guidelines for inspection planning and in establishing technical criteria for safeguards implementation

  9. Obligatory Effort [Hishtadlut] as an Explanatory Model: A Critique of Reproductive Choice and Control.

    Science.gov (United States)

    Teman, Elly; Ivry, Tsipy; Goren, Heela

    2016-06-01

    Studies on reproductive technologies often examine women's reproductive lives in terms of choice and control. Drawing on 48 accounts of procreative experiences of religiously devout Jewish women in Israel and the US, we examine their attitudes, understandings and experiences of pregnancy, reproductive technologies and prenatal testing. We suggest that the concept of hishtadlut-"obligatory effort"-works as an explanatory model that organizes Haredi women's reproductive careers and their negotiations of reproductive technologies. As an elastic category with negotiable and dynamic boundaries, hishtadlut gives ultra-orthodox Jewish women room for effort without the assumption of control; it allows them to exercise discretion in relation to medical issues without framing their efforts in terms of individual choice. Haredi women hold themselves responsible for making their obligatory effort and not for pregnancy outcomes. We suggest that an alternative paradigm to autonomous choice and control emerges from cosmological orders where reproductive duties constitute "obligatory choices."

  10. STUDY OF INSTRUCTIONAL MODELS AND SYNTAX AS AN EFFORT FOR DEVELOPING ‘OIDDE’ INSTRUCTIONAL MODEL

    Directory of Open Access Journals (Sweden)

    Atok Miftachul Hudha

    2016-07-01

    Full Text Available The 21st century requires the availability of human resources with seven skills or competence (Maftuh, 2016, namely: 1 critical thinking and problem solving skills, 2 creative and innovative, 3 behave ethically, 4 flexible and quick to adapt, 5 competence in ICT and literacy, 6 interpersonal and collaborative capabilities, 7 social skills and cross-cultural interaction. One of the competence of human resources of the 21st century are behaving ethically should be established and developed through learning that includes the study of ethics because ethical behavior can not be created and owned as it is by human, but must proceed through solving problem, especially ethical dilemma solving on the ethical problems atau problematics of ethics. The fundamental problem, in order to ethical behavior competence can be achieved through learning, is the right model of learning is not found yet by teachers to implement the learning associated with ethical values as expected in character education (Hudha, et al, 2014a, 2014b, 2014c. Therefore, it needs a decent learning model (valid, practical and effective so that ethics learning, to establish a human resources behave ethically, can be met. Thus, it is necessary to study (to analyze and modificate the steps of learning (syntax existing learning model, in order to obtain the results of the development model of learning syntax. One model of learning that is feasible, practical, and effective question is the learning model on the analysis and modification of syntax model of social learning, syntax learning model systems behavior (Joyce and Weil, 1980, Joyce, et al, 2009 as well as syntax learning model Tri Prakoro (Akbar, 2013. The modified syntax generate learning model 'OIDDE' which is an acronym of orientation, identify, discussion, decision, and engage in behavior.

  11. Time and Effort Required by Persons with Spinal Cord Injury to Learn to Use a Powered Exoskeleton for Assisted Walking.

    Science.gov (United States)

    Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P

    2015-01-01

    Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.

  12. Competition for marine space: modelling the Baltic Sea fisheries and effort displacement under spatial restrictions

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Eigaard, Ole Ritzau

    2015-01-01

    DISPLACE model) to combine stochastic variations in spatial fishing activities with harvested resource dynamics in scenario projections. The assessment computes economic and stock status indicators by modelling the activity of Danish, Swedish, and German vessels (.12 m) in the international western Baltic...... Sea commercial fishery, together with the underlying size-based distribution dynamics of the main fishery resources of sprat, herring, and cod. The outcomes of alternative scenarios for spatial effort displacement are exemplified by evaluating the fishers’s abilities to adapt to spatial plans under...... various constraints. Interlinked spatial, technical, and biological dynamics of vessels and stocks in the scenarios result in stable profits, which compensate for the additional costs from effort displacement and release pressure on the fish stocks. The effort is further redirected away from sensitive...

  13. Effort dynamics in a fisheries bioeconomic model: A vessel level approach through Game Theory

    Directory of Open Access Journals (Sweden)

    Gorka Merino

    2007-09-01

    Full Text Available Red shrimp, Aristeus antennatus (Risso, 1816 is one of the most important resources for the bottom-trawl fleets in the northwestern Mediterranean, in terms of both landings and economic value. A simple bioeconomic model introducing Game Theory for the prediction of effort dynamics at vessel level is proposed. The game is performed by the twelve vessels exploiting red shrimp in Blanes. Within the game, two solutions are performed: non-cooperation and cooperation. The first is proposed as a realistic method for the prediction of individual effort strategies and the second is used to illustrate the potential profitability of the analysed fishery. The effort strategy for each vessel is the number of fishing days per year and their objective is profit maximisation, individual profits for the non-cooperative solution and total profits for the cooperative one. In the present analysis, strategic conflicts arise from the differences between vessels in technical efficiency (catchability coefficient and economic efficiency (defined here. The ten-year and 1000-iteration stochastic simulations performed for the two effort solutions show that the best strategy from both an economic and a conservationist perspective is homogeneous effort cooperation. However, the results under non-cooperation are more similar to the observed data on effort strategies and landings.

  14. Investigation of Psychological Health and Migraine Headaches Among Personnel According to Effort-Reward Imbalance Model

    Directory of Open Access Journals (Sweden)

    Z. Darami

    2012-05-01

    Full Text Available Background and aims: The relationship between physical-mental health and Migraine headaches and stress, especially job stress, is known. Many factors can construct job stress in work settings. The factor that has gained much attention recently is inequality (imbalance of employees’ effort versus the reward they gain. The aim of the current attempt was to investigate the validity of effort-reward imbalance model and indicate the relation of this model with migraine headaches and psychological well-being among subjects in balance and imbalance groups. Methods: Participants were 180 personnel of Oil distribution company located in Isfahan city, and instruments used were General health questionnaire (Goldberg & Hilier, Social Re-adjustment Rating Scale (Holmes & Rahe, Ahvaz Migraine Questionnaire (Najariyan and Effort-reward imbalance scale (Van Vegchel & et al.   Results: The result of exploratory and confirmatory factor analysis for investigating the Construct validity of the effort-reward imbalance model showed that in both analyses, the two factor model was confirmed. Moreover, findings indicate that balance group was in better psychological (p<0/01 and physical (migraine (p<0/05 status comparing to the imbalance group. These findings indicate the significance of justice to present appropriate reward relative to personnel performance on their health.   Conclusion: Implication of these findings can improve Iranian industrial personnel health from both physical and psychological aspects.  

  15. Hindsight Bias Doesn't Always Come Easy: Causal Models, Cognitive Effort, and Creeping Determinism

    Science.gov (United States)

    Nestler, Steffen; Blank, Hartmut; von Collani, Gernot

    2008-01-01

    Creeping determinism, a form of hindsight bias, refers to people's hindsight perceptions of events as being determined or inevitable. This article proposes, on the basis of a causal-model theory of creeping determinism, that the underlying processes are effortful, and hence creeping determinism should disappear when individuals lack the cognitive…

  16. A Covariance Structure Model Test of Antecedents of Adolescent Alcohol Misuse and a Prevention Effort.

    Science.gov (United States)

    Dielman, T. E.; And Others

    1989-01-01

    Questionnaires were administered to 4,157 junior high school students to determine levels of alcohol misuse, exposure to peer use and misuse of alcohol, susceptibility to peer pressure, internal health locus of control, and self-esteem. Conceptual model of antecendents of adolescent alcohol misuse and effectiveness of a prevention effort was…

  17. Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort

    Directory of Open Access Journals (Sweden)

    Eliana Vassena

    2017-06-01

    Full Text Available In the last two decades the anterior cingulate cortex (ACC has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.

  18. Effort-reward imbalance and organisational injustice among aged nurses: a moderated mediation model.

    Science.gov (United States)

    Topa, Gabriela; Guglielmi, Dina; Depolo, Marco

    2016-09-01

    To test the effort-reward imbalance model among older nurses, expanding it to include the moderation of overcommitment and age in the stress-health complaints relationship, mediated by organisational injustice. The theoretical framework included the effort-reward imbalance, the uncertainty management and the socio-emotional selectivity models. Employing a two-wave design, the participants were 255 nurses aged 45 years and over, recruited from four large hospitals in Spain (Madrid and Basque Country). The direct effect of imbalance on health complaints was supported: it was significant when overcommitment was low but not when it was high. Organisational injustice mediated the influence of effort-reward imbalance on health complaints. The conditional effect of the mediation of organisational injustice was significant in three of the overcommitment/age conditions but it weakened, becoming non-significant, when the level of overcommitment was low and age was high. The study tested the model in nursing populations and expanded it to the settings of occupational health and safety at work. The results of this study highlight the importance of effort-reward imbalance and organisational justice for creating healthy work environments. © 2016 John Wiley & Sons Ltd.

  19. Integrating multiple distribution models to guide conservation efforts of an endangered toad

    Science.gov (United States)

    Treglia, Michael L.; Fisher, Robert N.; Fitzgerald, Lee A.

    2015-01-01

    Species distribution models are used for numerous purposes such as predicting changes in species’ ranges and identifying biodiversity hotspots. Although implications of distribution models for conservation are often implicit, few studies use these tools explicitly to inform conservation efforts. Herein, we illustrate how multiple distribution models developed using distinct sets of environmental variables can be integrated to aid in identification sites for use in conservation. We focus on the endangered arroyo toad (Anaxyrus californicus), which relies on open, sandy streams and surrounding floodplains in southern California, USA, and northern Baja California, Mexico. Declines of the species are largely attributed to habitat degradation associated with vegetation encroachment, invasive predators, and altered hydrologic regimes. We had three main goals: 1) develop a model of potential habitat for arroyo toads, based on long-term environmental variables and all available locality data; 2) develop a model of the species’ current habitat by incorporating recent remotely-sensed variables and only using recent locality data; and 3) integrate results of both models to identify sites that may be employed in conservation efforts. We used a machine learning technique, Random Forests, to develop the models, focused on riparian zones in southern California. We identified 14.37% and 10.50% of our study area as potential and current habitat for the arroyo toad, respectively. Generally, inclusion of remotely-sensed variables reduced modeled suitability of sites, thus many areas modeled as potential habitat were not modeled as current habitat. We propose such sites could be made suitable for arroyo toads through active management, increasing current habitat by up to 67.02%. Our general approach can be employed to guide conservation efforts of virtually any species with sufficient data necessary to develop appropriate distribution models.

  20. The Effect of the Demand Control and Effort Reward Imbalance Models on the Academic Burnout of Korean Adolescents

    Science.gov (United States)

    Lee, Jayoung; Puig, Ana; Lee, Sang Min

    2012-01-01

    The purpose of this study was to examine the effects of the Demand Control Model (DCM) and the Effort Reward Imbalance Model (ERIM) on academic burnout for Korean students. Specifically, this study identified the effects of the predictor variables based on DCM and ERIM (i.e., demand, control, effort, reward, Demand Control Ratio, Effort Reward…

  1. Terry Turbopump Analytical Modeling Efforts in Fiscal Year 2016 ? Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas; Ross, Kyle; Cardoni, Jeffrey N

    2018-04-01

    This document details the Fiscal Year 2016 modeling efforts to define the true operating limitations (margins) of the Terry turbopump systems used in the nuclear industry for Milestone 3 (full-scale component experiments) and Milestone 4 (Terry turbopump basic science experiments) experiments. The overall multinational-sponsored program creates the technical basis to: (1) reduce and defer additional utility costs, (2) simplify plant operations, and (3) provide a better understanding of the true margin which could reduce overall risk of operations.

  2. The Effort Paradox: Effort Is Both Costly and Valued.

    Science.gov (United States)

    Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y

    2018-04-01

    According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Fundamental Drop Dynamics and Mass Transfer Experiments to Support Solvent Extraction Modeling Efforts

    International Nuclear Information System (INIS)

    Christensen, Kristi; Rutledge, Veronica; Garn, Troy

    2011-01-01

    In support of the Nuclear Energy Advanced Modeling Simulation Safeguards and Separations (NEAMS SafeSep) program, the Idaho National Laboratory (INL) worked in collaboration with Los Alamos National Laboratory (LANL) to further a modeling effort designed to predict mass transfer behavior for selected metal species between individual dispersed drops and a continuous phase in a two phase liquid-liquid extraction (LLE) system. The purpose of the model is to understand the fundamental processes of mass transfer that occur at the drop interface. This fundamental understanding can be extended to support modeling of larger LLE equipment such as mixer settlers, pulse columns, and centrifugal contactors. The work performed at the INL involved gathering the necessary experimental data to support the modeling effort. A custom experimental apparatus was designed and built for performing drop contact experiments to measure mass transfer coefficients as a function of contact time. A high speed digital camera was used in conjunction with the apparatus to measure size, shape, and velocity of the drops. In addition to drop data, the physical properties of the experimental fluids were measured to be used as input data for the model. Physical properties measurements included density, viscosity, surface tension and interfacial tension. Additionally, self diffusion coefficients for the selected metal species in each experimental solution were measured, and the distribution coefficient for the metal partitioning between phases was determined. At the completion of this work, the INL has determined the mass transfer coefficient and a velocity profile for drops rising by buoyancy through a continuous medium under a specific set of experimental conditions. Additionally, a complete set of experimentally determined fluid properties has been obtained. All data will be provided to LANL to support the modeling effort.

  4. Data requirements for integrated near field models

    International Nuclear Information System (INIS)

    Wilems, R.E.; Pearson, F.J. Jr.; Faust, C.R.; Brecher, A.

    1981-01-01

    The coupled nature of the various processes in the near field require that integrated models be employed to assess long term performance of the waste package and repository. The nature of the integrated near field models being compiled under the SCEPTER program are discussed. The interfaces between these near field models and far field models are described. Finally, near field data requirements are outlined in sufficient detail to indicate overall programmatic guidance for data gathering activities

  5. A model of reward- and effort-based optimal decision making and motor control.

    Directory of Open Access Journals (Sweden)

    Lionel Rigoux

    Full Text Available Costs (e.g. energetic expenditure and benefits (e.g. food are central determinants of behavior. In ecology and economics, they are combined to form a utility function which is maximized to guide choices. This principle is widely used in neuroscience as a normative model of decision and action, but current versions of this model fail to consider how decisions are actually converted into actions (i.e. the formation of trajectories. Here, we describe an approach where decision making and motor control are optimal, iterative processes derived from the maximization of the discounted, weighted difference between expected rewards and foreseeable motor efforts. The model accounts for decision making in cost/benefit situations, and detailed characteristics of control and goal tracking in realistic motor tasks. As a normative construction, the model is relevant to address the neural bases and pathological aspects of decision making and motor control.

  6. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  7. Modeling and Evaluating Pilot Performance in NextGen: Review of and Recommendations Regarding Pilot Modeling Efforts, Architectures, and Validation Studies

    Science.gov (United States)

    Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.

    2013-01-01

    NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA

  8. Simulation and Modeling Efforts to Support Decision Making in Healthcare Supply Chain Management

    Directory of Open Access Journals (Sweden)

    Eman AbuKhousa

    2014-01-01

    Full Text Available Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM by improving the decision making pertaining processes’ efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges.

  9. Simulation and modeling efforts to support decision making in healthcare supply chain management.

    Science.gov (United States)

    AbuKhousa, Eman; Al-Jaroodi, Jameela; Lazarova-Molnar, Sanja; Mohamed, Nader

    2014-01-01

    Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM) by improving the decision making pertaining processes' efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM) has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges.

  10. A model to estimate cost-savings in diabetic foot ulcer prevention efforts.

    Science.gov (United States)

    Barshes, Neal R; Saedi, Samira; Wrobel, James; Kougias, Panos; Kundakcioglu, O Erhun; Armstrong, David G

    2017-04-01

    Sustained efforts at preventing diabetic foot ulcers (DFUs) and subsequent leg amputations are sporadic in most health care systems despite the high costs associated with such complications. We sought to estimate effectiveness targets at which cost-savings (i.e. improved health outcomes at decreased total costs) might occur. A Markov model with probabilistic sensitivity analyses was used to simulate the five-year survival, incidence of foot complications, and total health care costs in a hypothetical population of 100,000 people with diabetes. Clinical event and cost estimates were obtained from previously-published trials and studies. A population without previous DFU but with 17% neuropathy and 11% peripheral artery disease (PAD) prevalence was assumed. Primary prevention (PP) was defined as reducing initial DFU incidence. PP was more than 90% likely to provide cost-savings when annual prevention costs are less than $50/person and/or annual DFU incidence is reduced by at least 25%. Efforts directed at patients with diabetes who were at moderate or high risk for DFUs were very likely to provide cost-savings if DFU incidence was decreased by at least 10% and/or the cost was less than $150 per person per year. Low-cost DFU primary prevention efforts producing even small decreases in DFU incidence may provide the best opportunity for cost-savings, especially if focused on patients with neuropathy and/or PAD. Mobile phone-based reminders, self-identification of risk factors (ex. Ipswich touch test), and written brochures may be among such low-cost interventions that should be investigated for cost-savings potential. Published by Elsevier Inc.

  11. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  12. The effort-reward imbalance work-stress model and daytime salivary cortisol and dehydroepiandrosterone (DHEA) among Japanese women.

    Science.gov (United States)

    Ota, Atsuhiko; Mase, Junji; Howteerakul, Nopporn; Rajatanun, Thitipat; Suwannapong, Nawarat; Yatsuya, Hiroshi; Ono, Yuichiro

    2014-09-17

    We examined the influence of work-related effort-reward imbalance and overcommitment to work (OC), as derived from Siegrist's Effort-Reward Imbalance (ERI) model, on the hypothalamic-pituitary-adrenocortical (HPA) axis. We hypothesized that, among healthy workers, both cortisol and dehydroepiandrosterone (DHEA) secretion would be increased by effort-reward imbalance and OC and, as a result, cortisol-to-DHEA ratio (C/D ratio) would not differ by effort-reward imbalance or OC. The subjects were 115 healthy female nursery school teachers. Salivary cortisol, DHEA, and C/D ratio were used as indexes of HPA activity. Mixed-model analyses of variance revealed that neither the interaction between the ERI model indicators (i.e., effort, reward, effort-to-reward ratio, and OC) and the series of measurement times (9:00, 12:00, and 15:00) nor the main effect of the ERI model indicators was significant for daytime salivary cortisol, DHEA, or C/D ratio. Multiple linear regression analyses indicated that none of the ERI model indicators was significantly associated with area under the curve of daytime salivary cortisol, DHEA, or C/D ratio. We found that effort, reward, effort-reward imbalance, and OC had little influence on daytime variation patterns, levels, or amounts of salivary HPA-axis-related hormones. Thus, our hypotheses were not supported.

  13. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  14. A Collaborative Effort Between Caribbean States for Tsunami Numerical Modeling: Case Study CaribeWave15

    Science.gov (United States)

    Chacón-Barrantes, Silvia; López-Venegas, Alberto; Sánchez-Escobar, Rónald; Luque-Vergara, Néstor

    2018-04-01

    Historical records have shown that tsunami have affected the Caribbean region in the past. However infrequent, recent studies have demonstrated that they pose a latent hazard for countries within this basin. The Hazard Assessment Working Group of the ICG/CARIBE-EWS (Intergovernmental Coordination Group of the Early Warning System for Tsunamis and Other Coastal Threats for the Caribbean Sea and Adjacent Regions) of IOC/UNESCO has a modeling subgroup, which seeks to develop a modeling platform to assess the effects of possible tsunami sources within the basin. The CaribeWave tsunami exercise is carried out annually in the Caribbean region to increase awareness and test tsunami preparedness of countries within the basin. In this study we present results of tsunami inundation using the CaribeWave15 exercise scenario for four selected locations within the Caribbean basin (Colombia, Costa Rica, Panamá and Puerto Rico), performed by tsunami modeling researchers from those selected countries. The purpose of this study was to provide the states with additional results for the exercise. The results obtained here were compared to co-seismic deformation and tsunami heights within the basin (energy plots) provided for the exercise to assess the performance of the decision support tools distributed by PTWC (Pacific Tsunami Warning Center), the tsunami service provider for the Caribbean basin. However, comparison of coastal tsunami heights was not possible, due to inconsistencies between the provided fault parameters and the modeling results within the provided exercise products. Still, the modeling performed here allowed to analyze tsunami characteristics at the mentioned states from sources within the North Panamá Deformed Belt. The occurrence of a tsunami in the Caribbean may affect several countries because a great variety of them share coastal zones in this basin. Therefore, collaborative efforts similar to the one presented in this study, particularly between neighboring

  15. Index of Effort: An Analytical Model for Evaluating and Re-Directing Student Recruitment Activities for a Local Community College.

    Science.gov (United States)

    Landini, Albert J.

    This index of effort is proposed as a means by which those in charge of student recruitment activities at community colleges can be sure that their efforts are being directed toward all of the appropriate population. The index is an analytical model based on the concept of socio-economic profiles, using small area 1970 census data, and is the…

  16. Effects of fishing effort allocation scenarios on energy efficiency and profitability: an individual-based model applied to Danish fisheries

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Andersen, Bo Sølgaard

    2010-01-01

    to the harbour, and (C) allocating effort towards optimising the expected area-specific profit per trip. The model is informed by data from each Danish fishing vessel >15 m after coupling its high resolution spatial and temporal effort data (VMS) with data from logbook landing declarations, sales slips, vessel...... effort allocation has actually been sub-optimal because increased profits from decreased fuel consumption and larger landings could have been obtained by applying a different spatial effort allocation. Based on recent advances in VMS and logbooks data analyses, this paper contributes to improve...

  17. Habitat models to assist plant protection efforts in Shenandoah National Park, Virginia, USA

    Science.gov (United States)

    Van Manen, F.T.; Young, J.A.; Thatcher, C.A.; Cass, W.B.; Ulrey, C.

    2005-01-01

    During 2002, the National Park Service initiated a demonstration project to develop science-based law enforcement strategies for the protection of at-risk natural resources, including American ginseng (Panax quinquefolius L.), bloodroot (Sanguinaria canadensis L.), and black cohosh (Cimicifuga racemosa (L.) Nutt. [syn. Actaea racemosa L.]). Harvest pressure on these species is increasing because of the growing herbal remedy market. We developed habitat models for Shenandoah National Park and the northern portion of the Blue Ridge Parkway to determine the distribution of favorable habitats of these three plant species and to demonstrate the use of that information to support plant protection activities. We compiled locations for the three plant species to delineate favorable habitats with a geographic information system (GIS). We mapped potential habitat quality for each species by calculating a multivariate statistic, Mahalanobis distance, based on GIS layers that characterized the topography, land cover, and geology of the plant locations (10-m resolution). We tested model performance with an independent dataset of plant locations, which indicated a significant relationship between Mahalanobis distance values and species occurrence. We also generated null models by examining the distribution of the Mahalanobis distance values had plants been distributed randomly. For all species, the habitat models performed markedly better than their respective null models. We used our models to direct field searches to the most favorable habitats, resulting in a sizeable number of new plant locations (82 ginseng, 73 bloodroot, and 139 black cohosh locations). The odds of finding new plant locations based on the habitat models were 4.5 (black cohosh) to 12.3 (American ginseng) times greater than random searches; thus, the habitat models can be used to improve the efficiency of plant protection efforts, (e.g., marking of plants, law enforcement activities). The field searches also

  18. Modeling the Movement of Homicide by Type to Inform Public Health Prevention Efforts.

    Science.gov (United States)

    Zeoli, April M; Grady, Sue; Pizarro, Jesenia M; Melde, Chris

    2015-10-01

    We modeled the spatiotemporal movement of hotspot clusters of homicide by motive in Newark, New Jersey, to investigate whether different homicide types have different patterns of clustering and movement. We obtained homicide data from the Newark Police Department Homicide Unit's investigative files from 1997 through 2007 (n = 560). We geocoded the address at which each homicide victim was found and recorded the date of and the motive for the homicide. We used cluster detection software to model the spatiotemporal movement of statistically significant homicide clusters by motive, using census tract and month of occurrence as the spatial and temporal units of analysis. Gang-motivated homicides showed evidence of clustering and diffusion through Newark. Additionally, gang-motivated homicide clusters overlapped to a degree with revenge and drug-motivated homicide clusters. Escalating dispute and nonintimate familial homicides clustered; however, there was no evidence of diffusion. Intimate partner and robbery homicides did not cluster. By tracking how homicide types diffuse through communities and determining which places have ongoing or emerging homicide problems by type, we can better inform the deployment of prevention and intervention efforts.

  19. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  20. Economic effort management in multispecies fisheries: the FcubEcon model

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans; Ulrich, Clara

    2010-01-01

    in the development of management tools based on fleets, fisheries, and areas, rather than on unit fish stocks. A natural consequence of this has been to consider effort rather than quota management, a final effort decision being based on fleet-harvest potential and fish-stock-preservation considerations. Effort...... allocation between fleets should not be based on biological considerations alone, but also on the economic behaviour of fishers, because fisheries management has a significant impact on human behaviour as well as on ecosystem development. The FcubEcon management framework for effort allocation between fleets...... the past decade, increased focus on this issue has resulted in the development of management tools based on fleets, fisheries, and areas, rather than on unit fish stocks. A natural consequence of this has been to consider effort rather than quota management, a final effort decision being based on fleet...

  1. Work Stress and Altered Biomarkers: A Synthesis of Findings Based on the Effort-Reward Imbalance Model.

    Science.gov (United States)

    Siegrist, Johannes; Li, Jian

    2017-11-10

    While epidemiological studies provide statistical evidence on associations of exposures such as stressful work with elevated risks of stress-related disorders (e.g., coronary heart disease or depression), additional information on biological pathways and biomarkers underlying these associations is required. In this contribution, we summarize the current state of the art on research findings linking stressful work, in terms of an established theoretical model-effort-reward imbalance-with a broad range of biomarkers. Based on structured electronic literature search and recent available systematic reviews, our synthesis of findings indicates that associations of work stress with heart rate variability, altered blood lipids, and risk of metabolic syndrome are rather consistent and robust. Significant relationships with blood pressure, heart rate, altered immune function and inflammation, cortisol release, and haemostatic biomarkers were also observed, but due to conflicting findings additional data will be needed to reach a firm conclusion. This narrative review of empirical evidence supports the argument that the biomarkers under study can act as mediators of epidemiologically established associations of work stress, as measured by effort-reward imbalance, with incident stress-related disorders.

  2. Two models at work : A study of interactions and specificity in relation to the Demand-Control Model and the Effort-Reward Imbalance Model

    NARCIS (Netherlands)

    Vegchel, N.

    2005-01-01

    To investigate the relation between work and employee health, several work stress models, e.g., the Demand-Control (DC) Model and the Effort-Reward Imbalance (ERI) Model, have been developed. Although these models focus on job demands and job resources, relatively little attention has been devoted

  3. Cutting Edge PBPK Models and Analyses: Providing the Basis for Future Modeling Efforts and Bridges to Emerging Toxicology Paradigms

    Directory of Open Access Journals (Sweden)

    Jane C. Caldwell

    2012-01-01

    Full Text Available Physiologically based Pharmacokinetic (PBPK models are used for predictions of internal or target dose from environmental and pharmacologic chemical exposures. Their use in human risk assessment is dependent on the nature of databases (animal or human used to develop and test them, and includes extrapolations across species, experimental paradigms, and determination of variability of response within human populations. Integration of state-of-the science PBPK modeling with emerging computational toxicology models is critical for extrapolation between in vitro exposures, in vivo physiologic exposure, whole organism responses, and long-term health outcomes. This special issue contains papers that can provide the basis for future modeling efforts and provide bridges to emerging toxicology paradigms. In this overview paper, we present an overview of the field and introduction for these papers that includes discussions of model development, best practices, risk-assessment applications of PBPK models, and limitations and bridges of modeling approaches for future applications. Specifically, issues addressed include: (a increased understanding of human variability of pharmacokinetics and pharmacodynamics in the population, (b exploration of mode of action hypotheses (MOA, (c application of biological modeling in the risk assessment of individual chemicals and chemical mixtures, and (d identification and discussion of uncertainties in the modeling process.

  4. The complex nature of mixed farming systems requires multidimensional actions supported by integrative research and development efforts

    DEFF Research Database (Denmark)

    González-García, E; Gourdine, J L; Alexandre, G

    2012-01-01

    the requirement for a change in research strategies and initiatives through the development of a complex but necessary multi-/inter-/trans-disciplinary teamwork spirit. We stress as essential the collaboration and active participation of local and regional actors, stakeholders and end-users in the identification...

  5. Impact of habitat diversity on the sampling effort required for the assessment of river fish communities and IBI

    NARCIS (Netherlands)

    Van Liefferinge, C.; Simoens, I.; Vogt, C.; Cox, T.J.S.; Breine, J.; Ercken, D.; Goethals, P.; Belpaire, C.; Meire, P.

    2010-01-01

    The spatial variation in the fish communities of four small Belgian rivers with variable habitat diversity was investigated by electric fishing to define the minimum sampling distance required for optimal fish stock assessment and determination of the Index of Biotic Integrity. This study shows that

  6. Upending the social ecological model to guide health promotion efforts toward policy and environmental change.

    Science.gov (United States)

    Golden, Shelley D; McLeroy, Kenneth R; Green, Lawrence W; Earp, Jo Anne L; Lieberman, Lisa D

    2015-04-01

    Efforts to change policies and the environments in which people live, work, and play have gained increasing attention over the past several decades. Yet health promotion frameworks that illustrate the complex processes that produce health-enhancing structural changes are limited. Building on the experiences of health educators, community activists, and community-based researchers described in this supplement and elsewhere, as well as several political, social, and behavioral science theories, we propose a new framework to organize our thinking about producing policy, environmental, and other structural changes. We build on the social ecological model, a framework widely employed in public health research and practice, by turning it inside out, placing health-related and other social policies and environments at the center, and conceptualizing the ways in which individuals, their social networks, and organized groups produce a community context that fosters healthy policy and environmental development. We conclude by describing how health promotion practitioners and researchers can foster structural change by (1) conveying the health and social relevance of policy and environmental change initiatives, (2) building partnerships to support them, and (3) promoting more equitable distributions of the resources necessary for people to meet their daily needs, control their lives, and freely participate in the public sphere. © 2015 Society for Public Health Education.

  7. Modeling requirements for in situ vitrification

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom

  8. Early efforts in modeling the incubation period of infectious diseases with an acute course of illness

    Directory of Open Access Journals (Sweden)

    Nishiura Hiroshi

    2007-05-01

    Full Text Available Abstract The incubation period of infectious diseases, the time from infection with a microorganism to onset of disease, is directly relevant to prevention and control. Since explicit models of the incubation period enhance our understanding of the spread of disease, previous classic studies were revisited, focusing on the modeling methods employed and paying particular attention to relatively unknown historical efforts. The earliest study on the incubation period of pandemic influenza was published in 1919, providing estimates of the incubation period of Spanish flu using the daily incidence on ships departing from several ports in Australia. Although the study explicitly dealt with an unknown time of exposure, the assumed periods of exposure, which had an equal probability of infection, were too long, and thus, likely resulted in slight underestimates of the incubation period. After the suggestion that the incubation period follows lognormal distribution, Japanese epidemiologists extended this assumption to estimates of the time of exposure during a point source outbreak. Although the reason why the incubation period of acute infectious diseases tends to reveal a right-skewed distribution has been explored several times, the validity of the lognormal assumption is yet to be fully clarified. At present, various different distributions are assumed, and the lack of validity in assuming lognormal distribution is particularly apparent in the case of slowly progressing diseases. The present paper indicates that (1 analysis using well-defined short periods of exposure with appropriate statistical methods is critical when the exact time of exposure is unknown, and (2 when assuming a specific distribution for the incubation period, comparisons using different distributions are needed in addition to estimations using different datasets, analyses of the determinants of incubation period, and an understanding of the underlying disease mechanisms.

  9. Early efforts in modeling the incubation period of infectious diseases with an acute course of illness.

    Science.gov (United States)

    Nishiura, Hiroshi

    2007-05-11

    The incubation period of infectious diseases, the time from infection with a microorganism to onset of disease, is directly relevant to prevention and control. Since explicit models of the incubation period enhance our understanding of the spread of disease, previous classic studies were revisited, focusing on the modeling methods employed and paying particular attention to relatively unknown historical efforts. The earliest study on the incubation period of pandemic influenza was published in 1919, providing estimates of the incubation period of Spanish flu using the daily incidence on ships departing from several ports in Australia. Although the study explicitly dealt with an unknown time of exposure, the assumed periods of exposure, which had an equal probability of infection, were too long, and thus, likely resulted in slight underestimates of the incubation period. After the suggestion that the incubation period follows lognormal distribution, Japanese epidemiologists extended this assumption to estimates of the time of exposure during a point source outbreak. Although the reason why the incubation period of acute infectious diseases tends to reveal a right-skewed distribution has been explored several times, the validity of the lognormal assumption is yet to be fully clarified. At present, various different distributions are assumed, and the lack of validity in assuming lognormal distribution is particularly apparent in the case of slowly progressing diseases. The present paper indicates that (1) analysis using well-defined short periods of exposure with appropriate statistical methods is critical when the exact time of exposure is unknown, and (2) when assuming a specific distribution for the incubation period, comparisons using different distributions are needed in addition to estimations using different datasets, analyses of the determinants of incubation period, and an understanding of the underlying disease mechanisms.

  10. Overview of past, ongoing and future efforts of the integrated modeling of global change for Northern Eurasia

    Science.gov (United States)

    Monier, Erwan; Kicklighter, David; Sokolov, Andrei; Zhuang, Qianlai; Melillo, Jerry; Reilly, John

    2016-04-01

    Northern Eurasia is both a major player in the global carbon budget (it includes roughly 70% of the Earth's boreal forest and more than two-thirds of the Earth's permafrost) and a region that has experienced dramatic climate change (increase in temperature, growing season length, floods and droughts) over the past century. Northern Eurasia has also undergone significant land-use change, both driven by human activity (including deforestation, expansion of agricultural lands and urbanization) and natural disturbances (such as wildfires and insect outbreaks). These large environmental and socioeconomic impacts have major implications for the carbon cycle in the region. Northern Eurasia is made up of a diverse set of ecosystems that range from tundra to forests, with significant areas of croplands and pastures as well as deserts, with major urban areas. As such, it represents a complex system with substantial challenges for the modeling community. In this presentation, we provide an overview of past, ongoing and possible future efforts of the integrated modeling of global change for Northern Eurasia. We review the variety of existing modeling approaches to investigate specific components of Earth system dynamics in the region. While there are a limited number of studies that try to integrate various aspects of the Earth system (through scale, teleconnections or processes), we point out that there are few systematic analyses of the various feedbacks within the Earth system (between components, regions or scale). As a result, there is a lack of knowledge of the relative importance of such feedbacks, and it is unclear how policy relevant current studies are that fail to account for these feedbacks. We review the role of Earth system models, and their advantages/limitations compared to detailed single component models. We further introduce the human activity system (global trade, economic models, demographic model and so on), the need for coupled human/earth system models

  11. Overview 2004 of NASA Stirling-Convertor CFD-Model Development and Regenerator R&D Efforts

    Science.gov (United States)

    Tew, Roy C.; Dyson, Rodger W.; Wilson, Scott D.; Demko, Rikako

    2005-01-01

    This paper reports on accomplishments in 2004 in development of Stirling-convertor CFD model at NASA GRC and via a NASA grant, a Stirling regenerator-research effort being conducted via a NASA grant (a follow-on effort to an earlier DOE contract), and a regenerator-microfabrication contract for development of a "next-generation Stirling regenerator." Cleveland State University is the lead organization for all three grant/contractual efforts, with the University of Minnesota and Gedeor Associates as subcontractors. Also, the Stirling Technology Co. and Sunpower, Inc. are both involved in all three efforts, either as funded or unfunded participants. International Mezzo Technologies of Baton Rouge, LA is the regenerator fabricator for the regenerator-microfabrication contract. Results of the efforts in these three areas are summarized.

  12. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    Science.gov (United States)

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO

  13. The complex nature of mixed farming systems requires multidimensional actions supported by integrative research and development efforts.

    Science.gov (United States)

    González-García, E; Gourdine, J L; Alexandre, G; Archimède, H; Vaarst, M

    2012-05-01

    Mixed farming systems (MFS) have demonstrated some success by focusing on the use of integrative and holistic mechanisms, and rationally building on and using the natural and local resource base without exhausting it, while enhancing biodiversity, optimizing complementarities between crops and animal systems and finally increasing opportunities in rural livelihoods. Focusing our analysis and discussion on field experiences and empirical knowledge in the Caribbean islands, this paper discusses the opportunities for a change needed in current MFS research-development philosophy. The importance of shifting from fragile/specialized production systems to MFS under current global conditions is argued with an emphasis on the case of Small Islands Developing States (SIDS) and the Caribbean. Particular vulnerable characteristics as well as the potential and constraints of SIDS and their agricultural sectors are described, while revealing the opportunities for the 'richness' of the natural and local resources to support authentic and less dependent production system strategies. Examples are provided of the use of natural grasses, legumes, crop residues and agro-industrial by-products. We analyse the requirement for a change in research strategies and initiatives through the development of a complex but necessary multi-/inter-/trans-disciplinary teamwork spirit. We stress as essential the collaboration and active participation of local and regional actors, stakeholders and end-users in the identification of research priorities, as well as the generation, exchange and dissemination of knowledge and technology innovations, while strengthening the leadership roles in the conduct of integrative and participative research and development projects.

  14. Modelling human resource requirements for the nuclear industry in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG) (Netherlands); Flore, Massimo; Estorff, Ulrik von [Joint Research Center (JRC) (Netherlands)

    2017-11-15

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  15. Modelling human resource requirements for the nuclear industry in Europe

    International Nuclear Information System (INIS)

    Roelofs, Ferry; Flore, Massimo; Estorff, Ulrik von

    2017-01-01

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  16. Evaluation of an ARPS-based canopy flow modeling system for use in future operational smoke prediction efforts

    Science.gov (United States)

    M. T. Kiefer; S. Zhong; W. E. Heilman; J. J. Charney; X. Bian

    2013-01-01

    Efforts to develop a canopy flow modeling system based on the Advanced Regional Prediction System (ARPS) model are discussed. The standard version of ARPS is modified to account for the effect of drag forces on mean and turbulent flow through a vegetation canopy, via production and sink terms in the momentum and subgrid-scale turbulent kinetic energy (TKE) equations....

  17. A neuronal model of a global workspace in effortful cognitive tasks.

    Science.gov (United States)

    Dehaene, S; Kerszberg, M; Changeux, J P

    1998-11-24

    A minimal hypothesis is proposed concerning the brain processes underlying effortful tasks. It distinguishes two main computational spaces: a unique global workspace composed of distributed and heavily interconnected neurons with long-range axons, and a set of specialized and modular perceptual, motor, memory, evaluative, and attentional processors. Workspace neurons are mobilized in effortful tasks for which the specialized processors do not suffice. They selectively mobilize or suppress, through descending connections, the contribution of specific processor neurons. In the course of task performance, workspace neurons become spontaneously coactivated, forming discrete though variable spatio-temporal patterns subject to modulation by vigilance signals and to selection by reward signals. A computer simulation of the Stroop task shows workspace activation to increase during acquisition of a novel task, effortful execution, and after errors. We outline predictions for spatio-temporal activation patterns during brain imaging, particularly about the contribution of dorsolateral prefrontal cortex and anterior cingulate to the workspace.

  18. DISPLACE: a dynamic, individual-based model for spatial fishing planning and effort displacement: Integrating underlying fish population models

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Miethe, Tanja

    or to the alteration of individual fishing patterns. We demonstrate that integrating the spatial activity of vessels and local fish stock abundance dynamics allow for interactions and more realistic predictions of fishermen behaviour, revenues and stock abundance......We previously developed a spatially explicit, individual-based model (IBM) evaluating the bio-economic efficiency of fishing vessel movements between regions according to the catching and targeting of different species based on the most recent high resolution spatial fishery data. The main purpose...... was to test the effects of alternative fishing effort allocation scenarios related to fuel consumption, energy efficiency (value per litre of fuel), sustainable fish stock harvesting, and profitability of the fisheries. The assumption here was constant underlying resource availability. Now, an advanced...

  19. One State's Systems Change Efforts to Reduce Child Care Expulsion: Taking the Pyramid Model to Scale

    Science.gov (United States)

    Vinh, Megan; Strain, Phil; Davidon, Sarah; Smith, Barbara J.

    2016-01-01

    This article describes the efforts funded by the state of Colorado to address unacceptably high rates of expulsion from child care. Based on the results of a 2006 survey, the state of Colorado launched two complementary policy initiatives in 2009 to impact expulsion rates and to improve the use of evidence-based practices related to challenging…

  20. Bodily Effort Enhances Learning and Metacognition: Investigating the Relation Between Physical Effort and Cognition Using Dual-Process Models of Embodiment.

    Science.gov (United States)

    Skulmowski, Alexander; Rey, Günter Daniel

    2017-01-01

    Recent embodiment research revealed that cognitive processes can be influenced by bodily cues. Some of these cues were found to elicit disparate effects on cognition. For instance, weight sensations can inhibit problem-solving performance, but were shown to increase judgments regarding recall probability (judgments of learning; JOLs) in memory tasks. We investigated the effects of physical effort on learning and metacognition by conducting two studies in which we varied whether a backpack was worn or not while 20 nouns were to be learned. Participants entered a JOL for each word and completed a recall test. Experiment 1 ( N = 18) revealed that exerting physical effort by wearing a backpack led to higher JOLs for easy nouns, without a notable effect on difficult nouns. Participants who wore a backpack reached higher recall scores. Therefore, physical effort may act as a form of desirable difficulty during learning. In Experiment 2 ( N = 30), the influence of physical effort on JOL s and learning disappeared when more difficult nouns were to be learned, implying that a high cognitive load may diminish bodily effects. These findings suggest that physical effort mainly influences superficial modes of thought and raise doubts concerning the explanatory power of metaphor-centered accounts of embodiment for higher-level cognition.

  1. Motorized Activity on Legacy Seismic Lines: A Predictive Modeling Approach to Prioritize Restoration Efforts.

    Science.gov (United States)

    Hornseth, M L; Pigeon, K E; MacNearney, D; Larsen, T A; Stenhouse, G; Cranston, J; Finnegan, L

    2018-05-11

    Natural regeneration of seismic lines, cleared for hydrocarbon exploration, is slow and often hindered by vegetation damage, soil compaction, and motorized human activity. There is an extensive network of seismic lines in western Canada which is known to impact forest ecosystems, and seismic lines have been linked to declines in woodland caribou (Rangifer tarandus caribou). Seismic line restoration is costly, but necessary for caribou conservation to reduce cumulative disturbance. Understanding where motorized activity may be impeding regeneration of seismic lines will aid in prioritizing restoration. Our study area in west-central Alberta, encompassed five caribou ranges where restoration is required under federal species at risk recovery strategies, hence prioritizing seismic lines for restoration is of immediate conservation value. To understand patterns of motorized activity on seismic lines, we evaluated five a priori hypotheses using a predictive modeling framework and Geographic Information System variables across three landscapes in the foothills and northern boreal regions of Alberta. In the northern boreal landscape, motorized activity was most common in dry areas with a large industrial footprint. In highly disturbed areas of the foothills, motorized activity on seismic lines increased with low vegetation heights, relatively dry soils, and further from forest cutblocks, while in less disturbed areas of the foothills, motorized activity on seismic lines decreased proportional to seismic line density, slope steepness, and white-tailed deer abundance, and increased proportional with distance to roads. We generated predictive maps of high motorized activity, identifying 21,777 km of seismic lines where active restoration could expedite forest regeneration.

  2. Hybrid supply chain model for material requirement planning under financial constraints: A case study

    Science.gov (United States)

    Curci, Vita; Dassisti, Michele; Josefa, Mula Bru; Manuel, Díaz Madroñero

    2014-10-01

    Supply chain model (SCM) are potentially capable to integrate different aspects in supporting decision making for enterprise management tasks. The aim of the paper is to propose an hybrid mathematical programming model for optimization of production requirements resources planning. The preliminary model was conceived bottom-up from a real industrial case analysed oriented to maximize cash flow. Despite the intense computational effort required to converge to a solution, optimisation done brought good result in solving the objective function.

  3. Health Promotion Efforts as Predictors of Physical Activity in Schools: An Application of the Diffusion of Innovations Model

    Science.gov (United States)

    Glowacki, Elizabeth M.; Centeio, Erin E.; Van Dongen, Daniel J.; Carson, Russell L.; Castelli, Darla M.

    2016-01-01

    Background: Implementing a comprehensive school physical activity program (CSPAP) effectively addresses public health issues by providing opportunities for physical activity (PA). Grounded in the Diffusion of Innovations model, the purpose of this study was to identify how health promotion efforts facilitate opportunities for PA. Methods: Physical…

  4. Specification of advanced safety modeling requirements (Rev. 0)

    International Nuclear Information System (INIS)

    Fanning, T. H.; Tautges, T. J.

    2008-01-01

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models will

  5. Effortful echolalia.

    Science.gov (United States)

    Hadano, K; Nakamura, H; Hamanaka, T

    1998-02-01

    We report three cases of effortful echolalia in patients with cerebral infarction. The clinical picture of speech disturbance is associated with Type 1 Transcortical Motor Aphasia (TCMA, Goldstein, 1915). The patients always spoke nonfluently with loss of speech initiative, dysarthria, dysprosody, agrammatism, and increased effort and were unable to repeat sentences longer than those containing four or six words. In conversation, they first repeated a few words spoken to them, and then produced self initiated speech. The initial repetition as well as the subsequent self initiated speech, which were realized equally laboriously, can be regarded as mitigated echolalia (Pick, 1924). They were always aware of their own echolalia and tried to control it without effect. These cases demonstrate that neither the ability to repeat nor fluent speech are always necessary for echolalia. The possibility that a lesion in the left medial frontal lobe, including the supplementary motor area, plays an important role in effortful echolalia is discussed.

  6. Economic effort management in multispecies fisheries: the FcubEcon model

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans; Ulrich, Clara

    2010-01-01

    Applying single-species assessment and quotas in multispecies fisheries can lead to overfishing or quota underutilization, because advice can be conflicting when different stocks are caught within the same fishery. During the past decade, increased focus on this issue has resulted in the developm......Applying single-species assessment and quotas in multispecies fisheries can lead to overfishing or quota underutilization, because advice can be conflicting when different stocks are caught within the same fishery. During the past decade, increased focus on this issue has resulted...... optimal manner, in both effort-management and single-quota management settings.Applying single-species assessment and quotas in multispecies fisheries can lead to overfishing or quota underutilization, because advice can be conflicting when different stocks are caught within the same fishery. During...

  7. The Effort-reward Imbalance work-stress model and daytime salivary cortisol and dehydroepiandrosterone (DHEA) among Japanese women

    Science.gov (United States)

    Ota, Atsuhiko; Mase, Junji; Howteerakul, Nopporn; Rajatanun, Thitipat; Suwannapong, Nawarat; Yatsuya, Hiroshi; Ono, Yuichiro

    2014-01-01

    We examined the influence of work-related effort–reward imbalance and overcommitment to work (OC), as derived from Siegrist's Effort–Reward Imbalance (ERI) model, on the hypothalamic–pituitary–adrenocortical (HPA) axis. We hypothesized that, among healthy workers, both cortisol and dehydroepiandrosterone (DHEA) secretion would be increased by effort–reward imbalance and OC and, as a result, cortisol-to-DHEA ratio (C/D ratio) would not differ by effort–reward imbalance or OC. The subjects were 115 healthy female nursery school teachers. Salivary cortisol, DHEA, and C/D ratio were used as indexes of HPA activity. Mixed-model analyses of variance revealed that neither the interaction between the ERI model indicators (i.e., effort, reward, effort-to-reward ratio, and OC) and the series of measurement times (9:00, 12:00, and 15:00) nor the main effect of the ERI model indicators was significant for daytime salivary cortisol, DHEA, or C/D ratio. Multiple linear regression analyses indicated that none of the ERI model indicators was significantly associated with area under the curve of daytime salivary cortisol, DHEA, or C/D ratio. We found that effort, reward, effort–reward imbalance, and OC had little influence on daytime variation patterns, levels, or amounts of salivary HPA-axis-related hormones. Thus, our hypotheses were not supported. PMID:25228138

  8. Reviewing the effort-reward imbalance model: drawing up the balance of 45 empirical studies

    NARCIS (Netherlands)

    Vegchel, van N.; Jonge, de J.; Bosma, H.; Schaufeli, W.B.

    2005-01-01

    The present paper provides a review of 45 studies on the Effort–Reward Imbalance (ERI) Model published from 1986 to 2003 (inclusive). In 1986, the ERI Model was introduced by Siegrist et al. (Biological and Psychological Factors in Cardiovascular Disease, Springer, Berlin, 1986, pp. 104–126; Social

  9. LMDzT-INCA dust forecast model developments and associated validation efforts

    International Nuclear Information System (INIS)

    Schulz, M; Cozic, A; Szopa, S

    2009-01-01

    The nudged atmosphere global climate model LMDzT-INCA is used to forecast global dust fields. Evaluation is undertaken in retrospective for the forecast results of the year 2006. For this purpose AERONET/Photons sites in Northern Africa and on the Arabian Peninsula are chosen where aerosol optical depth is dominated by dust. Despite its coarse resolution, the model captures 48% of the day to day dust variability near Dakar on the initial day of the forecast. On weekly and monthly scale the model captures respectively 62% and 68% of the variability. Correlation coefficients between daily AOD values observed and modelled at Dakar decrease from 0.69 for the initial forecast day to 0.59 and 0.41 respectively for two days ahead and five days ahead. If one requests that the model should be able to issue a warning for an exceedance of aerosol optical depth of 0.5 and issue no warning in the other cases, then the model was wrong in 29% of the cases for day 0, 32% for day 2 and 35% for day 5. A reanalysis run with archived ECMWF winds is only slightly better (r=0.71) but was in error in 25% of the cases. Both the improved simulation of the monthly versus daily variability and the deterioration of the forecast with time can be explained by model failure to simulate the exact timing of a dust event.

  10. Evaluation of Thin Plate Hydrodynamic Stability through a Combined Numerical Modeling and Experimental Effort

    Energy Technology Data Exchange (ETDEWEB)

    Tentner, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Bojanowski, C. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Wilson, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Solbrekken, G [Univ. of Missouri, Columbia, MO (United States); Jesse, C. [Univ. of Missouri, Columbia, MO (United States); Kennedy, J. [Univ. of Missouri, Columbia, MO (United States); Rivers, J. [Univ. of Missouri, Columbia, MO (United States); Schnieders, G. [Univ. of Missouri, Columbia, MO (United States)

    2017-05-01

    An experimental and computational effort was undertaken in order to evaluate the capability of the fluid-structure interaction (FSI) simulation tools to describe the deflection of a Missouri University Research Reactor (MURR) fuel element plate redesigned for conversion to lowenriched uranium (LEU) fuel due to hydrodynamic forces. Experiments involving both flat plates and curved plates were conducted in a water flow test loop located at the University of Missouri (MU), at conditions and geometries that can be related to the MURR LEU fuel element. A wider channel gap on one side of the test plate, and a narrower on the other represent the differences that could be encountered in a MURR element due to allowed fabrication variability. The difference in the channel gaps leads to a pressure differential across the plate, leading to plate deflection. The induced plate deflection the pressure difference induces in the plate was measured at specified locations using a laser measurement technique. High fidelity 3-D simulations of the experiments were performed at MU using the computational fluid dynamics code STAR-CCM+ coupled with the structural mechanics code ABAQUS. Independent simulations of the experiments were performed at Argonne National Laboratory (ANL) using the STAR-CCM+ code and its built-in structural mechanics solver. The simulation results obtained at MU and ANL were compared with the corresponding measured plate deflections.

  11. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to

  12. Modeling Impact and Cost-Effectiveness of Increased Efforts to Attract Voluntary Medical Male Circumcision Clients Ages 20-29 in Zimbabwe.

    Directory of Open Access Journals (Sweden)

    Katharine Kripke

    Full Text Available Zimbabwe aims to increase circumcision coverage to 80% among 13- to 29-year-olds. However, implementation data suggest that high coverage among men ages 20 and older may not be achievable without efforts specifically targeted to these men, incurring additional costs per circumcision. Scale-up scenarios were created based on trends in implementation data in Zimbabwe, and the cost-effectiveness of increasing efforts to recruit clients ages 20-29 was examined.Zimbabwe voluntary medical male circumcision (VMMC program data were used to project trends in male circumcision coverage by age into the future. The projection informed a base scenario in which, by 2018, the country achieves 80% circumcision coverage among males ages 10-19 and lower levels of coverage among men above age 20. The Zimbabwe DMPPT 2.0 model was used to project costs and impacts, assuming a US$109 VMMC unit cost in the base scenario and a 3% discount rate. Two other scenarios assumed that the program could increase coverage among clients ages 20-29 with a corresponding increase in unit cost for these age groups.When circumcision coverage among men ages 20-29 is increased compared with a base scenario reflecting current implementation trends, fewer VMMCs are required to avert one infection. If more than 50% additional effort (reflected as multiplying the unit cost by >1.5 is required to double the increase in coverage among this age group compared with the base scenario, the cost per HIV infection averted is higher than in the base scenario.Although increased investment in recruiting VMMC clients ages 20-29 may lead to greater overall impact if recruitment efforts are successful, it may also lead to lower cost-effectiveness, depending on the cost of increasing recruitment. Programs should measure the relationship between increased effort and increased ability to attract this age group.

  13. Are current health behavioral change models helpful in guiding prevention of weight gain efforts?

    Science.gov (United States)

    Baranowski, Tom; Cullen, Karen W; Nicklas, Theresa; Thompson, Deborah; Baranowski, Janice

    2003-10-01

    Effective procedures are needed to prevent the substantial increases in adiposity that have been occurring among children and adults. Behavioral change may occur as a result of changes in variables that mediate interventions. These mediating variables have typically come from the theories or models used to understand behavior. Seven categories of theories and models are reviewed to define the concepts and to identify the motivational mechanism(s), the resources that a person needs for change, the processes by which behavioral change is likely to occur, and the procedures necessary to promote change. Although each model has something to offer obesity prevention, the early promise can be achieved only with substantial additional research in which these models are applied to diet and physical activity in regard to obesity. The most promising avenues for such research seem to be using the latest variants of the Theory of Planned Behavior and Social Ecology. Synergy may be achieved by taking the most promising concepts from each model and integrating them for use with specific populations. Biology-based steps in an eating or physical activity event are identified, and research issues are suggested to integrate behavioral and biological approaches to understanding eating and physical activity behaviors. Social marketing procedures have much to offer in terms of organizing and strategizing behavioral change programs to incorporate these theoretical ideas. More research is needed to assess the true potential for these models to contribute to our understanding of obesity-related diet and physical activity practices, and in turn, to obesity prevention.

  14. RECONSTRUCTION OF PENSION FUND PERFORMANCE MODEL AS AN EFFORT TO WORTHY PENSION FUND GOVERNANCE

    Directory of Open Access Journals (Sweden)

    Apriyanto Gaguk

    2017-08-01

    Full Text Available This study aims to reconstruct the performance assessment model on Pension Fund by modifying Baldrige Assessment method that is adjusted to the conditions in Dana Pensiun A (Pension Fund A in order to realize Good Pension Fund Governance. This study design uses case study analysis. The research sites were conducted in Dana Pensiun A. The informants in the study included the employer, supervisory board, pension fund management, active and passive pension fund participant as well as financial services authority elements as the regulator. The result of this research is a construction of a comprehensive and profound retirement performance assessment model with attention to aspects of growth and fair distribution. The model includes the parameters of leadership, strategic planning, stakeholders focus, measurement, analysis, and knowledge management, workforce focus, standard operational procedure focus, result, just and fair distribution of wealth and power.

  15. The European Integrated Tokamak Modelling (ITM) effort: achievements and first physics results

    International Nuclear Information System (INIS)

    Falchetto, G.L.; Nardon, E.; Artaud, J.F.; Basiuk, V.; Huynh, Ph.; Imbeaux, F.; Coster, D.; Scott, B.D.; Coelho, R.; Alves, L.L.; Bizarro, João P.S.; Ferreira, J.; Figueiredo, A.; Figini, L.; Nowak, S.; Farina, D.; Kalupin, D.; Boulbe, C.; Faugeras, B.; Dinklage, A.

    2014-01-01

    A selection of achievements and first physics results are presented of the European Integrated Tokamak Modelling Task Force (EFDA ITM-TF) simulation framework, which aims to provide a standardized platform and an integrated modelling suite of validated numerical codes for the simulation and prediction of a complete plasma discharge of an arbitrary tokamak. The framework developed by the ITM-TF, based on a generic data structure including both simulated and experimental data, allows for the development of sophisticated integrated simulations (workflows) for physics application. The equilibrium reconstruction and linear magnetohydrodynamic (MHD) stability simulation chain was applied, in particular, to the analysis of the edge MHD stability of ASDEX Upgrade type-I ELMy H-mode discharges and ITER hybrid scenario, demonstrating the stabilizing effect of an increased Shafranov shift on edge modes. Interpretive simulations of a JET hybrid discharge were performed with two electromagnetic turbulence codes within ITM infrastructure showing the signature of trapped-electron assisted ITG turbulence. A successful benchmark among five EC beam/ray-tracing codes was performed in the ITM framework for an ITER inductive scenario for different launching conditions from the equatorial and upper launcher, showing good agreement of the computed absorbed power and driven current. Selected achievements and scientific workflow applications targeting key modelling topics and physics problems are also presented, showing the current status of the ITM-TF modelling suite. (paper)

  16. Percent Effort vs. Fee-for-Service: A Comparison of Models for Statistical Collaboration

    Science.gov (United States)

    Ittenbach, Richard F.; DeAngelis, Francis W.

    2012-01-01

    Many statisticians are uncomfortable with discussions about the financial implications of their work. Those who are comfortable may not fully understand the policies and procedures underlying the financial operations of the department. The purpose of the present paper is twofold: first, to describe two predominant models of compensation used by…

  17. A coupled modelling effort to study the fate of contaminated sediments downstream of the Coles Hill deposit, Virginia, USA

    Directory of Open Access Journals (Sweden)

    C. F. Castro-Bolinaga

    2015-03-01

    Full Text Available This paper presents the preliminary results of a coupled modelling effort to study the fate of tailings (radioactive waste-by product downstream of the Coles Hill uranium deposit located in Virginia, USA. The implementation of the overall modelling process includes a one-dimensional hydraulic model to qualitatively characterize the sediment transport process under severe flooding conditions downstream of the potential mining site, a two-dimensional ANSYS Fluent model to simulate the release of tailings from a containment cell located partially above the local ground surface into the nearby streams, and a one-dimensional finite-volume sediment transport model to examine the propagation of a tailings sediment pulse in the river network located downstream. The findings of this investigation aim to assist in estimating the potential impacts that tailings would have if they were transported into rivers and reservoirs located downstream of the Coles Hill deposit that serve as municipal drinking water supplies.

  18. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  19. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  20. Exploring Spatiotemporal Trends in Commercial Fishing Effort of an Abalone Fishing Zone: A GIS-Based Hotspot Model

    Science.gov (United States)

    Jalali, M. Ali; Ierodiaconou, Daniel; Gorfine, Harry; Monk, Jacquomo; Rattray, Alex

    2015-01-01

    Assessing patterns of fisheries activity at a scale related to resource exploitation has received particular attention in recent times. However, acquiring data about the distribution and spatiotemporal allocation of catch and fishing effort in small scale benthic fisheries remains challenging. Here, we used GIS-based spatio-statistical models to investigate the footprint of commercial diving events on blacklip abalone (Haliotis rubra) stocks along the south-west coast of Victoria, Australia from 2008 to 2011. Using abalone catch data matched with GPS location we found catch per unit of fishing effort (CPUE) was not uniformly spatially and temporally distributed across the study area. Spatial autocorrelation and hotspot analysis revealed significant spatiotemporal clusters of CPUE (with distance thresholds of 100’s of meters) among years, indicating the presence of CPUE hotspots focused on specific reefs. Cumulative hotspot maps indicated that certain reef complexes were consistently targeted across years but with varying intensity, however often a relatively small proportion of the full reef extent was targeted. Integrating CPUE with remotely-sensed light detection and ranging (LiDAR) derived bathymetry data using generalized additive mixed model corroborated that fishing pressure primarily coincided with shallow, rugose and complex components of reef structures. This study demonstrates that a geospatial approach is efficient in detecting patterns and trends in commercial fishing effort and its association with seafloor characteristics. PMID:25992800

  1. Exploring Spatiotemporal Trends in Commercial Fishing Effort of an Abalone Fishing Zone: A GIS-Based Hotspot Model.

    Directory of Open Access Journals (Sweden)

    M Ali Jalali

    Full Text Available Assessing patterns of fisheries activity at a scale related to resource exploitation has received particular attention in recent times. However, acquiring data about the distribution and spatiotemporal allocation of catch and fishing effort in small scale benthic fisheries remains challenging. Here, we used GIS-based spatio-statistical models to investigate the footprint of commercial diving events on blacklip abalone (Haliotis rubra stocks along the south-west coast of Victoria, Australia from 2008 to 2011. Using abalone catch data matched with GPS location we found catch per unit of fishing effort (CPUE was not uniformly spatially and temporally distributed across the study area. Spatial autocorrelation and hotspot analysis revealed significant spatiotemporal clusters of CPUE (with distance thresholds of 100's of meters among years, indicating the presence of CPUE hotspots focused on specific reefs. Cumulative hotspot maps indicated that certain reef complexes were consistently targeted across years but with varying intensity, however often a relatively small proportion of the full reef extent was targeted. Integrating CPUE with remotely-sensed light detection and ranging (LiDAR derived bathymetry data using generalized additive mixed model corroborated that fishing pressure primarily coincided with shallow, rugose and complex components of reef structures. This study demonstrates that a geospatial approach is efficient in detecting patterns and trends in commercial fishing effort and its association with seafloor characteristics.

  2. Combined observational and modeling efforts of aerosol-cloud-precipitation interactions over Southeast Asia

    Science.gov (United States)

    Loftus, Adrian; Tsay, Si-Chee; Nguyen, Xuan Anh

    2016-04-01

    Low-level stratocumulus (Sc) clouds cover more of the Earth's surface than any other cloud type rendering them critical for Earth's energy balance, primarily via reflection of solar radiation, as well as their role in the global hydrological cycle. Stratocumuli are particularly sensitive to changes in aerosol loading on both microphysical and macrophysical scales, yet the complex feedbacks involved in aerosol-cloud-precipitation interactions remain poorly understood. Moreover, research on these clouds has largely been confined to marine environments, with far fewer studies over land where major sources of anthropogenic aerosols exist. The aerosol burden over Southeast Asia (SEA) in boreal spring, attributed to biomass burning (BB), exhibits highly consistent spatiotemporal distribution patterns, with major variability due to changes in aerosol loading mediated by processes ranging from large-scale climate factors to diurnal meteorological events. Downwind from source regions, the transported BB aerosols often overlap with low-level Sc cloud decks associated with the development of the region's pre-monsoon system, providing a unique, natural laboratory for further exploring their complex micro- and macro-scale relationships. Compared to other locations worldwide, studies of springtime biomass-burning aerosols and the predominately Sc cloud systems over SEA and their ensuing interactions are underrepresented in scientific literature. Measurements of aerosol and cloud properties, whether ground-based or from satellites, generally lack information on microphysical processes; thus cloud-resolving models are often employed to simulate the underlying physical processes in aerosol-cloud-precipitation interactions. The Goddard Cumulus Ensemble (GCE) cloud model has recently been enhanced with a triple-moment (3M) bulk microphysics scheme as well as the Regional Atmospheric Modeling System (RAMS) version 6 aerosol module. Because the aerosol burden not only affects cloud

  3. Artificial Neural Networks for Reducing Computational Effort in Active Truncated Model Testing of Mooring Lines

    DEFF Research Database (Denmark)

    Christiansen, Niels Hørbye; Voie, Per Erlend Torbergsen; Høgsberg, Jan Becker

    2015-01-01

    simultaneously, this method is very demanding in terms of numerical efficiency and computational power. Therefore, this method has not yet proved to be feasible. It has recently been shown how a hybrid method combining classical numerical models and artificial neural networks (ANN) can provide a dramatic...... prior to the experiment and with a properly trained ANN it is no problem to obtain accurate simulations much faster than real time-without any need for large computational capacity. The present study demonstrates how this hybrid method can be applied to the active truncated experiments yielding a system...

  4. Efforts toward validation of a hydrogeological model of the Asse area

    International Nuclear Information System (INIS)

    Fein, E.; Klarr, K.; von Stempel, C.

    1995-01-01

    The Asse anticline (8 x 3 km) near Braunschweig (Germany) is part of the Subhercynian Basin and is characterized by a NW-SE orientation. In 1965, the GSF Research Center for Environment and Health acquired the former Asse salt mine on behalf of the FRG in order to carry out research and development work with a view of safe disposal of radioactive waste. To assess long term safety and predict groundwater flow nd radionuclide transport, an experimental program was carried out to validate hydrogeological models of the overburden of the Asse salt mine and to provide these with data. Five deep boreholes from 700 to 2250 m and 4 geological exploration shallow boreholes where drilled in the Asse area. Moreover, 19 piezometers and 27 exploration boreholes were sunk to perform pumping and tracer tests and yearly borehole loggings. In the end, about 50 boreholes and wells, 25 measuring weirs and about 70 creeks, drainage and springs were available to collect hydrological data and water samples. The different experiments and their evaluations as well as different hydrogeological models are presented and discussed. (J.S.). 9 refs., 7 figs

  5. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  6. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  7. An effort allocation model considering different budgetary constraint on fault detection process and fault correction process

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2016-01-01

    Full Text Available Fault detection process (FDP and Fault correction process (FCP are important phases of software development life cycle (SDLC. It is essential for software to undergo a testing phase, during which faults are detected and corrected. The main goal of this article is to allocate the testing resources in an optimal manner to minimize the cost during testing phase using FDP and FCP under dynamic environment. In this paper, we first assume there is a time lag between fault detection and fault correction. Thus, removal of a fault is performed after a fault is detected. In addition, detection process and correction process are taken to be independent simultaneous activities with different budgetary constraints. A structured optimal policy based on optimal control theory is proposed for software managers to optimize the allocation of the limited resources with the reliability criteria. Furthermore, release policy for the proposed model is also discussed. Numerical example is given in support of the theoretical results.

  8. Community effort endorsing multiscale modelling, multiscale data science and multiscale computing for systems medicine.

    Science.gov (United States)

    Zanin, Massimiliano; Chorbev, Ivan; Stres, Blaz; Stalidzans, Egils; Vera, Julio; Tieri, Paolo; Castiglione, Filippo; Groen, Derek; Zheng, Huiru; Baumbach, Jan; Schmid, Johannes A; Basilio, José; Klimek, Peter; Debeljak, Nataša; Rozman, Damjana; Schmidt, Harald H H W

    2017-12-05

    Systems medicine holds many promises, but has so far provided only a limited number of proofs of principle. To address this road block, possible barriers and challenges of translating systems medicine into clinical practice need to be identified and addressed. The members of the European Cooperation in Science and Technology (COST) Action CA15120 Open Multiscale Systems Medicine (OpenMultiMed) wish to engage the scientific community of systems medicine and multiscale modelling, data science and computing, to provide their feedback in a structured manner. This will result in follow-up white papers and open access resources to accelerate the clinical translation of systems medicine. © The Author 2017. Published by Oxford University Press.

  9. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  10. Job characteristics and safety climate: the role of effort-reward and demand-control-support models.

    Science.gov (United States)

    Phipps, Denham L; Malley, Christine; Ashcroft, Darren M

    2012-07-01

    While safety climate is widely recognized as a key influence on organizational safety, there remain questions about the nature of its antecedents. One potential influence on safety climate is job characteristics (that is, psychosocial features of the work environment). This study investigated the relationship between two job characteristics models--demand-control-support (Karasek & Theorell, 1990) and effort-reward imbalance (Siegrist, 1996)--and safety climate. A survey was conducted with a random sample of 860 British retail pharmacists, using the job contents questionnaire (JCQ), effort-reward imbalance indicator (ERI) and a measure of safety climate in pharmacies. Multivariate data analyses found that: (a) both models contributed to the prediction of safety climate ratings, with the demand-control-support model making the largest contribution; (b) there were some interactions between demand, control and support from the JCQ in the prediction of safety climate scores. The latter finding suggests the presence of "active learning" with respect to safety improvement in high demand, high control settings. The findings provide further insight into the ways in which job characteristics relate to safety, both individually and at an aggregated level.

  11. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  12. Comparison of catch per unit effort among four minnow trap models in the three-spined stickleback (Gasterosteus aculeatus) fishery.

    Science.gov (United States)

    Budria, Alexandre; DeFaveri, Jacquelin; Merilä, Juha

    2015-12-21

    Minnow traps are commonly used in the stickleback (Gasterostidae) fishery, but the potential differences in catch per unit effort (CPUE) among different minnow trap models are little studied. We compared the CPUE of four different minnow trap models in field experiments conducted with three-spined sticklebacks (Gasterosteus aculeatus). Marked (up to 26 fold) differences in median CPUE among different trap models were observed. Metallic uncoated traps yielded the largest CPUE (2.8 fish/h), followed by metallic black nylon-coated traps (1.3 fish/h). Collapsible canvas traps yielded substantially lower CPUEs (black: 0.7 fish/h; red: 0.1 fish/h) than the metallic traps. Laboratory trials further revealed significant differences in escape probabilities among the different trap models. While the differences in escape probability can explain at least part of the differences in CPUE among the trap models (e.g. high escape rate and low CPUE in red canvas traps), discrepancies between model-specific CPUEs and escape rates suggests that variation in entrance rate also contributes to the differences in CPUE. In general, and in accordance with earlier data on nine-spined stickleback (Pungitius pungitius) trapping, the results suggest that uncoated metallic (Gee-type) traps are superior to the other commonly used minnow trap models in stickleback fisheries.

  13. DECISION MAKING MODELING OF CONCRETE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Suhartono Irawan

    2001-01-01

    Full Text Available This paper presents the results of an experimental evaluation between predicted and practice concrete strength. The scope of the evaluation is the optimisation of the cement content for different concrete grades as a result of bringing the target mean value of tests cubes closer to the required characteristic strength value by reducing the standard deviation. Abstract in Bahasa Indonesia : concrete+mix+design%2C+acceptance+control%2C+optimisation%2C+cement+content.

  14. Applying the effort-reward imbalance model to household and family work: a population-based study of German mothers

    Directory of Open Access Journals (Sweden)

    Sperlich Stefanie

    2012-01-01

    Full Text Available Abstract Background This paper reports on results of a newly developed questionnaire for the assessment of effort-reward imbalance (ERI in unpaid household and family work. Methods: Using a cross-sectional population-based survey of German mothers (n = 3129 the dimensional structure of the theoretical ERI model was validated by means of Confirmatory Factor Analysis (CFA. Analyses of Variance were computed to examine relationships between ERI and social factors and health outcomes. Results CFA revealed good psychometric properties indicating that the subscale 'effort' is based on one latent factor and the subscale 'reward' is composed of four dimensions: 'intrinsic value of family and household work', 'societal esteem', 'recognition from the partner', and 'affection from the child(ren'. About 19.3% of mothers perceived lack of reciprocity and 23.8% showed high rates of overcommitment in terms of inability to withdraw from household and family obligations. Socially disadvantaged mothers were at higher risk of ERI, in particular with respect to the perception of low societal esteem. Gender inequality in the division of household and family work and work-family conflict accounted most for ERI in household and family work. Analogous to ERI in paid work we could demonstrate that ERI affects self-rated health, somatic complaints, mental health and, to some extent, hypertension. Conclusions The newly developed questionnaire demonstrates satisfied validity and promising results for extending the ERI model to household and family work.

  15. Applying the effort-reward imbalance model to household and family work: a population-based study of German mothers.

    Science.gov (United States)

    Sperlich, Stefanie; Peter, Richard; Geyer, Siegfried

    2012-01-06

    This paper reports on results of a newly developed questionnaire for the assessment of effort-reward imbalance (ERI) in unpaid household and family work. Using a cross-sectional population-based survey of German mothers (n = 3129) the dimensional structure of the theoretical ERI model was validated by means of Confirmatory Factor Analysis (CFA). Analyses of Variance were computed to examine relationships between ERI and social factors and health outcomes. CFA revealed good psychometric properties indicating that the subscale 'effort' is based on one latent factor and the subscale 'reward' is composed of four dimensions: 'intrinsic value of family and household work', 'societal esteem', 'recognition from the partner', and 'affection from the child(ren)'. About 19.3% of mothers perceived lack of reciprocity and 23.8% showed high rates of overcommitment in terms of inability to withdraw from household and family obligations. Socially disadvantaged mothers were at higher risk of ERI, in particular with respect to the perception of low societal esteem. Gender inequality in the division of household and family work and work-family conflict accounted most for ERI in household and family work. Analogous to ERI in paid work we could demonstrate that ERI affects self-rated health, somatic complaints, mental health and, to some extent, hypertension. The newly developed questionnaire demonstrates satisfied validity and promising results for extending the ERI model to household and family work.

  16. Experimental and theoretical requirements for fuel modelling

    International Nuclear Information System (INIS)

    Gatesoupe, J.P.

    1979-01-01

    From a scientific point of view it may be considered that any event in the life of a fuel pin under irradiation should be perfectly well understood and foreseen from that deterministic point of view, the whole behaviour of the pin maybe analysed and dismantled with a specific function for every component part and each component part related to one basic phenomenon which can be independently studied on pure physical grounds. When extracted from the code structure the subroutine is studied for itself by specialists who try to keep as close as possible to the physics involved in the phenomenon; that often leads to an impressive luxury in details and a subsequent need for many unavailable input data. It might seem more secure to follow that approach since it tries to be firmly based on theoretical grounds. One should think so if the phenomenological situation in the pin were less complex than it is. The codes would not be adequate for off-normal operating conditions since for the accidental transient conditions the key-phenomena would not be the same as for steady-state or slow transient conditions. The orientation given to fuel modelling is based on our two main technological constraints which are: no fuel melting; no cladding failure; no excessive cladding deformation. In this context, the only relevant models are those which have a significant influence on the maximum temperatures in the fuel or on the cladding damage hence the selection between key models and irrelevant models which will next be done. A rather pragmatic view is kept on codification with a special focus on a few determinant aspects of fuel behaviour and no attention to models which are nothing but decorative. Fuel modeling is merely considered as a link between experimental knowledge; it serves as a guide for further improvements in fuel design and as so happens to be quite useful. On this basis the main lacks in of fuel behaviour is described. These are mainly concerning: thermal transfer through

  17. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  18. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  19. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  20. Systems and context modeling approach to requirements analysis

    Science.gov (United States)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  1. Customer requirement modeling and mapping of numerical control machine

    Directory of Open Access Journals (Sweden)

    Zhongqi Sheng

    2015-10-01

    Full Text Available In order to better obtain information about customer requirement and develop products meeting customer requirement, it is necessary to systematically analyze and handle the customer requirement. This article uses the product service system of numerical control machine as research objective and studies the customer requirement modeling and mapping oriented toward configuration design. It introduces the conception of requirement unit, expounds the customer requirement decomposition rules, and establishes customer requirement model; it builds the house of quality using quality function deployment and confirms the weight of technical feature of product and service; it explores the relevance rules between data using rough set theory, establishes rule database, and solves the target value of technical feature of product. Using economical turning center series numerical control machine as an example, it verifies the rationality of proposed customer requirement model.

  2. Defining Requirements and Applying Information Modeling for Protecting Enterprise Assets

    Science.gov (United States)

    Fortier, Stephen C.; Volk, Jennifer H.

    The advent of terrorist threats has heightened local, regional, and national governments' interest in emergency response and disaster preparedness. The threat of natural disasters also challenges emergency responders to act swiftly and in a coordinated fashion. When a disaster occurs, an ad hoc coalition of pre-planned groups usually forms to respond to the incident. History has shown that these “system of systems” do not interoperate very well. Communications between fire, police and rescue components either do not work or are inefficient. Government agencies, non-governmental organizations (NGOs), and private industry use a wide array of software platforms for managing data about emergency conditions, resources and response activities. Most of these are stand-alone systems with very limited capability for data sharing with other agencies or other levels of government. Information technology advances have facilitated the movement towards an integrated and coordinated approach to emergency management. Other communication mechanisms, such as video teleconferencing, digital television and radio broadcasting, are being utilized to combat the challenges of emergency information exchange. Recent disasters, such as Hurricane Katrina and the tsunami in Indonesia, have illuminated the weaknesses in emergency response. This paper will discuss the need for defining requirements for components of ad hoc coalitions which are formed to respond to disasters. A goal of our effort was to develop a proof of concept that applying information modeling to the business processes used to protect and mitigate potential loss of an enterprise was feasible. These activities would be modeled both pre- and post-incident.

  3. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  4. Predicting flow through low-permeability, partially saturated, fractured rock: A review of modeling and experimental efforts at Yucca Mountain

    International Nuclear Information System (INIS)

    Eaton, R.R.; Bixler, N.E.; Glass, R.J.

    1989-01-01

    Current interest in storing high-level nuclear waste in underground repositories has resulted in an increased effort to understand the physics of water flow through low-permeability rock. The US Department of Energy is investigating a prospective repository site located in volcanic ash (tuff) hundreds of meters above the water table at Yucca Mountain, Nevada. Consequently, mathematical models and experimental procedures are being developed to provide a better understanding of the hydrology of this low-permeability, partially saturated, fractured rock. Modeling water flow in the vadose zone in soils and in relatively permeable rocks such as sandstone has received considerable attention for many years. The treatment of flow (including nonisothermal conditions) through materials such as the Yucca Mountain tuffs, however, has not received the same level of attention, primarily because it is outside the domain of agricultural and petroleum technology. This paper reviews the status of modeling and experimentation currently being used to understand and predict water flow at the proposed repository site. Several areas of research needs emphasized by the review are outlined. The extremely nonlinear hydraulic properties of these tuffs in combination with their heterogeneous nature makes it a challenging and unique problem from a computational and experimental view point. 101 refs., 14 figs., 1 tab

  5. Capabilities and requirements for modelling radionuclide transport in the geosphere

    International Nuclear Information System (INIS)

    Paige, R.W.; Piper, D.

    1989-02-01

    This report gives an overview of geosphere flow and transport models suitable for use by the Department of the Environment in the performance assessment of radioactive waste disposal sites. An outline methodology for geosphere modelling is proposed, consisting of a number of different types of model. A brief description of each of the component models is given, indicating the purpose of the model, the processes being modelled and the methodologies adopted. Areas requiring development are noted. (author)

  6. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  7. Associations of Extrinsic and Intrinsic Components of Work Stress with Health: A Systematic Review of Evidence on the Effort-Reward Imbalance Model.

    Science.gov (United States)

    Siegrist, Johannes; Li, Jian

    2016-04-19

    Mainstream psychological stress theory claims that it is important to include information on people's ways of coping with work stress when assessing the impact of stressful psychosocial work environments on health. Yet, some widely used respective theoretical models focus exclusively on extrinsic factors. The model of effort-reward imbalance (ERI) differs from them as it explicitly combines information on extrinsic and intrinsic factors in studying workers' health. As a growing number of studies used the ERI model in recent past, we conducted a systematic review of available evidence, with a special focus on the distinct contribution of its intrinsic component, the coping pattern "over-commitment", towards explaining health. Moreover, we explore whether the interaction of intrinsic and extrinsic components exceeds the size of effects on health attributable to single components. Results based on 51 reports document an independent explanatory role of "over-commitment" in explaining workers' health in a majority of studies. However, support in favour of the interaction hypothesis is limited and requires further exploration. In conclusion, the findings of this review support the usefulness of a work stress model that combines extrinsic and intrinsic components in terms of scientific explanation and of designing more comprehensive worksite stress prevention programs.

  8. Associations of Extrinsic and Intrinsic Components of Work Stress with Health: A Systematic Review of Evidence on the Effort-Reward Imbalance Model

    Directory of Open Access Journals (Sweden)

    Johannes Siegrist

    2016-04-01

    Full Text Available Mainstream psychological stress theory claims that it is important to include information on people’s ways of coping with work stress when assessing the impact of stressful psychosocial work environments on health. Yet, some widely used respective theoretical models focus exclusively on extrinsic factors. The model of effort-reward imbalance (ERI differs from them as it explicitly combines information on extrinsic and intrinsic factors in studying workers’ health. As a growing number of studies used the ERI model in recent past, we conducted a systematic review of available evidence, with a special focus on the distinct contribution of its intrinsic component, the coping pattern “over-commitment”, towards explaining health. Moreover, we explore whether the interaction of intrinsic and extrinsic components exceeds the size of effects on health attributable to single components. Results based on 51 reports document an independent explanatory role of “over-commitment” in explaining workers’ health in a majority of studies. However, support in favour of the interaction hypothesis is limited and requires further exploration. In conclusion, the findings of this review support the usefulness of a work stress model that combines extrinsic and intrinsic components in terms of scientific explanation and of designing more comprehensive worksite stress prevention programs.

  9. Associations of Extrinsic and Intrinsic Components of Work Stress with Health: A Systematic Review of Evidence on the Effort-Reward Imbalance Model

    Science.gov (United States)

    Siegrist, Johannes; Li, Jian

    2016-01-01

    Mainstream psychological stress theory claims that it is important to include information on people’s ways of coping with work stress when assessing the impact of stressful psychosocial work environments on health. Yet, some widely used respective theoretical models focus exclusively on extrinsic factors. The model of effort-reward imbalance (ERI) differs from them as it explicitly combines information on extrinsic and intrinsic factors in studying workers’ health. As a growing number of studies used the ERI model in recent past, we conducted a systematic review of available evidence, with a special focus on the distinct contribution of its intrinsic component, the coping pattern “over-commitment”, towards explaining health. Moreover, we explore whether the interaction of intrinsic and extrinsic components exceeds the size of effects on health attributable to single components. Results based on 51 reports document an independent explanatory role of “over-commitment” in explaining workers’ health in a majority of studies. However, support in favour of the interaction hypothesis is limited and requires further exploration. In conclusion, the findings of this review support the usefulness of a work stress model that combines extrinsic and intrinsic components in terms of scientific explanation and of designing more comprehensive worksite stress prevention programs. PMID:27104548

  10. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  11. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  12. The Rayleigh curve as a model for effort distribution over the life of medium scale software systems. M.S. Thesis - Maryland Univ.

    Science.gov (United States)

    Picasso, G. O.; Basili, V. R.

    1982-01-01

    It is noted that previous investigations into the applicability of Rayleigh curve model to medium scale software development efforts have met with mixed results. The results of these investigations are confirmed by analyses of runs and smoothing. The reasons for the models' failure are found in the subcycle effort data. There are four contributing factors: uniqueness of the environment studied, the influence of holidays, varying management techniques and differences in the data studied.

  13. Parenting and the Development of Effortful Control from Early Childhood to Early Adolescence: A Transactional Developmental Model

    Science.gov (United States)

    Capaldi, Deborah M.; Kerr, David C. R.; Bertrand, Maria; Pears, Katherine C.; Owen, Lee

    2016-01-01

    Poor effortful control is a key temperamental factor underlying behavioral problems. The bidirectional association of child effortful control with both positive parenting and negative discipline was examined from ages approximately 3 to 13–14 years, involving 5 time points, and using data from parents and children in the Oregon Youth Study-Three Generational Study (N = 318 children from 150 families). Based on a dynamic developmental systems approach, it was hypothesized that there would be concurrent associations between parenting and child effortful control and bidirectional effects across time from each aspect of parenting to effortful control and from effortful control to each aspect of parenting. It was also hypothesized that associations would be more robust in early childhood, from ages 3 to 7 years, and would diminish as indicated by significantly weaker effects at the older ages, 11–12 to 13–14 years. Longitudinal feedback or mediated effects were also tested. Findings supported (a) stability in each construct over multiple developmental periods; (b) concurrent associations, which were significantly weaker at the older ages; (c) bidirectional effects, consistent with the interpretation that at younger ages children’s effortful control influenced parenting, whereas at older child ages, parenting influenced effortful control; and (d) a transactional effect, such that maternal parenting in late childhood was a mechanism explaining children’s development of effortful control from midchildhood to early adolescence. PMID:27427809

  14. GENERAL REQUIREMENTS FOR SIMULATION MODELS IN WASTE MANAGEMENT

    International Nuclear Information System (INIS)

    Miller, Ian; Kossik, Rick; Voss, Charlie

    2003-01-01

    Most waste management activities are decided upon and carried out in a public or semi-public arena, typically involving the waste management organization, one or more regulators, and often other stakeholders and members of the public. In these environments, simulation modeling can be a powerful tool in reaching a consensus on the best path forward, but only if the models that are developed are understood and accepted by all of the parties involved. These requirements for understanding and acceptance of the models constrain the appropriate software and model development procedures that are employed. This paper discusses requirements for both simulation software and for the models that are developed using the software. Requirements for the software include transparency, accessibility, flexibility, extensibility, quality assurance, ability to do discrete and/or continuous simulation, and efficiency. Requirements for the models that are developed include traceability, transparency, credibility/validity, and quality control. The paper discusses these requirements with specific reference to the requirements for performance assessment models that are used for predicting the long-term safety of waste disposal facilities, such as the proposed Yucca Mountain repository

  15. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  16. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    In this paper we argue that previous models of cognitive abilities (e.g. memory, analogy) have been constructed to satisfy functional requirements of implicit commonsense psychological theories held by researchers and nonresearchers alike...

  17. A Compositional Knowledge Level Process Model of Requirements Engineering

    NARCIS (Netherlands)

    Herlea, D.E.; Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.

    2002-01-01

    In current literature few detailed process models for Requirements Engineering are presented: usually high-level activities are distinguished, without a more precise specification of each activity. In this paper the process of Requirements Engineering has been analyzed using knowledge-level

  18. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    Science.gov (United States)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  19. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  20. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    . There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  1. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  2. Reuse-centric Requirements Analysis with Task Models, Scenarios, and Critical Parameters

    Directory of Open Access Journals (Sweden)

    Cyril Montabert

    2007-02-01

    Full Text Available This paper outlines a requirements-analysis process that unites task models, scenarios, and critical parameters to exploit and generate reusable knowledge at the requirements phase. Through the deployment of a critical-parameter-based approach to task modeling, the process yields the establishment of an integrative and formalized model issued from scenarios that can be used for requirements characterization. Furthermore, not only can this entity serve as interface to a knowledge repository relying on a critical-parameter-based taxonomy to support reuse but its characterization in terms of critical parameters also allows the model to constitute a broader reuse solution. We discuss our vision for a user-centric and reuse-centric approach to requirements analysis, present previous efforts implicated with this line of work, and state the revisions brought to extend the reuse potential and effectiveness of a previous iteration of a requirements tool implementing such process. Finally, the paper describes the sequence and nature of the activities involved with the conduct of our proposed requirements-analysis technique, concluding by previewing ongoing work in the field that will explore the feasibility for designers to use our approach.

  3. Models of Human Information Requirements: "When Reasonable Aiding Systems Disagree"

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Shafto, Michael (Technical Monitor)

    1994-01-01

    Aircraft flight management and Air Traffic Control (ATC) automation are under development to maximize the economy of flight and to increase the capacity of the terminal area airspace while maintaining levels of flight safety equal to or better than current system performance. These goals are being realized by the introduction of flight management automation aiding and operations support systems on the flight deck and by new developments of ATC aiding systems that seek to optimize scheduling of aircraft while potentially reducing required separation and accounting for weather and wake vortex turbulence. Aiding systems on both the flight deck and the ground operate through algorithmic functions on models of the aircraft and of the airspace. These models may differ from each other as a result of variations in their models of the immediate environment. The resultant flight operations or ATC commands may differ in their response requirements (e.g. different preferred descent speeds or descent initiation points). The human operators in the system must then interact with the automation to reconcile differences and resolve conflicts. We have developed a model of human performance including cognitive functions (decision-making, rule-based reasoning, procedural interruption recovery and forgetting) that supports analysis of the information requirements for resolution of flight aiding and ATC conflicts. The model represents multiple individuals in the flight crew and in ATC. The model is supported in simulation on a Silicon Graphics' workstation using Allegro Lisp. Design guidelines for aviation automation aiding systems have been developed using the model's specification of information and team procedural requirements. Empirical data on flight deck operations from full-mission flight simulation are provided to support the model's predictions. The paper describes the model, its development and implementation, the simulation test of the model predictions, and the empirical

  4. Requirements on mechanistic NPP models used in CSS for diagnostics and predictions

    International Nuclear Information System (INIS)

    Juslin, K.

    1996-01-01

    Mechanistic models have for several years with good experience been used for operators' support in electric power dispatching centres. Some models of limited scope have already been in use at nuclear power plants. It is considered that also advanced mechanistic models in combination with present computer technology with preference could be used in Computerized Support Systems (CSS) for the assistance of Nuclear Power Plant (NPP) operators. Requirements with respect to accuracy, validity range, speed flexibility and level of detail on the models used for such purposes are discussed. Quality Assurance, Verification and Validation efforts are considered. A long term commitment in the field of mechanistic modelling and real time simulation is considered as the key to successful implementations. The Advanced PROcess Simulation (APROS) code system and simulation environment developed at the Technical Research Centre of Finland (VTT) is intended also for CSS applications in NPP control rooms. (author). 4 refs

  5. Modeling Requirements for Simulating the Effects of Extreme Acts of Terrorism: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Allen, M.; Hiebert-Dodd, K.; Marozas, D.; Paananen, O.; Pryor, R.J.; Reinert, R.K.

    1998-10-01

    This white paper presents the initial requirements for developing a new computer model for simulating the effects of extreme acts of terrorism in the United States. General characteristics of the model are proposed and the level of effort to prepare a complete written description of the model, prior to coding, is detailed. The model would simulate the decision processes and interactions of complex U. S. systems engaged in responding to and recovering from four types of terrorist incidents. The incident scenarios span the space of extreme acts of terrorism that have the potential to affect not only the impacted area, but also the entire nation. The model would be useful to decision-makers in assessing and analyzing the vulnerability of the nation's complex infrastructures, in prioritizing resources to reduce risk, and in planning strategies for immediate response and for subsequent recovery from terrorist incidents.

  6. A proposal for a coordinated effort for the determination of brainwide neuroanatomical connectivity in model organisms at a mesoscopic scale.

    Directory of Open Access Journals (Sweden)

    Jason W Bohland

    2009-03-01

    Full Text Available In this era of complete genomes, our knowledge of neuroanatomical circuitry remains surprisingly sparse. Such knowledge is critical, however, for both basic and clinical research into brain function. Here we advocate for a concerted effort to fill this gap, through systematic, experimental mapping of neural circuits at a mesoscopic scale of resolution suitable for comprehensive, brainwide coverage, using injections of tracers or viral vectors. We detail the scientific and medical rationale and briefly review existing knowledge and experimental techniques. We define a set of desiderata, including brainwide coverage; validated and extensible experimental techniques suitable for standardization and automation; centralized, open-access data repository; compatibility with existing resources; and tractability with current informatics technology. We discuss a hypothetical but tractable plan for mouse, additional efforts for the macaque, and technique development for human. We estimate that the mouse connectivity project could be completed within five years with a comparatively modest budget.

  7. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    This dissertation presents the contributions of seven publications all concerned with the application of Coloured Petri Nets (CPN) to requirements modeling for reactive systems. The publications are introduced along with relevant background material and related work, and their contributions...... to take into concern that the behavior of human actors is less likely to be predictable than the behavior of e.g. mechanical components.   In the second approach, the CPN model is parameterized and utilizes a generic and reusable CPN module operating as an SD interpreter. In addition to distinguishing...... and events. A tool is presented that allows automated validation of the structure of CPN models with respect to the guidelines. Next, three publications on integrating Jackson's Problem Frames with CPN requirements models are presented: The first publication introduces a method for systematically structuring...

  8. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  9. Critical Business Requirements Model and Metrics for Intranet ROI

    OpenAIRE

    Luqi; Jacoby, Grant A.

    2005-01-01

    Journal of Electronic Commerce Research, Vol. 6, No. 1, pp. 1-30. This research provides the first theoretical model, the Intranet Efficiency and Effectiveness Model (IEEM), to measure intranet overall value contributions based on a corporation’s critical business requirements by applying a balanced baseline of metrics and conversion ratios linked to key business processes of knowledge workers, IT managers and business decision makers -- in effect, closing the gap of understanding...

  10. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  11. Generic skills requirements (KSA model) towards future mechanical ...

    African Journals Online (AJOL)

    ... Statistics and Discriminant Analysis (DA) as required to achieve the objective of the study. This study will guide all future engineers, especially in the field of Mechanical Engineering in Malaysia to penetrate the job market according to the current market needs. Keywords: generic skills; KSA model; mechanical engineers; ...

  12. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  13. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  14. Ethnicity, Effort, Self-Efficacy, Worry, and Statistics Achievement in Malaysia: A Construct Validation of the State-Trait Motivation Model

    Science.gov (United States)

    Awang-Hashim, Rosa; O'Neil, Harold F., Jr.; Hocevar, Dennis

    2002-01-01

    The relations between motivational constructs, effort, self-efficacy and worry, and statistics achievement were investigated in a sample of 360 undergraduates in Malaysia. Both trait (cross-situational) and state (task-specific) measures of each construct were used to test a mediational trait (r) state (r) performance (TSP) model. As hypothesized,…

  15. Required experimental accuracy to select between supersymmetrical models

    Science.gov (United States)

    Grellscheid, David

    2004-03-01

    We will present a method to decide a priori whether various supersymmetrical scenarios can be distinguished based on sparticle mass data alone. For each model, a scan over all free SUSY breaking parameters reveals the extent of that model's physically allowed region of sparticle-mass-space. Based on the geometrical configuration of these regions in mass-space, it is possible to obtain an estimate of the required accuracy of future sparticle mass measurements to distinguish between the models. We will illustrate this algorithm with an example. This talk is based on work done in collaboration with B C Allanach (LAPTH, Annecy) and F Quevedo (DAMTP, Cambridge).

  16. Environmental challenges and opportunities of the evolving North American electricity market : European electricity generating facilities: an overview of European regulatory requirements and standardization efforts

    International Nuclear Information System (INIS)

    Nichols, L.

    2002-06-01

    Several factors are affecting power generating facilities, such as the opening of both electricity and gas markets, and the pressure applied on generators and governments to ensure a steady energy supply for consumers. An additional factor is the pressure for the closing of nuclear power facilities. European siting and emissions requirements for coal-fired and natural gas generating facilities were presented in this background paper. In addition, the author provided an overview of the standardization process in place in Europe. The European Union and its functioning were briefly described, as well as a listing of relevant organizations. The current trends were examined. The document first introduced the European Union, and the next section dealt with Regulatory regime: the internal energy market. The third section examined the issue of Regulatory regime: generation and environmental regulations. Section four presented environmental management systems, followed by a section on standardization. Section six discussed European organizations involved in electricity issues, while the following section dealt with European commission programs. The last section briefly looked at the trends in the electricity sector, broaching topics such as compliance, electricity generation, and emissions trading. 52 refs., 2 tabs

  17. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  18. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  19. Reactor core modeling practice: Operational requirements, model characteristics, and model validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1997-01-01

    The physical models implemented in power plant simulators have greatly increased in performance and complexity in recent years. This process has been enabled by the ever increasing computing power available at affordable prices. This paper describes this process from several angles: First the operational requirements which are more critical from the point of view of model performance, both for normal and off-normal operating conditions; A second section discusses core model characteristics in the light of the solutions implemented by Thomson Training and Simulation (TT and S) in several full-scope simulators recently built and delivered for Dutch, German, and French nuclear power plants; finally we consider the model validation procedures, which are of course an integral part of model development, and which are becoming more and more severe as performance expectations increase. As a conclusion, it may be asserted that in the core modeling field, as in other areas, the general improvement in the quality of simulation codes has resulted in a fairly rapid convergence towards mainstream engineering-grade calculations. This is remarkable performance in view of the stringent real-time requirements which the simulation codes must satisfy as well as the extremely wide range of operating conditions that they are called upon to cover with good accuracy. (author)

  20. Cultural context in the effort to improve oral health among Alaska Native people: the dental health aide therapist model.

    Science.gov (United States)

    Wetterhall, Scott; Burrus, Barri; Shugars, Daniel; Bader, James

    2011-10-01

    The Alaska Native people in rural Alaska face serious challenges in obtaining dental care. Itinerant care models have failed to meet their needs for more than 50 years. The dental health aide therapist (DHAT) model, which entails training midlevel care providers to perform limited restorative, surgical, and preventive procedures, was adopted to address some of the limitations of the itinerant model. We used quantitative and qualitative methods to assess residents' satisfaction with the model and the role of DHATs in the cultural context in which they operate. Our findings suggest that the DHAT model can provide much-needed access to urgent care and is beneficial from a comprehensive cultural perspective.

  1. A Stage-Structured Prey-Predator Fishery Model In The Presence Of Toxicity With Taxation As A Control Parameter of Harvesting Effort

    Directory of Open Access Journals (Sweden)

    Sumit Kaur Bhatia

    2017-08-01

    Full Text Available In this paper we have considered stage-structured fishery model in the presence of toxicity, which is diminishing due to the current excessive use of fishing efforts resulting in devastating consequences. The purpose of this study is to propose a bio-economic mathematical model by introducing taxes to the profit per unit biomass of the harvested fish of each species with the intention of controlling fishing efforts in the presence of toxicity. We obtained both boundary and interior equilibrium points along with the conditions ensuring their validity. Local stability for the interior equilibrium point has been found by the trace-determinant criterion and global stability has been analyzed through a suitable Lyapunov function. We have also obtained the optimal harvesting policy with the help of Pontryagin's maximum principle. Lastly, numerical simulation with the help of MATLAB have been done and thus, the results of the formulated model have been established.

  2. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... ambiguity in the process of elicitation and analysis through the use of empirically informed quality goals attached to functional goals. The authors demonstrate the benefit of articulating a quality goal without turning it into a functional goal. Their study shows that quality goals kept at a high level...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  3. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  4. Requirements for High Level Models Supporting Design Space Exploration in Model-based Systems Engineering

    OpenAIRE

    Haveman, Steven P.; Bonnema, G. Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during detailed design. In this paper, we define requirements for a high level model that is firstly driven by key systems engineering challenges present in industry and secondly connects to several formal and d...

  5. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    Directory of Open Access Journals (Sweden)

    Shin Sook-Il

    2011-01-01

    Full Text Available Abstract Background Metabolic reconstructions (MRs are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Results Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i development and implementation of a community-based workflow for MR annotation and reconciliation; ii incorporation of thermodynamic information; and iii use of the consensus MR to identify potential multi-target drug therapy approaches. Conclusion Taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.

  6. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2.

    Science.gov (United States)

    Thiele, Ines; Hyduke, Daniel R; Steeb, Benjamin; Fankam, Guy; Allen, Douglas K; Bazzani, Susanna; Charusanti, Pep; Chen, Feng-Chi; Fleming, Ronan M T; Hsiung, Chao A; De Keersmaecker, Sigrid C J; Liao, Yu-Chieh; Marchal, Kathleen; Mo, Monica L; Özdemir, Emre; Raghunathan, Anu; Reed, Jennifer L; Shin, Sook-il; Sigurbjörnsdóttir, Sara; Steinmann, Jonas; Sudarsan, Suresh; Swainston, Neil; Thijs, Inge M; Zengler, Karsten; Palsson, Bernhard O; Adkins, Joshua N; Bumann, Dirk

    2011-01-18

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.

  7. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    Energy Technology Data Exchange (ETDEWEB)

    Thiele, Ines; Hyduke, Daniel R.; Steeb, Benjamin; Fankam, Guy; Allen, Douglas K.; Bazzani, Susanna; Charusanti, Pep; Chen, Feng-Chi; Fleming, Ronan MT; Hsiung, Chao A.; De Keersmaecker, Sigrid CJ; Liao, Yu-Chieh; Marchal, Kathleen; Mo, Monica L.; Özdemir, Emre; Raghunathan, Anu; Reed, Jennifer L.; Shin, Sook-Il; Sigurbjörnsdóttir, Sara; Steinmann, Jonas; Sudarsan, Suresh; Swainston, Neil; Thijs, Inge M.; Zengler, Karsten; Palsson, Bernhard O.; Adkins, Joshua N.; Bumann, Dirk

    2011-01-01

    Metabolic reconstructions (MRs) are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG) knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i) development and implementation of a community-based workflow for MR annotation and reconciliation; ii) incorporation of thermodynamic information; and iii) use of the consensus MR to identify potential multi-target drug therapy approaches. Finally, taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.

  8. [Psychosocial stress and disease risks in occupational life. Results of international studies on the demand-control and the effort-reward imbalance models].

    Science.gov (United States)

    Siegrist, J; Dragano, N

    2008-03-01

    Given the far-reaching changes of modern working life, psychosocial stress at work has received increased attention. Its influence on stress-related disease risks is analysed with the help of standardised measurements based on theoretical models. Two such models have gained special prominence in recent years, the demand-control model and the effort-reward imbalance model. The former model places its emphasis on a distinct combination of job characteristics, whereas the latter model's focus is on the imbalance between efforts spent and rewards received in turn. The predictive power of these models with respect to coronary or cardiovascular disease and depression was tested in a number of prospective epidemiological investigations. In summary, twofold elevated disease risks are observed. Effects on cardiovascular disease are particularly pronounced among men, whereas no gender differences are observed for depression. Additional evidence derived from experimental and ambulatory monitoring studies supplements this body of findings. Current scientific evidence justifies an increased awareness and assessment of these newly discovered occupational risks, in particular by occupational health professionals. Moreover, structural and interpersonal measures of stress prevention and health promotion at work are warranted, with special emphasis on gender differences.

  9. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1982-09-01

    Report III, Volume 1 contains those specifications numbered A through J, as follows: General Specifications (A); Specifications for Pressure Vessels (C); Specifications for Tanks (D); Specifications for Exchangers (E); Specifications for Fired Heaters (F); Specifications for Pumps and Drivers (G); and Specifications for Instrumentation (J). The standard specifications of Bechtel Petroleum Incorporated have been amended as necessary to reflect the specific requirements of the Breckinridge Project, and the more stringent specifications of Ashland Synthetic Fuels, Inc. These standard specifications are available to the Initial Effort (Phase Zero) work performed by all contractors and subcontractors. Report III, Volume 1 also contains the unique specifications prepared for Plants 8, 15, and 27. These specifications will be substantially reviewed during Phase I of the project, and modified as necessary for use during the engineering, procurement, and construction of this project.

  10. Improving fishing effort descriptors: Modelling engine power and gear-size relations of five European trawl fleets

    DEFF Research Database (Denmark)

    Eigaard, Ole Ritzau; Rihan, Dominic; Graham, Norman

    2011-01-01

    Based on information from an international inventory of gears currently deployed by trawlers in five European countries, the relationship between vessel engine power and trawl size is quantified for different trawl types, trawling techniques and target species. Using multiplicative modelling it i...

  11. Modeling the effects of promotional efforts on aggregate pharmaceutical demand : What we know and challenges for the future

    NARCIS (Netherlands)

    Wieringa, J.E.; Osinga, E.C.; Conde, E.R.; Leeflang, P.S.H.; Stern, P.; Ding, M.; Eliashberg, J.; Stremersch, S.

    2014-01-01

    Pharmaceutical marketing is becoming an important area of research in its own right, as evidenced by the steady increase in relevant papers published in the major marketing journals in recent years. These papers utilize different modeling techniques and types of data. In this chapter we focus on

  12. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  13. Life sciences research in space: The requirement for animal models

    Science.gov (United States)

    Fuller, C. A.; Philips, R. W.; Ballard, R. W.

    1987-01-01

    Use of animals in NASA space programs is reviewed. Animals are needed because life science experimentation frequently requires long-term controlled exposure to environments, statistical validation, invasive instrumentation or biological tissue sampling, tissue destruction, exposure to dangerous or unknown agents, or sacrifice of the subject. The availability and use of human subjects inflight is complicated by the multiple needs and demands upon crew time. Because only living organisms can sense, integrate and respond to the environment around them, the sole use of tissue culture and computer models is insufficient for understanding the influence of the space environment on intact organisms. Equipment for spaceborne experiments with animals is described.

  14. Effort rights-based management

    DEFF Research Database (Denmark)

    Squires, Dale; Maunder, Mark; Allen, Robin

    2017-01-01

    Effort rights-based fisheries management (RBM) is less widely used than catch rights, whether for groups or individuals. Because RBM on catch or effort necessarily requires a total allowable catch (TAC) or total allowable effort (TAE), RBM is discussed in conjunction with issues in assessing fish...... populations and providing TACs or TAEs. Both approaches have advantages and disadvantages, and there are trade-offs between the two approaches. In a narrow economic sense, catch rights are superior because of the type of incentives created, but once the costs of research to improve stock assessments...

  15. Experiments in anodic film effects during electrorefining of scrap U-10Mo fuels in support of modeling efforts

    Energy Technology Data Exchange (ETDEWEB)

    Van Kleeck, M. [School of Nuclear Engineering, Purdue University, West Lafayette, IN 47907 (United States); Chemical Sciences and Engineering Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Willit, J.; Williamson, M.A. [Chemical Sciences and Engineering Division, Argonne National Laboratory, Argonne, IL 60439 (United States); Fentiman, A.W. [School of Nuclear Engineering, Purdue University, West Lafayette, IN 47907 (United States)

    2013-07-01

    A monolithic uranium molybdenum alloy clad in zirconium has been proposed as a low enriched uranium (LEU) fuel option for research and test reactors, as part of the Reduced Enrichment for Research and Test Reactors program. Scrap from the fuel's manufacture will contain a significant portion of recoverable LEU. Pyroprocessing has been identified as an option to perform this recovery. A model of a pyroprocessing recovery procedure has been developed to assist in refining the LEU recovery process and designing the facility. Corrosion theory and a two mechanism transport model were implemented on a Mat-Lab platform to perform the modeling. In developing this model, improved anodic behavior prediction became necessary since a dense uranium-rich salt film was observed at the anode surface during electrorefining experiments. Experiments were conducted on uranium metal to determine the film's character and the conditions under which it forms. The electro-refiner salt used in all the experiments was eutectic LiCl/KCl containing UCl{sub 3}. The anodic film material was analyzed with ICP-OES to determine its composition. Both cyclic voltammetry and potentiodynamic scans were conducted at operating temperatures between 475 and 575 C. degrees to interrogate the electrochemical behavior of the uranium. The results show that an anodic film was produced on the uranium electrode. The film initially passivated the surface of the uranium on the working electrode. At high over potentials after a trans-passive region, the current observed was nearly equal to the current observed at the initial active level. Analytical results support the presence of K{sub 2}UCl{sub 6} at the uranium surface, within the error of the analytical method.

  16. An effort to improve track and intensity prediction of tropical cyclones through vortex initialization in NCUM-global model

    Science.gov (United States)

    Singh, Vivek; Routray, A.; Mallick, Swapan; George, John P.; Rajagopal, E. N.

    2016-05-01

    Tropical cyclones (TCs) have strong impact on socio-economic conditions of the countries like India, Bangladesh and Myanmar owing to its awful devastating power. This brings in the need of precise forecasting system to predict the tracks and intensities of TCs accurately well in advance. However, it has been a great challenge for major operational meteorological centers over the years. Genesis of TCs over data sparse warm Tropical Ocean adds more difficulty to this. Weak and misplaced vortices at initial time are one of the prime sources of track and intensity errors in the Numerical Weather Prediction (NWP) models. Many previous studies have reported the forecast skill of track and intensity of TC improved due to the assimilation of satellite data along with vortex initialization (VI). Keeping this in mind, an attempt has been made to investigate the impact of vortex initialization for simulation of TC using UK-Met office global model, operational at NCMRWF (NCUM). This assessment is carried out by taking the case of a extremely severe cyclonic storm "Chapala" that occurred over Arabian Sea (AS) from 28th October to 3rd November 2015. Two numerical experiments viz. Vort-GTS (Assimilation of GTS observations with VI) and Vort-RAD (Same as Vort-GTS with assimilation of satellite data) are carried out. This vortex initialization study in NCUM model is first of its type over North Indian Ocean (NIO). The model simulation of TC is carried out with five different initial conditions through 24 hour cycles for both the experiments. The results indicate that the vortex initialization with assimilation of satellite data has a positive impact on the track and intensity forecast, landfall time and position error of the TCs.

  17. Betting on change: Tenet deal with Vanguard shows it's primed to try ACO effort, new payment model.

    Science.gov (United States)

    Kutscher, Beth

    2013-07-01

    Tenet Healthcare Corp.'s acquisition of Vanguard Health Systems is a sign the investor-owned chain is willing to take a chance on alternative payment models such as accountable care organizations. There's no certainty that ACOs will deliver the improvements on quality or cost savings, but Vanguard Vice Chairman Keith Pitts, left, says his system's Pioneer ACO in Detroit has already achieved some cost savings.

  18. Modeling of the global carbon cycle - isotopic data requirements

    International Nuclear Information System (INIS)

    Ciais, P.

    1994-01-01

    Isotopes are powerful tools to constrain carbon cycle models. For example, the combinations of the CO 2 and the 13 C budget allows to calculate the net-carbon fluxes between atmosphere, ocean, and biosphere. Observations of natural and bomb-produced radiocarbon allow to estimate gross carbon exchange fluxes between different reservoirs and to deduce time scales of carbon overturning in important reservoirs. 18 O in CO 2 is potentially a tool to make the deconvolution of C fluxes within the land biosphere (assimilation vs respirations). The scope of this article is to identify gaps in our present knowledge about isotopes in the light of their use as constraint for the global carbon cycle. In the following we will present a list of some future data requirements for carbon cycle models. (authors)

  19. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  20. Requirements for high level models supporting design space exploration in model-based systems engineering

    NARCIS (Netherlands)

    Haveman, Steven; Bonnema, Gerrit Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during

  1. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  2. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    Science.gov (United States)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  3. Toward a Rational and Mechanistic Account of Mental Effort.

    Science.gov (United States)

    Shenhav, Amitai; Musslick, Sebastian; Lieder, Falk; Kool, Wouter; Griffiths, Thomas L; Cohen, Jonathan D; Botvinick, Matthew M

    2017-07-25

    In spite of its familiar phenomenology, the mechanistic basis for mental effort remains poorly understood. Although most researchers agree that mental effort is aversive and stems from limitations in our capacity to exercise cognitive control, it is unclear what gives rise to those limitations and why they result in an experience of control as costly. The presence of these control costs also raises further questions regarding how best to allocate mental effort to minimize those costs and maximize the attendant benefits. This review explores recent advances in computational modeling and empirical research aimed at addressing these questions at the level of psychological process and neural mechanism, examining both the limitations to mental effort exertion and how we manage those limited cognitive resources. We conclude by identifying remaining challenges for theoretical accounts of mental effort as well as possible applications of the available findings to understanding the causes of and potential solutions for apparent failures to exert the mental effort required of us.

  4. Modeling the minimum enzymatic requirements for optimal cellulose conversion

    International Nuclear Information System (INIS)

    Den Haan, R; Van Zyl, W H; Van Zyl, J M; Harms, T M

    2013-01-01

    Hydrolysis of cellulose is achieved by the synergistic action of endoglucanases, exoglucanases and β-glucosidases. Most cellulolytic microorganisms produce a varied array of these enzymes and the relative roles of the components are not easily defined or quantified. In this study we have used partially purified cellulases produced heterologously in the yeast Saccharomyces cerevisiae to increase our understanding of the roles of some of these components. CBH1 (Cel7), CBH2 (Cel6) and EG2 (Cel5) were separately produced in recombinant yeast strains, allowing their isolation free of any contaminating cellulolytic activity. Binary and ternary mixtures of the enzymes at loadings ranging between 3 and 100 mg g −1 Avicel allowed us to illustrate the relative roles of the enzymes and their levels of synergy. A mathematical model was created to simulate the interactions of these enzymes on crystalline cellulose, under both isolated and synergistic conditions. Laboratory results from the various mixtures at a range of loadings of recombinant enzymes allowed refinement of the mathematical model. The model can further be used to predict the optimal synergistic mixes of the enzymes. This information can subsequently be applied to help to determine the minimum protein requirement for complete hydrolysis of cellulose. Such knowledge will be greatly informative for the design of better enzymatic cocktails or processing organisms for the conversion of cellulosic biomass to commodity products. (letter)

  5. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kim, Dong-Sang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Skorski, Daniel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Matyas, Josef [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-07-01

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminary in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.

  6. Dopamine, behavioral economics, and effort

    Directory of Open Access Journals (Sweden)

    John D Salamone

    2009-09-01

    Full Text Available Abstract. There are numerous problems with the hypothesis that brain dopamine (DA systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Research and theory related to the functions of mesolimbic DA are undergoing a substantial conceptual restructuring, with the traditional emphasis on hedonia and primary reward yielding to other concepts and lines of inquiry. The present review is focused upon the involvement of nucleus accumbens DA in behavioral activation and effort-related processes. Viewed from the framework of behavioral economics, the effects of accumbens DA depletions and antagonism on food-reinforced behavior are highly dependent upon the work requirements of the instrumental task, and DA depleted rats are more sensitive to increases in response costs (i.e., ratio requirements. Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related choice behavior. Rats with accumbens DA depletions or antagonism reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead these rats select a less-effortful type of food-seeking behavior. Nucleus accumbens DA and adenosine interact in the regulation of effort-related functions, and other brain structures (anterior cingulate cortex, amygdala, ventral pallidum also are involved. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue or anergia in depression and other neurological disorders.

  7. An Imbalance of Approach and Effortful Control Predicts Externalizing Problems: Support for Extending the Dual-Systems Model into Early Childhood.

    Science.gov (United States)

    Jonas, Katherine; Kochanska, Grazyna

    2018-01-25

    Although the association between deficits in effortful control and later externalizing behavior is well established, many researchers (Nigg Journal of Child Psychology and Psychiatry, 47(3-4), 395-422, 2006; Steinberg Developmental Review, 28(1), 78-106, 2008) have hypothesized this association is actually the product of the imbalance of dual systems, or two underlying traits: approach and self-regulation. Very little research, however, has deployed a statistically robust strategy to examine that compelling model; further, no research has done so using behavioral measures, particularly in longitudinal studies. We examined the imbalance of approach and self-regulation (effortful control, EC) as predicting externalizing problems. Latent trait models of approach and EC were derived from behavioral measures collected from 102 children in a community sample at 25, 38, 52, and 67 months (2 to 5 ½ years), and used to predict externalizing behaviors, modeled as a latent trait derived from parent-reported measures at 80, 100, 123, and 147 months (6 ½ to 12 years). The imbalance hypothesis was supported: Children with an imbalance of approach and EC had more externalizing behavior problems in middle childhood and early preadolescence, relative to children with equal levels of the two traits.

  8. Research requirements for a unified approach to modelling chemical effects associated with radioactive waste disposal

    International Nuclear Information System (INIS)

    Krol, A.A.; Read, D.

    1986-09-01

    This report contains the results of a review of the current modelling, laboratory experiments and field experiments being conducted in the United Kingdom to aid understanding and improve prediction of the effects of chemistry on the disposal of radioactive wastes. The aim has been to summarise present work and derive a structure for future research effort that would support the use of probabilistic risk assessment (pra) methods for the disposal of radioactive wastes. The review was conducted by a combination of letter and personal visits, and preliminary results were reported to a plenary meeting of participants held in April, 1986. Following this meeting, copies of the report were circulated to participants at draft stage, so that the finalised report should be taken to provide as far as possible a consensus of opinion of research requirements. (author)

  9. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  10. The Modeling of Factors That Influence Coast Guard Manpower Requirements

    Science.gov (United States)

    2014-12-01

    applications, and common data warehouses needed to fully develop an effective and efficient manpower requirements engineering and management program. The... manpower requirements determination ensures a ready force, and safe and effective mission execution. Shortage or excess of manpower is the catalyst...FACTORS THAT INFLUENCE COAST GUARD MANPOWER REQUIREMENTS by Kara M. Lavin December 2014 Thesis Advisor: Ronald E. Giachetti Co-Advisor

  11. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  12. Building beef cow nutritional programs with the 1996 NRC beef cattle requirements model.

    Science.gov (United States)

    Lardy, G P; Adams, D C; Klopfenstein, T J; Patterson, H H

    2004-01-01

    Designing a sound cow-calf nutritional program requires knowledge of nutrient requirements, diet quality, and intake. Effectively using the NRC (1996) beef cattle requirements model (1996NRC) also requires knowledge of dietary degradable intake protein (DIP) and microbial efficiency. Objectives of this paper are to 1) describe a framework in which 1996NRC-applicable data can be generated, 2) describe seasonal changes in nutrients on native range, 3) use the 1996NRC to predict nutrient balance for cattle grazing these forages, and 4) make recommendations for using the 1996NRC for forage-fed cattle. Extrusa samples were collected over 2 yr on native upland range and subirrigated meadow in the Nebraska Sandhills. Samples were analyzed for CP, in vitro OM digestibility (IVOMD), and DIP. Regression equations to predict nutrients were developed from these data. The 1996NRC was used to predict nutrient balances based on the dietary nutrient analyses. Recommendations for model users were also developed. On subirrigated meadow, CP and IVOMD increased rapidly during March and April. On native range, CP and IVOMD increased from April through June but decreased rapidly from August through September. Degradable intake protein (DM basis) followed trends similar to CP for both native range and subirrigated meadow. Predicted nutrient balances for spring- and summer-calving cows agreed with reported values in the literature, provided that IVOMD values were converted to DE before use in the model (1.07 x IVOMD - 8.13). When the IVOMD-to-DE conversion was not used, the model gave unrealistically high NE(m) balances. To effectively use the 1996NRC to estimate protein requirements, users should focus on three key estimates: DIP, microbial efficiency, and TDN intake. Consequently, efforts should be focused on adequately describing seasonal changes in forage nutrient content. In order to increase use of the 1996NRC, research is needed in the following areas: 1) cost-effective and

  13. Effort in Multitasking: Local and Global Assessment of Effort.

    Science.gov (United States)

    Kiesel, Andrea; Dignath, David

    2017-01-01

    When performing multiple tasks in succession, self-organization of task order might be superior compared to external-controlled task schedules, because self-organization allows optimizing processing modes and thus reduces switch costs, and it increases commitment to task goals. However, self-organization is an additional executive control process that is not required if task order is externally specified and as such it is considered as time-consuming and effortful. To compare self-organized and externally controlled task scheduling, we suggest assessing global subjective and objectives measures of effort in addition to local performance measures. In our new experimental approach, we combined characteristics of dual tasking settings and task switching settings and compared local and global measures of effort in a condition with free choice of task sequence and a condition with cued task sequence. In a multi-tasking environment, participants chose the task order while the task requirement of the not-yet-performed task remained the same. This task preview allowed participants to work on the previously non-chosen items in parallel and resulted in faster responses and fewer errors in task switch trials than in task repetition trials. The free-choice group profited more from this task preview than the cued group when considering local performance measures. Nevertheless, the free-choice group invested more effort than the cued group when considering global measures. Thus, self-organization in task scheduling seems to be effortful even in conditions in which it is beneficiary for task processing. In a second experiment, we reduced the possibility of task preview for the not-yet-performed tasks in order to hinder efficient self-organization. Here neither local nor global measures revealed substantial differences between the free-choice and a cued task sequence condition. Based on the results of both experiments, we suggest that global assessment of effort in addition to

  14. Modelling efforts needed to advance herpes simplex virus (HSV) vaccine development: Key findings from the World Health Organization Consultation on HSV Vaccine Impact Modelling.

    Science.gov (United States)

    Gottlieb, Sami L; Giersing, Birgitte; Boily, Marie-Claude; Chesson, Harrell; Looker, Katharine J; Schiffer, Joshua; Spicknall, Ian; Hutubessy, Raymond; Broutet, Nathalie

    2017-06-21

    Development of a vaccine against herpes simplex virus (HSV) is an important goal for global sexual and reproductive health. In order to more precisely define the health and economic burden of HSV infection and the theoretical impact and cost-effectiveness of an HSV vaccine, in 2015 the World Health Organization convened an expert consultation meeting on HSV vaccine impact modelling. The experts reviewed existing model-based estimates and dynamic models of HSV infection to outline critical future modelling needs to inform development of a comprehensive business case and preferred product characteristics for an HSV vaccine. This article summarizes key findings and discussions from the meeting on modelling needs related to HSV burden, costs, and vaccine impact, essential data needs to carry out those models, and important model components and parameters. Copyright © 2017. Published by Elsevier Ltd.

  15. Department of Defense Enterprise Requirements and Acquisition Model

    Science.gov (United States)

    2011-06-01

    30  Figure 11: ExtendSim Icons ...collected through a series of interviews with space requirements and acquisition personnel from AFSPC Requirements directorate (AFSPC/ A5 ), the Under...of the many ExtendSim® icons are described and illustrated in Figure 11. The “Event/Activity” icon is implemented with a time duration allowing a

  16. Reproductive effort in viscous populations

    NARCIS (Netherlands)

    Pen, Ido

    Here I study a kin selection model of reproductive effort, the allocation of resources to fecundity versus survival, in a patch-structured population. Breeding females remain in the same patch for life. Offspring have costly, partial long-distance dispersal and compete for breeding sites, which

  17. Aerosol-Radiation-Cloud Interactions in the South-East Atlantic: Model-Relevant Observations and the Beneficiary Modeling Efforts in the Realm of the EVS-2 Project ORACLES

    Science.gov (United States)

    Redemann, Jens

    2018-01-01

    Globally, aerosols remain a major contributor to uncertainties in assessments of anthropogenically-induced changes to the Earth climate system, despite concerted efforts using satellite and suborbital observations and increasingly sophisticated models. The quantification of direct and indirect aerosol radiative effects, as well as cloud adjustments thereto, even at regional scales, continues to elude our capabilities. Some of our limitations are due to insufficient sampling and accuracy of the relevant observables, under an appropriate range of conditions to provide useful constraints for modeling efforts at various climate scales. In this talk, I will describe (1) the efforts of our group at NASA Ames to develop new airborne instrumentation to address some of the data insufficiencies mentioned above; (2) the efforts by the EVS-2 ORACLES project to address aerosol-cloud-climate interactions in the SE Atlantic and (3) time permitting, recent results from a synergistic use of A-Train aerosol data to test climate model simulations of present-day direct radiative effects in some of the AEROCOM phase II global climate models.

  18. Simulation Modeling Requirements for Loss-of-Control Accident Prevention of Turboprop Transport Aircraft

    Science.gov (United States)

    Crider, Dennis; Foster, John V.

    2012-01-01

    In-flight loss of control remains the leading contributor to aviation accident fatalities, with stall upsets being the leading causal factor. The February 12, 2009. Colgan Air, Inc., Continental Express flight 3407 accident outside Buffalo, New York, brought this issue to the forefront of public consciousness and resulted in recommendations from the National Transportation Safety Board to conduct training that incorporates stalls that are fully developed and develop simulator standards to support such training. In 2010, Congress responded to this accident with Public Law 11-216 (Section 208), which mandates full stall training for Part 121 flight operations. Efforts are currently in progress to develop recommendations on implementation of stall training for airline pilots. The International Committee on Aviation Training in Extended Envelopes (ICATEE) is currently defining simulator fidelity standards that will be necessary for effective stall training. These recommendations will apply to all civil transport aircraft including straight-wing turboprop aircraft. Government-funded research over the previous decade provides a strong foundation for stall/post-stall simulation for swept-wing, conventional tail jets to respond to this mandate, but turboprops present additional and unique modeling challenges. First among these challenges is the effect of power, which can provide enhanced flow attachment behind the propellers. Furthermore, turboprops tend to operate for longer periods in an environment more susceptible to ice. As a result, there have been a significant number of turboprop accidents as a result of the early (lower angle of attack) stalls in icing. The vulnerability of turboprop configurations to icing has led to studies on ice accumulation and the resulting effects on flight behavior. Piloted simulations of these effects have highlighted the important training needs for recognition and mitigation of icing effects, including the reduction of stall margins

  19. User Requirements Model for University Timetable Management System

    OpenAIRE

    Ahmad Althunibat; Mohammad I. Muhairat

    2016-01-01

    Automated timetables are used to schedule courses, lectures and rooms in universities by considering some constraints. Inconvenient and ineffective timetables often waste time and money. Therefore, it is important to investigate the requirements and potential needs of users. Thus, eliciting user requirements of University Timetable Management System (TMS) and their implication becomes an important process for the implementation of TMS. On this note, this paper seeks to propose a m...

  20. Job stress and mental health of permanent and fixed-term workers measured by effort-reward imbalance model, depressive complaints, and clinic utilization.

    Science.gov (United States)

    Inoue, Mariko; Tsurugano, Shinobu; Yano, Eiji

    2011-01-01

    The number of workers with precarious employment has increased globally; however, few studies have used validated measures to investigate the relationship of job status to stress and mental health. Thus, we conducted a study to compare differential job stress experienced by permanent and fixed-term workers using an effort-reward imbalance (ERI) model questionnaire, and by evaluating depressive complaints and clinic utilization. Subjects were permanent or fixed-term male workers at a Japanese research institute (n=756). Baseline data on job stress and depressive complaints were collected in 2007. We followed up with the same population over a 1-year period to assess their utilization of the company clinic for mental health concerns. The ERI ratio was higher among permanent workers than among fixed-term workers. More permanent workers presented with more than two depressive complaints, which is the standard used for the diagnosis of depression. ERI scores indicated that the effort component of permanent work was associated with distress, whereas distress in fixed-term work was related to job promotion and job insecurity. Moreover, over the one-year follow-up period, fixed-term workers visited the on-site clinic for mental concerns 4.04 times more often than permanent workers even after adjusting for age, lifestyle, ERI, and depressive complaints. These contrasting findings reflect the differential workloads and working conditions encountered by permanent and fixed-term workers. The occupational setting where employment status was intermingled, may have contributed to the high numbers of mental health-related issues experienced by workers with different employment status.

  1. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  2. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    .... Rather than working to avoid the influence of commonsense psychology in cognitive modeling research, we propose to capitalize on progress in developing formal theories of commonsense psychology...

  3. Requirements for data integration platforms in biomedical research networks: a reference model.

    Science.gov (United States)

    Ganzinger, Matthias; Knaup, Petra

    2015-01-01

    Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper.

  4. Modelling and Simulation for Requirements Engineering and Options Analysis

    Science.gov (United States)

    2010-05-01

    should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments

  5. Downscaling the marine modelling effort: Development, application and assessment of a 3D ecosystem model implemented in a small coastal area

    Science.gov (United States)

    Kolovoyiannis, V. N.; Tsirtsis, G. E.

    2013-07-01

    The present study deals with the development, application and evaluation of a modelling tool, implemented along with a field sampling program, in a limited coastal area in the Northeast Aegean. The aim was to study, understand and quantify physical circulation and water column ecological processes in a high resolution simulation of a past annual cycle. The marine ecosystem model consists of a three dimensional hydrodynamic component suitable for coastal areas (Princeton Ocean Model) coupled to a simple ecological model of five variables, namely, phytoplankton, nitrate, ammonia, phosphate and dissolved organic carbon concentrations. The ecological parameters (e.g. half saturation constants and maximum uptake rates for nutrients) were calibrated using a specially developed automated procedure. Model errors were evaluated using qualitative, graphic techniques and were quantified with a number of goodness-of-fit measures. Regarding physical variables, the goodness-of-fit of model to field data varied from fairly to quite good. Indicatively, the cost function, expressed as mean value per sampling station, ranged from 0.15 to 0.23 for temperature and 0.81 to 3.70 for current speed. The annual cycle of phytoplankton biomass was simulated with sufficient accuracy (e.g. mean cost function ranging from 0.49 to 2.67), partly attributed to the adequate reproduction of the dynamics of growth limiting nutrients, nitrate, ammonia and the main limiting nutrient, phosphate, whose mean cost function ranged from 0.97 to 1.88. Model results and field data provided insight to physical processes such as the development of a wind-driven, coastal jet type of surface alongshore flow with a subsurface countercurrent flowing towards opposite direction and the formation of rotational flows in the embayments of the coastline when the offshore coastal current speed approaches values of about 0.1 m/s. The percentage of field measurements where the N:P ratio was found over 16:1 varied between

  6. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    designed specifically to withstand severe underwater explosion (UNDEX) loading caused by the detonation of weapons such as bombs, missiles, mines and... Explosions ( BLEVEs ): The energy from a BLEVE is from a sudden change of phase of stored material. Tanks of liquids immersed in pool fires BLEVE when the...2.10.3 Summary of Data Requirements ....................................................... 46 2.11 Underwater Explosion

  7. Requirements engineering for trust management: Model, methodology, and reasoning

    NARCIS (Netherlands)

    Giorgini, P.; Massacci, F.; Mylopoulos, J.; Zannone, N.

    2006-01-01

    A number of recent proposals aim to incorporate security engineering into mainstream software engineering. Yet, capturing trust and security requirements at an organizational level, as opposed to an IT system level, and mapping these into security and trust management policies is still an open

  8. Managing salinity in Upper Colorado River Basin streams: Selecting catchments for sediment control efforts using watershed characteristics and random forests models

    Science.gov (United States)

    Tillman, Fred; Anning, David W.; Heilman, Julian A.; Buto, Susan G.; Miller, Matthew P.

    2018-01-01

    Elevated concentrations of dissolved-solids (salinity) including calcium, sodium, sulfate, and chloride, among others, in the Colorado River cause substantial problems for its water users. Previous efforts to reduce dissolved solids in upper Colorado River basin (UCRB) streams often focused on reducing suspended-sediment transport to streams, but few studies have investigated the relationship between suspended sediment and salinity, or evaluated which watershed characteristics might be associated with this relationship. Are there catchment properties that may help in identifying areas where control of suspended sediment will also reduce salinity transport to streams? A random forests classification analysis was performed on topographic, climate, land cover, geology, rock chemistry, soil, and hydrologic information in 163 UCRB catchments. Two random forests models were developed in this study: one for exploring stream and catchment characteristics associated with stream sites where dissolved solids increase with increasing suspended-sediment concentration, and the other for predicting where these sites are located in unmonitored reaches. Results of variable importance from the exploratory random forests models indicate that no simple source, geochemical process, or transport mechanism can easily explain the relationship between dissolved solids and suspended sediment concentrations at UCRB monitoring sites. Among the most important watershed characteristics in both models were measures of soil hydraulic conductivity, soil erodibility, minimum catchment elevation, catchment area, and the silt component of soil in the catchment. Predictions at key locations in the basin were combined with observations from selected monitoring sites, and presented in map-form to give a complete understanding of where catchment sediment control practices would also benefit control of dissolved solids in streams.

  9. A Meta-model for the Assessment of Non-Functional Requirement Size

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormandjieva, O.; Mirandola, R.

    2008-01-01

    Non-functional requirements (NFRs) pose unique challenges in estimating the effort it would take to implement them. This is mainly because of their unique nature; NFRs are subjective, relative, interactive and tending to have a broad impact on the system as a whole. Nevertheless, it is crucial, when

  10. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    Science.gov (United States)

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  11. Four Reference Models for Transparency Requirements in Information Systems

    OpenAIRE

    Hosseini, Mahmoud; Shahri, Alimohammad; Phalp, Keith T.; Ali, Ra

    2017-01-01

    Transparency is a key emerging requirement in modern businesses and their information systems. Transparency refers to the information which flows amongst stakeholders for the purpose of informed decision-making and taking the right action. Transparency is generally associated with positive connotations such as trust and accountability. However, it has been shown that it could have adverse effects such as information overload and affecting decisions objectiveness. This calls for systematic app...

  12. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  13. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  14. How the dual process model of human cognition can inform efforts to de-implement ineffective and harmful clinical practices: A preliminary model of unlearning and substitution.

    Science.gov (United States)

    Helfrich, Christian D; Rose, Adam J; Hartmann, Christine W; van Bodegom-Vos, Leti; Graham, Ian D; Wood, Suzanne J; Majerczyk, Barbara R; Good, Chester B; Pogach, Leonard M; Ball, Sherry L; Au, David H; Aron, David C

    2018-02-01

    One way to understand medical overuse at the clinician level is in terms of clinical decision-making processes that are normally adaptive but become maladaptive. In psychology, dual process models of cognition propose 2 decision-making processes. Reflective cognition is a conscious process of evaluating options based on some combination of utility, risk, capabilities, and/or social influences. Automatic cognition is a largely unconscious process occurring in response to environmental or emotive cues based on previously learned, ingrained heuristics. De-implementation strategies directed at clinicians may be conceptualized as corresponding to cognition: (1) a process of unlearning based on reflective cognition and (2) a process of substitution based on automatic cognition. We define unlearning as a process in which clinicians consciously change their knowledge, beliefs, and intentions about an ineffective practice and alter their behaviour accordingly. Unlearning has been described as "the questioning of established knowledge, habits, beliefs and assumptions as a prerequisite to identifying inappropriate or obsolete knowledge underpinning and/or embedded in existing practices and routines." We hypothesize that as an unintended consequence of unlearning strategies clinicians may experience "reactance," ie, feel their professional prerogative is being violated and, consequently, increase their commitment to the ineffective practice. We define substitution as replacing the ineffective practice with one or more alternatives. A substitute is a specific alternative action or decision that either precludes the ineffective practice or makes it less likely to occur. Both approaches may work independently, eg, a substitute could displace an ineffective practice without changing clinicians' knowledge, and unlearning could occur even if no alternative exists. For some clinical practice, unlearning and substitution strategies may be most effectively used together. By taking into

  15. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  16. Functional requirements of a mathematical model of the heart.

    Science.gov (United States)

    Palladino, Joseph L; Noordergraaf, Abraham

    2009-01-01

    Functional descriptions of the heart, especially the left ventricle, are often based on the measured variables pressure and ventricular outflow, embodied as a time-varying elastance. The fundamental difficulty of describing the mechanical properties of the heart with a time-varying elastance function that is set a priori is described. As an alternative, a new functional model of the heart is presented, which characterizes the ventricle's contractile state with parameters, rather than variables. Each chamber is treated as a pressure generator that is time and volume dependent. The heart's complex dynamics develop from a single equation based on the formation and relaxation of crossbridge bonds. This equation permits the calculation of ventricular elastance via E(v) = partial differentialp(v)/ partial differentialV(v). This heart model is defined independently from load properties, and ventricular elastance is dynamic and reflects changing numbers of crossbridge bonds. In this paper, the functionality of this new heart model is presented via computed work loops that demonstrate the Frank-Starling mechanism and the effects of preload, the effects of afterload, inotropic changes, and varied heart rate, as well as the interdependence of these effects. Results suggest the origin of the equivalent of Hill's force-velocity relation in the ventricle.

  17. Modelling of capital requirements in the energy sector: capital market access. Final memorandum

    Energy Technology Data Exchange (ETDEWEB)

    1978-04-01

    Formal modelling techniques for analyzing the capital requirements of energy industries have been performed at DOE. A survey has been undertaken of a number of models which forecast energy-sector capital requirements or which detail the interactions of the energy sector and the economy. Models are identified which can be useful as prototypes for some portion of DOE's modelling needs. The models are examined to determine any useful data bases which could serve as inputs to an original DOE model. A selected group of models are examined which can comply with the stated capabilities. The data sources being used by these models are covered and a catalog of the relevant data bases is provided. The models covered are: capital markets and capital availability models (Fossil 1, Bankers Trust Co., DRI Macro Model); models of physical capital requirements (Bechtel Supply Planning Model, ICF Oil and Gas Model and Coal Model, Stanford Research Institute National Energy Model); macroeconomic forecasting models with input-output analysis capabilities (Wharton Annual Long-Term Forecasting Model, Brookhaven/University of Illinois Model, Hudson-Jorgenson/Brookhaven Model); utility models (MIT Regional Electricity Model-Baughman Joskow, Teknekron Electric Utility Simulation Model); and others (DRI Energy Model, DRI/Zimmerman Coal Model, and Oak Ridge Residential Energy Use Model).

  18. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  19. Illinois highway materials sustainability efforts of 2015.

    Science.gov (United States)

    2016-08-01

    This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2015. This report meets the requirements of Illinois Publ...

  20. Illinois highway materials sustainability efforts of 2014.

    Science.gov (United States)

    2015-08-01

    This report presents the 2014 sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling reclaimed materials in highway construction. This report meets the requirements of Illinois : Public Act 097-0314 by documenting I...

  1. Illinois highway materials sustainability efforts of 2016.

    Science.gov (United States)

    2017-07-04

    This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2016. This report meets the requirements of Illinois Publ...

  2. Illinois highway materials sustainability efforts of 2013.

    Science.gov (United States)

    2014-08-01

    This report presents the sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling and reclaiming materials for use in highway construction. This report meets the requirements of : Illinois Public Act 097-0314 by docum...

  3. Incentive Design and Mis-Allocated Effort

    OpenAIRE

    Schnedler, Wendelin

    2013-01-01

    Incentives often distort behavior: they induce agents to exert effort but this effort is not employed optimally. This paper proposes a theory of incentive design allowing for such distorted behavior. At the heart of the theory is a trade-off between getting the agent to exert effort and ensuring that this effort is used well. The theory covers various moral-hazard models, ranging from traditional single-task to multi-task models. It also provides -for the first time- a formalization and proof...

  4. Requirements for modeling airborne microbial contamination in space stations

    Science.gov (United States)

    Van Houdt, Rob; Kokkonen, Eero; Lehtimäki, Matti; Pasanen, Pertti; Leys, Natalie; Kulmala, Ilpo

    2018-03-01

    Exposure to bioaerosols is one of the facets that affect indoor air quality, especially for people living in densely populated or confined habitats, and is associated to a wide range of health effects. Good indoor air quality is thus vital and a prerequisite for fully confined environments such as space habitats. Bioaerosols and microbial contamination in these confined space stations can have significant health impacts, considering the unique prevailing conditions and constraints of such habitats. Therefore, biocontamination in space stations is strictly monitored and controlled to ensure crew and mission safety. However, efficient bioaerosol control measures rely on solid understanding and knowledge on how these bioaerosols are created and dispersed, and which factors affect the survivability of the associated microorganisms. Here we review the current knowledge gained from relevant studies in this wide and multidisciplinary area of bioaerosol dispersion modeling and biological indoor air quality control, specifically taking into account the specific space conditions.

  5. [Requirements imposed on model objects in microevolutionary investigations].

    Science.gov (United States)

    Mina, M V

    2015-01-01

    Extrapolation of results of investigations of a model object is justified only within the limits of a set of objects that have essential properties in common with the modal object. Which properties are essential depends on the aim of a study. Similarity of objects emerged in the process of their independent evolution does not prove similarity of ways and mechanisms of their evolution. If the objects differ in their essential properties then extrapolation of results of investigation of an object on another one is risky because it may lead to wrong decisions and, moreover, to the loss of interest to alternative hypotheses. Positions formulated above are considered with the reference to species flocks of fishes, large African Barbus in particular.

  6. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    Science.gov (United States)

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  7. Literality and Cognitive Effort

    DEFF Research Database (Denmark)

    Lacruz, Isabel; Carl, Michael; Yamada, Masaru

    2018-01-01

    We introduce a notion of pause-word ratio computed using ranges of pause lengths rather than lower cutoffs for pause lengths. Standard pause-word ratios are indicators of cognitive effort during different translation modalities.The pause range version allows for the study of how different types...... remoteness. We use data from the CRITT TPR database, comparing translation and post-editing from English to Japanese and from English to Spanish, and study the interaction of pause-word ratio for short pauses ranging between 300 and 500ms with syntactic remoteness, measured by the CrossS feature, semantic...... remoteness, measured by HTra, and syntactic and semantic remoteness, measured by Literality....

  8. Mapping telemedicine efforts

    DEFF Research Database (Denmark)

    Kierkegaard, Patrick

    2015-01-01

    are being utilized? What medical disciplines are being addressed using telemedicine systems? Methods: All data was surveyed from the "Telemedicinsk Landkort", a newly created database designed to provide a comprehensive and systematic overview of all telemedicine technologies in Denmark. Results......Objectives: The aim of this study is to survey telemedicine services currently in operation across Denmark. The study specifically seeks to answer the following questions: What initiatives are deployed within the different regions? What are the motivations behind the projects? What technologies......: The results of this study suggest that a growing number of telemedicine initiatives are currently in operation across Denmark but that considerable variations existed in terms of regional efforts as the number of operational telemedicine projects varied from region to region. Conclusions: The results...

  9. Towards a Concerted Effort

    DEFF Research Database (Denmark)

    Johansen, Mette-Louise; Mouritsen, Tina; Montgomery, Edith

    2006-01-01

    This book contains a method model for the prevention of youth crime in Danish municipalities. The method model consists of instructions for conducting processual network meetings between traumatized refugee parents and the professional specialists working with their children on an intermunicipal...

  10. Voluntary versus Enforced Team Effort

    Directory of Open Access Journals (Sweden)

    Claudia Keser

    2011-08-01

    Full Text Available We present a model where each of two players chooses between remuneration based on either private or team effort. Although at least one of the players has the equilibrium strategy to choose private remuneration, we frequently observe both players to choose team remuneration in a series of laboratory experiments. This allows for high cooperation payoffs but also provides individual free-riding incentives. Due to significant cooperation, we observe that, in team remuneration, participants make higher profits than in private remuneration. We also observe that, when participants are not given the option of private remuneration, they cooperate significantly less.

  11. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    The project cogeneration plant supplies electric power, process steam and treated boiler feedwater for use by the project plants. The plant consists of multiple turbine generators and steam generators connected to a common main steam header. The major plant systems which are required to produce steam, electrical power and treated feedwater are discussed individually. The systems are: steam, steam generator, steam generator fuel, condensate and feedwater deaeration, condensate and blowdown collection, cooling water, boiler feedwater treatment, coal handling, ash handling (fly ash and bottom ash), electrical, and control system. The plant description is based on the Phase Zero design basis established for Plant 31 in July of 1980 and the steam/condensate balance as presented on Drawing 31-E-B-1. Updating of steam requirements as more refined process information becomes available has generated some changes in the steam balance. Boiler operation with these updated requirements is reflected on Drawing 31-D-B-1A. The major impact of updating has been that less 600 psig steam generated within the process units requires more extraction steam from the turbine generators to close the 600 psig steam balance. Since the 900 psig steam generation from the boilers was fixed at 1,200,000 lb/hr, the additional extraction steam required to close the 600 psig steam balance decreased the quantity of electrical power available from the turbine generators. In the next phase of engineering work, the production of 600 psig steam will be augmented by increasing convection bank steam generation in the Plant 3 fired heaters by 140,000 to 150,000 lb/hr. This modification will allow full rated power generation from the turbine generators.

  12. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    issues early. For requirements formulation of control structures, cyber and physical aspects need to be jointly represented to express interdependencies, check for consistency and discover potentially conflicting requirements. Early identification of potential conflicts may prevent larger problems...... at later design stages. However, languages employed for requirements modeling today do not offer the expressiveness necessary to represent control purposes in relation to domain level interactions and therefore miss several types of interdependencies. This paper introduces the idea of control structure...... modeling for early requirements checking using a suitable modeling language, and illustrates how this approach enables the identification of several classes of controller conflict....

  13. Requirements-level semantics and model checking of object-oriented statecharts

    NARCIS (Netherlands)

    Eshuis, H.; Jansen, D.N.; Wieringa, Roelf J.

    2002-01-01

    In this paper we define a requirements-level execution semantics for object-oriented statecharts and show how properties of a system specified by these statecharts can be model checked using tool support for model checkers. Our execution semantics is requirements-level because it uses the perfect

  14. Swedish nuclear waste efforts

    International Nuclear Information System (INIS)

    Rydberg, J.

    1981-09-01

    After the introduction of a law prohibiting the start-up of any new nuclear power plant until the utility had shown that the waste produced by the plant could be taken care of in an absolutely safe way, the Swedish nuclear utilities in December 1976 embarked on the Nuclear Fuel Safety Project, which in November 1977 presented a first report, Handling of Spent Nuclear Fuel and Final Storage of Vitrified Waste (KBS-I), and in November 1978 a second report, Handling and Final Storage of Unreprocessed Spent Nuclear Fuel (KBS II). These summary reports were supported by 120 technical reports prepared by 450 experts. The project engaged 70 private and governmental institutions at a total cost of US $15 million. The KBS-I and KBS-II reports are summarized in this document, as are also continued waste research efforts carried out by KBS, SKBF, PRAV, ASEA and other Swedish organizations. The KBS reports describe all steps (except reprocessing) in handling chain from removal from a reactor of spent fuel elements until their radioactive waste products are finally disposed of, in canisters, in an underground granite depository. The KBS concept relies on engineered multibarrier systems in combination with final storage in thoroughly investigated stable geologic formations. This report also briefly describes other activities carried out by the nuclear industry, namely, the construction of a central storage facility for spent fuel elements (to be in operation by 1985), a repository for reactor waste (to be in operation by 1988), and an intermediate storage facility for vitrified high-level waste (to be in operation by 1990). The R and D activities are updated to September 1981

  15. Worldwide effort against smoking.

    Science.gov (United States)

    1986-07-01

    The 39th World Health Assembly, which met in May 1986, recognized the escalating health problem of smoking-related diseases and affirmed that tobacco smoking and its use in other forms are incompatible with the attainment of "Health for All by the Year 2000." If properly implemented, antismoking campaigns can decrease the prevalence of smoking. Nations as a whole must work toward changing smoking habits, and governments must support these efforts by officially stating their stand against smoking. Over 60 countries have introduced legislation affecting smoking. The variety of policies range from adopting a health education program designed to increase peoples' awareness of its dangers to increasing taxes to deter smoking by increasing tobacco prices. Each country must adopt an antismoking campaign which works most effectively within the cultural parameters of the society. Other smoking policies include: printed warnings on cigarette packages; health messages via radio, television, mobile teams, pamphlets, health workers, clinic walls, and newspapers; prohibition of smoking in public areas and transportation; prohibition of all advertisement of cigarettes and tobacco; and the establishment of upper limits of tar and nicotine content in cigarettes. The tobacco industry spends about $2000 million annually on worldwide advertising. According to the World Health Organization (WHO), controlling this overabundance of tobacco advertisements is a major priority in preventing the spread of smoking. Cigarette and tobacco advertising can be controlled to varying degrees, e.g., over a dozen countries have enacted a total ban on advertising on television or radio, a mandatory health warning must accompany advertisements in other countries, and tobacco companies often are prohibited from sponsoring sports events. Imposing a substantial tax on cigarettes is one of the most effective means to deter smoking. However, raising taxes and banning advertisements is not enough because

  16. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high......, modelling the performance of the PV modules at high irradiances requires a dataset of only a few hundred samples in order to obtain a power estimation accuracy of ~1-2\\%....

  17. Termination of prehospital resuscitative efforts

    DEFF Research Database (Denmark)

    Mikkelsen, Søren; Schaffalitzky de Muckadell, Caroline; Binderup, Lars Grassmé

    2017-01-01

    -and-death decision-making in the patient's medical records is required. We suggest that a template be implemented in the prehospital medical records describing the basis for any ethical decisions. This template should contain information regarding the persons involved in the deliberations and notes on ethical......BACKGROUND: Discussions on ethical aspects of life-and-death decisions within the hospital are often made in plenary. The prehospital physician, however, may be faced with ethical dilemmas in life-and-death decisions when time-critical decisions to initiate or refrain from resuscitative efforts...... need to be taken without the possibility to discuss matters with colleagues. Little is known whether these considerations regarding ethical issues in crucial life-and-death decisions are documented prehospitally. This is a review of the ethical considerations documented in the prehospital medical...

  18. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    Report IV, Volume 7 provides descriptions, data, and drawings pertaining to the Oxygen Plant (Plant 15) and Naphtha Hydrotreating and Reforming (Plant 18). The Oxygen Plant (Plant 15) utilizes low-pressure air separation to manufacture the oxygen required in Gasification and Purification (Plant 12). The Oxygen Plant also supplies nitrogen as needed by the H-COAL process. Naphtha Hydrotreating and Reforming (Plant 18) upgrades the raw H-COAL naphtha. The following information is provided for both plants described in this volume: a description of the plant's process design, including the utility balance, catalysts and chemicals usage, and process flow diagrams; an equipment list including item numbers and descriptions; data sheets and sketches for major plant components (Oxygen Plant only); and pertinent engineering drawings. An appendix contains: an overall site plan showing the locations of all plants; and the symbols and legend for the piping and instrument diagrams included in this volume.

  19. Programming effort analysis of the ELLPACK language

    Science.gov (United States)

    Rice, J. R.

    1978-01-01

    ELLPACK is a problem statement language and system for elliptic partial differential equations which is implemented by a FORTRAN preprocessor. ELLPACK's principal purpose is as a tool for the performance evaluation of software. However, it is used here as an example with which to study the programming effort required for problem solving. It is obvious that problem statement languages can reduce programming effort tremendously; the goal is to quantify this somewhat. This is done by analyzing the lengths and effort (as measured by Halstead's software science technique) of various approaches to solving these problems.

  20. Dopamine and Effort-Based Decision Making

    Directory of Open Access Journals (Sweden)

    Irma Triasih Kurniawan

    2011-06-01

    Full Text Available Motivational theories of choice focus on the influence of goal values and strength of reinforcement to explain behavior. By contrast relatively little is known concerning how the cost of an action, such as effort expended, contributes to a decision to act. Effort-based decision making addresses how we make an action choice based on an integration of action and goal values. Here we review behavioral and neurobiological data regarding the representation of effort as action cost, and how this impacts on decision making. Although organisms expend effort to obtain a desired reward there is a striking sensitivity to the amount of effort required, such that the net preference for an action decreases as effort cost increases. We discuss the contribution of the neurotransmitter dopamine (DA towards overcoming response costs and in enhancing an animal’s motivation towards effortful actions. We also consider the contribution of brain structures, including the basal ganglia (BG and anterior cingulate cortex (ACC, in the internal generation of action involving a translation of reward expectation into effortful action.

  1. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    Report IV, Volume 6 provides descriptions, data, and drawings pertaining to Gasification and Purification (Plant 12). Gasification and Purification (Plant 12) produces makeup hydrogen for H-COAL Preheating and Reaction (Plant 3), and produces a medium Btu fuel gas for consumption in fired heaters. The following information is included: a description of the plant's process design, including the utility balance, catalysts and chemicals usage, and a process flow diagram; an equipment list, including item numbers and descriptions; data sheets and sketches for major plant components; and pertinent engineering drawings. An appendix contains: an overall site plan showing the locations of all plants; and the symbols and legend for the piping and instrument diagrams included in this volume. Gasification and Purification (Plant 12) utilizes process technology from three licensors: gasification of vacuum bottoms using the Texaco process, shift conversion using the Haldor Topsoe process, and purification of fuel gas and shifted gas using the Allied Chemical Selexol process. This licensed technology is proprietary in nature. As a result, this volume does not contain full disclosure of these processes although a maximum of information has been presented consistent with the confidentiality requirements. Where data appears incomplete in this volume, it is due to the above described limitations. Full data concerning this plant are available for DOE review at the Houston offices of Bechtel Petroleum, Inc.

  2. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  3. Developing a Predictive Model for Unscheduled Maintenance Requirements on United States Air Force Installations

    National Research Council Canada - National Science Library

    Kovich, Matthew D; Norton, J. D

    2008-01-01

    .... This paper develops one such method by using linear regression and time series analysis to develop a predictive model to forecast future year man-hour and funding requirements for unscheduled maintenance...

  4. A modeling ontology for integrating vulnerabilities into security requirements conceptual foundations

    NARCIS (Netherlands)

    Elahi, G.; Yu, E.; Zannone, N.; Laender, A.H.F.; Castano, S.; Dayal, U.; Casati, F.; Palazzo Moreira de Oliveira, J.

    2009-01-01

    Vulnerabilities are weaknesses in the requirements, design, and implementation, which attackers exploit to compromise the system. This paper proposes a vulnerability-centric modeling ontology, which aims to integrate empirical knowledge of vulnerabilities into the system development process. In

  5. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  6. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms of a derivat...

  7. Fuel requirements for experimental devices in MTR reactors. A perturbation model for reactor core analysis

    International Nuclear Information System (INIS)

    Beeckmans de West-Meerbeeck, A.

    1991-01-01

    Irradiation in neutron absorbing devices, requiring high fast neutron fluxes in the core or high thermal fluxes in the reflector and flux traps, lead to higher density fuel and larger core dimensions. A perturbation model of the reactor core helps to estimate the fuel requirements. (orig.)

  8. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  9. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  10. Dopamine, Effort-Based Choice, and Behavioral Economics: Basic and Translational Research.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Yang, Jen-Hau; Rotolo, Renee; Presby, Rose

    2018-01-01

    Operant behavior is not only regulated by factors related to the quality or quantity of reinforcement, but also by the work requirements inherent in performing instrumental actions. Moreover, organisms often make effort-related decisions involving economic choices such as cost/benefit analyses. Effort-based decision making is studied using behavioral procedures that offer choices between high-effort options leading to relatively preferred reinforcers vs. low effort/low reward choices. Several neural systems, including the mesolimbic dopamine (DA) system and other brain circuits, are involved in regulating effort-related aspects of motivation. Considerable evidence indicates that mesolimbic DA transmission exerts a bi-directional control over exertion of effort on instrumental behavior tasks. Interference with DA transmission produces a low-effort bias in animals tested on effort-based choice tasks, while increasing DA transmission with drugs such as DA transport blockers tends to enhance selection of high-effort options. The results from these pharmacology studies are corroborated by the findings from recent articles using optogenetic, chemogenetic and physiological techniques. In addition to providing important information about the neural regulation of motivated behavior, effort-based choice tasks are useful for developing animal models of some of the motivational symptoms that are seen in people with various psychiatric and neurological disorders (e.g., depression, schizophrenia, Parkinson's disease). Studies of effort-based decision making may ultimately contribute to the development of novel drug treatments for motivational dysfunction.

  11. Dopamine, Effort-Based Choice, and Behavioral Economics: Basic and Translational Research

    Directory of Open Access Journals (Sweden)

    John D. Salamone

    2018-03-01

    Full Text Available Operant behavior is not only regulated by factors related to the quality or quantity of reinforcement, but also by the work requirements inherent in performing instrumental actions. Moreover, organisms often make effort-related decisions involving economic choices such as cost/benefit analyses. Effort-based decision making is studied using behavioral procedures that offer choices between high-effort options leading to relatively preferred reinforcers vs. low effort/low reward choices. Several neural systems, including the mesolimbic dopamine (DA system and other brain circuits, are involved in regulating effort-related aspects of motivation. Considerable evidence indicates that mesolimbic DA transmission exerts a bi-directional control over exertion of effort on instrumental behavior tasks. Interference with DA transmission produces a low-effort bias in animals tested on effort-based choice tasks, while increasing DA transmission with drugs such as DA transport blockers tends to enhance selection of high-effort options. The results from these pharmacology studies are corroborated by the findings from recent articles using optogenetic, chemogenetic and physiological techniques. In addition to providing important information about the neural regulation of motivated behavior, effort-based choice tasks are useful for developing animal models of some of the motivational symptoms that are seen in people with various psychiatric and neurological disorders (e.g., depression, schizophrenia, Parkinson’s disease. Studies of effort-based decision making may ultimately contribute to the development of novel drug treatments for motivational dysfunction.

  12. Egg origin determination efforts

    International Nuclear Information System (INIS)

    Horvath, A.; Futo, I.; Vodila, G.; Palcsu, L.

    2012-01-01

    whites outside this interval, foreign origin can be assumed, since the isotope ratio of the drinking water samples covers natural waters characteristic for Hungary. As a conclusion, the same applies for eggs, as well. The three foreign egg samples can be separated from the Hungarian ones based on their δ 18 O and δD values, however, differences in the shifts compared to their own drinking water samples masks the region-specific information of the drinking water. From the micro elemental composition of the egg shells it can be stated that the identification of the samples can be performed with a precision of 97.1%, therefore differences in the elemental composition are large enough to characterise the origin of the eggs by the elemental analysis of the egg shell. It is recommended from the market protection point of view that to compare the elemental composition data of the shell of the supposedly foreign egg with an information database established for each production plant in Hungary. As on the information stamp of the mislabelled eggs the code of a Hungarian production plant is seen, it would be comparable with the real eggs originating from that plant. In this way, foreign eggs may be filtered out. Of course, to verify this method, further investigations are required.

  13. Vocal effort modulates the motor planning of short speech structures

    Science.gov (United States)

    Taitz, Alan; Shalom, Diego E.; Trevisan, Marcos A.

    2018-05-01

    Speech requires programming the sequence of vocal gestures that produce the sounds of words. Here we explored the timing of this program by asking our participants to pronounce, as quickly as possible, a sequence of consonant-consonant-vowel (CCV) structures appearing on screen. We measured the delay between visual presentation and voice onset. In the case of plosive consonants, produced by sharp and well defined movements of the vocal tract, we found that delays are positively correlated with the duration of the transition between consonants. We then used a battery of statistical tests and mathematical vocal models to show that delays reflect the motor planning of CCVs and transitions are proxy indicators of the vocal effort needed to produce them. These results support that the effort required to produce the sequence of movements of a vocal gesture modulates the onset of the motor plan.

  14. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  15. Visual cues and listening effort: individual variability.

    Science.gov (United States)

    Picou, Erin M; Ricketts, Todd A; Hornsby, Benjamin W Y

    2011-10-01

    To investigate the effect of visual cues on listening effort as well as whether predictive variables such as working memory capacity (WMC) and lipreading ability affect the magnitude of listening effort. Twenty participants with normal hearing were tested using a paired-associates recall task in 2 conditions (quiet and noise) and 2 presentation modalities (audio only [AO] and auditory-visual [AV]). Signal-to-noise ratios were adjusted to provide matched speech recognition across audio-only and AV noise conditions. Also measured were subjective perceptions of listening effort and 2 predictive variables: (a) lipreading ability and (b) WMC. Objective and subjective results indicated that listening effort increased in the presence of noise, but on average the addition of visual cues did not significantly affect the magnitude of listening effort. Although there was substantial individual variability, on average participants who were better lipreaders or had larger WMCs demonstrated reduced listening effort in noise in AV conditions. Overall, the results support the hypothesis that integrating auditory and visual cues requires cognitive resources in some participants. The data indicate that low lipreading ability or low WMC is associated with relatively effortful integration of auditory and visual information in noise.

  16. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  17. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  18. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  19. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  20. Spontaneous breathing during lung-protective ventilation in an experimental acute lung injury model: high transpulmonary pressure associated with strong spontaneous breathing effort may worsen lung injury.

    Science.gov (United States)

    Yoshida, Takeshi; Uchiyama, Akinori; Matsuura, Nariaki; Mashimo, Takashi; Fujino, Yuji

    2012-05-01

    We investigated whether potentially injurious transpulmonary pressure could be generated by strong spontaneous breathing and exacerbate lung injury even when plateau pressure is limited to ventilation, each combined with weak or strong spontaneous breathing effort. Inspiratory pressure for low tidal volume ventilation was set at 10 cm H2O and tidal volume at 6 mL/kg. For moderate tidal volume ventilation, the values were 20 cm H2O and 7-9 mL/kg. The groups were: low tidal volume ventilation+spontaneous breathingweak, low tidal volume ventilation+spontaneous breathingstrong, moderate tidal volume ventilation+spontaneous breathingweak, and moderate tidal volume ventilation+spontaneous breathingstrong. Each group had the same settings for positive end-expiratory pressure of 8 cm H2O. Respiratory variables were measured every 60 mins. Distribution of lung aeration and alveolar collapse were histologically evaluated. Low tidal volume ventilation+spontaneous breathingstrong showed the most favorable oxygenation and compliance of respiratory system, and the best lung aeration. By contrast, in moderate tidal volume ventilation+spontaneous breathingstrong, the greatest atelectasis with numerous neutrophils was observed. While we applied settings to maintain plateau pressure at ventilation+spontaneous breathingstrong, transpulmonary pressure rose >33 cm H2O. Both minute ventilation and respiratory rate were higher in the strong spontaneous breathing groups. Even when plateau pressure is limited to mechanical ventilation, transpulmonary pressure and tidal volume should be strictly controlled to prevent further lung injury.

  1. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  2. Cognitive effort: A neuroeconomic approach

    Science.gov (United States)

    Braver, Todd S.

    2015-01-01

    Cognitive effort has been implicated in numerous theories regarding normal and aberrant behavior and the physiological response to engagement with demanding tasks. Yet, despite broad interest, no unifying, operational definition of cognitive effort itself has been proposed. Here, we argue that the most intuitive and epistemologically valuable treatment is in terms of effort-based decision-making, and advocate a neuroeconomics-focused research strategy. We first outline psychological and neuroscientific theories of cognitive effort. Then we describe the benefits of a neuroeconomic research strategy, highlighting how it affords greater inferential traction than do traditional markers of cognitive effort, including self-reports and physiologic markers of autonomic arousal. Finally, we sketch a future series of studies that can leverage the full potential of the neuroeconomic approach toward understanding the cognitive and neural mechanisms that give rise to phenomenal, subjective cognitive effort. PMID:25673005

  3. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  4. IDENTIFYING OPERATIONAL REQUIREMENTS TO SELECT SUITABLE DECISION MODELS FOR A PUBLIC SECTOR EPROCUREMENT DECISION SUPPORT SYSTEM

    Directory of Open Access Journals (Sweden)

    Mohamed Adil

    2014-10-01

    Full Text Available Public sector procurement should be a transparent and fair process. Strict legal requirements are enforced on public sector procurement to make it a standardised process. To make fair decisions on selecting suppliers, a practical method which adheres to legal requirements is important. The research that is the base for this paper aimed at identifying a suitable Multi-Criteria Decision Analysis (MCDA method for the specific legal and functional needs of the Maldivian Public Sector. To identify such operational requirements, a set of focus group interviews were conducted in the Maldives with public officials responsible for procurement decision making. Based on the operational requirements identified through focus groups, criteria-based evaluation is done on published MCDA methods to identify the suitable methods for e-procurement decision making. This paper describes the identification of the operational requirements and the results of the evaluation to select suitable decision models for the Maldivian context.

  5. Multidisciplinary Efforts Driving Translational Theranostics

    Science.gov (United States)

    Hu, Tony Y.

    2014-01-01

    This themed issue summarizes significant efforts aimed at using “biological language” to discern between “friends” and “foes” in the context of theranostics for true clinical application. It is expected that the success of theranostics depends on multidisciplinary efforts, combined to expedite our understanding of host responses to “customized” theranostic agents and formulating individualized therapies. PMID:25285169

  6. Learning Environment and Student Effort

    Science.gov (United States)

    Hopland, Arnt O.; Nyhus, Ole Henning

    2016-01-01

    Purpose: The purpose of this paper is to explore the relationship between satisfaction with learning environment and student effort, both in class and with homework assignments. Design/methodology/approach: The authors use data from a nationwide and compulsory survey to analyze the relationship between learning environment and student effort. The…

  7. Definition of common support equipment and space station interface requirements for IOC model technology experiments

    Science.gov (United States)

    Russell, Richard A.; Waiss, Richard D.

    1988-01-01

    A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.

  8. The effects of savings on reservation wages and search effort

    NARCIS (Netherlands)

    Lammers, M.

    2014-01-01

    This paper discusses the interrelations among wealth, reservation wages and search effort. A theoretical job search model predicts wealth to affect reservation wages positively, and search effort negatively. Subsequently, reduced form equations for reservation wages and search intensity take these

  9. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    International Nuclear Information System (INIS)

    St John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program

  10. Data requirements of GREAT-ER: Modelling and validation using LAS in four UK catchments

    International Nuclear Information System (INIS)

    Price, Oliver R.; Munday, Dawn K.; Whelan, Mick J.; Holt, Martin S.; Fox, Katharine K.; Morris, Gerard; Young, Andrew R.

    2009-01-01

    Higher-tier environmental risk assessments on 'down-the-drain' chemicals in river networks can be conducted using models such as GREAT-ER (Geography-referenced Regional Exposure Assessment Tool for European Rivers). It is important these models are evaluated and their sensitivities to input variables understood. This study had two primary objectives: evaluate GREAT-ER model performance, comparing simulated modelled predictions for LAS (linear alkylbenzene sulphonate) with measured concentrations, for four rivers in the UK, and investigate model sensitivity to input variables. We demonstrate that the GREAT-ER model is very sensitive to variability in river discharges. However it is insensitive to the form of distributions used to describe chemical usage and removal rate in sewage treatment plants (STPs). It is concluded that more effort should be directed towards improving empirical estimates of effluent load and reducing uncertainty associated with usage and removal rates in STPs. Simulations could be improved by incorporating the effect of river depth on dissipation rates. - Validation of GREAT-ER.

  11. Data requirements of GREAT-ER: Modelling and validation using LAS in four UK catchments

    Energy Technology Data Exchange (ETDEWEB)

    Price, Oliver R., E-mail: oliver.price@unilever.co [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Munday, Dawn K. [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Whelan, Mick J. [Department of Natural Resources, School of Applied Sciences, Cranfield University, College Road, Cranfield, Bedfordshire MK43 0AL (United Kingdom); Holt, Martin S. [ECETOC, Ave van Nieuwenhuyse 4, Box 6, B-1160 Brussels (Belgium); Fox, Katharine K. [85 Park Road West, Birkenhead, Merseyside CH43 8SQ (United Kingdom); Morris, Gerard [Environment Agency, Phoenix House, Global Avenue, Leeds LS11 8PG (United Kingdom); Young, Andrew R. [Wallingford HydroSolutions Ltd, Maclean building, Crowmarsh Gifford, Wallingford, Oxon OX10 8BB (United Kingdom)

    2009-10-15

    Higher-tier environmental risk assessments on 'down-the-drain' chemicals in river networks can be conducted using models such as GREAT-ER (Geography-referenced Regional Exposure Assessment Tool for European Rivers). It is important these models are evaluated and their sensitivities to input variables understood. This study had two primary objectives: evaluate GREAT-ER model performance, comparing simulated modelled predictions for LAS (linear alkylbenzene sulphonate) with measured concentrations, for four rivers in the UK, and investigate model sensitivity to input variables. We demonstrate that the GREAT-ER model is very sensitive to variability in river discharges. However it is insensitive to the form of distributions used to describe chemical usage and removal rate in sewage treatment plants (STPs). It is concluded that more effort should be directed towards improving empirical estimates of effluent load and reducing uncertainty associated with usage and removal rates in STPs. Simulations could be improved by incorporating the effect of river depth on dissipation rates. - Validation of GREAT-ER.

  12. Modeling and Simulation for Exploring Human-Robot Team Interaction Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Dudenhoeffer, Donald Dean; Bruemmer, David Jonathon; Davis, Midge Lee

    2001-12-01

    Small-sized and micro-robots will soon be available for deployment in large-scale forces. Consequently, the ability of a human operator to coordinate and interact with largescale robotic forces is of great interest. This paper describes the ways in which modeling and simulation have been used to explore new possibilities for human-robot interaction. The paper also discusses how these explorations have fed implementation of a unified set of command and control concepts for robotic force deployment. Modeling and simulation can play a major role in fielding robot teams in actual missions. While live testing is preferred, limitations in terms of technology, cost, and time often prohibit extensive experimentation with physical multi-robot systems. Simulation provides insight, focuses efforts, eliminates large areas of the possible solution space, and increases the quality of actual testing.

  13. Respiratory effort from the photoplethysmogram.

    Science.gov (United States)

    Addison, Paul S

    2017-03-01

    The potential for a simple, non-invasive measure of respiratory effort based on the pulse oximeter signal - the photoplethysmogram or 'pleth' - was investigated in a pilot study. Several parameters were developed based on a variety of manifestations of respiratory effort in the signal, including modulation changes in amplitude, baseline, frequency and pulse transit times, as well as distinct baseline signal shifts. Thirteen candidate parameters were investigated using data from healthy volunteers. Each volunteer underwent a series of controlled respiratory effort maneuvers at various set flow resistances and respiratory rates. Six oximeter probes were tested at various body sites. In all, over three thousand pleth-based effort-airway pressure (EP) curves were generated across the various airway constrictions, respiratory efforts, respiratory rates, subjects, probe sites, and the candidate parameters considered. Regression analysis was performed to determine the existence of positive monotonic relationships between the respiratory effort parameters and resulting airway pressures. Six of the candidate parameters investigated exhibited a distinct positive relationship (poximeter probe and an ECG (P2E-Effort) and the other using two pulse oximeter probes placed at different peripheral body sites (P2-Effort); and baseline shifts in heart rate, (BL-HR-Effort). In conclusion, a clear monotonic relationship was found between several pleth-based parameters and imposed respiratory loadings at the mouth across a range of respiratory rates and flow constrictions. The results suggest that the pleth may provide a measure of changing upper airway dynamics indicative of the effort to breathe. Copyright © 2017 The Author. Published by Elsevier Ltd.. All rights reserved.

  14. Modeling requirements for full-scope reactor simulators of fission-product transport during severe accidents

    International Nuclear Information System (INIS)

    Ellison, P.G.; Monson, P.R.; Mitchell, H.A.

    1990-01-01

    This paper describes in the needs and requirements to properly and efficiently model fission product transport on full scope reactor simulators. Current LWR simulators can be easily adapted to model severe accident phenomena and the transport of radionuclides. Once adapted these simulators can be used as a training tool during operator training exercises for training on severe accident guidelines, for training on containment venting procedures, or as training tool during site wide emergency training exercises

  15. System Design Description and Requirements for Modeling the Off-Gas Systems for Fuel Recycling Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Daryl R. Haefner; Jack D. Law; Troy J. Tranter

    2010-08-01

    This document provides descriptions of the off-gases evolved during spent nuclear fuel processing and the systems used to capture the gases of concern. Two reprocessing techniques are discussed, namely aqueous separations and electrochemical (pyrochemical) processing. The unit operations associated with each process are described in enough detail so that computer models to mimic their behavior can be developed. The document also lists the general requirements for the desired computer models.

  16. A funding model for health visiting: baseline requirements--part 1.

    Science.gov (United States)

    Cowley, Sarah

    2007-11-01

    A funding model proposed in two papers will outline the health visiting resource, including team skill mix, required to deliver the recommended approach of 'progressive universalism,' taking account of health inequalities, best evidence and impact on outcomes that might be anticipated. The model has been discussed as far as possible across the professional networks of both the Community Practitioners' and Health Visitors' Association (CPHVA) and United Kingdom Public Health Association (UKPHA), and is a consensus statement agreed by all who have participated.

  17. Defining climate modeling user needs: which data are actually required to support impact analysis and adaptation policy development?

    Science.gov (United States)

    Swart, R. J.; Pagé, C.

    2010-12-01

    Until recently, the policy applications of Earth System Models in general and climate models in particular were focusing mainly on the potential future changes in the global and regional climate and attribution of observed changes to anthropogenic activities. Is climate change real? And if so, why do we have to worry about it? Following the broad acceptance of the reality of the risks by the majority of governments, particularly after the publication of IPCC’s 4th Assessment Report and the increasing number of observations of changes in ecological and socio-economic systems that are consistent with the observed climatic changes, governments, companies and other societal groups have started to evaluate their own vulnerability in more detail and to develop adaptation and mitigation strategies. After an early focus on the most vulnerable developing countries, recently, an increasing number of industrialized countries have embarked on the design of adaptation and mitigation plans, or on studies to evaluate the level of climate resilience of their development plans and projects. Which climate data are actually required to effectively support these activities? This paper reports on the efforts of the IS-ENES project, the infrastructure project of the European Network for Earth System Modeling, to address this question. How do we define user needs and can the existing gap between the climate modeling and impact research communities be bridged in support of the ENES long-term strategy? In contrast from the climate modeling community, which has a relatively long history of collaboration facilitated by a relatively uniform subject matter, commonly agreed definitions of key terminology and some level of harmonization of methods, the climate change impacts research community is very diverse and fragmented, using a wide variety of data sources, methods and tools. An additional complicating factor is that researchers working on adaptation usually closely collaborate with non

  18. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  19. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  20. Geochemical modelling of CO2-water-rock interactions for carbon storage : data requirements and outputs

    International Nuclear Information System (INIS)

    Kirste, D.

    2008-01-01

    A geochemical model was used to predict the short-term and long-term behaviour of carbon dioxide (CO 2 ), formation water, and reservoir mineralogy at a carbon sequestration site. Data requirements for the geochemical model included detailed mineral petrography; formation water chemistry; thermodynamic and kinetic data for mineral phases; and rock and reservoir physical characteristics. The model was used to determine the types of outputs expected for potential CO 2 storage sites and natural analogues. Reaction path modelling was conducted to determine the total reactivity or CO 2 storage capability of the rock by applying static equilibrium and kinetic simulations. Potential product phases were identified using the modelling technique, which also enabled the identification of the chemical evolution of the system. Results of the modelling study demonstrated that changes in porosity and permeability over time should be considered during the site selection process.

  1. Projected irrigation requirements for upland crops using soil moisture model under climate change in South Korea

    Science.gov (United States)

    An increase in abnormal climate change patterns and unsustainable irrigation in uplands cause drought and affect agricultural water security, crop productivity, and price fluctuations. In this study, we developed a soil moisture model to project irrigation requirements (IR) for upland crops under cl...

  2. Non-formal techniques for requirements elicitation, modeling, and early assessment for services

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Vyas, Dhaval; Dittmar, A.; Forbig, P.

    2011-01-01

    Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder involvement in early design where

  3. An evaluation model for the definition of regulatory requirements on spent fuel pool cooling systems

    International Nuclear Information System (INIS)

    Izquierdo, J.M.

    1979-01-01

    A calculation model is presented for establishing regulatory requirements in the SFPCS System. The major design factors, regulatory and design limits and key parameters are discussed. A regulatory position for internal use is proposed. Finally, associated problems and experience are presented. (author)

  4. A Proposal to Elicit Usability Requirements within a Model-Driven Development Environment

    NARCIS (Netherlands)

    Isela Ormeno, Y; Panach, I; Condori-Fernandez, O.N.; Pastor, O.

    2014-01-01

    Nowadays there are sound Model-Driven Development (MDD) methods that deal with functional requirements, but in general, usability is not considered from the early stages of the development. Analysts that work with MDD implement usability features manually once the code has been generated. This

  5. Teaching Tip: Using Rapid Game Prototyping for Exploring Requirements Discovery and Modeling

    Science.gov (United States)

    Dalal, Nikunj

    2012-01-01

    We describe the use of rapid game prototyping as a pedagogic technique to experientially explore and learn requirements discovery, modeling, and specification in systems analysis and design courses. Students have a natural interest in gaming that transcends age, gender, and background. Rapid digital game creation is used to build computer games…

  6. INTEGRATED DATA CAPTURING REQUIREMENTS FOR 3D SEMANTIC MODELLING OF CULTURAL HERITAGE: THE INCEPTION PROTOCOL

    Directory of Open Access Journals (Sweden)

    R. Di Giulio

    2017-02-01

    In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  7. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  8. The pharmacology of effort-related choice behavior: Dopamine, depression, and individual differences.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Yohn, Samantha; Lopez Cruz, Laura; San Miguel, Noemi; Alatorre, Luisa

    2016-06-01

    This review paper is focused upon the involvement of mesolimbic dopamine (DA) and related brain systems in effort-based processes. Interference with DA transmission affects instrumental behavior in a manner that interacts with the response requirements of the task, such that rats with impaired DA transmission show a heightened sensitivity to ratio requirements. Impaired DA transmission also affects effort-related choice behavior, which is assessed by tasks that offer a choice between a preferred reinforcer that has a high work requirement vs. less preferred reinforcer that can be obtained with minimal effort. Rats and mice with impaired DA transmission reallocate instrumental behavior away from food-reinforced tasks with high response costs, and show increased selection of low reinforcement/low cost options. Tests of effort-related choice have been developed into models of pathological symptoms of motivation that are seen in disorders such as depression and schizophrenia. These models are being employed to explore the effects of conditions associated with various psychopathologies, and to assess drugs for their potential utility as treatments for effort-related symptoms. Studies of the pharmacology of effort-based choice may contribute to the development of treatments for symptoms such as psychomotor slowing, fatigue or anergia, which are seen in depression and other disorders. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Pandemic Influenza: Domestic Preparedness Efforts

    National Research Council Canada - National Science Library

    Lister, Sarah A

    2005-01-01

    .... Though influenza pandemics occur with some regularity, and the United States has been involved in specific planning efforts since the early 1990s, the H5N1 situation has created a sense of urgency...

  10. The dynamic system of parental work of care for children with special health care needs: A conceptual model to guide quality improvement efforts

    Directory of Open Access Journals (Sweden)

    Hexem Kari R

    2011-10-01

    Full Text Available Abstract Background The work of care for parents of children with complex special health care needs may be increasing, while excessive work demands may erode the quality of care. We sought to summarize knowledge and develop a general conceptual model of the work of care. Methods Systematic review of peer-reviewed journal articles that focused on parents of children with special health care needs and addressed factors related to the physical and emotional work of providing care for these children. From the large pool of eligible articles, we selected articles in a randomized sequence, using qualitative techniques to identify the conceptual components of the work of care and their relationship to the family system. Results The work of care for a child with special health care needs occurs within a dynamic system that comprises 5 core components: (1 performance of tasks such as monitoring symptoms or administering treatments, (2 the occurrence of various events and the pursuit of valued outcomes regarding the child's physical health, the parent's mental health, or other attributes of the child or family, (3 operating with available resources and within certain constraints (4 over the passage of time, (5 while mentally representing or depicting the ever-changing situation and detecting possible problems and opportunities. These components interact, some with simple cause-effect relationships and others with more complex interdependencies. Conclusions The work of care affecting the health of children with special health care needs and their families can best be understood, studied, and managed as a multilevel complex system.

  11. Effort Estimation in BPMS Migration

    OpenAIRE

    Drews, Christopher; Lantow, Birger

    2018-01-01

    Usually Business Process Management Systems (BPMS) are highly integrated in the IT of organizations and are at the core of their business. Thus, migrating from one BPMS solution to another is not a common task. However, there are forces that are pushing organizations to perform this step, e.g. maintenance costs of legacy BPMS or the need for additional functionality. Before the actual migration, the risk and the effort must be evaluated. This work provides a framework for effort estimation re...

  12. Net benefits of wildfire prevention education efforts

    Science.gov (United States)

    Jeffrey P. Prestemon; David T. Butry; Karen L. Abt; Ronda Sutphen

    2010-01-01

    Wildfire prevention education efforts involve a variety of methods, including airing public service announcements, distributing brochures, and making presentations, which are intended to reduce the occurrence of certain kinds of wildfires. A Poisson model of preventable Florida wildfires from 2002 to 2007 by fire management region was developed. Controlling for...

  13. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    Directory of Open Access Journals (Sweden)

    Pedro Mello Paiva

    2016-12-01

    Full Text Available This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers simplification of the spill on the surface, even in the well blowout scenario. Efforts to better understand the oil and gas behavior in the water column and three-dimensional modeling of the trajectory gained strength after the Deepwater Horizon spill in 2010 in the Gulf of Mexico. The data collected and the observations made during the accident were widely used for adjustment of the models, incorporating various factors related to hydrodynamic forcing and weathering processes to which the hydrocarbons are subjected during subsurface leaks. The difficulties show to be even more challenging in the case of blowouts in deep waters, where the uncertainties are still larger. The studies addressed different variables to make adjustments of oil and gas dispersion models along the upward trajectory. Factors that exert strong influences include: speed of the subsurface currents;  gas separation from the main plume; hydrate formation, dissolution of oil and gas droplets; variations in droplet diameter; intrusion of the droplets at intermediate depths; biodegradation; and appropriate parametrization of the density, salinity and temperature profiles of water through the column.

  14. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand

    2015-01-01

    (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses...... distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against...... the traditional binning method with trapezoidal and Simpson's integration rules. The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model...

  15. On meeting capital requirements with a chance-constrained optimization model.

    Science.gov (United States)

    Atta Mills, Ebenezer Fiifi Emire; Yu, Bo; Gu, Lanlan

    2016-01-01

    This paper deals with a capital to risk asset ratio chance-constrained optimization model in the presence of loans, treasury bill, fixed assets and non-interest earning assets. To model the dynamics of loans, we introduce a modified CreditMetrics approach. This leads to development of a deterministic convex counterpart of capital to risk asset ratio chance constraint. We pursue the scope of analyzing our model under the worst-case scenario i.e. loan default. The theoretical model is analyzed by applying numerical procedures, in order to administer valuable insights from a financial outlook. Our results suggest that, our capital to risk asset ratio chance-constrained optimization model guarantees banks of meeting capital requirements of Basel III with a likelihood of 95 % irrespective of changes in future market value of assets.

  16. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  17. Knowledge representation requirements for model sharing between model-based reasoning and simulation in process flow domains

    Science.gov (United States)

    Throop, David R.

    1992-01-01

    The paper examines the requirements for the reuse of computational models employed in model-based reasoning (MBR) to support automated inference about mechanisms. Areas in which the theory of MBR is not yet completely adequate for using the information that simulations can yield are identified, and recent work in these areas is reviewed. It is argued that using MBR along with simulations forces the use of specific fault models. Fault models are used so that a particular fault can be instantiated into the model and run. This in turn implies that the component specification language needs to be capable of encoding any fault that might need to be sensed or diagnosed. It also means that the simulation code must anticipate all these faults at the component level.

  18. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    International Nuclear Information System (INIS)

    Murcia, J P; Réthoré, P E; Natarajan, A; Sørensen, J D

    2015-01-01

    Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against the traditional binning method with trapezoidal and Simpson's integration rules.The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model evaluations for a general wind power plant is proposed based on the convergence of the present method for each case. (paper)

  19. A New Rapid Simplified Model for Urban Rainstorm Inundation with Low Data Requirements

    Directory of Open Access Journals (Sweden)

    Ji Shen

    2016-11-01

    Full Text Available This paper proposes a new rapid simplified inundation model (NRSIM for flood inundation caused by rainstorms in an urban setting that can simulate the urban rainstorm inundation extent and depth in a data-scarce area. Drainage basins delineated from a floodplain map according to the distribution of the inundation sources serve as the calculation cells of NRSIM. To reduce data requirements and computational costs of the model, the internal topography of each calculation cell is simplified to a circular cone, and a mass conservation equation based on a volume spreading algorithm is established to simulate the interior water filling process. Moreover, an improved D8 algorithm is outlined for the simulation of water spilling between different cells. The performance of NRSIM is evaluated by comparing the simulated results with those from a traditional rapid flood spreading model (TRFSM for various resolutions of digital elevation model (DEM data. The results are as follows: (1 given high-resolution DEM data input, the TRFSM model has better performance in terms of precision than NRSIM; (2 the results from TRFSM are seriously affected by the decrease in DEM data resolution, whereas those from NRSIM are not; and (3 NRSIM always requires less computational time than TRFSM. Apparently, compared with the complex hydrodynamic or traditional rapid flood spreading model, NRSIM has much better applicability and cost-efficiency in real-time urban inundation forecasting for data-sparse areas.

  20. Maximum effort in the minimum-effort game

    Czech Academy of Sciences Publication Activity Database

    Engelmann, Dirk; Normann, H.-T.

    2010-01-01

    Roč. 13, č. 3 (2010), s. 249-259 ISSN 1386-4157 Institutional research plan: CEZ:AV0Z70850503 Keywords : minimum-effort game * coordination game * experiments * social capital Subject RIV: AH - Economics Impact factor: 1.868, year: 2010

  1. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  2. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  3. Fraud Risk Modelling: Requirements Elicitation in the Case of Telecom Services

    DEFF Research Database (Denmark)

    Yesuf, Ahmed; Wolos, Lars Peter; Rannenberg, Kai

    2017-01-01

    Telecom providers are losing tremendous amounts of money due to fraud risks posed to Telecom services and products. Currently, they are mainly focusing on fraud detection approaches to reduce the impact of fraud risks against their services. However, fraud prevention approaches should also...... be investigated in order to further reduce fraud risks and improve the revenue of Telecom providers. Fraud risk modelling is a fraud prevention approach aims at identifying the potential fraud risks, estimating the damage and setting up preventive mechanisms before the fraud risks lead to actual losses....... In this paper, we highlight the important requirements for a usable and context-aware fraud risk modelling approach for Telecom services. To do so, we have conducted two workshops with experts from a Telecom provider and experts from multi-disciplinary areas. In order to show and document the requirements, we...

  4. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  5. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  6. Effort Estimation in BPMS Migration

    Directory of Open Access Journals (Sweden)

    Christopher Drews

    2018-04-01

    Full Text Available Usually Business Process Management Systems (BPMS are highly integrated in the IT of organizations and are at the core of their business. Thus, migrating from one BPMS solution to another is not a common task. However, there are forces that are pushing organizations to perform this step, e.g. maintenance costs of legacy BPMS or the need for additional functionality. Before the actual migration, the risk and the effort must be evaluated. This work provides a framework for effort estimation regarding the technical aspects of BPMS migration. The framework provides questions for BPMS comparison and an effort evaluation schema. The applicability of the framework is evaluated based on a simplified BPMS migration scenario.

  7. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  8. Effort problem of chemical pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okrajni, J.; Ciesla, M.; Mutwil, K. [Silesian Technical University, Katowice (Poland)

    1998-12-31

    The problem of the technical state assessment of the chemical pipelines working under mechanical and thermal loading has been shown in the paper. The pipelines effort after the long time operating period has been analysed. Material geometrical and loading conditions of the crack initiation and crack growth process in the chosen object has been discussed. Areas of the maximal effort have been determined. The material structure charges after the long time operating period have been described. Mechanisms of the crack initiation and crack growth in the pipeline elements have been analysed and mutual relations between the chemical and mechanical influences have been shown. (orig.) 16 refs.

  9. Dopamine antagonism decreases willingness to expend physical, but not cognitive, effort: a comparison of two rodent cost/benefit decision-making tasks.

    Science.gov (United States)

    Hosking, Jay G; Floresco, Stan B; Winstanley, Catharine A

    2015-03-01

    Successful decision making often requires weighing a given option's costs against its associated benefits, an ability that appears perturbed in virtually every severe mental illness. Animal models of such cost/benefit decision making overwhelmingly implicate mesolimbic dopamine in our willingness to exert effort for a larger reward. Until recently, however, animal models have invariably manipulated the degree of physical effort, whereas human studies of effort have primarily relied on cognitive costs. Dopamine's relationship to cognitive effort has not been directly examined, nor has the relationship between individuals' willingness to expend mental versus physical effort. It is therefore unclear whether willingness to work hard in one domain corresponds to willingness in the other. Here we utilize a rat cognitive effort task (rCET), wherein animals can choose to allocate greater visuospatial attention for a greater reward, and a previously established physical effort-discounting task (EDT) to examine dopaminergic and noradrenergic contributions to effort. The dopamine antagonists eticlopride and SCH23390 each decreased willingness to exert physical effort on the EDT; these drugs had no effect on willingness to exert mental effort for the rCET. Preference for the high effort option correlated across the two tasks, although this effect was transient. These results suggest that dopamine is only minimally involved in cost/benefit decision making with cognitive effort costs. The constructs of mental and physical effort may therefore comprise overlapping, but distinct, circuitry, and therapeutic interventions that prove efficacious in one effort domain may not be beneficial in another.

  10. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  11. High resolution weather data for urban hydrological modelling and impact assessment, ICT requirements and future challenges

    Science.gov (United States)

    ten Veldhuis, Marie-claire; van Riemsdijk, Birna

    2013-04-01

    Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This

  12. Experiments with data assimilation in comprehensive air quality models: Impacts on model predictions and observation requirements (Invited)

    Science.gov (United States)

    Mathur, R.

    2009-12-01

    Emerging regional scale atmospheric simulation models must address the increasing complexity arising from new model applications that treat multi-pollutant interactions. Sophisticated air quality modeling systems are needed to develop effective abatement strategies that focus on simultaneously controlling multiple criteria pollutants as well as use in providing short term air quality forecasts. In recent years the applications of such models is continuously being extended to address atmospheric pollution phenomenon from local to hemispheric spatial scales over time scales ranging from episodic to annual. The need to represent interactions between physical and chemical atmospheric processes occurring at these disparate spatial and temporal scales requires the use of observation data beyond traditional in-situ networks so that the model simulations can be reasonably constrained. Preliminary applications of assimilation of remote sensing and aloft observations within a comprehensive regional scale atmospheric chemistry-transport modeling system will be presented: (1) A methodology is developed to assimilate MODIS aerosol optical depths in the model to represent the impacts long-range transport associated with the summer 2004 Alaskan fires on surface-level regional fine particulate matter (PM2.5) concentrations across the Eastern U.S. The episodic impact of this pollution transport event on PM2.5 concentrations over the eastern U.S. during mid-July 2004, is quantified through the complementary use of the model with remotely-sensed, aloft, and surface measurements; (2) Simple nudging experiments with limited aloft measurements are performed to identify uncertainties in model representations of physical processes and assess the potential use of such measurements in improving the predictive capability of atmospheric chemistry-transport models. The results from these early applications will be discussed in context of uncertainties in the model and in the remote sensing

  13. ZNJPrice/Earnings Ratio Model through Dividend Yield and Required Yield Above Expected Inflation

    Directory of Open Access Journals (Sweden)

    Emil Mihalina

    2010-07-01

    Full Text Available Price/earnings ratio is the most popular and most widespread evaluation model used to assess relative capital asset value on financial markets. In functional terms, company earnings in the very long term can be described with high significance. Empirically, it is visible from long-term statistics that the demanded (required yield on capital markets has certain regularity. Thus, investors first require a yield above the stable inflation rate and then a dividend yield and a capital increase caused by the growth of earnings that influence the price, with the assumption that the P/E ratio is stable. By combining the Gordon model for current dividend value, the model of market capitalization of earnings (price/earnings ratio and bearing in mind the influence of the general price levels on company earnings, it is possible to adjust the price/earnings ratio by deriving a function of the required yield on capital markets measured by a market index through dividend yield and inflation rate above the stable inflation rate increased by profit growth. The S&P 500 index for example, has in the last 100 years grown by exactly the inflation rate above the stable inflation rate increased by profit growth. The comparison of two series of price/earnings ratios, a modelled one and an average 7-year ratio, shows a notable correlation in the movement of two series of variables, with a three year deviation. Therefore, it could be hypothesized that three years of the expected inflation level, dividend yield and profit growth rate of the market index are discounted in the current market prices. The conclusion is that, at the present time, the relationship between the adjusted average price/earnings ratio and its effect on the market index on one hand and the modelled price/earnings ratio on the other can clearly show the expected dynamics and course in the following period.

  14. Modelo de requisitos para sistemas embebidos: Model of requirements for embedded systems

    Directory of Open Access Journals (Sweden)

    Liliana González Palacio

    2008-07-01

    Full Text Available En este artículo se presenta un modelo de requisitos como apoyo para la construcción de sistemas embebidos. En la actualidad, las metodologías de Ingeniería de Requisitos propuestas para este dominio no establecen continuidad en su proceso de desarrollo, ya que poseen una fuerte orientación a la etapa de diseño y un énfasis más débil en la etapa de análisis. Además, dichas metodologías ofrecen pautas para tratar los requisitos luego de que han sido obtenidos, pero no proponen herramientas; como por ejemplo, un modelo de requisitos, para la obtención de estos. Este trabajo hace parte de un proyecto de investigación que tiene como objetivo proponer una metodología de Ingeniería de Requisitos (IR para el análisis de Sistemas Embebidos (SE. El modelo de requisitos propuesto y su forma de utilización se ilustran mediante un caso de aplicación consistente en la obtención de requisitos para un sistema de sensado de movimiento, embebido en un sistema de alarma para hogar.In this paper a model of requirements for supporting the construction of embedded systems is presented. Currently, the methodologies of Engineering of Requirements, in this field, do not let continuity in their development process, since they have a strong orientation to design stage and a weaker emphasis on the analysis stage. Furthermore, such methodologies provide guidelines for treating requirements after being obtained. However, they do not propose tools such as a model of requirements for obtaining them. This paper is the result of a research project which objective is to propose engineering of requirements methodology for embedded systems analysis. The model of proposed requirements and its use are illustrated through an application case consisting on obtaining requirements for a movement sensing system, embedded in a home alarm system.

  15. A critical review of the data requirements for fluid flow models through fractured rock

    International Nuclear Information System (INIS)

    Priest, S.D.

    1986-01-01

    The report is a comprehensive critical review of the data requirements for ten models of fluid flow through fractured rock, developed in Europe and North America. The first part of the report contains a detailed review of rock discontinuities and how their important geometrical properties can be quantified. This is followed by a brief summary of the fundamental principles in the analysis of fluid flow through two-dimensional discontinuity networks and an explanation of a new approach to the incorporation of variability and uncertainty into geotechnical models. The report also contains a review of the geological and geotechnical properties of anhydrite and granite. Of the ten fluid flow models reviewed, only three offer a realistic fracture network model for which it is feasible to obtain the input data. Although some of the other models have some valuable or novel features, there is a tendency to concentrate on the simulation of contaminant transport processes, at the expense of providing a realistic fracture network model. Only two of the models reviewed, neither of them developed in Europe, have seriously addressed the problem of analysing fluid flow in three-dimensional networks. (author)

  16. Phase transitions in least-effort communications

    International Nuclear Information System (INIS)

    Prokopenko, Mikhail; Ay, Nihat; Obst, Oliver; Polani, Daniel

    2010-01-01

    We critically examine a model that attempts to explain the emergence of power laws (e.g., Zipf's law) in human language. The model is based on the principle of least effort in communications—specifically, the overall effort is balanced between the speaker effort and listener effort, with some trade-off. It has been shown that an information-theoretic interpretation of this principle is sufficiently rich to explain the emergence of Zipf's law in the vicinity of the transition between referentially useless systems (one signal for all referable objects) and indexical reference systems (one signal per object). The phase transition is defined in the space of communication accuracy (information content) expressed in terms of the trade-off parameter. Our study explicitly solves the continuous optimization problem, subsuming a recent, more specific result obtained within a discrete space. The obtained results contrast Zipf's law found by heuristic search (that attained only local minima) in the vicinity of the transition between referentially useless systems and indexical reference systems, with an inverse-factorial (sub-logarithmic) law found at the transition that corresponds to global minima. The inverse-factorial law is observed to be the most representative frequency distribution among optimal solutions

  17. The Norwegian Noark Model Requirements for EDRMS in the context of open government and access to governmental information

    Directory of Open Access Journals (Sweden)

    Olav Hagen Sataslåtten

    2017-11-01

    Full Text Available This article analyses the relationship between the Norwegian Noark Standard and the concepts of Open Government and Freedom of Information. Noark is the Norwegian model requirements for Electronic Documents and Records Management Systems (EDRMS. It was introduced in 1984, making it not only the world’s first model requirement for EDRMS, but also, through the introduction of versions from Noark 1 to the present Noark 5, internationally the model requirement with the longest continuation of implementation.

  18. Expansion of the Kano model to identify relevant customer segments and functional requirements

    DEFF Research Database (Denmark)

    Atlason, Reynir Smari; Stefansson, Arnaldur Smari; Wietz, Miriam

    2017-01-01

    The Kano model of customer satisfaction has been widely used to analyse perceived needs of customers. The model provides product developers valuable information about if, and then how much a given functional requirement (FR) will impact customer satisfaction if implemented within a product, system...... or a service. A current limitation of the Kano model is that it does not allow developers to visualise which combined sets of FRs would provide the highest satisfaction between different customer segments. In this paper, a stepwise method to address this particular shortcoming is presented. First......, a traditional Kano analysis is conducted for the different segments of interest. Second, for each FR, relationship functions are integrated between x=0 and x=1. Third, integrals are inserted into a combination matrix crossing segments and FRs, where FRs with the highest sum across the chosen segments...

  19. Evaluation of olive flowering at low latitude sites in Argentina using a chilling requirement model

    Energy Technology Data Exchange (ETDEWEB)

    Aybar, V.E.; Melo-Abreu, J.P. de; Searles, P.S.; Matias, A.G.; Del Rio, C.; Caballero, C. M.; Rousseaux, M.C.

    2015-07-01

    Olive production has expanded significantly from the Mediterranean Basin into the New World over the last two decades. In some cases, cultivars of European origin have been introduced at a large commercial scale with little previous evaluation of potential productivity. The objective of this study was to evaluate whether a temperature-driven simulation model developed in the Mediterranean Basin to predict normal flowering occurrence and flowering date using cultivar-specific thermal requirements was suitable for the low latitude areas of Northwest Argentina. The model was validated at eight sites over several years and a wide elevation range (350–1200 m above mean sea level) for three cultivars (‘Arbequina’, ‘Frantoio’, ‘Leccino’) with potentially different chilling requirements. In ‘Arbequina’, normal flowering was observed at almost all sites and in all years, while normal flowering events in ‘Frantoio’ and ‘Leccino’ were uncommon. The model successfully predicted if flowering would be normal in 92% and 83% of the cases in ‘Arbequina’ and ‘Frantoio’, respectively, but was somewhat less successful in ‘Leccino’ (61%). When flowering occurred, the predicted flowering date was within ± 7 days of the observed date in 71% of the cases. Overall, the model results indicate that cultivar-specific simulation models may be used as an approximate tool to predict whether individual cultivars will be successful in new growing areas. In Northwest Argentina, the model could be used to identify cultivars to replace ‘Frantoio’ and ‘Leccino’ and to simulate global warming scenarios. (Author)

  20. Prospects and requirements for an operational modelling unit in flood crisis situations

    Directory of Open Access Journals (Sweden)

    Anders Katharina

    2016-01-01

    Full Text Available Dike failure events pose severe flood crisis situations on areas in the hinterland of dikes. In recent decades the importance of being prepared for dike breaches has been increasingly recognized. However, the pre-assessment of inundation resulting from dike breaches is possible only based on scenarios, which might not reflect the situation of a real event. This paper presents a setup and workflow that allows to model dike breachinduced inundation operationally, i.e. when an event is imminent or occurring. A comprehensive system setup of an operational modelling unit has been developed and implemented in the frame of a federal project in Saxony-Anhalt, Germany. The modelling unit setup comprises a powerful methodology of flood modelling and elaborated operational guidelines for crisis situations. Nevertheless, it is of fundamental importance that the modelling unit is instated prior to flood events as a permanent system. Moreover the unit needs to be fully integrated in flood crisis management. If these crucial requirements are met, a modelling unit is capable of fundamentally supporting flood management with operational prognoses of adequate quality even in the limited timeframe of crisis situations.

  1. Effort levels of the partners in networked manufacturing

    Science.gov (United States)

    Chai, G. R.; Cai, Z.; Su, Y. N.; Zong, S. L.; Zhai, G. Y.; Jia, J. H.

    2017-08-01

    Compared with traditional manufacturing mode, could networked manufacturing improve effort levels of the partners? What factors will affect effort level of the partners? How to encourage the partners to improve their effort levels? To answer these questions, we introduce network effect coefficient to build effort level model of the partners in networked manufacturing. The results show that (1) with the increase of the network effect in networked manufacturing, the actual effort level can go beyond the ideal level of traditional manufacturing. (2) Profit allocation based on marginal contribution rate would help improve effort levels of the partners in networked manufacturing. (3) The partners in networked manufacturing who wishes to have a larger distribution ratio must make a higher effort level, and enterprises with insufficient effort should be terminated in networked manufacturing.

  2. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    Science.gov (United States)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  3. Modeling traceability information and functionality requirement in export-oriented tilapia chain.

    Science.gov (United States)

    Zhang, Xiaoshuan; Feng, Jianying; Xu, Mark; Hu, Jinyou

    2011-05-01

    Tilapia has been named as the 'food fish of the 21st century' and has become the most important farmed fish. China is the world leader in tilapia production and export. Identifying information and functional requirements is critical in developing an efficient traceability system because traceability has become a fundamental prerequisite for exporting aquaculture products. This paper examines the export-oriented tilapia chains and information flow in the chains, and identifies the key actors, information requirements and information-capturing points. Unified Modeling Language (UML) technology is adopted to describe the information and functionality requirement for chain traceability. The barriers of traceability system adoption are also identified. The results show that the traceability data consist of four categories that must be recorded by each link in the chain. The functionality requirement is classified into four categories from the fundamental information record to decisive quality control; the top three barriers to the traceability system adoption are: high costs of implementing the system, lack of experienced and professional staff; and low level of government involvement and support. Copyright © 2011 Society of Chemical Industry.

  4. Safety Culture: A Requirement for New Business Models — Lessons Learned from Other High Risk Industries

    International Nuclear Information System (INIS)

    Kecklund, L.

    2016-01-01

    Technical development and changes on global markets affects all high risk industries creating opportunities as well as risks related to the achievement of safety and business goals. Changes in legal and regulatory frameworks as well as in market demands create a need for major changes. Several high risk industries are facing a situation where they have to develop new business models. Within the transportation domain, e.g., aviation and railways, there is a growing concern related to how the new business models may affects safety issues. New business models in aviation and railways include extensive use of outsourcing and subcontractors to reduce costs resulting in, e.g., negative changes in working conditions, work hours, employment conditions and high turnover rates. The energy sector also faces pressures to create new business models for transition to renewable energy production to comply with new legal and regulatory requirements and to make best use of new reactor designs. In addition, large scale phase out and decommissioning of nuclear facilities have to be managed by the nuclear industry. Some negative effects of new business models have already arisen within the transportation domain, e.g., the negative effects of extensive outsourcing and subcontractor use. In the railway domain the infrastructure manager is required by European and national regulations to assure that all subcontractors are working according to the requirements in the infrastructure managers SMS (Safety Management System). More than ten levels of subcontracts can be working in a major infrastructure project making the system highly complex and thus difficult to control. In the aviation domain, tightly coupled interacting computer networks supplying airport services, as well as air traffic control, are managed and maintained by several different companies creating numerous interfaces which must be managed by the SMS. There are examples where a business model with several low

  5. Proper interpretation of dissolved nitrous oxide isotopes, production pathways, and emissions requires a modelling approach.

    Science.gov (United States)

    Thuss, Simon J; Venkiteswaran, Jason J; Schiff, Sherry L

    2014-01-01

    Stable isotopes ([Formula: see text]15N and [Formula: see text]18O) of the greenhouse gas N2O provide information about the sources and processes leading to N2O production and emission from aquatic ecosystems to the atmosphere. In turn, this describes the fate of nitrogen in the aquatic environment since N2O is an obligate intermediate of denitrification and can be a by-product of nitrification. However, due to exchange with the atmosphere, the [Formula: see text] values at typical concentrations in aquatic ecosystems differ significantly from both the source of N2O and the N2O emitted to the atmosphere. A dynamic model, SIDNO, was developed to explore the relationship between the isotopic ratios of N2O, N2O source, and the emitted N2O. If the N2O production rate or isotopic ratios vary, then the N2O concentration and isotopic ratios may vary or be constant, not necessarily concomitantly, depending on the synchronicity of production rate and source isotopic ratios. Thus prima facie interpretation of patterns in dissolved N2O concentrations and isotopic ratios is difficult. The dynamic model may be used to correctly interpret diel field data and allows for the estimation of the gas exchange coefficient, N2O production rate, and the production-weighted [Formula: see text] values of the N2O source in aquatic ecosystems. Combining field data with these modelling efforts allows this critical piece of nitrogen cycling and N2O flux to the atmosphere to be assessed.

  6. Summary of process research analysis efforts

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  7. Model of an aquaponic system for minimised water, energy and nitrogen requirements.

    Science.gov (United States)

    Reyes Lastiri, D; Slinkert, T; Cappon, H J; Baganz, D; Staaks, G; Keesman, K J

    2016-01-01

    Water and nutrient savings can be established by coupling water streams between interacting processes. Wastewater from production processes contains nutrients like nitrogen (N), which can and should be recycled in order to meet future regulatory discharge demands. Optimisation of interacting water systems is a complex task. An effective way of understanding, analysing and optimising such systems is by applying mathematical models. The present modelling work aims at supporting the design of a nearly emission-free aquaculture and hydroponic system (aquaponics), thus contributing to sustainable production and to food security for the 21st century. Based on the model, a system that couples 40 m(3) fish tanks and a hydroponic system of 1,000 m(2) can produce 5 tons of tilapia and 75 tons of tomato yearly. The system requires energy to condense and recover evaporated water, for lighting and heating, adding up to 1.3 GJ/m(2) every year. In the suggested configuration, the fish can provide about 26% of the N required in a plant cycle. A coupling strategy that sends water from the fish to the plants in amounts proportional to the fish feed input, reduces the standard deviation of the NO3(-) level in the fish cycle by 35%.

  8. Classical Causal Models for Bell and Kochen-Specker Inequality Violations Require Fine-Tuning

    Directory of Open Access Journals (Sweden)

    Eric G. Cavalcanti

    2018-04-01

    Full Text Available Nonlocality and contextuality are at the root of conceptual puzzles in quantum mechanics, and they are key resources for quantum advantage in information-processing tasks. Bell nonlocality is best understood as the incompatibility between quantum correlations and the classical theory of causality, applied to relativistic causal structure. Contextuality, on the other hand, is on a more controversial foundation. In this work, I provide a common conceptual ground between nonlocality and contextuality as violations of classical causality. First, I show that Bell inequalities can be derived solely from the assumptions of no signaling and no fine-tuning of the causal model. This removes two extra assumptions from a recent result from Wood and Spekkens and, remarkably, does not require any assumption related to independence of measurement settings—unlike all other derivations of Bell inequalities. I then introduce a formalism to represent contextuality scenarios within causal models and show that all classical causal models for violations of a Kochen-Specker inequality require fine-tuning. Thus, the quantum violation of classical causality goes beyond the case of spacelike-separated systems and already manifests in scenarios involving single systems.

  9. Minimum requirements for predictive pore-network modeling of solute transport in micromodels

    Science.gov (United States)

    Mehmani, Yashar; Tchelepi, Hamdi A.

    2017-10-01

    Pore-scale models are now an integral part of analyzing fluid dynamics in porous materials (e.g., rocks, soils, fuel cells). Pore network models (PNM) are particularly attractive due to their computational efficiency. However, quantitative predictions with PNM have not always been successful. We focus on single-phase transport of a passive tracer under advection-dominated regimes and compare PNM with high-fidelity direct numerical simulations (DNS) for a range of micromodel heterogeneities. We identify the minimum requirements for predictive PNM of transport. They are: (a) flow-based network extraction, i.e., discretizing the pore space based on the underlying velocity field, (b) a Lagrangian (particle tracking) simulation framework, and (c) accurate transfer of particles from one pore throat to the next. We develop novel network extraction and particle tracking PNM methods that meet these requirements. Moreover, we show that certain established PNM practices in the literature can result in first-order errors in modeling advection-dominated transport. They include: all Eulerian PNMs, networks extracted based on geometric metrics only, and flux-based nodal transfer probabilities. Preliminary results for a 3D sphere pack are also presented. The simulation inputs for this work are made public to serve as a benchmark for the research community.

  10. Assessing Requirements Quality through Requirements Coverage

    Science.gov (United States)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software

  11. Estimating required information size by quantifying diversity in random-effects model meta-analyses

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Thorlund, Kristian; Brok, Jesper

    2009-01-01

    an intervention effect suggested by trials with low-risk of bias. METHODS: Information size calculations need to consider the total model variance in a meta-analysis to control type I and type II errors. Here, we derive an adjusting factor for the required information size under any random-effects model meta......-analysis. RESULTS: We devise a measure of diversity (D2) in a meta-analysis, which is the relative variance reduction when the meta-analysis model is changed from a random-effects into a fixed-effect model. D2 is the percentage that the between-trial variability constitutes of the sum of the between...... and interpreted using several simulations and clinical examples. In addition we show mathematically that diversity is equal to or greater than inconsistency, that is D2 >or= I2, for all meta-analyses. CONCLUSION: We conclude that D2 seems a better alternative than I2 to consider model variation in any random...

  12. Modelling of radon control and air cleaning requirements in underground uranium mines

    International Nuclear Information System (INIS)

    El Fawal, M.; Gadalla, A.

    2014-01-01

    As a part of a comprehensive study concerned with control workplace short-lived radon daughter concentration in underground uranium mines to safe levels, a computer program has been developed and verified, to calculate ventilation parameters e.g. local pressures, flow rates and radon daughter concentration levels. The computer program is composed of two parts, one part for mine ventilation and the other part for radon daughter levels calculations. This program has been validated in an actual case study to calculate radon concentration levels, pressure and flow rates required to maintain acceptable levels of radon concentrations in each point of the mine. The required fan static pressure and the approximate energy consumption were also estimated. The results of the calculations have been evaluated and compared with similar investigation. It was found that the calculated values are in good agreement with the corresponding values obtained using ''REDES'' standard ventilation modelling software. The developed computer model can be used as an available tool to help in the evaluation of ventilation systems proposed by mining authority, to assist the uranium mining industry in maintaining the health and safety of the workers underground while efficiently achieving economic production targets. It could be used also for regulatory inspection and radiation protection assessments of workers in the underground mining. Also with using this model, one can effectively design, assess and manage underground mine ventilation systems. Values of radon decay products concentration in units of working level, pressures drop and flow rates required to reach the acceptable radon concentration relative to the recommended levels, at different extraction points in the mine and fan static pressure could be estimated which are not available using other software. (author)

  13. Meeting Human Reliability Requirements through Human Factors Design, Testing, and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Boring

    2007-06-01

    In the design of novel systems, it is important for the human factors engineer to work in parallel with the human reliability analyst to arrive at the safest achievable design that meets design team safety goals and certification or regulatory requirements. This paper introduces the System Development Safety Triptych, a checklist of considerations for the interplay of human factors and human reliability through design, testing, and modeling in product development. This paper also explores three phases of safe system development, corresponding to the conception, design, and implementation of a system.

  14. Verification of voltage/ frequency requirement for emergency diesel generator in nuclear power plant using dynamic modeling

    International Nuclear Information System (INIS)

    Hur, J.S.; Roh, M.S.

    2013-01-01

    Full-text: One major cause of the plant shutdown is the loss of electrical power. The study is to comprehend the coping action against station blackout including emergency diesel generator, sequential loading of safety system and to ensure that the emergency diesel generator should meet requirements, especially voltage and frequency criteria using modeling tool. This paper also considered the change of the sequencing time and load capacity only for finding electrical design margin. However, the revision of load list must be verified with safety analysis. From this study, it is discovered that new load calculation is a key factor in EDG localization and in-house capability increase. (author)

  15. Model-independent requirements to the source of positrons in the galactic centre

    International Nuclear Information System (INIS)

    Aharonyan, F.A.

    1986-01-01

    The main requirements, following from the observational data in a wide range of electromagnetic waves, to positron source in the galactic centre are formulated. The most probable mechanism providing an efficiency of positron production of 10% is the pair production at photon-photon collisions. This mechanism can be realized a) in a thermal e + e - pair-dominated weak-relativistic plasma and b) at the development of a nonthermal electromagnetic cascade initiated by relativistic particles in the field of X-rays. Gamma-astronomical observations in the region of E γ ≥ 10 11 eV can be crucial in the choice of the model

  16. Mathematically modelling the power requirement for a vertical shaft mowing machine

    Directory of Open Access Journals (Sweden)

    Jorge Simón Pérez de Corcho Fuentes

    2008-09-01

    Full Text Available This work describes a mathematical model for determining the power demand for a vertical shaft mowing machine, particularly taking into account the influence of speed on cutting power, which is different from that of other models of mowers. The influence of the apparatus’ rotation and translation speeds was simulated in determining power demand. The results showed that no chan-ges in cutting power were produced by varying the knives’ angular speed (if translation speed was constant, while cutting power became increased if translation speed was increased. Variations in angular speed, however, influenced other parameters deter-mining total power demand. Determining this vertical shaft mower’s cutting pattern led to obtaining good crop stubble quality at the mower’s lower rotation speed, hence reducing total energy requirements.

  17. Required spatial resolution of hydrological models to evaluate urban flood resilience measures

    Science.gov (United States)

    Gires, A.; Giangola-Murzyn, A.; Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.

    2012-04-01

    During a flood in urban area, several non-linear processes (rainfall, surface runoff, sewer flow, and sub-surface flow) interact. Fully distributed hydrological models are a useful tool to better understand these complex interactions between natural processes and man built environment. Developing an efficient model is a first step to improve the understanding of flood resilience in urban area. Given that the previously mentioned underlying physical phenomenon exhibit different relevant scales, determining the required spatial resolution of such model is tricky but necessary issue. For instance such model should be able to properly represent large scale effects of local scale flood resilience measures such as stop logs. The model should also be as simple as possible without being simplistic. In this paper we test two types of model. First we use an operational semi-distributed model over a 3400 ha peri-urban area located in Seine-Saint-Denis (North-East of Paris). In this model, the area is divided into sub-catchments of average size 17 ha that are considered as homogenous, and only the sewer discharge is modelled. The rainfall data, whose resolution is 1 km is space and 5 min in time, comes from the C-band radar of Trappes, located in the West of Paris, and operated by Météo-France. It was shown that the spatial resolution of both the model and the rainfall field did not enable to fully grasp the small scale rainfall variability. To achieve this, first an ensemble of realistic rainfall fields downscaled to a resolution of 100 m is generated with the help of multifractal space-time cascades whose characteristic exponents are estimated on the available radar data. Second the corresponding ensemble of sewer hydrographs is simulated by inputting each rainfall realization to the model. It appears that the probability distribution of the simulated peak flow exhibits a power-law behaviour. This indicates that there is a great uncertainty associated with small scale

  18. Model-based Assessment for Balancing Privacy Requirements and Operational Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-17

    The smart grid changes the way energy is produced and distributed. In addition both, energy and information is exchanged bidirectionally among participating parties. Therefore heterogeneous systems have to cooperate effectively in order to achieve a common high-level use case, such as smart metering for billing or demand response for load curtailment. Furthermore, a substantial amount of personal data is often needed for achieving that goal. Capturing and processing personal data in the smart grid increases customer concerns about privacy and in addition, certain statutory and operational requirements regarding privacy aware data processing and storage have to be met. An increase of privacy constraints, however, often limits the operational capabilities of the system. In this paper, we present an approach that automates the process of finding an optimal balance between privacy requirements and operational requirements in a smart grid use case and application scenario. This is achieved by formally describing use cases in an abstract model and by finding an algorithm that determines the optimum balance by forward mapping privacy and operational impacts. For this optimal balancing algorithm both, a numeric approximation and – if feasible – an analytic assessment are presented and investigated. The system is evaluated by applying the tool to a real-world use case from the University of Southern California (USC) microgrid.

  19. [In-depth interviews and the Kano model to determine user requirements in a burns unit].

    Science.gov (United States)

    González-Revaldería, J; Holguín-Holgado, P; Lumbreras-Marín, E; Núñez-López, G

    To determine the healthcare requirements of patients in a Burns Unit, using qualitative techniques, such us in-depth personal interviews and Kano's methodology. Qualitative methodology using in-depth personal interviews (12 patients), Kano's conceptual model, and the SERVQHOS questionnaire (24 patients). All patients had been hospitalised in the last 12 months in the Burns Unit. Using Kano's methodology, service attributes were grouped by affinity diagrams, and classified as follows: must-be, attractive (unexpected, great satisfaction), and one-dimensional (linked to the degree of functionality of the service). The outcomes were compared with those obtained with SERVQHOS questionnaire. From the analysis of in-depth interviews, 11 requirements were obtained, referring to hotel aspects, information, need for closer staff relationship, and organisational aspects. The attributes classified as must-be were free television and automatic TV disconnection at midnight. Those classified as attractive were: individual room for more privacy, information about dressing change times in order to avoid anxiety, and additional staff for in-patients. The results were complementary to those obtained with the SERVQHOS questionnaire. In-depth personal interviews provide extra knowledge about patient requirements, complementing the information obtained with questionnaires. With this methodology, a more active patient participation is achieved and the companion's opinion is also taken into account. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    Science.gov (United States)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  1. Multipartite Entanglement Detection with Minimal Effort

    Science.gov (United States)

    Knips, Lukas; Schwemmer, Christian; Klein, Nico; Wieśniak, Marcin; Weinfurter, Harald

    2016-11-01

    Certifying entanglement of a multipartite state is generally considered a demanding task. Since an N qubit state is parametrized by 4N-1 real numbers, one might naively expect that the measurement effort of generic entanglement detection also scales exponentially with N . Here, we introduce a general scheme to construct efficient witnesses requiring a constant number of measurements independent of the number of qubits for states like, e.g., Greenberger-Horne-Zeilinger states, cluster states, and Dicke states. For four qubits, we apply this novel method to experimental realizations of the aforementioned states and prove genuine four-partite entanglement with two measurement settings only.

  2. Corporate Data Network (CDN) data requirements task. Enterprise Model. Volume 1

    International Nuclear Information System (INIS)

    1985-11-01

    The NRC has initiated a multi-year program to centralize its information processing in a Corporate Data Network (CDN). The new information processing environment will include shared databases, telecommunications, office automation tools, and state-of-the-art software. Touche Ross and Company was contracted with to perform a general data requirements analysis for shared databases and to develop a preliminary plan for implementation of the CDN concept. The Enterprise Model (Vol. 1) provided the NRC with agency-wide information requirements in the form of data entities and organizational demand patterns as the basis for clustering the entities into logical groups. The Data Dictionary (Vol. 2) provided the NRC with definitions and example attributes and properties for each entity. The Data Model (Vol. 3) defined logical databases and entity relationships within and between databases. The Preliminary Strategic Data Plan (Vol. 4) prioritized the development of databases and included a workplan and approach for implementation of the shared database component of the Corporate Data Network

  3. THE 3C COOPERATION MODEL APPLIED TO THE CLASSICAL REQUIREMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vagner Luiz Gava

    2012-08-01

    Full Text Available Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users’ workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system

  4. The Streptococcus sanguinis competence regulon is not required for infective endocarditis virulence in a rabbit model.

    Directory of Open Access Journals (Sweden)

    Jill E Callahan

    Full Text Available Streptococcus sanguinis is an important component of dental plaque and a leading cause of infective endocarditis. Genetic competence in S. sanguinis requires a quorum sensing system encoded by the early comCDE genes, as well as late genes controlled by the alternative sigma factor, ComX. Previous studies of Streptococcus pneumoniae and Streptococcus mutans have identified functions for the >100-gene com regulon in addition to DNA uptake, including virulence. We investigated this possibility in S. sanguinis. Strains deleted for the comCDE or comX master regulatory genes were created. Using a rabbit endocarditis model in conjunction with a variety of virulence assays, we determined that both mutants possessed infectivity equivalent to that of a virulent control strain, and that measures of disease were similar in rabbits infected with each strain. These results suggest that the com regulon is not required for S. sanguinis infective endocarditis virulence in this model. We propose that the different roles of the S. sanguinis, S. pneumoniae, and S. mutans com regulons in virulence can be understood in relation to the pathogenic mechanisms employed by each species.

  5. The Streptococcus sanguinis competence regulon is not required for infective endocarditis virulence in a rabbit model.

    Science.gov (United States)

    Callahan, Jill E; Munro, Cindy L; Kitten, Todd

    2011-01-01

    Streptococcus sanguinis is an important component of dental plaque and a leading cause of infective endocarditis. Genetic competence in S. sanguinis requires a quorum sensing system encoded by the early comCDE genes, as well as late genes controlled by the alternative sigma factor, ComX. Previous studies of Streptococcus pneumoniae and Streptococcus mutans have identified functions for the >100-gene com regulon in addition to DNA uptake, including virulence. We investigated this possibility in S. sanguinis. Strains deleted for the comCDE or comX master regulatory genes were created. Using a rabbit endocarditis model in conjunction with a variety of virulence assays, we determined that both mutants possessed infectivity equivalent to that of a virulent control strain, and that measures of disease were similar in rabbits infected with each strain. These results suggest that the com regulon is not required for S. sanguinis infective endocarditis virulence in this model. We propose that the different roles of the S. sanguinis, S. pneumoniae, and S. mutans com regulons in virulence can be understood in relation to the pathogenic mechanisms employed by each species.

  6. Greater effort increases perceived value in an invertebrate.

    Science.gov (United States)

    Czaczkes, Tomer J; Brandstetter, Birgit; di Stefano, Isabella; Heinze, Jürgen

    2018-05-01

    Expending effort is generally considered to be undesirable. However, both humans and vertebrates will work for a reward they could also get for free. Moreover, cues associated with high-effort rewards are preferred to low-effort associated cues. Many explanations for these counterintuitive findings have been suggested, including cognitive dissonance (self-justification) or a greater contrast in state (e.g., energy or frustration level) before and after an effort-linked reward. Here, we test whether effort expenditure also increases perceived value in ants, using both classical cue-association methods and pheromone deposition, which correlates with perceived value. In 2 separate experimental setups, we show that pheromone deposition is higher toward the reward that requires more effort: 47% more pheromone deposition was performed for rewards reached via a vertical runway (high effort) compared with ones reached via a horizontal runway (low effort), and deposition rates were 28% higher on rough (high effort) versus smooth (low effort) runways. Using traditional cue-association methods, 63% of ants trained on different surface roughness, and 70% of ants trained on different runway elevations, preferred the high-effort related cues on a Y maze. Finally, pheromone deposition to feeders requiring memorization of one path bifurcation was up to 29% higher than to an identical feeder requiring no learning. Our results suggest that effort affects value perception in ants. This effect may stem from a cognitive process, which monitors the change in a generalized hedonic state before and after reward. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Bioenergetics model for estimating food requirements of female Pacific walruses (Odobenus rosmarus divergens)

    Science.gov (United States)

    Noren, S.R.; Udevitz, M.S.; Jay, C.V.

    2012-01-01

    Pacific walruses Odobenus rosmarus divergens use sea ice as a platform for resting, nursing, and accessing extensive benthic foraging grounds. The extent of summer sea ice in the Chukchi Sea has decreased substantially in recent decades, causing walruses to alter habitat use and activity patterns which could affect their energy requirements. We developed a bioenergetics model to estimate caloric demand of female walruses, accounting for maintenance, growth, activity (active in-water and hauled-out resting), molt, and reproductive costs. Estimates for non-reproductive females 0–12 yr old (65−810 kg) ranged from 16359 to 68960 kcal d−1 (74−257 kcal d−1 kg−1) for years with readily available sea ice for which we assumed animals spent 83% of their time in water. This translated into the energy content of 3200–5960 clams per day, equivalent to 7–8% and 14–9% of body mass per day for 5–12 and 2–4 yr olds, respectively. Estimated consumption rates of 12 yr old females were minimally affected by pregnancy, but lactation had a large impact, increasing consumption rates to 15% of body mass per day. Increasing the proportion of time in water to 93%, as might happen if walruses were required to spend more time foraging during ice-free periods, increased daily caloric demand by 6–7% for non-lactating females. We provide the first bioenergetics-based estimates of energy requirements for walruses and a first step towards establishing bioenergetic linkages between demography and prey requirements that can ultimately be used in predicting this population’s response to environmental change.

  8. Diverse Secreted Effectors Are Required for Salmonella Persistence in a Mouse Infection Model

    Energy Technology Data Exchange (ETDEWEB)

    Kidwai, Afshan S.; Mushamiri, Ivy T.; Niemann, George; Brown, Roslyn N.; Adkins, Joshua N.; Heffron, Fred

    2013-08-12

    Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS) apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI) was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  9. Modelling elderly cardiac patients decision making using Cognitive Work Analysis: identifying requirements for patient decision aids.

    Science.gov (United States)

    Dhukaram, Anandhi Vivekanandan; Baber, Chris

    2015-06-01

    Patients make various healthcare decisions on a daily basis. Such day-to-day decision making can have significant consequences on their own health, treatment, care, and costs. While decision aids (DAs) provide effective support in enhancing patient's decision making, to date there have been few studies examining patient's decision making process or exploring how the understanding of such decision processes can aid in extracting requirements for the design of DAs. This paper applies Cognitive Work Analysis (CWA) to analyse patient's decision making in order to inform requirements for supporting self-care decision making. This study uses focus groups to elicit information from elderly cardiovascular disease (CVD) patients concerning a range of decision situations they face on a daily basis. Specifically, the focus groups addressed issues related to the decision making of CVD in terms of medication compliance, pain, diet and exercise. The results of these focus groups are used to develop high level views using CWA. CWA framework decomposes the complex decision making problem to inform three approaches to DA design: one design based on high level requirements; one based on a normative model of decision-making for patients; and the third based on a range of heuristics that patients seem to use. CWA helps in extracting and synthesising decision making from different perspectives: decision processes, work organisation, patient competencies and strategies used in decision making. As decision making can be influenced by human behaviour like skills, rules and knowledge, it is argued that patients require support to different types of decision making. This paper also provides insights for designers in using CWA framework for the design of effective DAs to support patients in self-management. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Business Process Modelling is an Essential Part of a Requirements Analysis. Contribution of EFMI Primary Care Working Group.

    Science.gov (United States)

    de Lusignan, S; Krause, P; Michalakidis, G; Vicente, M Tristan; Thompson, S; McGilchrist, M; Sullivan, F; van Royen, P; Agreus, L; Desombre, T; Taweel, A; Delaney, B

    2012-01-01

    To perform a requirements analysis of the barriers to conducting research linking of primary care, genetic and cancer data. We extended our initial data-centric approach to include socio-cultural and business requirements. We created reference models of core data requirements common to most studies using unified modelling language (UML), dataflow diagrams (DFD) and business process modelling notation (BPMN). We conducted a stakeholder analysis and constructed DFD and UML diagrams for use cases based on simulated research studies. We used research output as a sensitivity analysis. Differences between the reference model and use cases identified study specific data requirements. The stakeholder analysis identified: tensions, changes in specification, some indifference from data providers and enthusiastic informaticians urging inclusion of socio-cultural context. We identified requirements to collect information at three levels: micro- data items, which need to be semantically interoperable, meso- the medical record and data extraction, and macro- the health system and socio-cultural issues. BPMN clarified complex business requirements among data providers and vendors; and additional geographical requirements for patients to be represented in both linked datasets. High quality research output was the norm for most repositories. Reference models provide high-level schemata of the core data requirements. However, business requirements' modelling identifies stakeholder issues and identifies what needs to be addressed to enable participation.

  11. Analysis Efforts Supporting NSTX Upgrades

    International Nuclear Information System (INIS)

    Zhang, H.; Titus, P.; Rogoff, P.; Zolfaghari, A.; Mangra, D.; Smith, M.

    2010-01-01

    The National Spherical Torus Experiment (NSTX) is a low aspect ratio, spherical torus (ST) configuration device which is located at Princeton Plasma Physics Laboratory (PPPL) This device is presently being updated to enhance its physics by doubling the TF field to 1 Tesla and increasing the plasma current to 2 Mega-amperes. The upgrades include a replacement of the centerstack and addition of a second neutral beam. The upgrade analyses have two missions. The first is to support design of new components, principally the centerstack, the second is to qualify existing NSTX components for higher loads, which will increase by a factor of four. Cost efficiency was a design goal for new equipment qualification, and reanalysis of the existing components. Showing that older components can sustain the increased loads has been a challenging effort in which designs had to be developed that would limit loading on weaker components, and would minimize the extent of modifications needed. Two areas representing this effort have been chosen to describe in more details: analysis of the current distribution in the new TF inner legs, and, second, analysis of the out-of-plane support of the existing TF outer legs.

  12. APS Education and Diversity Efforts

    Science.gov (United States)

    Prestridge, Katherine; Hodapp, Theodore

    2015-11-01

    American Physical Society (APS) has a wide range of education and diversity programs and activities, including programs that improve physics education, increase diversity, provide outreach to the public, and impact public policy. We present the latest programs spearheaded by the Committee on the Status of Women in Physics (CSWP), with highlights from other diversity and education efforts. The CSWP is working to increase the fraction of women in physics, understand and implement solutions for gender-specific issues, enhance professional development opportunities for women in physics, and remedy issues that impact gender inequality in physics. The Conferences for Undergraduate Women in Physics, Professional Skills Development Workshops, and our new Professional Skills program for students and postdocs are all working towards meeting these goals. The CSWP also has site visit and conversation visit programs, where department chairs request that the APS assess the climate for women in their departments or facilitate climate discussions. APS also has two significant programs to increase participation by underrepresented minorities (URM). The newest program, the APS National Mentoring Community, is working to provide mentoring to URM undergraduates, and the APS Bridge Program is an established effort that is dramatically increasing the number of URM PhDs in physics.

  13. Effort, anhedonia, and function in schizophrenia: reduced effort allocation predicts amotivation and functional impairment.

    Science.gov (United States)

    Barch, Deanna M; Treadway, Michael T; Schoen, Nathan

    2014-05-01

    One of the most debilitating aspects of schizophrenia is an apparent interest in or ability to exert effort for rewards. Such "negative symptoms" may prevent individuals from obtaining potentially beneficial outcomes in educational, occupational, or social domains. In animal models, dopamine abnormalities decrease willingness to work for rewards, implicating dopamine (DA) function as a candidate substrate for negative symptoms given that schizophrenia involves dysregulation of the dopamine system. We used the effort-expenditure for rewards task (EEfRT) to assess the degree to which individuals with schizophrenia were wiling to exert increased effort for either larger magnitude rewards or for rewards that were more probable. Fifty-nine individuals with schizophrenia and 39 demographically similar controls performed the EEfRT task, which involves making choices between "easy" and "hard" tasks to earn potential rewards. Individuals with schizophrenia showed less of an increase in effort allocation as either reward magnitude or probability increased. In controls, the frequency of choosing the hard task in high reward magnitude and probability conditions was negatively correlated with depression severity and anhedonia. In schizophrenia, fewer hard task choices were associated with more severe negative symptoms and worse community and work function as assessed by a caretaker. Consistent with patterns of disrupted dopamine functioning observed in animal models of schizophrenia, these results suggest that 1 mechanism contributing to impaired function and motivational drive in schizophrenia may be a reduced allocation of greater effort for higher magnitude or higher probability rewards.

  14. Dynamic Computational Model of Symptomatic Bacteremia to Inform Bacterial Separation Treatment Requirements.

    Directory of Open Access Journals (Sweden)

    Sinead E Miller

    Full Text Available The rise of multi-drug resistance has decreased the effectiveness of antibiotics, which has led to increased mortality rates associated with symptomatic bacteremia, or bacterial sepsis. To combat decreasing antibiotic effectiveness, extracorporeal bacterial separation approaches have been proposed to capture and separate bacteria from blood. However, bacteremia is dynamic and involves host-pathogen interactions across various anatomical sites. We developed a mathematical model that quantitatively describes the kinetics of pathogenesis and progression of symptomatic bacteremia under various conditions, including bacterial separation therapy, to better understand disease mechanisms and quantitatively assess the biological impact of bacterial separation therapy. Model validity was tested against experimental data from published studies. This is the first multi-compartment model of symptomatic bacteremia in mammals that includes extracorporeal bacterial separation and antibiotic treatment, separately and in combination. The addition of an extracorporeal bacterial separation circuit reduced the predicted time of total bacteria clearance from the blood of an immunocompromised rodent by 49%, compared to antibiotic treatment alone. Implementation of bacterial separation therapy resulted in predicted multi-drug resistant bacterial clearance from the blood of a human in 97% less time than antibiotic treatment alone. The model also proposes a quantitative correlation between time-dependent bacterial load among tissues and bacteremia severity, analogous to the well-known 'area under the curve' for characterization of drug efficacy. The engineering-based mathematical model developed may be useful for informing the design of extracorporeal bacterial separation devices. This work enables the quantitative identification of the characteristics required of an extracorporeal bacteria separation device to provide biological benefit. These devices will potentially

  15. Summary report of a seminar on geosphere modelling requirements of deep disposal of low and intermediate level radioactive wastes

    International Nuclear Information System (INIS)

    Piper, D.; Paige, R.W.; Broyd, T.W.

    1989-02-01

    A seminar on the geosphere modelling requirements of deep disposal of low and intermediate level radioactive wastes was organised by WS Atkins Engineering Sciences as part of Her Majesty's Inspectorate of Pollution's Radioactive Waste Assessment Programme. The objectives of the seminar were to review geosphere modelling capabilities and prioritise, if possible, any requirements for model development. Summaries of the presentations and subsequent discussions are given in this report. (author)

  16. A decision-making framework to model environmental flow requirements in oasis areas using Bayesian networks

    Science.gov (United States)

    Xue, Jie; Gui, Dongwei; Zhao, Ying; Lei, Jiaqiang; Zeng, Fanjiang; Feng, Xinlong; Mao, Donglei; Shareef, Muhammad

    2016-09-01

    The competition for water resources between agricultural and natural oasis ecosystems has become an increasingly serious problem in oasis areas worldwide. Recently, the intensive extension of oasis farmland has led to excessive exploitation of water discharge, and consequently has resulted in a lack of water supply in natural oasis. To coordinate the conflicts, this paper provides a decision-making framework for modeling environmental flows in oasis areas using Bayesian networks (BNs). Three components are included in the framework: (1) assessment of agricultural economic loss due to meeting environmental flow requirements; (2) decision-making analysis using BNs; and (3) environmental flow decision-making under different water management scenarios. The decision-making criterion is determined based on intersection point analysis between the probability of large-level total agro-economic loss and the ratio of total to maximum agro-economic output by satisfying environmental flows. An application in the Qira oasis area of the Tarim Basin, Northwest China indicates that BNs can model environmental flow decision-making associated with agricultural economic loss effectively, as a powerful tool to coordinate water-use conflicts. In the case study, the environmental flow requirement is determined as 50.24%, 49.71% and 48.73% of the natural river flow in wet, normal and dry years, respectively. Without further agricultural economic loss, 1.93%, 0.66% and 0.43% of more river discharge can be allocated to eco-environmental water demands under the combined strategy in wet, normal and dry years, respectively. This work provides a valuable reference for environmental flow decision-making in any oasis area worldwide.

  17. Cognitive dissonance in children: justification of effort or contrast?

    Science.gov (United States)

    Alessandri, Jérôme; Darcheville, Jean-Claude; Zentall, Thomas R

    2008-06-01

    Justification of effort is a form of cognitive dissonance in which the subjective value of an outcome is directly related to the effort that went into obtaining it. However, it is likely that in social contexts (such as the requirements for joining a group) an inference can be made (perhaps incorrectly) that an outcome that requires greater effort to obtain in fact has greater value. Here we present evidence that a cognitive dissonance effect can be found in children under conditions that offer better control for the social value of the outcome. This effect is quite similar to contrast effects that recently have been studied in animals. We suggest that contrast between the effort required to obtain the outcome and the outcome itself provides a more parsimonious account of this phenomenon and perhaps other related cognitive dissonance phenomena as well. Research will be needed to identify cognitive dissonance processes that are different from contrast effects of this kind.

  18. The Design of Effective ICT-Supported Learning Activities: Exemplary Models, Changing Requirements, and New Possibilities

    Directory of Open Access Journals (Sweden)

    Cameron Richards

    2005-01-01

    Full Text Available Despite the imperatives of policy and rhetoric about their integration in formal education, Information and Communication Technologies (ICTs are often used as an "add-on" in many classrooms and in many lesson plans. Nevertheless, many teachers find that interesting and well-planned tasks, projects, and resources provide a key to harnessing the educational potential of digital resources, Internet communications and interactive multimedia to engage the interest, interaction, and knowledge construction of young learners. To the extent that such approaches go beyond and transform traditional "transmission" models of teaching and formal lesson planning, this paper investigates the changing requirements and new possibilities represented by the challenge of integrating ICTs in education in a way which at the same time connects more effectively with both the specific contents of the curriculum and the various stages and elements of the learning process. Case studies from teacher education foundation courses provide an exemplary focus of inquiry in order to better link relevant new theories or models of learning with practice, to build upon related learner-centered strategies for integrating ICT resources and tools, and to incorporate interdependent functions of learning as information access, communication, and applied interactions. As one possible strategy in this direction, the concept of an "ICT-supported learning activity" suggests the need for teachers to approach this increasing challenge more as "designers" of effective and integrated learning rather than mere "transmitters" of skills or information through an add-on use of ICTs.

  19. Model Based User's Access Requirement Analysis of E-Governance Systems

    Science.gov (United States)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  20. Estimation of total Effort and Effort Elapsed in Each Step of Software Development Using Optimal Bayesian Belief Network

    Directory of Open Access Journals (Sweden)

    Fatemeh Zare Baghiabad

    2017-09-01

    Full Text Available Accuracy in estimating the needed effort for software development caused software effort estimation to be a challenging issue. Beside estimation of total effort, determining the effort elapsed in each software development step is very important because any mistakes in enterprise resource planning can lead to project failure. In this paper, a Bayesian belief network was proposed based on effective components and software development process. In this model, the feedback loops are considered between development steps provided that the return rates are different for each project. Different return rates help us determine the percentages of the elapsed effort in each software development step, distinctively. Moreover, the error measurement resulted from optimized effort estimation and the optimal coefficients to modify the model are sought. The results of the comparison between the proposed model and other models showed that the model has the capability to highly accurately estimate the total effort (with the marginal error of about 0.114 and to estimate the effort elapsed in each software development step.

  1. Quantifying commercial catch and effort of monkfish Lophius ...

    African Journals Online (AJOL)

    Catch-per-unit-effort (cpue) data of vessels targeting monkfish and sole (the two ... analysed using two different methods to construct indices of abundance. ... in Namibia to all tail-weight classes is not appropriate for the current fishery and needs ... Keywords: catch per unit effort, Generalized Linear Model, Lophius vaillanti, ...

  2. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  3. Modeling the Non-functional Requirements in the Context of Usability, Performance, Safety and Security

    OpenAIRE

    Sadiq, Mazhar

    2007-01-01

    Requirement engineering is the most significant part of the software development life cycle. Until now great emphasis has been put on the maturity of the functional requirements. But with the passage of time it reveals that the success of software development does not only pertain to the functional requirements rather non-functional requirements should also be taken into consideration. Among the non-functional requirements usability, performance, safety and security are considered important. ...

  4. [Limitation of the therapeutic effort].

    Science.gov (United States)

    Herreros, B; Palacios, G; Pacho, E

    2012-03-01

    The limitation of the therapeutic effort (LTE) consists in not applying extraordinary or disproportionate measures for therapeutic purposes that are proposed for a patient with poor life prognosis and/or poor quality of life. There are two types. The first is to not initiate certain measures or to withdraw them when they are established. A decision of the LTE should be based on some rigorous criteria, so that we make the following proposal. First, it is necessary to know the most relevant details of the case to make a decision: the preferences of the patient, the preferences of the family when pertinent, the prognosis (severity), the quality of life and distribution of the limited resources. After, the decision should be made. In this phase, participatory deliberation should be established to clarify the end of the intervention. Finally, if it is decided to perform an LTE, it should be decided how to do it. Special procedures, disproportionate measures, that are useless and vain should not be initiated for the therapeutic objective designed (withdraw them if they have been established). When it has been decided to treat a condition (interim measures), the treatment should be maintained. This complex phase may need stratification of he measures. Finally, the necessary palliative measures should be established. Copyright © 2011 Elsevier España, S.L. All rights reserved.

  5. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  6. Aminoglycoside Concentrations Required for Synergy with Carbapenems against Pseudomonas aeruginosa Determined via Mechanistic Studies and Modeling.

    Science.gov (United States)

    Yadav, Rajbharan; Bulitta, Jürgen B; Schneider, Elena K; Shin, Beom Soo; Velkov, Tony; Nation, Roger L; Landersdorfer, Cornelia B

    2017-12-01

    This study aimed to systematically identify the aminoglycoside concentrations required for synergy with a carbapenem and characterize the permeabilizing effect of aminoglycosides on the outer membrane of Pseudomonas aeruginosa Monotherapies and combinations of four aminoglycosides and three carbapenems were studied for activity against P. aeruginosa strain AH298-GFP in 48-h static-concentration time-kill studies (SCTK) (inoculum: 10 7.6 CFU/ml). The outer membrane-permeabilizing effect of tobramycin alone and in combination with imipenem was characterized via electron microscopy, confocal imaging, and the nitrocefin assay. A mechanism-based model (MBM) was developed to simultaneously describe the time course of bacterial killing and prevention of regrowth by imipenem combined with each of the four aminoglycosides. Notably, 0.25 mg/liter of tobramycin, which was inactive in monotherapy, achieved synergy (i.e., ≥2-log 10 more killing than the most active monotherapy at 24 h) combined with imipenem. Electron micrographs, confocal image analyses, and the nitrocefin uptake data showed distinct outer membrane damage by tobramycin, which was more extensive for the combination with imipenem. The MBM indicated that aminoglycosides enhanced the imipenem target site concentration up to 4.27-fold. Tobramycin was the most potent aminoglycoside to permeabilize the outer membrane; tobramycin (0.216 mg/liter), gentamicin (0.739 mg/liter), amikacin (1.70 mg/liter), or streptomycin (5.19 mg/liter) was required for half-maximal permeabilization. In summary, our SCTK, mechanistic studies and MBM indicated that tobramycin was highly synergistic and displayed the maximum outer membrane disruption potential among the tested aminoglycosides. These findings support the optimization of highly promising antibiotic combination dosage regimens for critically ill patients. Copyright © 2017 American Society for Microbiology.

  7. Mental and physical effort affect vigilance differently

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  8. Mental and physical effort affect vigilance differently.

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  9. JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases

    OpenAIRE

    Aitana Alonso-Nogueira; Helia Estévez-Fernández; Isaías García

    2017-01-01

    This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach a...

  10. Economic growth, biodiversity loss and conservation effort.

    Science.gov (United States)

    Dietz, Simon; Adger, W Neil

    2003-05-01

    This paper investigates the relationship between economic growth, biodiversity loss and efforts to conserve biodiversity using a combination of panel and cross section data. If economic growth is a cause of biodiversity loss through habitat transformation and other means, then we would expect an inverse relationship. But if higher levels of income are associated with increasing real demand for biodiversity conservation, then investment to protect remaining diversity should grow and the rate of biodiversity loss should slow with growth. Initially, economic growth and biodiversity loss are examined within the framework of the environmental Kuznets hypothesis. Biodiversity is represented by predicted species richness, generated for tropical terrestrial biodiversity using a species-area relationship. The environmental Kuznets hypothesis is investigated with reference to comparison of fixed and random effects models to allow the relationship to vary for each country. It is concluded that an environmental Kuznets curve between income and rates of loss of habitat and species does not exist in this case. The role of conservation effort in addressing environmental problems is examined through state protection of land and the regulation of trade in endangered species, two important means of biodiversity conservation. This analysis shows that the extent of government environmental policy increases with economic development. We argue that, although the data are problematic, the implications of these models is that conservation effort can only ever result in a partial deceleration of biodiversity decline partly because protected areas serve multiple functions and are not necessarily designated to protect biodiversity. Nevertheless institutional and policy response components of the income biodiversity relationship are important but are not well captured through cross-country regression analysis.

  11. Control and Effort Costs Influence the Motivational Consequences of Choice

    Directory of Open Access Journals (Sweden)

    Holly Sullivan-Toole

    2017-05-01

    Full Text Available The act of making a choice, apart from any outcomes the choice may yield, has, paradoxically, been linked to both the enhancement and the detriment of intrinsic motivation. Research has implicated two factors in potentially mediating these contradictory effects: the personal control conferred by a choice and the costs associated with a choice. Across four experiments, utilizing a physical effort task disguised as a simple video game, we systematically varied costs across two levels of physical effort requirements (Low-Requirement, High-Requirement and control over effort costs across three levels of choice (Free-Choice, Restricted-Choice, and No-Choice to disambiguate how these factors affect the motivational consequences of choosing within an effortful task. Together, our results indicated that, in the face of effort requirements, illusory control alone may not sufficiently enhance perceptions of personal control to boost intrinsic motivation; rather, the experience of actual control may be necessary to overcome effort costs and elevate performance. Additionally, we demonstrated that conditions of illusory control, while otherwise unmotivating, can through association with the experience of free-choice, be transformed to have a positive effect on motivation.

  12. Improving groundwater management in rural India using simple modeling tools with minimal data requirements

    Science.gov (United States)

    Moysey, S. M.; Oblinger, J. A.; Ravindranath, R.; Guha, C.

    2008-12-01

    shortly after the start of the monsoon and villager water use is small compared to the other fluxes. Groundwater fluxes were accounted for by conceptualizing the contributing areas upstream and downstream of the reservoir as one dimensional flow tubes. This description of the flow system allows for the definition of physically-based parameters making the model useful for investigating WHS infiltration under a variety of management scenarios. To address concerns regarding the uniqueness of the model parameters, 10,000 independent model calibrations were performed using randomly selected starting parameters. Based on this Monte Carlo analysis, it was found that the mean volume of water contributed by the WHS to infiltration over the study period (Sept.-Dec., 2007) was 48.1x103m3 with a 95% confidence interval of 43.7-53.7x103m3. This volume represents 17-21% of the total natural groundwater recharge contributed by the entire watershed, which was determined independently using a surface water balance. Despite the fact that the model is easy to use and requires minimal data, the results obtained provide a powerful quantitative starting point for managing groundwater withdrawals in the dry season.

  13. Review of data requirements for groundwater flow and solute transport modelling and the ability of site investigation methods to meet these requirements

    International Nuclear Information System (INIS)

    McEwen, T.J.; Chapman, N.A.; Robinson, P.C.

    1990-08-01

    This report describes the data requirements for the codes that may be used in the modelling of groundwater flow and radionuclide transport during the assessment of a Nirex site for the deep disposal of low and intermediate level radioactive waste and also the site investigation methods that exist to supply the data for these codes. The data requirements for eight codes are reviewed, with most emphasis on three of the more significant codes, VANDAL, NAMMU and CHEMTARD. The largest part of the report describes and discusses the site investigation techniques and each technique is considered in terms of its ability to provide the data necessary to characterise the geological and hydrogeological environment around a potential repository. (author)

  14. Analysis and Design Environment for Large Scale System Models and Collaborative Model Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — As NASA modeling efforts grow more complex and more distributed among many working groups, new tools and technologies are required to integrate their efforts...

  15. SOFTWARE EFFORT PREDICTION: AN EMPIRICAL EVALUATION OF METHODS TO TREAT MISSING VALUES WITH RAPIDMINER ®

    OpenAIRE

    OLGA FEDOTOVA; GLADYS CASTILLO; LEONOR TEIXEIRA; HELENA ALVELOS

    2011-01-01

    Missing values is a common problem in the data analysis in all areas, being software engineering not an exception. Particularly, missing data is a widespread phenomenon observed during the elaboration of effort prediction models (EPMs) required for budget, time and functionalities planning. Current work presents the results of a study carried out on a Portuguese medium-sized software development organization in order to obtain a formal method for EPMs elicitation in development processes. Thi...

  16. FEM-model of the Naesliden Mine: requirements and limitations at the outset of the project. [Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Krauland, N.

    1980-05-15

    The design of any instrument depends entirely on the application planned for the instrument. This applies also to the FEM-representation of the Naesliden Mine. With reference to the aims of the project the requirements on the model are outlined with regard to - simulation of the mining process - modelling with special reference to the aims of the project - comparison of FEM-results with in situ observations to determine the validity of the model. The proposed model is two-dimensional and incorporates joint elements to simulate the weak alteration zone between orebody and sidewall rock. The remainder of the model exhibits linear elastic behaviour. This model is evaluated with respect to the given requirements. The limitations of the chosen model are outlined.

  17. Model for peace support operations: an overview of the ICT and interoperability requirements

    CSIR Research Space (South Africa)

    Leenen, L

    2009-03-01

    Full Text Available requires a reciprocal interdependence among these various elements, and this necessitates complex coordination and a great demand for ongoing and accurate communication (Chisholm 1986). Higher technological complexity requires higher levels... interoperability requirements thereof. Such methods, when fully developed, give the military planner the ability to rapidly assess the requirements as circumstances change. From interviews with SANDF staff (Ross 2007), we gathered that the SANDF planning...

  18. Semantics of trace relations in requirements models for consistency checking and inferencing

    NARCIS (Netherlands)

    Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas; Veldhuis, Jan-Willem

    2009-01-01

    Requirements traceability is the ability to relate requirements back to stakeholders and forward to corresponding design artifacts, code, and test cases. Although considerable research has been devoted to relating requirements in both forward and backward directions, less attention has been paid to

  19. 1996 Design effort for IFMIF HEBT

    International Nuclear Information System (INIS)

    Blind, B.

    1997-01-01

    The paper details the 1996 design effort for the IFMIF HEBT. Following a brief overview, it lists the primary requirements for the beam at the target, describes the design approach and design tools used, introduces the beamline modules, gives the results achieved with the design at this stage, points out possible improvements and gives the names and computer locations of the TRACE3-D and PARMILA files that sum up the design work. The design does not fully meet specifications in regards to the flatness of the distribution at the target. With further work, including if necessary some backup options, the flatness specifications may be realized. It is not proposed that the specifications, namely flatness to ±5% and higher-intensity ridges that are no more than 15% above average, be changed at this time. The design also does not meet the requirement that the modules of all beamlines should operate at the same settings. However, the goal of using identical components and operational procedures has been met and only minor returning is needed to produce very similar beam distributions from all beamlines. Significant further work is required in the following areas: TRACE3-D designs and PARMILA runs must be made for the beams coming from accelerators No. 3 and No. 4. Transport of 30-MeV and 35-MeV beams to the targets and beam dump must be studied. Comprehensive error studies must be made. These must result in tolerance specifications and may require design iterations. Detailed interfacing with target-spot instrumentation is required. This instrumentation must be able to check all aspects of the specifications

  20. Directed-energy process technology efforts

    Science.gov (United States)

    Alexander, P.

    1985-01-01

    A summary of directed-energy process technology for solar cells was presented. This technology is defined as directing energy or mass to specific areas on solar cells to produce a desired effect in contrast to exposing a cell to a thermal or mass flow environment. Some of these second generation processing techniques are: ion implantation; microwave-enhanced chemical vapor deposition; rapid thermal processing; and the use of lasers for cutting, assisting in metallization, assisting in deposition, and drive-in of liquid dopants. Advantages of directed energy techniques are: surface heating resulting in the bulk of the cell material being cooler and unchanged; better process control yields; better junction profiles, junction depths, and metal sintering; lower energy consumption during processing and smaller factory space requirements. These advantages should result in higher-efficiency cells at lower costs. The results of the numerous contracted efforts were presented as well as the application potentials of these new technologies.

  1. ON THE REQUIREMENTS FOR REALISTIC MODELING OF NEUTRINO TRANSPORT IN SIMULATIONS OF CORE-COLLAPSE SUPERNOVAE

    Energy Technology Data Exchange (ETDEWEB)

    Lentz, Eric J. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996-1200 (United States); Mezzacappa, Anthony; Hix, W. Raphael [Physics Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6354 (United States); Messer, O. E. Bronson [Computer Science and Mathematics Division, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6164 (United States); Liebendoerfer, Matthias [Department of Physics, University of Basel, Klingelbergstrasse 82, CH-4056 Basel (Switzerland); Bruenn, Stephen W., E-mail: elentz@utk.edu, E-mail: mezzacappaa@ornl.gov [Department of Physics, Florida Atlantic University, 777 Glades Road, Boca Raton, FL 33431-0991 (United States)

    2012-03-01

    We have conducted a series of numerical experiments with the spherically symmetric, general relativistic, neutrino radiation hydrodynamics code AGILE-BOLTZTRAN to examine the effects of several approximations used in multidimensional core-collapse supernova simulations. Our code permits us to examine the effects of these approximations quantitatively by removing, or substituting for, the pieces of supernova physics of interest. These approximations include: (1) using Newtonian versus general relativistic gravity, hydrodynamics, and transport; (2) using a reduced set of weak interactions, including the omission of non-isoenergetic neutrino scattering, versus the current state-of-the-art; and (3) omitting the velocity-dependent terms, or observer corrections, from the neutrino Boltzmann kinetic equation. We demonstrate that each of these changes has noticeable effects on the outcomes of our simulations. Of these, we find that the omission of observer corrections is particularly detrimental to the potential for neutrino-driven explosions and exhibits a failure to conserve lepton number. Finally, we discuss the impact of these results on our understanding of current, and the requirements for future, multidimensional models.

  2. Progesterone production requires activation of caspase-3 in preovulatory granulosa cells in a serum starvation model.

    Science.gov (United States)

    An, Li-Sha; Yuan, Xiao-Hua; Hu, Ying; Shi, Zi-Yun; Liu, Xiao-Qin; Qin, Li; Wu, Gui-Qing; Han, Wei; Wang, Ya-Qin; Ma, Xu

    2012-11-01

    Granulosa cells proliferate, differentiate, and undergo apoptosis throughout follicular development. Previous studies have demonstrated that stimulation of progesterone production is accompanied by caspase-3 activation. Moreover, we previously reported that arsenic enhanced caspase-3 activity coupled with progesterone production. Inhibition of caspase-3 activity can significantly inhibit progesterone production induced by arsenic or follicle-stimulating hormone (FSH). Here, we report that serum starvation induces caspase-3 activation coupled with augmentation of progesterone production. Serum starvation also increased the levels of cytochrome P450 cholesterol side chain cleavage enzyme (P450scc) and steroidogenic acute regulatory (StAR) protein, both of which may contribute to progesterone synthesis in preovulatory granulosa cells. Inhibition of caspase-3 activity resulted in a decrease in progesterone production. Deactivation of caspase-3 activity by caspase-3 specific inhibitor also resulted in decreases in P450scc and StAR expression, which may partly contribute to the observed decrease in progesterone production. Our study demonstrates for the first time that progesterone production in preovulatory granulosa cells is required for caspase-3 activation in a serum starvation model. Inhibition of caspase-3 activity can result in decreased expression of the steroidogenic proteins P450scc and StAR. Our work provides further details on the relationship between caspase-3 activation and steroidogenesis and indicates that caspase-3 plays a critical role in progesterone production by granulosa cells. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  4. Immune cells are required for cutaneous ulceration in a swine model of chancroid.

    Science.gov (United States)

    San Mateo, L R; Toffer, K L; Orndorff, P E; Kawula, T H

    1999-09-01

    Cutaneous lesions of the human sexually transmitted genital ulcer disease chancroid are characterized by the presence of intraepidermal pustules, keratinocyte cytopathology, and epidermal and dermal erosion. These lesions are replete with neutrophils, macrophages, and CD4(+) T cells and contain very low numbers of cells of Haemophilus ducreyi, the bacterial agent of chancroid. We examined lesion formation by H. ducreyi in a pig model by using cyclophosphamide (CPA)-induced immune cell deficiency to distinguish between host and bacterial contributions to chancroid ulcer formation. Histologic presentation of H. ducreyi-induced lesions in CPA-treated pigs differed from ulcers that developed in immune-competent animals in that pustules did not form and surface epithelia remained intact. However, these lesions had significant suprabasal keratinocyte cytotoxicity. These results demonstrate that the host immune response was required for chancroid ulceration, while bacterial products were at least partially responsible for the keratinocyte cytopathology associated with chancroid lesions in the pig. The low numbers of H. ducreyi present in lesions in humans and immune-competent pigs have prevented localization of these organisms within skin. However, H. ducreyi organisms were readily visualized in lesion biopsies from infected CPA-treated pigs by immunoelectron microscopy. These bacteria were extracellular and associated with necrotic host cells in the epidermis and dermis. The relative abundance of H. ducreyi in inoculated CPA-treated pig skin suggests control of bacterial replication by host immune cells during natural human infection.

  5. ON THE REQUIREMENTS FOR REALISTIC MODELING OF NEUTRINO TRANSPORT IN SIMULATIONS OF CORE-COLLAPSE SUPERNOVAE

    International Nuclear Information System (INIS)

    Lentz, Eric J.; Mezzacappa, Anthony; Hix, W. Raphael; Messer, O. E. Bronson; Liebendörfer, Matthias; Bruenn, Stephen W.

    2012-01-01

    We have conducted a series of numerical experiments with the spherically symmetric, general relativistic, neutrino radiation hydrodynamics code AGILE-BOLTZTRAN to examine the effects of several approximations used in multidimensional core-collapse supernova simulations. Our code permits us to examine the effects of these approximations quantitatively by removing, or substituting for, the pieces of supernova physics of interest. These approximations include: (1) using Newtonian versus general relativistic gravity, hydrodynamics, and transport; (2) using a reduced set of weak interactions, including the omission of non-isoenergetic neutrino scattering, versus the current state-of-the-art; and (3) omitting the velocity-dependent terms, or observer corrections, from the neutrino Boltzmann kinetic equation. We demonstrate that each of these changes has noticeable effects on the outcomes of our simulations. Of these, we find that the omission of observer corrections is particularly detrimental to the potential for neutrino-driven explosions and exhibits a failure to conserve lepton number. Finally, we discuss the impact of these results on our understanding of current, and the requirements for future, multidimensional models.

  6. Optimal Effort in Consumer Choice : Theory and Experimental Evidence for Binary Choice

    NARCIS (Netherlands)

    Conlon, B.J.; Dellaert, B.G.C.; van Soest, A.H.O.

    2001-01-01

    This paper develops a theoretical model of optimal effort in consumer choice.The model extends previous consumer choice models in that the consumer not only chooses a product, but also decides how much effort to apply to a given choice problem.The model yields a unique optimal level of effort, which

  7. Does the digital age require new models of democracy? : lasswell's policy scientist of democracy vs. Liquid democracy

    NARCIS (Netherlands)

    Jelena Gregorius

    2015-01-01

    This essay provides a debate about Lasswell’s policy scientist of democracy (PSOD, 1948) in comparison to the model of liquid democracy (21st century) based on the question if the digital age requires new models of democracy. The PSOD of Lasswell, a disciplinary persona, is in favour of an elitist

  8. Shell Inspection History and Current CMM Inspection Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Montano, Joshua Daniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-26

    The following report provides a review of past and current CMM Shell Inspection efforts. Calibration of the Sheffield rotary contour gauge has expired and the primary inspector, Matthew Naranjo, has retired. Efforts within the Inspection team are transitioning from maintaining and training new inspectors on Sheffield to off-the-shelf CMM technology. Although inspection of a shell has many requirements, the scope of the data presented in this report focuses on the inner contour, outer contour, radial wall thickness and mass comparisons.

  9. Distinct effects of apathy and dopamine on effort-based decision-making in Parkinson's disease.

    Science.gov (United States)

    Le Heron, Campbell; Plant, Olivia; Manohar, Sanjay; Ang, Yuen-Siang; Jackson, Matthew; Lennox, Graham; Hu, Michele T; Husain, Masud

    2018-05-01

    Effort-based decision-making is a cognitive process crucial to normal motivated behaviour. Apathy is a common and disabling complication of Parkinson's disease, but its aetiology remains unclear. Intriguingly, the neural substrates associated with apathy also subserve effort-based decision-making in animal models and humans. Furthermore, the dopaminergic system plays a core role in motivating effortful behaviour for reward, and its dysfunction has been proposed to play a crucial role in the aetiology of apathy in Parkinson's disease. We hypothesized that disrupted effort-based decision-making underlies the syndrome of apathy in Parkinson's disease, and that this disruption may be modulated by the dopaminergic system. An effort-based decision-making task was administered to 39 patients with Parkinson's disease, with and without clinical apathy, ON and OFF their normal dopaminergic medications across two separate sessions, as well as 32 healthy age- and gender-matched controls. On a trial-by-trial basis, participants decided whether to accept or reject offers of monetary reward in return for exerting different levels of physical effort via handheld, individually calibrated dynamometers. Effort and reward were manipulated independently, such that offers spanned the full range of effort/reward combinations. Apathy was assessed using the Lille apathy rating scale. Motor effects of the dopamine manipulation were assessed using the Unified Parkinson's Disease Rating Scale part three motor score. The primary outcome variable was choice (accept/decline offer) analysed using a hierarchical generalized linear mixed effects model, and the vigour of squeeze (Newtons exerted above required force). Both apathy and dopamine depletion were associated with reduced acceptance of offers. However, these effects were driven by dissociable patterns of responding. While apathy was characterized by increased rejection of predominantly low reward offers, dopamine increased responding to

  10. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Margaria, Tiziana (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  11. Modeling Multi-Reservoir Hydropower Systems in the Sierra Nevada with Environmental Requirements and Climate Warming

    Science.gov (United States)

    Rheinheimer, David Emmanuel

    Hydropower systems and other river regulation often harm instream ecosystems, partly by altering the natural flow and temperature regimes that ecosystems have historically depended on. These effects are compounded at regional scales. As hydropower and ecosystems are increasingly valued globally due to growing values for clean energy and native species as well as and new threats from climate warming, it is important to understand how climate warming might affect these systems, to identify tradeoffs between different water uses for different climate conditions, and to identify promising water management solutions. This research uses traditional simulation and optimization to explore these issues in California's upper west slope Sierra Nevada mountains. The Sierra Nevada provides most of the water for California's vast water supply system, supporting high-elevation hydropower generation, ecosystems, recreation, and some local municipal and agricultural water supply along the way. However, regional climate warming is expected to reduce snowmelt and shift runoff to earlier in the year, affecting all water uses. This dissertation begins by reviewing important literature related to the broader motivations of this study, including river regulation, freshwater conservation, and climate change. It then describes three substantial studies. First, a weekly time step water resources management model spanning the Feather River watershed in the north to the Kern River watershed in the south is developed. The model, which uses the Water Evaluation And Planning System (WEAP), includes reservoirs, run-of-river hydropower, variable head hydropower, water supply demand, and instream flow requirements. The model is applied with a runoff dataset that considers regional air temperature increases of 0, 2, 4 and 6 °C to represent historical, near-term, mid-term and far-term (end-of-century) warming. Most major hydropower turbine flows are simulated well. Reservoir storage is also

  12. Knowing requires data

    Science.gov (United States)

    Naranjo, Ramon C.

    2017-01-01

    Groundwater-flow models are often calibrated using a limited number of observations relative to the unknown inputs required for the model. This is especially true for models that simulate groundwater surface-water interactions. In this case, subsurface temperature sensors can be an efficient means for collecting long-term data that capture the transient nature of physical processes such as seepage losses. Continuous and spatially dense network of diverse observation data can be used to improve knowledge of important physical drivers, conceptualize and calibrate variably saturated groundwater flow models. An example is presented for which the results of such analysis were used to help guide irrigation districts and water management decisions on costly upgrades to conveyance systems to improve water usage, farm productivity and restoration efforts to improve downstream water quality and ecosystems.

  13. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  14. Is My Effort Worth It?

    DEFF Research Database (Denmark)

    Liu, Fei; Xiao, Bo; Lim, Eric T. K.

    2016-01-01

    experience, there is a paucity of studies that investigate the impact of search features on search outcomes. We therefore draw on Information Foraging Theory (IFT) to disentangle the dual role of search cost in shaping the utility of information search. We also extend the Information Seeking Model...... in which Amazon Mechanical Turk (AMT) participants were recruited and tasked to perform search tasks on custom-made online review websites. By analyzing the behavioral data generated in the experimental process, we discover that search cost reduces the expected search utility while improving the yield...

  15. Prefrontal Cortical Inactivations Decrease Willingness to Expend Cognitive Effort on a Rodent Cost/Benefit Decision-Making Task.

    Science.gov (United States)

    Hosking, Jay G; Cocker, Paul J; Winstanley, Catharine A

    2016-04-01

    Personal success often necessitates expending greater effort for greater reward but, equally important, also requires judicious use of our limited cognitive resources (e.g., attention). Previous animal models have shown that the prelimbic (PL) and infralimbic (IL) regions of the prefrontal cortex (PFC) are not involved in (physical) effort-based choice, whereas human studies have demonstrated PFC contributions to (mental) effort. Here, we utilize the rat Cognitive Effort Task (rCET) to probe PFC's role in effort-based decision making. In the rCET, animals can choose either an easy trial, where the attentional demand is low but the reward (sugar) is small or a difficult trial on which both the attentional demand and reward are greater. Temporary inactivation of PL and IL decreased all animals' willingness to expend mental effort and increased animals' distractibility; PL inactivations more substantially affected performance (i.e., attention), whereas IL inactivations increased motor impulsivity. These data imply that the PFC contributes to attentional resources, and when these resources are diminished, animals shift their choice (via other brain regions) accordingly. Thus, one novel therapeutic approach to deficits in effort expenditure may be to focus on the resources that such decision making requires, rather than the decision-making process per se. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Dissociable contributions of anterior cingulate cortex and basolateral amygdala on a rodent cost/benefit decision-making task of cognitive effort.

    Science.gov (United States)

    Hosking, Jay G; Cocker, Paul J; Winstanley, Catharine A

    2014-06-01

    Personal success often requires the choice to expend greater effort for larger rewards, and deficits in such effortful decision making accompany a number of illnesses including depression, schizophrenia, and attention-deficit/hyperactivity disorder. Animal models have implicated brain regions such as the basolateral amygdala (BLA) and anterior cingulate cortex (ACC) in physical effort-based choice, but disentangling the unique contributions of these two regions has proven difficult, and effort demands in industrialized society are predominantly cognitive in nature. Here we utilize the rodent cognitive effort task (rCET), a modification of the five-choice serial reaction-time task, wherein animals can choose to expend greater visuospatial attention to obtain larger sucrose rewards. Temporary inactivation (via baclofen-muscimol) of BLA and ACC showed dissociable effects: BLA inactivation caused hard-working rats to 'slack off' and 'slacker' rats to work harder, whereas ACC inactivation caused all animals to reduce willingness to expend mental effort. Furthermore, BLA inactivation increased the time needed to make choices, whereas ACC inactivation increased motor impulsivity. These data illuminate unique contributions of BLA and ACC to effort-based decision making, and imply overlapping yet distinct circuitry for cognitive vs physical effort. Our understanding of effortful decision making may therefore require expanding our models beyond purely physical costs.

  17. Application of a hydrodynamic and sediment transport model for guidance of response efforts related to the Deepwater Horizon oil spill in the Northern Gulf of Mexico along the coast of Alabama and Florida

    Science.gov (United States)

    Plant, Nathaniel G.; Long, Joseph W.; Dalyander, P. Soupy; Thompson, David M.; Raabe, Ellen A.

    2013-01-01

    U.S. Geological Survey (USGS) scientists have provided a model-based assessment of transport and deposition of residual Deepwater Horizon oil along the shoreline within the northern Gulf of Mexico in the form of mixtures of sand and weathered oil, known as surface residual balls (SRBs). The results of this USGS research, in combination with results from other components of the overall study, will inform operational decisionmaking. The results will provide guidance for response activities and data collection needs during future oil spills. In May 2012 the U.S. Coast Guard, acting as the Deepwater Horizon Federal on-scene coordinator, chartered an operational science advisory team to provide a science-based review of data collected and to conduct additional directed studies and sampling. The goal was to characterize typical shoreline profiles and morphology in the northern Gulf of Mexico to identify likely sources of residual oil and to evaluate mechanisms whereby reoiling phenomena may be occurring (for example, burial and exhumation and alongshore transport). A steering committee cochaired by British Petroleum Corporation (BP) and the National Oceanic and Atmospheric Administration (NOAA) is overseeing the project and includes State on-scene coordinators from four States (Alabama, Florida, Louisiana, and Mississippi), trustees of the U.S. Department of the Interior (DOI), and representatives from the U.S. Coast Guard. This report presents the results of hydrodynamic and sediment transport models and developed techniques for analyzing potential SRB movement and burial and exhumation along the coastline of Alabama and Florida. Results from these modeling efforts are being used to explain the complexity of reoiling in the nearshore environment and to broaden consideration of the different scenarios and difficulties that are being faced in identifying and removing residual oil. For instance, modeling results suggest that larger SRBs are not, under the most commonly

  18. Putting User Stories First: Experiences Adapting the Legacy Data Models and Information Architecture at NASA JPL's PO.DAAC to Accommodate the New Information Lifecycle Required by SWOT

    Science.gov (United States)

    McGibbney, L. J.; Hausman, J.; Laurencelle, J. C.; Toaz, R., Jr.; McAuley, J.; Freeborn, D. J.; Stoner, C.

    2016-12-01

    The Surface Water & Ocean Topography (SWOT) mission brings together two communities focused on a better understanding of the world's oceans and its terrestrial surface waters. U.S. and French oceanographers and hydrologists and international partners have joined forces to develop this new space mission. At NASA JPL's PO.DAAC, the team is currently engaged in the gathering of SWOT User Stores (access patterns, metadata requirements, primary and value added product requirements, data access protocols, etc.) to better inform the adaptive planning of what will be known as the next generation PO.DAAC Information Architecture (IA). The IA effort acknowledges that missions such as SWOT (and NISAR) have few or no precedent in terms of data volume, hot and cold storage, archival, analysis, existing system engineering complexities, etc. and that the only way we can better understand the projected impacts of such requirements is to interface directly with the User Community. Additionally, it also acknowledges that collective learning has taken place to understand certain limitations in the existing data models (DM) underlying the existing PO.DAAC Data Management and Archival System. This work documents an evolutionary, use case based, standards driven approach to adapting the legacy DM and accompanying knowledge representation infrastructure at NASA JPL's PO.DAAC to address forthcoming DAAC mission requirements presented by missions such as SWOT. Some of the topics covered in this evolution include, but are not limited to: How we are leveraging lessons learned from the development of existing DM (such as that generated for SMAP) in an attempt to map them to SWOT. What is the governance model for the SWOT IA? What are the `governing' entities? What is the hierarchy of the `governed entities'? How are elements grouped? How is the design-working group formed? How is model independence maintained and what choices/requirements do we have for the implementation language? The use of

  19. ICRP new recommendations. Committee 2's efforts

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    The International Commission on Radiological Protection (ICRP) may release new primary radiation protection recommendation in 2007. Committee 2 has underway reviews of the dosimetric and biokinetic models and associated data used in calculating dose coefficients for intakes of radionuclides and exposures to external radiation fields. This paper outlines the work plans of Committee 2 during the current term, 2005-2009, in anticipation of the new primary recommendations. The two task groups of Committee 2 responsible for the computations of dose coefficients, INDOS and DOCAL, are reviewing the models and data used in the computations. INDOS is reviewing the lung model and the biokinetic models that describe the behavior of the radionuclides in the body. DOCAL is reviewing its computational formulations with the objective of harmonizing the formulation with those of nuclear medicine, and developing new computational phantoms representing the adult male and female reference individuals of ICRP Publication 89. In addition, DOCAL will issue a publication on nuclear decay data to replace ICRP Publication 38. While the current efforts are focused on updating the dose coefficients for occupational intakes of radionuclides plans are being formulated to address dose coefficients for external radiation fields which include consideration of high energy fields associated with accelerators and space travel and the updating of dose coefficients for members of the public. (author)

  20. Measuring collections effort improves cash performance.

    Science.gov (United States)

    Shutts, Joe

    2009-09-01

    Having a satisfied work force can lead to an improved collections effort. Hiring the right people and training them ensures employee engagement. Measuring collections effort and offering incentives is key to revenue cycle success.

  1. Transaction-based building controls framework, Volume 2: Platform descriptive model and requirements

    Energy Technology Data Exchange (ETDEWEB)

    Akyol, Bora A. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Haack, Jereme N. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Carpenter, Brandon J. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Lutes, Robert G. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Hernandez, George [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)

    2015-07-31

    Transaction-based Building Controls (TBC) offer a control systems platform that provides an agent execution environment that meets the growing requirements for security, resource utilization, and reliability. This report outlines the requirements for a platform to meet these needs and describes an illustrative/exemplary implementation.

  2. Perception of effort in Exercise Science: Definition, measurement and perspectives.

    Science.gov (United States)

    Pageaux, Benjamin

    2016-11-01

    Perception of effort, also known as perceived exertion or sense of effort, can be described as a cognitive feeling of work associated with voluntary actions. The aim of the present review is to provide an overview of what is perception of effort in Exercise Science. Due to the addition of sensations other than effort in its definition, the neurophysiology of perceived exertion remains poorly understood. As humans have the ability to dissociate effort from other sensations related to physical exercise, the need to use a narrower definition is emphasised. Consequently, a definition and some brief guidelines for its measurement are provided. Finally, an overview of the models present in the literature aiming to explain its neurophysiology, and some perspectives for future research are offered.

  3. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival.

    Directory of Open Access Journals (Sweden)

    Ziya Kordjazi

    Full Text Available Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day and four levels of the number of sampling-days (2, 4, 6 and 7 days. The most parsimonious Cormack-Jolly-Seber (CJS model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery.

  4. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival

    Science.gov (United States)

    Kordjazi, Ziya; Frusher, Stewart; Buxton, Colin; Gardner, Caleb; Bird, Tomas

    2016-01-01

    Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters) were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day) and four levels of the number of sampling-days (2, 4, 6 and 7 days). The most parsimonious Cormack-Jolly-Seber (CJS) model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery. PMID:26990561

  5. Model-driven requirements engineering (MDRE) for real-time ultra-wide instantaneous bandwidth signal simulation

    Science.gov (United States)

    Chang, Daniel Y.; Rowe, Neil C.

    2013-05-01

    While conducting a cutting-edge research in a specific domain, we realize that (1) requirements clarity and correctness are crucial to our success [1], (2) hardware is hard to change, most work is in software requirements development, coding and testing [2], (3) requirements are constantly changing, so that configurability, reusability, scalability, adaptability, modularity and testability are important non-functional attributes [3], (4) cross-domain knowledge is necessary for complex systems [4], and (5) if our research is successful, the results could be applied to other domains with similar problems. In this paper, we propose to use model-driven requirements engineering (MDRE) to model and guide our requirements/development, since models are easy to understand, execute, and modify. The domain for our research is Electronic Warfare (EW) real-time ultra-wide instantaneous bandwidth (IBW1) signal simulation. The proposed four MDRE models are (1) Switch-and-Filter architecture, (2) multiple parallel data bit streams alignment, (3) post-ADC and pre-DAC bits re-mapping, and (4) Discrete Fourier Transform (DFT) filter bank. This research is unique since the instantaneous bandwidth we are dealing with is in gigahertz range instead of conventional megahertz.

  6. A MODEL BUILDING CODE ARTICLE ON FALLOUT SHELTERS WITH RECOMMENDATIONS FOR INCLUSION OF REQUIREMENTS FOR FALLOUT SHELTER CONSTRUCTION IN FOUR NATIONAL MODEL BUILDING CODES.

    Science.gov (United States)

    American Inst. of Architects, Washington, DC.

    A MODEL BUILDING CODE FOR FALLOUT SHELTERS WAS DRAWN UP FOR INCLUSION IN FOUR NATIONAL MODEL BUILDING CODES. DISCUSSION IS GIVEN OF FALLOUT SHELTERS WITH RESPECT TO--(1) NUCLEAR RADIATION, (2) NATIONAL POLICIES, AND (3) COMMUNITY PLANNING. FALLOUT SHELTER REQUIREMENTS FOR SHIELDING, SPACE, VENTILATION, CONSTRUCTION, AND SERVICES SUCH AS ELECTRICAL…

  7. Towards requirements elicitation in service-oriented business networks using value and goal modelling

    NARCIS (Netherlands)

    Mantovaneli Pessoa, Rodrigo; van Sinderen, Marten J.; Quartel, Dick; Shishkov, Boris; Cordeiro, J.; Ranchordas, A.

    2009-01-01

    Due to the contemporary trends towards increased focus on core competences and outsourcing of non-core activities, enterprises are forming strategic alliances and building business networks. This often requires cross enterprise interoperability and integration of their information systems, leading

  8. High Resolution Modeling of Coastal Inundation: User Requirements and Current Practice, A Navy Perspective

    National Research Council Canada - National Science Library

    Blain, Cheryl Ann; Preller, Ruth H

    2007-01-01

    The impact of coastal flooding and inundation on Navy operational missions and the existing Navy requirements for resolution and accuracy relevant to coastal inundation are presented. High resolution (less than 500 m...

  9. Adopting adequate leaching requirement for practical response models of basil to salinity

    Science.gov (United States)

    Babazadeh, Hossein; Tabrizi, Mahdi Sarai; Darvishi, Hossein Hassanpour

    2016-07-01

    Several mathematical models are being used for assessing plant response to salinity of the root zone. Objectives of this study included quantifying the yield salinity threshold value of basil plants to irrigation water salinity and investigating the possibilities of using irrigation water salinity instead of saturated extract salinity in the available mathematical models for estimating yield. To achieve the above objectives, an extensive greenhouse experiment was conducted with 13 irrigation water salinity levels, namely 1.175 dS m-1 (control treatment) and 1.8 to 10 dS m-1. The result indicated that, among these models, the modified discount model (one of the most famous root water uptake model which is based on statistics) produced more accurate results in simulating the basil yield reduction function using irrigation water salinities. Overall the statistical model of Steppuhn et al. on the modified discount model and the math-empirical model of van Genuchten and Hoffman provided the best results. In general, all of the statistical models produced very similar results and their results were better than math-empirical models. It was also concluded that if enough leaching was present, there was no significant difference between the soil salinity saturated extract models and the models using irrigation water salinity.

  10. OPTIMIZATION OF ATM AND BRANCH CASH OPERATIONS USING AN INTEGRATED CASH REQUIREMENT FORECASTING AND CASH OPTIMIZATION MODEL

    Directory of Open Access Journals (Sweden)

    Canser BİLİR

    2018-04-01

    Full Text Available In this study, an integrated cash requirement forecasting and cash inventory optimization model is implemented in both the branch and automated teller machine (ATM networks of a mid-sized bank in Turkey to optimize the bank’s cash supply chain. The implemented model’s objective is to minimize the idle cash levels at both branches and ATMs without decreasing the customer service level (CSL by providing the correct amount of cash at the correct location and time. To the best of our knowledge, the model is the first integrated model in the literature to be applied to both ATMs and branches simultaneously. The results demonstrated that the integrated model dramatically decreased the idle cash levels at both branches and ATMs without degrading the availability of cash and hence customer satisfaction. An in-depth analysis of the results also indicated that the results were more remarkable for branches. The results also demonstrated that the utilization of various seasonal indices plays a very critical role in the forecasting of cash requirements for a bank. Another unique feature of the study is that the model is the first to include the recycling feature of ATMs. The results demonstrated that as a result of the inclusion of the deliberate seasonal indices in the forecasting model, the integrated cash optimization models can be used to estimate the cash requirements of recycling ATMs.

  11. A Parsimonious Model of the Rabbit Action Potential Elucidates the Minimal Physiological Requirements for Alternans and Spiral Wave Breakup.

    Science.gov (United States)

    Gray, Richard A; Pathmanathan, Pras

    2016-10-01

    Elucidating the underlying mechanisms of fatal cardiac arrhythmias requires a tight integration of electrophysiological experiments, models, and theory. Existing models of transmembrane action potential (AP) are complex (resulting in over parameterization) and varied (leading to dissimilar predictions). Thus, simpler models are needed to elucidate the "minimal physiological requirements" to reproduce significant observable phenomena using as few parameters as possible. Moreover, models have been derived from experimental studies from a variety of species under a range of environmental conditions (for example, all existing rabbit AP models incorporate a formulation of the rapid sodium current, INa, based on 30 year old data from chick embryo cell aggregates). Here we develop a simple "parsimonious" rabbit AP model that is mathematically identifiable (i.e., not over parameterized) by combining a novel Hodgkin-Huxley formulation of INa with a phenomenological model of repolarization similar to the voltage dependent, time-independent rectifying outward potassium current (IK). The model was calibrated using the following experimental data sets measured from the same species (rabbit) under physiological conditions: dynamic current-voltage (I-V) relationships during the AP upstroke; rapid recovery of AP excitability during the relative refractory period; and steady-state INa inactivation via voltage clamp. Simulations reproduced several important "emergent" phenomena including cellular alternans at rates > 250 bpm as observed in rabbit myocytes, reentrant spiral waves as observed on the surface of the rabbit heart, and spiral wave breakup. Model variants were studied which elucidated the minimal requirements for alternans and spiral wave break up, namely the kinetics of INa inactivation and the non-linear rectification of IK.The simplicity of the model, and the fact that its parameters have physiological meaning, make it ideal for engendering generalizable mechanistic

  12. Devising a Structural Equation Model of Relationships between Preservice Teachers' Time and Study Environment Management, Effort Regulation, Self-Efficacy, Control of Learning Beliefs, and Metacognitive Self-Regulation

    Science.gov (United States)

    Sen, Senol; Yilmaz, Ayhan

    2016-01-01

    The objective of this study is to analyze the relationship between preservice teachers' time and study environment management, effort regulation, self-efficacy beliefs, control of learning beliefs and metacognitive self-regulation. This study also investigates the direct and indirect effects of metacognitive self-regulation on time and study…

  13. Investigating the Nature of Relationship between Software Size and Development Effort

    OpenAIRE

    Bajwa, Sohaib-Shahid

    2008-01-01

    Software effort estimation still remains a challenging and debatable research area. Most of the software effort estimation models take software size as the base input. Among the others, Constructive Cost Model (COCOMO II) is a widely known effort estimation model. It uses Source Lines of Code (SLOC) as the software size to estimate effort. However, many problems arise while using SLOC as a size measure due to its late availability in the software life cycle. Therefore, a lot of research has b...

  14. Estimation of the Required Modeling Depth for the Simulation of Cable Switching in a Cable-based Network

    DEFF Research Database (Denmark)

    Silva, Filipe Faria Da; Bak, Claus Leth; Balle Holst, Per

    2012-01-01

    . If the area is too large, the simulation requires a long period of time and numerical problems are more likely to exist. This paper proposes a method that can be used to estimate the depth of the modeling area using the grid layout, which can be obtained directly from a PSS/E file, or equivalent...

  15. A Census of Statistics Requirements at U.S. Journalism Programs and a Model for a "Statistics for Journalism" Course

    Science.gov (United States)

    Martin, Justin D.

    2017-01-01

    This essay presents data from a census of statistics requirements and offerings at all 4-year journalism programs in the United States (N = 369) and proposes a model of a potential course in statistics for journalism majors. The author proposes that three philosophies underlie a statistics course for journalism students. Such a course should (a)…

  16. Teaching Case: IS Security Requirements Identification from Conceptual Models in Systems Analysis and Design: The Fun & Fitness, Inc. Case

    Science.gov (United States)

    Spears, Janine L.; Parrish, James L., Jr.

    2013-01-01

    This teaching case introduces students to a relatively simple approach to identifying and documenting security requirements within conceptual models that are commonly taught in systems analysis and design courses. An introduction to information security is provided, followed by a classroom example of a fictitious company, "Fun &…

  17. On the required complexity of vehicle dynamic models for use in simulation-based highway design.

    Science.gov (United States)

    Brown, Alexander; Brennan, Sean

    2014-06-01

    This paper presents the results of a comprehensive project whose goal is to identify roadway design practices that maximize the margin of safety between the friction supply and friction demand. This study is motivated by the concern for increased accident rates on curves with steep downgrades, geometries that contain features that interact in all three dimensions - planar curves, grade, and superelevation. This complexity makes the prediction of vehicle skidding quite difficult, particularly for simple simulation models that have historically been used for road geometry design guidance. To obtain estimates of friction margin, this study considers a range of vehicle models, including: a point-mass model used by the American Association of State Highway Transportation Officials (AASHTO) design policy, a steady-state "bicycle model" formulation that considers only per-axle forces, a transient formulation of the bicycle model commonly used in vehicle stability control systems, and finally, a full multi-body simulation (CarSim and TruckSim) regularly used in the automotive industry for high-fidelity vehicle behavior prediction. The presence of skidding--the friction demand exceeding supply--was calculated for each model considering a wide range of vehicles and road situations. The results indicate that the most complicated vehicle models are generally unnecessary for predicting skidding events. However, there are specific maneuvers, namely braking events within lane changes and curves, which consistently predict the worst-case friction margins across all models. This suggests that any vehicle model used for roadway safety analysis should include the effects of combined cornering and braking. The point-mass model typically used by highway design professionals may not be appropriate to predict vehicle behavior on high-speed curves during braking in low-friction situations. However, engineers can use the results of this study to help select the appropriate vehicle dynamic

  18. Economic response to harvest and effort control in fishery

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans

    for fisheries management. The report outlines bio-economic models, which are designed to shed light on the efficiency of different management tools in terms of quota or effort restrictions given the objectives of the Common Fisheries Policy about sustainable and economic viable fisheries. The report addresses...... the complexities of biological and economic interaction in a multispecies, multifleet framework and outlines consistent mathematical models....

  19. 76 FR 25229 - Special Conditions: Gulfstream Aerospace LP (GALP) Model G250 Airplane, Dynamic Test Requirements...

    Science.gov (United States)

    2011-05-04

    ... Memorandum ``Side-Facing Seats on Transport Category Airplanes'' and draft Issue Paper ``Dynamic Test...; Special Conditions No. 25-425-SC] Special Conditions: Gulfstream Aerospace LP (GALP) Model G250 Airplane... are issued for the Gulfstream Aerospace LP (GALP) model G250 airplane. This airplane will have a novel...

  20. Process-based models are required to manage ecological systems in a changing world

    Science.gov (United States)

    K. Cuddington; M.-J. Fortin; L.R. Gerber; A. Hastings; A. Liebhold; M. OConnor; C. Ray

    2013-01-01

    Several modeling approaches can be used to guide management decisions. However, some approaches are better fitted than others to address the problem of prediction under global change. Process-based models, which are based on a theoretical understanding of relevant ecological processes, provide a useful framework to incorporate specific responses to altered...

  1. Disease models of chronic inflammatory airway disease : applications and requirements for clinical trials

    NARCIS (Netherlands)

    Diamant, Zuzana; Clarke, Graham W.; Pieterse, Herman; Gispert, Juan

    Purpose of reviewThis review will discuss methodologies and applicability of key inflammatory models of respiratory disease in proof of concept or proof of efficacy clinical studies. In close relationship with these models, induced sputum and inflammatory cell counts will be addressed for

  2. 76 FR 36870 - Special Conditions: Gulfstream Model GVI Airplane; Design Roll Maneuver Requirement for...

    Science.gov (United States)

    2011-06-23

    ... issue a finding of regulatory adequacy pursuant to section 611 of Public Law 92-574, the ``Noise Control... Requirement for Electronic Flight Controls AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Final... airplane will have a novel or unusual design feature associated with an electronic flight control system...

  3. Security requirements engineering : the SI* modeling language and the Secure Tropos methodology

    NARCIS (Netherlands)

    Massacci, F.; Mylopoulos, J.; Zannone, N.; Ras, Z.W.; Tsay, L.-S.

    2010-01-01

    Security Requirements Engineering is an emerging field which lies at the crossroads of Security and Software Engineering. Much research has focused on this field in recent years, spurred by the realization that security must be dealt with in the earliest phases of the software development process as

  4. A Curriculum Model: Engineering Design Graphics Course Updates Based on Industrial and Academic Institution Requirements

    Science.gov (United States)

    Meznarich, R. A.; Shava, R. C.; Lightner, S. L.

    2009-01-01

    Engineering design graphics courses taught in colleges or universities should provide and equip students preparing for employment with the basic occupational graphics skill competences required by engineering and technology disciplines. Academic institutions should introduce and include topics that cover the newer and more efficient graphics…

  5. Telematic Requirements for a Mobile and Wireless Healthcare System derived from Enterprise Models

    NARCIS (Netherlands)

    Widya, I.A.; van Halteren, Aart; Jones, Valerie M.; Bults, Richard G.A.; Konstantas, D.; Vierhout, P.A.M.; Peuscher, J.; Jevtic, D.; Mikuc, M.

    A challenge of current innovation in healthcare processes is to improve the time to treatment. This paper addresses the benefits of telematic services and mobile & wireless devices such as vital sign sensors and head mounted cameras for healthcare processes. It explores the telematic requirements

  6. Pocket money and child effort at school

    OpenAIRE

    François-Charles Wolff; Christine Barnet-Verzat

    2008-01-01

    In this paper, we study the relationship between the provision of parental pocket and the level of effort undertaken by the child at school. Under altruism, an increased amount of parental transfer should reduce the child's effort. Our empirical analysis is based on a French data set including about 1,400 parent-child pairs. We find that children do not undertake less effort when their parents are more generous.

  7. Getting Grip on Security Requirements Elicitation by Structuring and Reusing Security Requirements Sources

    Directory of Open Access Journals (Sweden)

    Christian Schmitt

    2015-07-01

    Full Text Available This paper presents a model for structuring and reusing security requirements sources. The model serves as blueprint for the development of an organization-specific repository, which provides relevant security requirements sources, such as security information and knowledge sources and relevant compliance obligations, in a structured and reusable form. The resulting repository is intended to be used by development teams during the elicitation and analysis of security requirements with the goal to understand the security problem space, incorporate all relevant requirements sources, and to avoid unnecessary effort for identifying, understanding, and correlating applicable security requirements sources on a project-wise basis. We start with an overview and categorization of important security requirements sources, followed by the description of the generic model. To demonstrate the applicability and benefits of the model, the instantiation approach and details of the resulting repository of security requirements sources are presented.

  8. Three-dimensional pulmonary model using rapid-prototyping in patient with lung cancer requiring segmentectomy.

    Science.gov (United States)

    Akiba, Tadashi; Nakada, Takeo; Inagaki, Takuya

    2014-01-01

    Thoracoscopic pulmonary segmentectomy of the lung is sometime adopted for the lung cancer, but a problem with segmentectomy is variable anatomy. Recently, we are exploring the impact of three-dimensional models using rapid-prototyping technique. It is useful for decision making, surgical planning, and intraoperative orientation for surgical treatment in patient with lung cancer who underwent pulmonary segmentectomy. These newly created models allow us to clearly identify the surgical margin and the intersegmental plane, vessels, and bronchi related to the cancer in the posterior segment. To the best of our knowledge, there are few reports describing a pulmonary model so far.

  9. The USEtox story: A survey of model developer visions and user requirements

    DEFF Research Database (Denmark)

    Westh, Torbjørn Bochsen; Hauschild, Michael Zwicky; Birkved, Morten

    2015-01-01

    into LCA software and methods, (4) improve update/testing procedures, (5) strengthen communication between developers and users, and (6) extend model scope. By generalizing our recommendations to guide scientific model development in a broader context, we emphasize to acknowledge different levels of user...... expertise to integrate sound revision and update procedures and to facilitate modularity, data import/export, and incorporation into relevant software and databases during model design and development. Our fully documented approach can inspire performing similar surveys on other LCA-related tools...

  10. TumorML: Concept and requirements of an in silico cancer modelling markup language.

    Science.gov (United States)

    Johnson, David; Cooper, Jonathan; McKeever, Steve

    2011-01-01

    This paper describes the initial groundwork carried out as part of the European Commission funded Transatlantic Tumor Model Repositories project, to develop a new markup language for computational cancer modelling, TumorML. In this paper we describe the motivations for such a language, arguing that current state-of-the-art biomodelling languages are not suited to the cancer modelling domain. We go on to describe the work that needs to be done to develop TumorML, the conceptual design, and a description of what existing markup languages will be used to compose the language specification.

  11. Fast tracking ICT infrastructure requirements and design, based on Enterprise Reference Architecture and matching Reference Models

    DEFF Research Database (Denmark)

    Bernus, Peter; Baltrusch, Rob; Vesterager, Johan

    2002-01-01

    The Globemen Consortium has developed the virtual enterprise reference architecture and methodology (VERAM), based on GERAM and developed reference models for virtual enterprise management and joint mission delivery. The planned virtual enterprise capability includes the areas of sales...

  12. Models and Tabu Search Metaheuristics for Service Network Design with Asset-Balance Requirements

    DEFF Research Database (Denmark)

    Pedersen, Michael Berliner; Crainic, T.G.; Madsen, Oli B.G.

    2009-01-01

    This paper focuses on a generic model for service network design, which includes asset positioning and utilization through constraints on asset availability at terminals. We denote these relations as "design-balance constraints" and focus on the design-balanced capacitated multicommodity network...... design model, a generalization of the capacitated multicommodity network design model generally used in service network design applications. Both arc-and cycle-based formulations for the new model are presented. The paper also proposes a tabu search metaheuristic framework for the arc-based formulation....... Results on a wide range of network design problem instances from the literature indicate the proposed method behaves very well in terms of computational efficiency and solution quality....

  13. Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking

    Science.gov (United States)

    Turgeon, Gregory; Price, Petra

    2010-01-01

    A feasibility study was performed on a representative aerospace system to determine the following: (1) the benefits and limitations to using SCADE , a commercially available tool for model checking, in comparison to using a proprietary tool that was studied previously [1] and (2) metrics for performing the model checking and for assessing the findings. This study was performed independently of the development task by a group unfamiliar with the system, providing a fresh, external perspective free from development bias.

  14. Nothing Else Matters: Model-Agnostic Explanations By Identifying Prediction Invariance

    OpenAIRE

    Ribeiro, Marco Tulio; Singh, Sameer; Guestrin, Carlos

    2016-01-01

    At the core of interpretable machine learning is the question of whether humans are able to make accurate predictions about a model's behavior. Assumed in this question are three properties of the interpretable output: coverage, precision, and effort. Coverage refers to how often humans think they can predict the model's behavior, precision to how accurate humans are in those predictions, and effort is either the up-front effort required in interpreting the model, or the effort required to ma...

  15. REQUIREMENTS PATTERNS FOR FORMAL CONTRACTS IN ARCHITECTURAL ANALYSIS AND DESIGN LANGUAGE (AADL) MODELS

    Science.gov (United States)

    2017-04-17

    because of simultaneous failures in two of the aircrafts braking system . The architecture of the primary Braking System Control Unit (BSCU) is...is a component of the overall Flight Control System (FCS) that compares the measured state of an aircraft (position, speed, and attitude) to the...Cyberphysical Systems , Formal Methods, Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF

  16. The Streptococcus sanguinis Competence Regulon Is Not Required for Infective Endocarditis Virulence in a Rabbit Model

    OpenAIRE

    Callahan, Jill E.; Munro, Cindy L.; Kitten, Todd

    2011-01-01

    Streptococcus sanguinis is an important component of dental plaque and a leading cause of infective endocarditis. Genetic competence in S. sanguinis requires a quorum sensing system encoded by the early comCDE genes, as well as late genes controlled by the alternative sigma factor, ComX. Previous studies of Streptococcus pneumoniae and Streptococcus mutans have identified functions for the >100-gene com regulon in addition to DNA uptake, including virulence. We investigated this possibility i...

  17. Divided attention and mental effort after severe traumatic brain injury.

    Science.gov (United States)

    Azouvi, Philippe; Couillet, Josette; Leclercq, Michel; Martin, Yves; Asloun, Sybille; Rousseaux, Marc

    2004-01-01

    The aim of this study was to assess dual-task performance in TBI patients, under different experimental conditions, with or without explicit emphasis on one of two tasks. Results were compared with measurement of the subjective mental effort required to perform each task. Forty-three severe TBI patients at the subacute or chronic phase performed two tasks under single- and dual-task conditions: (a) random generation; (b) visual go-no go reaction time task. Three dual-task conditions were given, requiring either to consider both tasks as equally important or to focus preferentially on one of them. Patients were compared to matched controls. Subjective mental effort was rated on a visual analogic scale. TBI patients showed a disproportionate increase in reaction time in the go-no go task under the dual-task condition. However, they were just as able as controls to adapt performance to the specific instructions about the task to be emphasised. Patients reported significantly higher subjective mental effort, but the variation of mental effort according to task condition was similar to that of controls. These results suggest that the divided attention deficit of TBI patients is related to a reduction in available processing resources rather than an impairment of strategic processes responsible for attentional allocation and switching. The higher level of subjective mental effort may explain why TBI patients frequently complain of mental fatigue, although this subjective complaint seems to be relatively independent of cognitive impairment.

  18. 49 CFR Appendix A to Part 26 - Guidance Concerning Good Faith Efforts

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Guidance Concerning Good Faith Efforts A Appendix... A to Part 26—Guidance Concerning Good Faith Efforts I. When, as a recipient, you establish a... good faith efforts to meet the goal. The bidder can meet this requirement in either of two ways. First...

  19. The influence of music on mental effort and driving performance.

    Science.gov (United States)

    Ünal, Ayça Berfu; Steg, Linda; Epstude, Kai

    2012-09-01

    The current research examined the influence of loud music on driving performance, and whether mental effort mediated this effect. Participants (N=69) drove in a driving simulator either with or without listening to music. In order to test whether music would have similar effects on driving performance in different situations, we manipulated the simulated traffic environment such that the driving context consisted of both complex and monotonous driving situations. In addition, we systematically kept track of drivers' mental load by making the participants verbally report their mental effort at certain moments while driving. We found that listening to music increased mental effort while driving, irrespective of the driving situation being complex or monotonous, providing support to the general assumption that music can be a distracting auditory stimulus while driving. However, drivers who listened to music performed as well as the drivers who did not listen to music, indicating that music did not impair their driving performance. Importantly, the increases in mental effort while listening to music pointed out that drivers try to regulate their mental effort as a cognitive compensatory strategy to deal with task demands. Interestingly, we observed significant improvements in driving performance in two of the driving situations. It seems like mental effort might mediate the effect of music on driving performance in situations requiring sustained attention. Other process variables, such as arousal and boredom, should also be incorporated to study designs in order to reveal more on the nature of how music affects driving. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Estimates of nutritional requirements and use of Small Ruminant Nutrition System model for hair sheep in semiarid conditions

    Directory of Open Access Journals (Sweden)

    Alessandra Pinto de Oliveira

    2014-09-01

    Full Text Available The objective was to determine the efficiency of utilization of metabolizable energy for maintenance (km and weight gain (kf, the dietary requirements of total digestible nutrients (TDN and metabolizable protein (MP, as well as, evaluate the Small Ruminant Nutrition System (SRNS model to predict the dry matter intake (DMI and the average daily gain (ADG of Santa Ines lambs, fed diets containing different levels of metabolizable energy (ME. Thirty five lambs, non-castrated, with initial body weight (BW of 14.77 ± 1.26 kg at approximate two months old, were used. At the beginning of the experiment, five animals were slaughtered to serve as reference for the estimative of empty body weight (EBW and initial body composition of the 30 remaining animals, which were distributed in randomized block design with five treatments (1.13; 1.40; 1.73; 2.22 and 2.60 Mcal/kg DM, and six repetitions. The requirement of metabolizable energy for maintenance was 78.53 kcal/kg EBW0,75/day, with a utilization efficiency of 66%. The average value of efficiency of metabolizable energy utilization for weight gain was 48%. The dietary requirements of TDN and MP increased with the increase in BW and ADG of the animals. The SRNS model underestimated the DMI and ADG of the animals in 6.2% and 24.6%, respectively. Concludes that the values of km and kf are consistent with those observed in several studies with lambs created in the tropics. The dietary requirements of TDN and MP of Santa Ines lambs for different BW and ADG are, approximately, 42% and 24%, respectively, lower than those suggested by the american system of evaluation of food and nutrient requirements of small ruminants. The SRNS model was sensitive to predict the DMI in Santa Ines lambs, however, for variable ADG, more studies are needed, since the model underestimated the response of the animals of this study.

  1. Generation of a convalescent model of virulent Francisella tularensis infection for assessment of host requirements for survival of tularemia.

    Directory of Open Access Journals (Sweden)

    Deborah D Crane

    Full Text Available Francisella tularensis is a facultative intracellular bacterium and the causative agent of tularemia. Development of novel vaccines and therapeutics for tularemia has been hampered by the lack of understanding of which immune components are required to survive infection. Defining these requirements for protection against virulent F. tularensis, such as strain SchuS4, has been difficult since experimentally infected animals typically die within 5 days after exposure to as few as 10 bacteria. Such a short mean time to death typically precludes development, and therefore assessment, of immune responses directed against virulent F. tularensis. To enable identification of the components of the immune system that are required for survival of virulent F. tularensis, we developed a convalescent model of tularemia in C57Bl/6 mice using low dose antibiotic therapy in which the host immune response is ultimately responsible for clearance of the bacterium. Using this model we demonstrate αβTCR(+ cells, γδTCR(+ cells, and B cells are necessary to survive primary SchuS4 infection. Analysis of mice deficient in specific soluble mediators shows that IL-12p40 and IL-12p35 are essential for survival of SchuS4 infection. We also show that IFN-γ is required for survival of SchuS4 infection since mice lacking IFN-γR succumb to disease during the course of antibiotic therapy. Finally, we found that both CD4(+ and CD8(+ cells are the primary producers of IFN-γand that γδTCR(+ cells and NK cells make a minimal contribution toward production of this cytokine throughout infection. Together these data provide a novel model that identifies key cells and cytokines required for survival or exacerbation of infection with virulent F. tularensis and provides evidence that this model will be a useful tool for better understanding the dynamics of tularemia infection.

  2. Food and energy choices for India: a programming model with partial endogenous energy requirements.

    Science.gov (United States)

    Parikh, K S; Srinivasan, T N

    1980-09-01

    This paper presents a mathematical model for all matter-energy processing subsystems at the level of the society, specifically India. It explores India's choices in the food and energy sectors over the coming decades. Alternative land intensive, irrigation energy intensive, and fertilizer intensive techniques of food production are identified using a nonlinear programming model. The land saved is devoted to growing firewood. The optimum combination of railway (steam, diesel, and electric traction) and road (automobiles, diesel trucks, and diesel and gasoline buses) transport is determined. For the oil sector, two alternative sources of supply of crude oil and petroleum products are included, namely, domestic production and imports. The optimum choice is determined through a linear programming model. While the model is basically a static one, designed to determine the optimal choice for the target year of 2000-2001, certain intertemporal detail is incorporated for electricity generation. The model minimizes the costs of meeting the needs for food, transport in terms of passenger kilometers and goods per ton per kilometer, energy needs for domestic cooking and lighting, and the energy needs of the rest of the economy.

  3. Time preferences, study effort, and academic performance

    NARCIS (Netherlands)

    Non, J.A.; Tempelaar, D.T.

    2014-01-01

    We analyze the relation between time preferences, study effort, and academic performance among first-year Business and Economics students. Time preferences are measured by stated preferences for an immediate payment over larger delayed payments. Data on study efforts are derived from an electronic

  4. Interests, Effort, Achievement and Vocational Preference.

    Science.gov (United States)

    Sjoberg, L.

    1984-01-01

    Relationships between interest in natural sciences and technology and perceived ability, success, and invested effort were studied in Swedish secondary school students. Interests were accounted for by logical orientation and practical value. Interests and grades were strongly correlated, but correlations between interests and effort and vocational…

  5. Listening Effort With Cochlear Implant Simulations

    NARCIS (Netherlands)

    Pals, Carina; Sarampalis, Anastasios; Başkent, Deniz

    2013-01-01

    Purpose: Fitting a cochlear implant (CI) for optimal speech perception does not necessarily optimize listening effort. This study aimed to show that listening effort may change between CI processing conditions for which speech intelligibility remains constant. Method: Nineteen normal-hearing

  6. Effort and Selection Effects of Incentive Contracts

    NARCIS (Netherlands)

    Bouwens, J.F.M.G.; van Lent, L.A.G.M.

    2003-01-01

    We show that the improved effort of employees associated with incentive contracts depends on the properties of the performance measures used in the contract.We also find that the power of incentives in the contract is only indirectly related to any improved employee effort.High powered incentive

  7. The Effect of Age on Listening Effort

    Science.gov (United States)

    Degeest, Sofie; Keppler, Hannah; Corthals, Paul

    2015-01-01

    Purpose: The objective of this study was to investigate the effect of age on listening effort. Method: A dual-task paradigm was used to evaluate listening effort in different conditions of background noise. Sixty adults ranging in age from 20 to 77 years were included. A primary speech-recognition task and a secondary memory task were performed…

  8. Low-effort thought promotes political conservatism.

    Science.gov (United States)

    Eidelman, Scott; Crandall, Christian S; Goodman, Jeffrey A; Blanchar, John C

    2012-06-01

    The authors test the hypothesis that low-effort thought promotes political conservatism. In Study 1, alcohol intoxication was measured among bar patrons; as blood alcohol level increased, so did political conservatism (controlling for sex, education, and political identification). In Study 2, participants under cognitive load reported more conservative attitudes than their no-load counterparts. In Study 3, time pressure increased participants' endorsement of conservative terms. In Study 4, participants considering political terms in a cursory manner endorsed conservative terms more than those asked to cogitate; an indicator of effortful thought (recognition memory) partially mediated the relationship between processing effort and conservatism. Together these data suggest that political conservatism may be a process consequence of low-effort thought; when effortful, deliberate thought is disengaged, endorsement of conservative ideology increases.

  9. Data Requirements and Modeling for Gas Hydrate-Related Mixtures and a Comparison of Two Association Models

    DEFF Research Database (Denmark)

    Liang, Xiaodong; Aloupis, Georgios; Kontogeorgis, Georgios M.

    2017-01-01

    the performance of the CPA and sPC-SAFT EOS for modeling the fluid-phase equilibria of gas hydrate-related systems and will try to explore how the models can help in suggesting experimental measurements. These systems contain water, hydrocarbon (alkane or aromatic), and either methanol or monoethylene glycol...... parameter sets have been chosen for the sPC-SAFT EOS for a fair comparison. The comparisons are made for pure fluid properties, vapor liquid-equilibria, and liquid liquid equilibria of binary and ternary mixtures as well as vapor liquid liquid equilibria of quaternary mixtures. The results show, from...

  10. Reducing the computational requirements for simulating tunnel fires by combining multiscale modelling and multiple processor calculation

    DEFF Research Database (Denmark)

    Vermesi, Izabella; Rein, Guillermo; Colella, Francesco

    2017-01-01

    Multiscale modelling of tunnel fires that uses a coupled 3D (fire area) and 1D (the rest of the tunnel) model is seen as the solution to the numerical problem of the large domains associated with long tunnels. The present study demonstrates the feasibility of the implementation of this method...... in FDS version 6.0, a widely used fire-specific, open source CFD software. Furthermore, it compares the reduction in simulation time given by multiscale modelling with the one given by the use of multiple processor calculation. This was done using a 1200m long tunnel with a rectangular cross......-section as a demonstration case. The multiscale implementation consisted of placing a 30MW fire in the centre of a 400m long 3D domain, along with two 400m long 1D ducts on each side of it, that were again bounded by two nodes each. A fixed volume flow was defined in the upstream duct and the two models were coupled...

  11. Integrating Behavioral-Motive and Experiential-Requirement Perspectives on Psychological Needs: A Two Process Model

    Science.gov (United States)

    Sheldon, Kennon M.

    2011-01-01

    Psychological need theories offer much explanatory potential for behavioral scientists, but there is considerable disagreement and confusion about what needs are and how they work. A 2-process model of psychological needs is outlined, viewing needs as evolved functional systems that provide both (a) innate psychosocial motives that tend to impel…

  12. Vegetation-specific model parameters are not required for estimating gross primary production

    Czech Academy of Sciences Publication Activity Database

    Yuan, W.; Cai, W.; Liu, S.; Dong, W.; Chen, J.; Altaf Arain, M.; Blanken, P. D.; Cescatti, A.; Wohlfahrt, G.; Georgiadis, T.; Genesio, L.; Gianelle, D.; Grelle, A.; Kiely, G.; Knohl, A.; Liu, D.; Marek, Michal V.; Merbold, L.; Montagnani, L.; Panferov, O.; Peltoniemi, M.; Rambal, S.; Raschi, A.; Varlagin, A.; Xia, J.

    2014-01-01

    Roč. 292, NOV 24 2014 (2014), s. 1-10 ISSN 0304-3800 Institutional support: RVO:67179843 Keywords : light use efficiency * gross primary production * model parameters Subject RIV: EH - Ecology, Behaviour Impact factor: 2.321, year: 2014

  13. Towards security requirements: Iconicity as a feature of an informal modeling language

    NARCIS (Netherlands)

    Vasenev, Alexandr; Ionita, Dan; Zoppi, Tomasso; Ceccarelli, Andrea; Wieringa, Roelf J.

    2017-01-01

    Self-adaptive systems need to be designed with respect to threats within their operating conditions. Identifying such threats during the design phase can benefit from the involvement of stakeholders. Using a system model, the stakeholders, who may neither be IT experts nor security experts, can

  14. Telematic Requirements for Emergency and Disaster Response derived from Enterprise Models

    NARCIS (Netherlands)

    Widya, I.A.; Vierhout, P.A.M.; Vierhout, P.A.M.; Jones, Valerie M.; Bults, Richard G.A.; van Halteren, Aart; Peuscher, J.; Konstantas, D.; Istepanian, R.S.H.; Laxminarayan, S.; Pattichis, C.S.

    2006-01-01

    One of the prime objectives in disaster response management is to achieve full control of the situation as rapidly as possible. Coordination and communication facility therefore plays an essential role in managing disasters. This chapter discusses Enterprise Models that capture the invariant

  15. Structure and data requirements of an end-use model for residential ...

    African Journals Online (AJOL)

    driniev

    2004-07-03

    Jul 3, 2004 ... 2 Department of Civil and Urban Engineering, Rand Afrikaans University, PO Box 524, Auckland Park 2006, ... One such approach is end-use modelling, which has a ... chemistry and are not easily removed after being dissolved into the ..... with a rainfall of 12 mm/month for month m the same applies to all.

  16. Quasi-monte carlo simulation and variance reduction techniques substantially reduce computational requirements of patient-level simulation models: An application to a discrete event simulation model

    NARCIS (Netherlands)

    Treur, M.; Postma, M.

    2014-01-01

    Objectives: Patient-level simulation models provide increased flexibility to overcome the limitations of cohort-based approaches in health-economic analysis. However, computational requirements of reaching convergence is a notorious barrier. The objective was to assess the impact of using

  17. Proposing a Holistic Model for Formulating the Security Requirements of e-learning based on Stakeholders’ Point of Veiw

    Directory of Open Access Journals (Sweden)

    Abouzar Arabsorkhi Mishabi

    2016-03-01

    Full Text Available Development of e-learning applications and services in the context of information and communication networks –beside qualitative and quantitative improvement in the scope and range of services they provide – has increased veriety of threats which are emerged from these networks and telecommunications infrastructure. This kind of issue have mad the effective and accurate analysing of security issues nessesary to managers and decision makers. Accordingly, in this study, using findings of other studies in the field of e-learning security, using methasyntesis, attempted to define a holistic model for classification and organization of security requirements. A structure that defines the origin of security requirements of e-learning and rolplays as a reference for formulating security requirements for this area.

  18. Standardization efforts of digital pathology in Europe.

    Science.gov (United States)

    Rojo, Marcial García; Daniel, Christel; Schrader, Thomas

    2012-01-01

    EURO-TELEPATH is a European COST Action IC0604. It started in 2007 and will end in November 2011. Its main objectives are evaluating and validating the common technological framework and communication standards required to access, transmit, and manage digital medical records by pathologists and other medical specialties in a networked environment. Working Group 1, "Business Modelling in Pathology," has designed main pathology processes - Frozen Study, Formalin Fixed Specimen Study, Telepathology, Cytology, and Autopsy - using Business Process Modelling Notation (BPMN). Working Group 2 has been dedicated to promoting the application of informatics standards in pathology, collaborating with Integrating Healthcare Enterprise (IHE), Digital Imaging and Communications in Medicine (DICOM), Health Level Seven (HL7), and other standardization bodies. Health terminology standardization research has become a topic of great interest. Future research work should focus on standardizing automatic image analysis and tissue microarrays imaging.

  19. Simulation of temporal and spatial distribution of required irrigation water by crop models and the pan evaporation coefficient method

    Science.gov (United States)

    Yang, Yan-min; Yang, Yonghui; Han, Shu-min; Hu, Yu-kun

    2009-07-01

    Hebei Plain is the most important agricultural belt in North China. Intensive irrigation, low and uneven precipitation have led to severe water shortage on the plain. This study is an attempt to resolve this crucial issue of water shortage for sustainable agricultural production and water resources management. The paper models distributed regional irrigation requirement for a range of cultivated crops on the plain. Classic crop models like DSSAT- wheat/maize and COTTON2K are used in combination with pan-evaporation coefficient method to estimate water requirements for wheat, corn, cotton, fruit-trees and vegetables. The approach is more accurate than the static approach adopted in previous studies. This is because the combination use of crop models and pan-evaporation coefficient method dynamically accounts for irrigation requirement at different growth stages of crops, agronomic practices, and field and climatic conditions. The simulation results show increasing Required Irrigation Amount (RIA) with time. RIA ranges from 5.08×109 m3 to 14.42×109 m3 for the period 1986~2006, with an annual average of 10.6×109 m3. Percent average water use by wheat, fruit trees, vegetable, corn and cotton is 41%, 12%, 12%, 11%, 7% and 17% respectively. RIA for April and May (the period with the highest irrigation water use) is 1.78×109 m3 and 2.41×109 m3 respectively. The counties in the piedmont regions of Mount Taihang have high RIA while the central and eastern regions/counties have low irrigation requirement.

  20. Requirements analysis and data model design for the development of vertical ERP solutions for the ceramic industry

    International Nuclear Information System (INIS)

    Oltra-Badenes, R. F.; Gil-Gomez, H.; Bellver-Lopez, R.; Asensio-Cuenta, S.

    2013-01-01

    Currently, the existing information systems, and specifically the ERP, can not give adequate support to the management of manufacturing companies of ceramic tile, because, among other reasons, not to contemplate the existence of tone, size and quality within the same product. This feature, caused by the lack of homogeneity of the product (LHP), generates various problems in managing the product through the different business processes, such as, stocks management, order management, the production management, etc. Thus, it is necessary to develop an ERP solution that is able to manage adequately the ceramic product, including tone, size and quality. In this paper we analyze the requirements of the ceramic sector, in terms of product identification, and propose a data model to meet these requirements. The model arises as a basic guide for the development of vertical ERP solutions tailored to the ceramic industry. (Author)

  1. What model resolution is required in climatological downscaling over complex terrain?

    Science.gov (United States)

    El-Samra, Renalda; Bou-Zeid, Elie; El-Fadel, Mutasem

    2018-05-01

    This study presents results from the Weather Research and Forecasting (WRF) model applied for climatological downscaling simulations over highly complex terrain along the Eastern Mediterranean. We sequentially downscale general circulation model results, for a mild and wet year (2003) and a hot and dry year (2010), to three local horizontal resolutions of 9, 3 and 1 km. Simulated near-surface hydrometeorological variables are compared at different time scales against data from an observational network over the study area comprising rain gauges, anemometers, and thermometers. The overall performance of WRF at 1 and 3 km horizontal resolution was satisfactory, with significant improvement over the 9 km downscaling simulation. The total yearly precipitation from WRF's 1 km and 3 km domains exhibited quantitative measure of the potential errors for various hydrometeorological variables.

  2. Improving the Enterprise Requirements and Acquisition Model’s Developmental Test and Evaluation Process Fidelity

    Science.gov (United States)

    2014-03-27

    program office. Many of the individuals were identified through snowball sampling . SME Discussion Summary The SME discussions brought to light...adequate sample sizes are collected. With computer modeling and simulation, time is less of a limiting factor as it can be manipulated. Data representing...The DAMS suffers from diverse problems of which only a very small sample were discussed here. However, as diverse as the problems encountered were, a

  3. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    OpenAIRE

    Pedro Mello Paiva; Alexandre Nunes Barreto; Jader Lugon Junior; Leticia Ferraço de Campos

    2016-01-01

    This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers sim...

  4. Summary of NR Program Prometheus Efforts

    International Nuclear Information System (INIS)

    J Ashcroft; C Eshelman

    2006-01-01

    The Naval Reactors Program led work on the development of a reactor plant system for the Prometheus space reactor program. The work centered on a 200 kWe electric reactor plant with a 15-20 year mission applicable to nuclear electric propulsion (NEP). After a review of all reactor and energy conversion alternatives, a direct gas Brayton reactor plant was selected for further development. The work performed subsequent to this selection included preliminary nuclear reactor and reactor plant design, development of instrumentation and control techniques, modeling reactor plant operational features, development and testing of core and plant material options, and development of an overall project plan. Prior to restructuring of the program, substantial progress had been made on defining reference plant operating conditions, defining reactor mechanical, thermal and nuclear performance, understanding the capabilities and uncertainties provided by material alternatives, and planning non-nuclear and nuclear system testing. The mission requirements for the envisioned NEP missions cannot be accommodated with existing reactor technologies. Therefore concurrent design, development and testing would be needed to deliver a functional reactor system. Fuel and material performance beyond the current state of the art is needed. There is very little national infrastructure available for fast reactor nuclear testing and associated materials development and testing. Surface mission requirements may be different enough to warrant different reactor design approaches and development of a generic multi-purpose reactor requires substantial sacrifice in performance capability for each mission

  5. New model for predicting energy requirements of children during catch-up growth developed using doubly labeled water

    Energy Technology Data Exchange (ETDEWEB)

    Fjeld, C R; Schoeller, D A; Brown, K H

    1989-05-01

    Energy partitioned to maintenance plus activity, tissue synthesis, and storage was measured in 41 children in early recovery (W/L (wt/length) less than 5th percentile) from severe protein-energy malnutrition and in late recovery (W/L = 25th percentile) to determine energy requirements during catch-up growth. Metabolizable energy intake was measured by bomb calorimetry and metabolic collections. Energy expended (means +/- SD) for maintenance and activity estimated by the doubly labeled water method was 97 +/- 12 kcal/kg FFM (fat-free mass) in early recovery and 98 +/- 12 kcal/kg FFM in late recovery (p greater than 0.5). Energy stored was 5-6 kcal/g of wt gain. Tissue synthesis increased energy expenditure by 1 +/- 0.7 kcal/g gain in both early and late recovery. From these data a mathematical model was developed to predict energy requirements for children during catch-up growth as a function of initial body composition and rate and composition of wt gain. The model for predicting metabolizable energy requirements is ((98 x FFM) + A (11.1 B + 2.2 C)), kcal/kg.d, where FFM is fat-free mass expressed as a percentage of body wt, A is wt gain (g/kg.d), B and C are percentage of wt gain/100 as fat and FFM, respectively. The model was tested retrospectively in separate studies of malnourished children.

  6. Practitioner's knowledge representation a pathway to improve software effort estimation

    CERN Document Server

    Mendes, Emilia

    2014-01-01

    The main goal of this book is to help organizations improve their effort estimates and effort estimation processes by providing a step-by-step methodology that takes them through the creation and validation of models that are based on their own knowledge and experience. Such models, once validated, can then be used to obtain predictions, carry out risk analyses, enhance their estimation processes for new projects and generally advance them as learning organizations.Emilia Mendes presents the Expert-Based Knowledge Engineering of Bayesian Networks (EKEBNs) methodology, which she has used and adapted during the course of several industry collaborations with different companies world-wide over more than 6 years. The book itself consists of two major parts: first, the methodology's foundations in knowledge management, effort estimation (with special emphasis on the intricacies of software and Web development) and Bayesian networks are detailed; then six industry case studies are presented which illustrate the pra...

  7. Literacy And Reward: Teachers’ Effort To Build Children Reading Habit

    Directory of Open Access Journals (Sweden)

    Helena Arisandi Komang Widia

    2018-01-01

    Full Text Available Children at early levels of primary school require appropriate guidance in their initial reading skill. They need to be trained on how reading becomes an enjoyable routine activity. This study aimed at describing teachers’ effort to build children reading habit. This study employed a qualitative descriptive study and conducted at North Bali Bilingual School Bali. The data were collected through observations and interview. The findings of the study showed that there were several activities conducted by the teacher as efforts to build children reading habit. In terms of building students’ reading habit, the teacher used (1 Point-written in Reading Rocket Chart (PRRC, (2 Chip (white, yellow, green for appreciating good behaviour in reading and using English, (3 Certificate, (4 Class Reward, and (5 Free Play Time. With these efforts, it is evident that the students’ literacy improves and they exhibited great enthusiasm in their reading and studying literacy in the classroom.

  8. Inpo/industry job and task analysis efforts

    International Nuclear Information System (INIS)

    Wigley, W.W.

    1985-01-01

    One of the goals of INPO is to develop and coordinate industrywide programs to improve the education, training and qualification of nuclear utility personnel. To accomplish this goal, INPO's Training and Education Division: conducts periodic evaluations of industry training programs; provides assistance to the industry in developing training programs; manages the accreditation of utility training programs. These efforts are aimed at satisfying the need for training programs for nuclear utility personnel to be performance-based. Performance-based means that training programs provide an incumbent with the skills and knowledge required to safely perform the job. One of the ways that INPO has provided assistance to the industry is through the industrywide job and task analysis effort. I will discuss the job analysis and task analysis processes, the current status of JTA efforts, JTA products and JTA lessons learned

  9. Job Satisfaction, Effort, and Performance: A Reasoned Action Perspective

    Directory of Open Access Journals (Sweden)

    Icek Ajzen

    2011-12-01

    Full Text Available In this article the author takes issue with the recurrent reliance on job satisfaction to explain job-related effort and performance.  The disappointing findings in this tradition are explained by lack of compatibility between job satisfaction–-a very broad attitude–-and the more specific effort and performance criteria.  Moreover, attempts to apply the expectancy-value model of attitude to explore the determinants of effort and performance suffer from reliance on unrepresentative sets of beliefs about the likely consequences of these behaviors.  The theory of planned behavior (Ajzen, 1991, 2012, with its emphasis on the proximal antecedents of job effort and performance, is offered as an alternative.  According to the theory, intentions to exert effort and to attain a certain performance level are determined by attitudes, subjective norms, and perceptions of control in relation to these behaviors; and these variables, in turn, are a function of readily accessible beliefs about the likely outcomes of effort and performance, about the normative expectations of important others, and about factors that facilitate or hinder effective performance.

  10. Water requirements of short rotation poplar coppice: Experimental and modelling analyses across Europe

    Czech Academy of Sciences Publication Activity Database

    Fischer, Milan; Zenone, T.; Trnka, Miroslav; Orság, Matěj; Montagnani, L.; Ward, E. J.; Tripathi, Abishek; Hlavinka, Petr; Seufert, G.; Žalud, Zdeněk; King, J.; Ceulemans, R.

    2018-01-01

    Roč. 250, MAR (2018), s. 343-360 ISSN 0168-1923 R&D Projects: GA MŠk(CZ) LO1415 Institutional support: RVO:86652079 Keywords : energy-balance closure * dual crop coefficient * radiation use efficiency * simulate yield response * below-ground carbon * vs. 2nd rotation * flux data * biomass production * forest model * stand-scale * Bioenergy * Bowen ratio and energy balance * Crop coefficient * Eddy covariance * Evapotranspiration * Water balance Subject RIV: GC - Agronomy OBOR OECD: Agriculture Impact factor: 3.887, year: 2016

  11. SEBAL Model Using to Estimate Irrigation Water Efficiency & Water Requirement of Alfalfa Crop

    Science.gov (United States)

    Zeyliger, Anatoly; Ermolaeva, Olga

    2013-04-01

    The sustainability of irrigation is a complex and comprehensive undertaking, requiring an attention to much more than hydraulics, chemistry, and agronomy. A special combination of human, environmental, and economic factors exists in each irrigated region and must be recognized and evaluated. A way to evaluate the efficiency of irrigation water use for crop production is to consider the so-called crop-water production functions, which express the relation between the yield of a crop and the quantity of water applied to it or consumed by it. The term has been used in a somewhat ambiguous way. Some authors have defined the Crop-Water Production Functions between yield and the total amount of water applied, whereas others have defined it as a relation between yield and seasonal evapotranspiration (ET). In case of high efficiency of irrigation water use the volume of water applied is less than the potential evapotranspiration (PET), then - assuming no significant change of soil moisture storage from beginning of the growing season to its end-the volume of water may be roughly equal to ET. In other case of low efficiency of irrigation water use the volume of water applied exceeds PET, then the excess of volume of water applied over PET must go to either augmenting soil moisture storage (end-of-season moisture being greater than start-of-season soil moisture) or to runoff or/and deep percolation beyond the root zone. In presented contribution some results of a case study of estimation of biomass and leaf area index (LAI) for irrigated alfalfa by SEBAL algorithm will be discussed. The field study was conducted with aim to compare ground biomass of alfalfa at some irrigated fields (provided by agricultural farm) at Saratov and Volgograd Regions of Russia. The study was conducted during vegetation period of 2012 from April till September. All the operations from importing the data to calculation of the output data were carried by eLEAF company and uploaded in Fieldlook web

  12. Requirement for Serratia marcescens cytolysin in a murine model of hemorrhagic pneumonia.

    Science.gov (United States)

    González-Juarbe, Norberto; Mares, Chris A; Hinojosa, Cecilia A; Medina, Jorge L; Cantwell, Angelene; Dube, Peter H; Orihuela, Carlos J; Bergman, Molly A

    2015-02-01

    Serratia marcescens, a member of the carbapenem-resistant Enterobacteriaceae, is an important emerging pathogen that causes a wide variety of nosocomial infections, spreads rapidly within hospitals, and has a systemic mortality rate of ≤41%. Despite multiple clinical descriptions of S. marcescens nosocomial pneumonia, little is known regarding the mechanisms of bacterial pathogenesis and the host immune response. To address this gap, we developed an oropharyngeal aspiration model of lethal and sublethal S. marcescens pneumonia in BALB/c mice and extensively characterized the latter. Lethal challenge (>4.0 × 10(6) CFU) was characterized by fulminate hemorrhagic pneumonia with rapid loss of lung function and death. Mice challenged with a sublethal dose (marcescens strains that failed to cause profound weight loss, extended illness, hemorrhage, and prolonged lung pathology in mice. This study describes a model of S. marcescens pneumonia that mimics known clinical features of human illness, identifies neutrophils and the toxin ShlA as a key factors important for defense and infection, respectively, and provides a solid foundation for future studies of novel therapeutics for this important opportunistic pathogen. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  13. EU grid computing effort takes on malaria

    CERN Multimedia

    Lawrence, Stacy

    2006-01-01

    Malaria is the world's most common parasitic infection, affecting more thatn 500 million people annually and killing more than 1 million. In order to help combat malaria, CERN has launched a grid computing effort (1 page)

  14. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Science.gov (United States)

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Empirical Models for Power and Energy Requirements II : A Powered Implement Operation in Serdang Sandy Clay Loam, Malaysia

    Directory of Open Access Journals (Sweden)

    A. F. Kheiralla

    2017-12-01

    Full Text Available Power and energy requirements were measured with an instrumented tractor for rotary tilling in Serdang sandy clay loam soil.  The effects of travel speed and rotor speed upon the measured data were investigated.  Power model from orthogonal regression analysis was formulated based on linear and quadratic functions of travel speed and bite length.  Fuel consumption model from regression analysis was formulated based on linear tractor PTO power as well as linear equivalent tractor PTO power.  Fuel consumption rates predicted by ASAE D497.3 were found to be 25% to 28% overestimates of the values predicted by the model developed.  However, fuel consumption rates reported by OECD Tractor Test were found to be 1% to 9% lower than the fuel consumption rates predicted by the model developed.  A comparison of power and energy requirements for both powered and draught implements showed that the disk harrow was the most energy efficient implement in terms of fuel consumption and specific energy followed by the rotary tiller, disk plough and mouldboard.  Finally, average PTO power, fuel consumption, wheel slip, wheel power and specific energy for a powered implement are presented.

  16. A recapitulative three-dimensional model of breast carcinoma requires perfusion for multi-week growth

    Directory of Open Access Journals (Sweden)

    Kayla F Goliwas

    2016-07-01

    Full Text Available Breast carcinomas are complex, three-dimensional tissues composed of cancer epithelial cells and stromal components, including fibroblasts and extracellular matrix. In vitro models that more faithfully recapitulate this dimensionality and stromal microenvironment should more accurately elucidate the processes driving carcinogenesis, tumor progression, and therapeutic response. Herein, novel in vitro breast carcinoma surrogates, distinguished by a relevant dimensionality and stromal microenvironment, are described and characterized. A perfusion bioreactor system was used to deliver medium to surrogates containing engineered microchannels and the effects of perfusion, medium composition, and the method of cell incorporation and density of initial cell seeding on the growth and morphology of surrogates were assessed. Perfused surrogates demonstrated significantly greater cell density and proliferation and were more histologically recapitulative of human breast carcinoma than surrogates maintained without perfusion. Although other parameters of the surrogate system, such as medium composition and cell seeding density, affected cell growth, perfusion was the most influential parameter.

  17. Covenant model of corporate compliance. "Corporate integrity" program meets mission, not just legal, requirements.

    Science.gov (United States)

    Tuohey, J F

    1998-01-01

    Catholic healthcare should establish comprehensive compliance strategies, beyond following Medicare reimbursement laws, that reflect mission and ethics. A covenant model of business ethics--rather than a self-interest emphasis on contracts--can help organizations develop a creed to focus on obligations and trust in their relationships. The corporate integrity program (CIP) of Mercy Health System Oklahoma promotes its mission and interests, educates and motivates its employees, provides assurance of systemwide commitment, and enforces CIP policies and procedures. Mercy's creed, based on its mission statement and core values, articulates responsibilities regarding patients and providers, business partners, society and the environment, and internal relationships. The CIP is carried out through an integrated network of committees, advocacy teams, and an expanded institutional review board. Two documents set standards for how Mercy conducts external affairs and clarify employee codes of conduct.

  18. Community Digital Library Requirements for the Southern California Earthquake Center Community Modeling Environment (SCEC/CME)

    Science.gov (United States)

    Moore, R.; Faerman, M.; Minster, J.; Day, S. M.; Ely, G.

    2003-12-01

    A community digital library provides support for ingestion, organization, description, preservation, and access of digital entities. The technologies that traditionally provide these capabilities are digital libraries (ingestion, organization, description), persistent archives (preservation) and data grids (access). We present a design for the SCEC community digital library that incorporates aspects of all three systems. Multiple groups have created integrated environments that sustain large-scale scientific data collections. By examining these projects, the following stages of implementation can be identified: \\begin{itemize} Definition of semantic terms to associate with relevant information. This includes definition of uniform content descriptors to describe physical quantities relevant to the scientific discipline, and creation of concept spaces to define how the uniform content descriptors are logically related. Organization of digital entities into logical collections that make it simple to browse and manage related material. Definition of services that are used to access and manipulate material in the collection. Creation of a preservation environment for the long-term management of the collection. Each community is faced with heterogeneity that is introduced when data is distributed across multiple sites, or when multiple sets of collection semantics are used, and or when multiple scientific sub-disciplines are federated. We will present the relevant standards that simplify the implementation of the SCEC community library, the resource requirements for different types of data sets that drive the implementation, and the digital library processes that the SCEC community library will support. The SCEC community library can be viewed as the set of processing steps that are required to build the appropriate SCEC reference data sets (SCEC approved encoding format, SCEC approved descriptive metadata, SCEC approved collection organization, and SCEC managed storage

  19. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Linda [Federal Office for Radiation Protection, Department ' ' Radiation Protection and Health' ' , Oberschleissheim (Germany); University of Zurich, Medical Physics Group, Institute of Physics, Zurich (Switzerland); Zhang, Wei [Public Health England, Centre for Radiation, Chemical and Environmental Hazards, Oxford (United Kingdom)

    2016-03-15

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated ''No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data''. Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome ''all solid cancer'', it is shown here that sex modification is not statistically significant for the outcome ''all solid cancer other than thyroid and breast cancer''. It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and

  20. Radiation risk models for all solid cancers other than those types of cancer requiring individual assessments after a nuclear accident

    International Nuclear Information System (INIS)

    Walsh, Linda; Zhang, Wei

    2016-01-01

    In the assessment of health risks after nuclear accidents, some health consequences require special attention. For example, in their 2013 report on health risk assessment after the Fukushima nuclear accident, the World Health Organisation (WHO) panel of experts considered risks of breast cancer, thyroid cancer and leukaemia. For these specific cancer types, use was made of already published excess relative risk (ERR) and excess absolute risk (EAR) models for radiation-related cancer incidence fitted to the epidemiological data from the Japanese A-bomb Life Span Study (LSS). However, it was also considered important to assess all other types of solid cancer together and the WHO, in their above-mentioned report, stated ''No model to calculate the risk for all other solid cancer excluding breast and thyroid cancer risks is available from the LSS data''. Applying the LSS models for all solid cancers along with the models for the specific sites means that some cancers have an overlap in the risk evaluations. Thus, calculating the total solid cancer risk plus the breast cancer risk plus the thyroid cancer risk can overestimate the total risk by several per cent. Therefore, the purpose of this paper was to publish the required models for all other solid cancers, i.e. all solid cancers other than those types of cancer requiring special attention after a nuclear accident. The new models presented here have been fitted to the same LSS data set from which the risks provided by the WHO were derived. Although it is known already that the EAR and ERR effect modifications by sex are statistically significant for the outcome ''all solid cancer'', it is shown here that sex modification is not statistically significant for the outcome ''all solid cancer other than thyroid and breast cancer''. It is also shown here that the sex-averaged solid cancer risks with and without the sex modification are very similar once breast and thyroid cancers are factored out. Some other notable model