WorldWideScience

Sample records for modeling efforts required

  1. The minimum effort required to eradicate infections in models with backward bifurcation

    NARCIS (Netherlands)

    Safan, M.; Heesterbeek, J.A.P.; Dietz, K.

    2006-01-01

    We study an epidemiological model which assumes that the susceptibility after a primary infection is r times the susceptibility before a primary infection. For r = 0 (r = 1) this is the SIR (SIS) model. For r > 1 + (μ/α) this model shows backward bifurcations, where μ is the death rate and α is the

  2. Examining Requirements Change Rework Effort: A Study

    CERN Document Server

    Chua, Bee Bee; 10.5121/ijsea.2010.1304

    2010-01-01

    Although software managers are generally good at new project estimation, their experience of scheduling rework tends to be poor. Inconsistent or incorrect effort estimation can increase the risk that the completion time for a project will be problematic. To continually alter software maintenance schedules during software maintenance is a daunting task. Our proposed framework, validated in a case study confirms that the variables resulting from requirements changes suffer from a number of problems, e.g., the coding used, end user involvement and user documentation. Our results clearly show a significant impact on rework effort as a result of unexpected errors that correlate with 1) weak characteristics and attributes as described in the program's source lines of code, especially in data declarations and data statements, 2) lack of communication between developers and users on a change effects, and 3) unavailability of user documentation. To keep rework effort under control, new criteria in change request forms...

  3. Analysis of Empirical Software Effort Estimation Models

    CERN Document Server

    Basha, Saleem

    2010-01-01

    Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...

  4. Prosocial apathy for helping others when effort is required.

    Science.gov (United States)

    Lockwood, Patricia L; Hamonet, Mathilde; Zhang, Samuel H; Ratnavel, Anya; Salmony, Florentine U; Husain, Masud; Apps, Matthew A J

    2017-07-01

    Prosocial acts - those that are costly to ourselves but benefit others - are a central component of human co-existence1-3. While the financial and moral costs of prosocial behaviours are well understood4-6, everyday prosocial acts do not typically come at such costs. Instead, they require effort. Here, using computational modelling of an effort-based task we show that people are prosocially apathetic. They are less willing to choose to initiate highly effortful acts that benefit others compared to benefitting themselves. Moreover, even when choosing to initiate effortful prosocial acts, people show superficiality, exerting less force into actions that benefit others than themselves. These findings replicated, were present when the other was anonymous or not, and when choices were made to earn rewards or avoid losses. Importantly, the least prosocially motivated people had higher subclinical levels of psychopathy and social apathy. Thus, although people sometimes 'help out', they are less motivated to benefit others and sometimes 'superficially prosocial', which may characterise everyday prosociality and its disruption in social disorders.

  5. Manage changes in the requirements definition through a collaborative effort

    CSIR Research Space (South Africa)

    Joseph-Malherbe, S

    2009-08-01

    Full Text Available Updating or changing the requirements statement during the systems engineering process may impact adversely on project parameters such as sequence, dependencies, effort, and duration of tasks, usually with an increase in development time and cost...

  6. 34 CFR 361.62 - Maintenance of effort requirements.

    Science.gov (United States)

    2010-07-01

    ... PROGRAM Financing of State Vocational Rehabilitation Programs § 361.62 Maintenance of effort requirements... provides for the construction of a facility for community rehabilitation program purposes, the amount of... for the construction of a facility for community rehabilitation program purposes or the...

  7. Statistical Modeling Efforts for Headspace Gas

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Brian Phillip [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-03-17

    The purpose of this document is to describe the statistical modeling effort for gas concentrations in WIPP storage containers. The concentration (in ppm) of CO2 in the headspace volume of standard waste box (SWB) 68685 is shown. A Bayesian approach and an adaptive Metropolis-Hastings algorithm were used.

  8. Hierarchy, Dominance, and Deliberation: Egalitarian Values Require Mental Effort.

    Science.gov (United States)

    Van Berkel, Laura; Crandall, Christian S; Eidelman, Scott; Blanchar, John C

    2015-09-01

    Hierarchy and dominance are ubiquitous. Because social hierarchy is early learned and highly rehearsed, the value of hierarchy enjoys relative ease over competing egalitarian values. In six studies, we interfere with deliberate thinking and measure endorsement of hierarchy and egalitarianism. In Study 1, bar patrons' blood alcohol content was correlated with hierarchy preference. In Study 2, cognitive load increased the authority/hierarchy moral foundation. In Study 3, low-effort thought instructions increased hierarchy endorsement and reduced equality endorsement. In Study 4, ego depletion increased hierarchy endorsement and caused a trend toward reduced equality endorsement. In Study 5, low-effort thought instructions increased endorsement of hierarchical attitudes among those with a sense of low personal power. In Study 6, participants' thinking quickly allocated more resources to high-status groups. Across five operationalizations of impaired deliberative thought, hierarchy endorsement increased and egalitarianism receded. These data suggest hierarchy may persist in part because it has a psychological advantage. © 2015 by the Society for Personality and Social Psychology, Inc.

  9. Efforts and models of education for parents

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal

    2010-01-01

    Artiklen omfatter en gennemgang af modeller for forældreuddannelse, der fortrinsvis anvendes i Danmark. Artiklen indlejrer modellerne i nogle bredere blikke på uddannelsessystemet og den aktuelle diskurs om ansvarliggørelse af forældre.   Udgivelsesdato: Marts 2010...

  10. Examining the antecedents of challenge and threat states: the influence of perceived required effort and support availability.

    Science.gov (United States)

    Moore, Lee J; Vine, Samuel J; Wilson, Mark R; Freeman, Paul

    2014-08-01

    To date, limited research has explicitly examined the antecedents of challenge and threat states proposed by the biopsychosocial model. Thus, the aim of the present study was to examine the influence of perceived required effort and support availability on demand/resource evaluations, challenge and threat states, and motor performance. A 2 (required effort; high, low)×2 (support availability; available, not available) between-subjects design was used with one hundred and twenty participants randomly assigned to one of four experimental conditions. Participants received instructions designed to manipulate perceptions of required effort and support availability before demand/resource evaluations and cardiovascular responses were assessed. Participants then performed the novel motor task (laparoscopic surgery) while performance was recorded. Participants in the low perceived required effort condition evaluated the task as more of a challenge (i.e., resources outweighed demands), exhibited a cardiovascular response more indicative of a challenge state (i.e., higher cardiac output and lower total peripheral resistance), and performed the task better (i.e., quicker completion time) than those in the high perceived required effort condition. However, perceptions of support availability had no significant impact on participants' demand/resource evaluations, cardiovascular responses, or performance. Furthermore, there was no significant interaction effect between perceptions of required effort and support availability. The findings suggest that interventions aimed at promoting a challenge state should include instructions that help individuals perceive that the task is not difficult and requires little physical and mental effort to perform effectively.

  11. Performance Analysis of Software Effort Estimation Models Using Neural Networks

    Directory of Open Access Journals (Sweden)

    P.Latha

    2013-08-01

    Full Text Available Software Effort estimation involves the estimation of effort required to develop software. Cost overrun, schedule overrun occur in the software development due to the wrong estimate made during the initial stage of software development. Proper estimation is very essential for successful completion of software development. Lot of estimation techniques available to estimate the effort in which neural network based estimation technique play a prominent role. Back propagation Network is the most widely used architecture. ELMAN neural network a recurrent type network can be used on par with Back propagation Network. For a good predictor system the difference between estimated effort and actual effort should be as low as possible. Data from historic project of NASA is used for training and testing. The experimental Results confirm that Back propagation algorithm is efficient than Elman neural network.

  12. Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormanjieva, Olga; Abran, A.; Braungarten, R.; Dumke, R.; Cuadrado-Gallego, J.; Brunekreef, J.

    2009-01-01

    The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient

  13. Electrophysiological correlates of listening effort: neurodynamical modeling and measurement.

    Science.gov (United States)

    Strauss, Daniel J; Corona-Strauss, Farah I; Trenado, Carlos; Bernarding, Corinna; Reith, Wolfgang; Latzel, Matthias; Froehlich, Matthias

    2010-06-01

    An increased listing effort represents a major problem in humans with hearing impairment. Neurodiagnostic methods for an objective listening effort estimation might support hearing instrument fitting procedures. However the cognitive neurodynamics of listening effort is far from being understood and its neural correlates have not been identified yet. In this paper we analyze the cognitive neurodynamics of listening effort by using methods of forward neurophysical modeling and time-scale electroencephalographic neurodiagnostics. In particular, we present a forward neurophysical model for auditory late responses (ALRs) as large-scale listening effort correlates. Here endogenously driven top-down projections related to listening effort are mapped to corticothalamic feedback pathways which were analyzed for the selective attention neurodynamics before. We show that this model represents well the time-scale phase stability analysis of experimental electroencephalographic data from auditory discrimination paradigms. It is concluded that the proposed neurophysical and neuropsychological framework is appropriate for the analysis of listening effort and might help to develop objective electroencephalographic methods for its estimation in future.

  14. The Mental Effort-Reward Imbalances Model and Its Implications for Behaviour Management

    Science.gov (United States)

    Poulton, Alison; Whale, Samina; Robinson, Joanne

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is frequently associated with oppositional defiant disorder (ODD). The Mental Effort Reward Imbalances Model (MERIM) explains this observational association as follows: in ADHD a disproportionate level of mental effort is required for sustaining concentration for achievement; in ODD the subjective…

  15. City Logistics Modeling Efforts: Trends and Gaps - A Review

    NARCIS (Netherlands)

    Anand, N.R.; Quak, H.J.; Van Duin, J.H.R.; Tavasszy, L.A.

    2012-01-01

    In this paper, we present a review of city logistics modeling efforts reported in the literature for urban freight analysis. The review framework takes into account the diversity and complexity found in the present-day city logistics practice. Next, it covers the different aspects in the modeling se

  16. 29 CFR 1620.16 - Jobs requiring equal effort in performance.

    Science.gov (United States)

    2010-07-01

    ... factors which cause mental fatigue and stress, as well as those which alleviate fatigue, are to be... portion of her time to performing fill-in work requiring greater dexterity—such as rearranging displays of spices or other small items. The difference in kind of effort required of the employees does not appear...

  17. Testing agile requirements models

    Institute of Scientific and Technical Information of China (English)

    BOTASCHANJAN Jewgenij; PISTER Markus; RUMPE Bernhard

    2004-01-01

    This paper discusses a model-based approach to validate software requirements in agile development processes by simulation and in particular automated testing. The use of models as central development artifact needs to be added to the portfolio of software engineering techniques, to further increase efficiency and flexibility of the development beginning already early in the requirements definition phase. Testing requirements are some of the most important techniques to give feedback and to increase the quality of the result. Therefore testing of artifacts should be introduced as early as possible, even in the requirements definition phase.

  18. Efforts - Final technical report on task 4. Physical modelling calidation

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; Olsson, David Dam; Christensen, T. W.

    The present report is documentation for the work carried out in Task 4 at DTU Physical modelling-validation on the Brite/Euram project No. BE96-3340, contract No. BRPR-CT97-0398, with the title Enhanced Framework for forging design using reliable three-dimensional simulation (EFFORTS). The report...

  19. 34 CFR 403.182 - What is the maintenance of fiscal effort requirement?

    Science.gov (United States)

    2010-07-01

    ... EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403.182 What is the maintenance of... 34 Education 3 2010-07-01 2010-07-01 false What is the maintenance of fiscal effort requirement? 403.182 Section 403.182 Education Regulations of the Offices of the Department of Education...

  20. Adaptive Reward Pursuit: How Effort Requirements Affect Unconscious Reward Responses and Conscious Reward Decisions

    NARCIS (Netherlands)

    Bijleveld, E.H.; Custers, R.; Aarts, H.A.G.

    2012-01-01

    When in pursuit of rewards, humans weigh the value of potential rewards against the amount of effort that is required to attain them. Although previous research has generally conceptualized this process as a deliberate calculation, recent work suggests that rudimentary mechanisms operating without c

  1. Adaptive Reward Pursuit: How Effort Requirements Affect Unconscious Reward Responses and Conscious Reward Decisions

    Science.gov (United States)

    Bijleveld, Erik; Custers, Ruud; Aarts, Henk

    2012-01-01

    When in pursuit of rewards, humans weigh the value of potential rewards against the amount of effort that is required to attain them. Although previous research has generally conceptualized this process as a deliberate calculation, recent work suggests that rudimentary mechanisms--operating without conscious intervention--play an important role as…

  2. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts.

    Science.gov (United States)

    Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M

    2012-03-09

    A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.

  3. How backyard poultry flocks influence the effort required to curtail avian influenza epidemics in commercial poultry flocks.

    Science.gov (United States)

    Smith, G; Dunipace, S

    2011-06-01

    This paper summarizes the evidence that the contribution of backyard poultry flocks to the on-going transmission dynamics of an avian influenza epidemic in commercial flocks is modest at best. Nevertheless, while disease control strategies need not involve the backyard flocks, an analysis of the contribution of each element of the next generation matrix to the basic reproduction number indicates that models which ignores the contribution of backyard flocks in estimating the effort required of strategies focused one host type (e.g. commercial flocks only) necessarily underestimate the level of effort to an extent that may matter to policy makers.

  4. Modeling to Mars: a NASA Model Based Systems Engineering Pathfinder Effort

    Science.gov (United States)

    Phojanamongkolkij, Nipa; Lee, Kristopher A.; Miller, Scott T.; Vorndran, Kenneth A.; Vaden, Karl R.; Ross, Eric P.; Powell, Bobby C.; Moses, Robert W.

    2017-01-01

    The NASA Engineering Safety Center (NESC) Systems Engineering (SE) Technical Discipline Team (TDT) initiated the Model Based Systems Engineering (MBSE) Pathfinder effort in FY16. The goals and objectives of the MBSE Pathfinder include developing and advancing MBSE capability across NASA, applying MBSE to real NASA issues, and capturing issues and opportunities surrounding MBSE. The Pathfinder effort consisted of four teams, with each team addressing a particular focus area. This paper focuses on Pathfinder team 1 with the focus area of architectures and mission campaigns. These efforts covered the timeframe of February 2016 through September 2016. The team was comprised of eight team members from seven NASA Centers (Glenn Research Center, Langley Research Center, Ames Research Center, Goddard Space Flight Center IV&V Facility, Johnson Space Center, Marshall Space Flight Center, and Stennis Space Center). Collectively, the team had varying levels of knowledge, skills and expertise in systems engineering and MBSE. The team applied their existing and newly acquired system modeling knowledge and expertise to develop modeling products for a campaign (Program) of crew and cargo missions (Projects) to establish a human presence on Mars utilizing In-Situ Resource Utilization (ISRU). Pathfinder team 1 developed a subset of modeling products that are required for a Program System Requirement Review (SRR)/System Design Review (SDR) and Project Mission Concept Review (MCR)/SRR as defined in NASA Procedural Requirements. Additionally, Team 1 was able to perform and demonstrate some trades and constraint analyses. At the end of these efforts, over twenty lessons learned and recommended next steps have been identified.

  5. How backyard poultry flocks influence the effort required to curtail avian influenza epidemics in commercial poultry flocks

    OpenAIRE

    2011-01-01

    This paper summarizes the evidence that the contribution of backyard poultry flocks to the on-going transmission dynamics of an avian influenza epidemic in commercial flocks is modest at best. Nevertheless, while disease control strategies need not involve the backyard flocks, an analysis of the contribution of each element of the next generation matrix to the basic reproduction number indicates that models which ignores the contribution of backyard flocks in estimating the effort required of...

  6. Linking effort and fishing mortality in a mixed fisheries model

    DEFF Research Database (Denmark)

    Thøgersen, Thomas Talund; Hoff, Ayoe; Frost, Hans Staby

    2012-01-01

    in fish stocks has led to overcapacity in many fisheries, leading to incentives for overfishing. Recent research has shown that the allocation of effort among fleets can play an important role in mitigating overfishing when the targeting covers a range of species (multi-species—i.e., so-called mixed...... fisheries), while simultaneously optimising the overall economic performance of the fleets. The so-called FcubEcon model, in particular, has elucidated both the biologically and economically optimal method for allocating catches—and thus effort—between fishing fleets, while ensuring that the quotas...

  7. Moving the mountain: analysis of the effort required to transform comparative anatomy into computable anatomy.

    Science.gov (United States)

    Dahdul, Wasila; Dececchi, T Alexander; Ibrahim, Nizar; Lapp, Hilmar; Mabee, Paula

    2015-01-01

    The diverse phenotypes of living organisms have been described for centuries, and though they may be digitized, they are not readily available in a computable form. Using over 100 morphological studies, the Phenoscape project has demonstrated that by annotating characters with community ontology terms, links between novel species anatomy and the genes that may underlie them can be made. But given the enormity of the legacy literature, how can this largely unexploited wealth of descriptive data be rendered amenable to large-scale computation? To identify the bottlenecks, we quantified the time involved in the major aspects of phenotype curation as we annotated characters from the vertebrate phylogenetic systematics literature. This involves attaching fully computable logical expressions consisting of ontology terms to the descriptions in character-by-taxon matrices. The workflow consists of: (i) data preparation, (ii) phenotype annotation, (iii) ontology development and (iv) curation team discussions and software development feedback. Our results showed that the completion of this work required two person-years by a team of two post-docs, a lead data curator, and students. Manual data preparation required close to 13% of the effort. This part in particular could be reduced substantially with better community data practices, such as depositing fully populated matrices in public repositories. Phenotype annotation required ∼40% of the effort. We are working to make this more efficient with Natural Language Processing tools. Ontology development (40%), however, remains a highly manual task requiring domain (anatomical) expertise and use of specialized software. The large overhead required for data preparation and ontology development contributed to a low annotation rate of approximately two characters per hour, compared with 14 characters per hour when activity was restricted to character annotation. Unlocking the potential of the vast stores of morphological

  8. Increased effort requirements and risk sensitivity: a comparison of delay and magnitude manipulations.

    Science.gov (United States)

    Kirshenbaum, Ari P.; Szalda-Petree, Allen D.; Haddad, Nabil F.

    2003-03-31

    Reward magnitude and delay to reward were independently manipulated in two separate experiments examining risk-sensitive choice in rats. A dual-running wheel apparatus was used and the tangential force resistance required to displace both wheels was low (50g) for half of the subjects, and high (120g) for the remaining subjects. Concurrent FI30-s and FI60-s schedules delivered equivalent amounts of food reward per unit time (i.e. 5 and 10 pellets of food, respectively), and these conditions served as the baseline treatment for all subjects. Variability, either in reward magnitude or delay, was introduced on the long-delay (60s) schedule during the second phase. All subjects were returned to the baseline condition in the third phase, and variability was introduced on the short-delay (30s) interval schedule during phase four. The subjects were again returned to the baseline condition in the fifth and final phase, ultimately yielding a five-phase ABACA design. Original baseline performance was characterized by a slight short-delay interval preference, and this pattern of performance was recovered with each subsequent presentation of the baseline condition. Overall, the data obtained from the reward magnitude and delay-to-reward manipulations were indistinguishable; subjects experiencing low-response effort requirement behaved in a risk-indifferent manner and subjects experiencing high-response effort requirement preferred the variable schedule. Implications for the daily energy budget rule on risk-sensitive foraging are discussed in light of these findings.

  9. Finding the right balance between groundwater model complexity and experimental effort via Bayesian model selection

    Science.gov (United States)

    Schöniger, Anneli; Illman, Walter A.; Wöhling, Thomas; Nowak, Wolfgang

    2015-12-01

    Groundwater modelers face the challenge of how to assign representative parameter values to the studied aquifer. Several approaches are available to parameterize spatial heterogeneity in aquifer parameters. They differ in their conceptualization and complexity, ranging from homogeneous models to heterogeneous random fields. While it is common practice to invest more effort into data collection for models with a finer resolution of heterogeneities, there is a lack of advice which amount of data is required to justify a certain level of model complexity. In this study, we propose to use concepts related to Bayesian model selection to identify this balance. We demonstrate our approach on the characterization of a heterogeneous aquifer via hydraulic tomography in a sandbox experiment (Illman et al., 2010). We consider four increasingly complex parameterizations of hydraulic conductivity: (1) Effective homogeneous medium, (2) geology-based zonation, (3) interpolation by pilot points, and (4) geostatistical random fields. First, we investigate the shift in justified complexity with increasing amount of available data by constructing a model confusion matrix. This matrix indicates the maximum level of complexity that can be justified given a specific experimental setup. Second, we determine which parameterization is most adequate given the observed drawdown data. Third, we test how the different parameterizations perform in a validation setup. The results of our test case indicate that aquifer characterization via hydraulic tomography does not necessarily require (or justify) a geostatistical description. Instead, a zonation-based model might be a more robust choice, but only if the zonation is geologically adequate.

  10. Recent efforts to model human diseases in vivo in Drosophila.

    Science.gov (United States)

    Pfleger, Cathie M; Reiter, Lawrence T

    2008-01-01

    Upon completion of sequencing the Drosophila genome, it was estimated that 61% of human disease-associated genes had sequence homologs in flies, and in some diseases such as cancer, the number was as high as 68%. We now know that as many as 75% of the genes associated with genetic disease have counterparts in Drosophila. Using better tools for mutation detection, association studies and whole genome analysis the number of human genes associated with genetic disease is steadily increasing. These detection efforts are outpacing the ability to assign function and understand the underlying cause of the disease at the molecular level. Drosophila models can therefore advance human disease research in a number of ways by: establishing the normal role of these gene products during development, elucidating the mechanism underlying disease pathology, and even identifying candidate therapeutic agents for the treatment of human disease. At the 49(th) Annual Drosophila Research Conference in San Diego this year, a number of labs presented their exciting findings on Drosophila models of human disease in both platform presentations and poster sessions. Here we can only briefly review some of these developments, and we apologize that we do not have the time or space to review all of the findings presented which use Drosophila to understand human disease etiology.

  11. Global Efforts in the Development of Vaccines for Tuberculosis: Requirements for Improved Vaccines Against Mycobacterium tuberculosis.

    Science.gov (United States)

    Méndez-Samperio, P

    2016-10-01

    Currently, more than 9.0 million people develop acute pulmonary tuberculosis (TB) each year and about 1.5 million people worldwide die from this infection. Thus, developing vaccines to prevent active TB disease remains a priority. This article discusses recent progress in the development of new vaccines against TB and focusses on the main requirements for development of improved vaccines against Mycobacterium tuberculosis (M. tb). Over the last two decades, significant progress has been made in TB vaccine development, and some TB vaccine candidates have currently completed a phase III clinical trial. The potential public health benefits of these vaccines are possible, but it will need much more effort, including new global governance investment on this research. This investment would certainly be less than the annual global financial toll of TB treatment.

  12. A Study on System Availability Vs System Administration Efforts with Mathematical Models

    Institute of Scientific and Technical Information of China (English)

    郑建德

    2003-01-01

    Two mathematical models are developed in this paper to study the effectiveness of system administration efforts on the improvement of system availability, based on the assumption that there exists a transitional state for a computer system in operation before it is brought down by some hardware or software problems and with intensified system administration efforts, it is possible to discover and fix the problems in time to bring the system back to normal state before it is down. Markov chain is used to simulate the transition of system states. A conclusion is made that increasing system administration efforts may be a cost-effective way to meet the requirements for moderate improvement on system availability, but higher demand on this aspect still has to be met by advanced technologies.

  13. Army Training: Efforts to Adjust Training Requirements Should Consider the Use of Virtual Training Devices

    Science.gov (United States)

    2016-08-01

    training needs contributed to program cost increases significant enough to require the program to be re-categorized in terms of its size and scope. We... training strategies, the Army risks missing opportunities to increase usage of the devices during training . View GAO-16-636. For more information, contact... training uses computer models and simulations to exercise command and staff functions. Gaming is the use of technology employing commercial or government

  14. Linking effort and fishing mortality in a mixed fisheries model

    DEFF Research Database (Denmark)

    Thøgersen, Thomas Talund; Hoff, Ayoe; Frost, Hans Staby

    2012-01-01

    in fish stocks has led to overcapacity in many fisheries, leading to incentives for overfishing. Recent research has shown that the allocation of effort among fleets can play an important role in mitigating overfishing when the targeting covers a range of species (multi-species—i.e., so-called mixed...

  15. Suggestion Program and Model Installation Program - Duplication of Effort.

    Science.gov (United States)

    1988-04-01

    REPORTNUMBER88-26- TITL SUGESIONPROGAM ND ODE INSALLTIO PRGRAM -DULICTIO OF EFFORT AUTHR(S)MAJR DOALD . TOWBRDGEUSA FACUTY DVISRMAOR SEVE L.HANSN, CSC/824STU...NIP Evaluation Process............................ 13 FIGURE 3--USAF MIP Growth................................... 17 0. p.r vip I -.# EXECUTIVE SUMMARY...the study centers on program processes for submitting and evaluating proposals. The Suggestion Program and MIP processes are similar in that they both

  16. Renewable energy data requirements: A review of user opinions and data collection efforts

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, G.G.

    1991-11-01

    Interest in the contribution of renewable energy to US energy supply is growing. This interest stems from environmental and energy security concerns and the desire to develop domestic resources. In order to plan for the use of renewable energy, data are essential to a variety of users both inside and outside the government. The purpose of this study is to identify priorities and requirements for gathering different types of renewable energy data. Results of this study are to be used by the US Department of Energy, Energy Information Administration (EIA), in planning and evaluating its ongoing and future renewable energy information programs. The types of renewable energy addressed in this study include biomass (wood, agricultural residues, and crops grown for energy), municipal solid waste, geothermal energy, solar energy, and wind. To assess the relative importance of different types of information, we reviewed existing renewable energy data collection efforts and asked the opinions of renewable energy data users. Individuals in government, private industry, research organizations, industry trade associations, and public interest research groups were contacted and questioned about particular renewable energy data items. An analysis of their responses provides the basis for the conclusions in this report. The types of information; about which we asked each respondent included resource stock and flow information; quantities of energy inputs (e.g., wood) and outputs (e.g., electricity, heat); energy input and output costs and prices; numbers, location, and production capacities of energy conversion facilities; quantities and costs of energy conversion equipment; and quantities of pollutant emissions from energy conversion. 5 refs., 25 tabs.

  17. A polynomial model of patient-specific breathing effort during controlled mechanical ventilation.

    Science.gov (United States)

    Redmond, Daniel P; Docherty, Paul D; Yeong Shiong Chiew; Chase, J Geoffrey

    2015-08-01

    Patient breathing efforts occurring during controlled ventilation causes perturbations in pressure data, which cause erroneous parameter estimation in conventional models of respiratory mechanics. A polynomial model of patient effort can be used to capture breath-specific effort and underlying lung condition. An iterative multiple linear regression is used to identify the model in clinical volume controlled data. The polynomial model has lower fitting error and more stable estimates of respiratory elastance and resistance in the presence of patient effort than the conventional single compartment model. However, the polynomial model can converge to poor parameter estimation when patient efforts occur very early in the breath, or for long duration. The model of patient effort can provide clinical benefits by providing accurate respiratory mechanics estimation and monitoring of breath-to-breath patient effort, which can be used by clinicians to guide treatment.

  18. Modeling Requirements for Cohort and Register IT.

    Science.gov (United States)

    Stäubert, Sebastian; Weber, Ulrike; Michalik, Claudia; Dress, Jochen; Ngouongo, Sylvie; Stausberg, Jürgen; Winter, Alfred

    2016-01-01

    The project KoRegIT (funded by TMF e.V.) aimed to develop a generic catalog of requirements for research networks like cohort studies and registers (KoReg). The catalog supports such kind of research networks to build up and to manage their organizational and IT infrastructure. To make transparent the complex relationships between requirements, which are described in use cases from a given text catalog. By analyzing and modeling the requirements a better understanding and optimizations of the catalog are intended. There are two subgoals: a) to investigate one cohort study and two registers and to model the current state of their IT infrastructure; b) to analyze the current state models and to find simplifications within the generic catalog. Processing the generic catalog was performed by means of text extraction, conceptualization and concept mapping. Then methods of enterprise architecture planning (EAP) are used to model the extracted information. To work on objective a) questionnaires are developed by utilizing the model. They are used for semi-structured interviews, whose results are evaluated via qualitative content analysis. Afterwards the current state was modeled. Objective b) was done by model analysis. A given generic text catalog of requirements was transferred into a model. As result of objective a) current state models of one existing cohort study and two registers are created and analyzed. An optimized model called KoReg-reference-model is the result of objective b). It is possible to use methods of EAP to model requirements. This enables a better overview of the partly connected requirements by means of visualization. The model based approach also enables the analysis and comparison of the empirical data from the current state models. Information managers could reduce the effort of planning the IT infrastructure utilizing the KoReg-reference-model. Modeling the current state and the generation of reports from the model, which could be used as

  19. Revised Use Case Point (Re-UCP Model for Software Effort Estimation

    Directory of Open Access Journals (Sweden)

    Mudasir Manzoor Kirmani

    2015-03-01

    Full Text Available At present the most challenging issue that the software development industry encounters is less efficient management of software development budget projections. This problem has put the modern day software development companies in a situation wherein they are dealing with improper requirement engineering, ambiguous resource elicitation, uncertain cost and effort estimation. The most indispensable and inevitable aspect of any software development company is to form a counter mechanism to deal with the problems which leads to chaos. An emphatic combative domain to deal with this problem is to schedule the whole development process to undergo proper and efficient estimation process, wherein the estimation of all the resources can be made well in advance in order to check whether the conceived project is feasible and within the resources available. The basic building block in any object oriented design is Use Case diagrams which are prepared in early stages of design after clearly understanding the requirements. Use Case Diagrams are considered to be useful for approximating estimates for software development project. This research work gives detailed overview of Re-UCP (revised use case point method of effort estimation for software projects. The Re-UCP method is a modified approach which is based on UCP method of effort estimation. In this research study 14 projects were subjected to estimate efforts using Re-UCP method and the results were compared with UCP and e-UCP models. The comparison of 14 projects shows that Re-UCP has significantly outperformed the existing UCP and e-UCP effort estimation techniques.

  20. A comparison between the effort-reward imbalance and demand control models.

    Science.gov (United States)

    Ostry, Aleck S; Kelly, Shona; Demers, Paul A; Mustard, Cameron; Hertzman, Clyde

    2003-02-27

    To compare the predictive validity of the demand/control and reward/imbalance models, alone and in combination with each other, for self-reported health status and the self-reported presence of any chronic disease condition. Self-reports for psychosocial work conditions were obtained in a sample of sawmill workers using the demand/control and effort/reward imbalance models. The relative predictive validity of task-level control was compared with effort/reward imbalance. As well, the predictive validity of a model developed by combining task-level control with effort/reward imbalance was determined. Logistic regression was utilized for all models. The demand/control and effort/reward imbalance models independently predicted poor self-reported health status. The effort-reward imbalance model predicted the presence of a chronic disease while the demand/control model did not. A model combining effort-reward imbalance and task-level control was a better predictor of self-reported health status and any chronic condition than either model alone. Effort reward imbalance modeled with intrinsic effort had marginally better predictive validity than when modeled with extrinsic effort only. Future work should explore the combined effects of these two models of psychosocial stress at work on health more thoroughly.

  1. An experience report on ERP effort estimation driven by quality requirements

    NARCIS (Netherlands)

    Erasmus, Pierre; Daneva, Maya; Schockert, Sixten

    2015-01-01

    Producing useful and accurate project effort estimates is highly dependable on the proper definition of the project scope. In the ERP service industry, the scope of an ERP service project is determined by desired needs which are driven by certain quality attributes that the client expects to be

  2. An experience report on ERP effort estimation driven by quality requirements

    NARCIS (Netherlands)

    Erasmus, Pierre; Daneva, Maya; Schockert, Sixten

    2015-01-01

    Producing useful and accurate project effort estimates is highly dependable on the proper definition of the project scope. In the ERP service industry, the scope of an ERP service project is determined by desired needs which are driven by certain quality attributes that the client expects to be pres

  3. Colloids and Radionuclide Transport: A Field, Experimental and Modeling Effort

    Science.gov (United States)

    Zhao, P.; Zavarin, M.; Sylwester, E. E.; Allen, P. G.; Williams, R. W.; Kersting, A. B.

    2002-05-01

    Natural inorganic colloids (clinoptilolite, colloids particle size 171 ñ 25 nm) were conducted in synthetic groundwater (similar to J-13, Yucca Mountain standard) with a pH range from 4 to 10 and initial plutonium concentration of 10-9 M. The results show that Pu(IV) sorption takes place within an hour, while the rates of Pu(V) sorption onto the colloids is much slower and mineral dependent. The kinetic results from the batch sorption/desorption experiments, coupled with redox kinetics of plutonium in solution will be used in geochemical modeling of Pu surface complexation to colloids and reactive transport. (This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48.)

  4. Efforts and Models of Education for Parents: the Danish Approach

    Directory of Open Access Journals (Sweden)

    Rosendal Jensen, Niels

    2009-12-01

    to underline that Danish welfare policy has been changing rather radical. The classic model was an understanding of welfare as social assurance and/or as social distribution – based on social solidarity. The modern model looks like welfare as social service and/or social investment. This means that citizens are changing role – from user and/or citizen to consumer and/or investor. The Danish state is in correspondence with decisions taken by the government investing in a national future shaped by global competition. The new models of welfare – “service” and “investment” – imply severe changes in hitherto known concepts of family life, relationship between parents and children etc. As an example the investment model points at a new implementation of the relationship between social rights and the rights of freedom. The service model has demonstrated that weakness that the access to qualified services in the field of health or education is becoming more and more dependent of the private purchasing power. The weakness of the investment model is that it represents a sort of “The Winner takes it all” – since a political majority is enabled to make agendas in societal fields former protected by the tripartite power and the rights of freedom of the citizens. The outcome of the Danish development seems to be an establishment of a political governed public service industry which on one side are capable of competing on market conditions and on the other are able being governed by contracts. This represents a new form of close linking of politics, economy and professional work. Attempts of controlling education, pedagogy and thereby the population are not a recent invention. In European history we could easily point at several such experiments. The real news is the linking between political priorities and exercise of public activities by economic incentives. By defining visible goals for the public servants, by introducing measurement of achievements and

  5. APPLYING TEACHING-LEARNING TO ARTIFICIAL BEE COLONY FOR PARAMETER OPTIMIZATION OF SOFTWARE EFFORT ESTIMATION MODEL

    Directory of Open Access Journals (Sweden)

    THANH TUNG KHUAT

    2017-05-01

    Full Text Available Artificial Bee Colony inspired by the foraging behaviour of honey bees is a novel meta-heuristic optimization algorithm in the community of swarm intelligence algorithms. Nevertheless, it is still insufficient in the speed of convergence and the quality of solutions. This paper proposes an approach in order to tackle these downsides by combining the positive aspects of TeachingLearning based optimization and Artificial Bee Colony. The performance of the proposed method is assessed on the software effort estimation problem, which is the complex and important issue in the project management. Software developers often carry out the software estimation in the early stages of the software development life cycle to derive the required cost and schedule for a project. There are a large number of methods for effort estimation in which COCOMO II is one of the most widely used models. However, this model has some restricts because its parameters have not been optimized yet. In this work, therefore, we will present the approach to overcome this limitation of COCOMO II model. The experiments have been conducted on NASA software project dataset and the obtained results indicated that the improvement of parameters provided better estimation capabilities compared to the original COCOMO II model.

  6. Evaluation of Arroyo Channel Restoration Efforts using Hydrological Modeling: Rancho San Bernardino, Sonora, MX

    Science.gov (United States)

    Jemison, N. E.; DeLong, S.; Henderson, W. M.; Adams, J.

    2012-12-01

    In the drylands of the southwestern U.S. and northwestern Mexico, historical river channel incision (arroyo cutting) has led to the destruction of riparian ecological systems and cieñega wetlands in many locations. Along Silver Creek on the Arizona-Sonora border, the Cuenca Los Ojos Foundation has been installing rock gabions and concrete and earthen berms with a goal of slowing flash floods, raising groundwater levels, and refilling arroyo channels with sediment in an area that changed from a broad, perennially wet cieñega to a narrow sand- and gravel-dominated arroyo channel with an average depth of ~6 m. The engineering efforts hope to restore desert wetlands, regrow riparian vegetation, and promote sediment deposition along the arroyo floor. Hydrological modeling allows us to predict how rare flood events interact with the restoration efforts and may guide future approaches to dryland ecological restoration. This modeling is complemented by detailed topographic surveying and use of streamflow sensors to monitor hydrological processes in the restoration project. We evaluate the inundation associated with model 10-, 50-, 100-, 500-, and 1,000-year floods through the study area using FLO-2D and HEC-RAS modeling environments in order to evaluate the possibility of returning surface inundation to the former cieñega surface. According to HEC-RAS model predictions, given current channel configuration, it would require a 500-year flood to overtop the channel banks and reinundate the cieñega (now terrace) surface, though the 100-year flood may lead to limited terrace surface inundation. Based on our models, 10-year floods were ~2 m from overtopping the arroyo walls, 50-year floods came ~1.5 m from overtopping the arroyos, 100-year floods were ~1.2 m from overtopping, and 500- and 1,000-year floods at least partially inundated the cieñega surface. The current topography of Silver Creek does not allow for frequent flooding of the former cieñega; model predictions

  7. Nuclear Hybrid Energy Systems FY16 Modeling Efforts at ORNL

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Greenwood, Michael Scott [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harrison, Thomas J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A. L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Guler Yigitoglu, Askin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    A nuclear hybrid system uses a nuclear reactor as the basic power generation unit. The power generated by the nuclear reactor is utilized by one or more power customers as either thermal power, electrical power, or both. In general, a nuclear hybrid system will couple the nuclear reactor to at least one thermal power user in addition to the power conversion system. The definition and architecture of a particular nuclear hybrid system is flexible depending on local markets needs and opportunities. For example, locations in need of potable water may be best served by coupling a desalination plant to the nuclear system. Similarly, an area near oil refineries may have a need for emission free hydrogen production. A nuclear hybrid system expands the nuclear power plant from its more familiar central power station role by diversifying its immediately and directly connected customer base. The definition, design, analysis, and optimization work currently performed with respect to the nuclear hybrid systems represents the work of three national laboratories. Idaho National Laboratory (INL) is the lead lab working with Argonne National Laboratory (ANL) and Oak Ridge National Laboratory. Each laboratory is providing modeling and simulation expertise for the integration of the hybrid system.

  8. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  9. Hospital outbreak control requires joint efforts from hospital management, microbiology and infection control.

    Science.gov (United States)

    Ransjö, U; Lytsy, B; Melhus, A; Aspevall, O; Artinger, C; Eriksson, B-M; Günther, G; Hambraeus, A

    2010-09-01

    An outbreak of multidrug-resistant Klebsiella pneumoniae producing the extended-spectrum beta-lactamase CTX-M15 affected 247 mainly elderly patients in more than 30 wards in a 1000-bedded swedish teaching hospital between May 2005 and August 2007. A manual search of the hospital administrative records for possible contacts between cases in wards and outpatient settings revealed a complex chain of transmission. Faecal screening identified twice as many cases as cultures from clinical samples. Transmission occurred by direct and indirect patient-to-patient contact, facilitated by patient overcrowding. Interventions included formation of a steering group with economic power, increased bed numbers, better compliance with alcohol hand disinfection and hospital dress code, better hand hygiene for patients and improved cleaning. The cost of the interventions was estimated to be euro3 million. Special infection control policies were not necessary, but resources were needed to make existing policies possible to follow, and for educational efforts to improve compliance. Copyright 2010 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.

  10. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  11. Echoes from the past: a healthy Baltic Sea requires more effort.

    Science.gov (United States)

    Kotilainen, Aarno T; Arppe, Laura; Dobosz, Slawomir; Jansen, Eystein; Kabel, Karoline; Karhu, Juha; Kotilainen, Mia M; Kuijpers, Antoon; Lougheed, Bryan C; Meier, H E Markus; Moros, Matthias; Neumann, Thomas; Porsche, Christian; Poulsen, Niels; Rasmussen, Peter; Ribeiro, Sofia; Risebrobakken, Bjørg; Ryabchuk, Daria; Schimanke, Semjon; Snowball, Ian; Spiridonov, Mikhail; Virtasalo, Joonas J; Weckström, Kaarina; Witkowski, Andrzej; Zhamoida, Vladimir

    2014-02-01

    Integrated sediment multiproxy studies and modeling were used to reconstruct past changes in the Baltic Sea ecosystem. Results of natural changes over the past 6000 years in the Baltic Sea ecosystem suggest that forecasted climate warming might enhance environmental problems of the Baltic Sea. Integrated modeling and sediment proxy studies reveal increased sea surface temperatures and expanded seafloor anoxia (in deep basins) during earlier natural warm climate phases, such as the Medieval Climate Anomaly. Under future IPCC scenarios of global warming, there is likely no improvement of bottom water conditions in the Baltic Sea. Thus, the measures already designed to produce a healthier Baltic Sea are insufficient in the long term. The interactions between climate change and anthropogenic impacts on the Baltic Sea should be considered in management, implementation of policy strategies in the Baltic Sea environmental issues, and adaptation to future climate change.

  12. Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…

  13. Does the incremental shuttle walk test require maximal effort in young obese women?

    Directory of Open Access Journals (Sweden)

    S.P. Jürgensen

    2016-01-01

    Full Text Available Obesity is a chronic disease with a multifaceted treatment approach that includes nutritional counseling, structured exercise training, and increased daily physical activity. Increased body mass elicits higher cardiovascular, ventilatory and metabolic demands to varying degrees during exercise. With functional capacity assessment, this variability can be evaluated so individualized guidance for exercise training and daily physical activity can be provided. The aim of the present study was to compare cardiovascular, ventilatory and metabolic responses obtained during a symptom-limited cardiopulmonary exercise test (CPX on a treadmill to responses obtained by the incremental shuttle walk test (ISWT in obese women and to propose a peak oxygen consumption (VO2 prediction equation through variables obtained during the ISWT. Forty obese women (BMI ≥30 kg/m2 performed one treadmill CPX and two ISWTs. Heart rate (HR, arterial blood pressure (ABP and perceived exertion by the Borg scale were measured at rest, during each stage of the exercise protocol, and throughout the recovery period. The predicted maximal heart rate (HRmax was calculated (210 – age in years (16 and compared to the HR response during the CPX. Peak VO2 obtained during CPX correlated significantly (P<0.05 with ISWT peak VO2 (r=0.79 as well as ISWT distance (r=0.65. The predictive model for CPX peak VO2, using age and ISWT distance explained 67% of the variability. The current study indicates the ISWT may be used to predict aerobic capacity in obese women when CPX is not a viable option.

  14. [Psychometric properties of the French version of the Effort-Reward Imbalance model].

    Science.gov (United States)

    Niedhammer, I; Siegrist, J; Landre, M F; Goldberg, M; Leclerc, A

    2000-10-01

    Two main models are currently used to evaluate psychosocial factors at work: the Job Strain model developed by Karasek and the Effort-Reward Imbalance model. A French version of the first model has been validated for the dimensions of psychological demands and decision latitude. As regards the second one evaluating three dimensions (extrinsic effort, reward, and intrinsic effort), there are several versions in different languages, but until recently there was no validated French version. The objective of this study was to explore the psychometric properties of the French version of the Effort-Reward Imbalance model in terms of internal consistency, factorial validity, and discriminant validity. The present study was based on the GAZEL cohort and included the 10 174 subjects who were working at the French national electric and gas company (EDF-GDF) and answered the questionnaire in 1998. A French version of Effort-Reward Imbalance was included in this questionnaire. This version was obtained by a standard forward/backward translation procedure. Internal consistency was satisfactory for the three scales of extrinsic effort, reward, and intrinsic effort: Cronbach's Alpha coefficients higher than 0.7 were observed. A one-factor solution was retained for the factor analysis of the scale of extrinsic effort. A three-factor solution was retained for the factor analysis of reward, and these dimensions were interpreted as the factor analysis of intrinsic effort did not support the expected four-dimension structure. The analysis of discriminant validity displayed significant associations between measures of Effort-Reward Imbalance and the variables of sex, age, education level, and occupational grade. This study is the first one supporting satisfactory psychometric properties of the French version of the Effort-Reward Imbalance model. However, the factorial validity of intrinsic effort could be questioned. Furthermore, as most previous studies were based on male samples

  15. Supercomputer and cluster performance modeling and analysis efforts:2004-2006.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Ganti, Anand; Meyer, Harold (Hal) Edward; Stevenson, Joel O.; Benner, Robert E., Jr. (.,; .); Goudy, Susan Phelps; Doerfler, Douglas W.; Domino, Stefan Paul; Taylor, Mark A.; Malins, Robert Joseph; Scott, Ryan T.; Barnette, Daniel Wayne; Rajan, Mahesh; Ang, James Alfred; Black, Amalia Rebecca; Laub, Thomas William; Vaughan, Courtenay Thomas; Franke, Brian Claude

    2007-02-01

    This report describes efforts by the Performance Modeling and Analysis Team to investigate performance characteristics of Sandia's engineering and scientific applications on the ASC capability and advanced architecture supercomputers, and Sandia's capacity Linux clusters. Efforts to model various aspects of these computers are also discussed. The goals of these efforts are to quantify and compare Sandia's supercomputer and cluster performance characteristics; to reveal strengths and weaknesses in such systems; and to predict performance characteristics of, and provide guidelines for, future acquisitions and follow-on systems. Described herein are the results obtained from running benchmarks and applications to extract performance characteristics and comparisons, as well as modeling efforts, obtained during the time period 2004-2006. The format of the report, with hypertext links to numerous additional documents, purposefully minimizes the document size needed to disseminate the extensive results from our research.

  16. Adaptive Effort Investment in Cognitive and Physical Tasks: A Neurocomputational Model

    Directory of Open Access Journals (Sweden)

    Tom eVerguts

    2015-03-01

    Full Text Available Despite its importance in everyday life, the computational nature of effort investment remains poorly understood. We propose an effort model obtained from optimality considerations, and a neurocomputational approximation to the optimal model. Both are couched in the framework of reinforcement learning. It is shown that choosing when or when not to exert effort can be adaptively learned, depending on rewards, costs, and task difficulty. In the neurocomputational model, the limbic loop comprising anterior cingulate cortex and ventral striatum in the basal ganglia allocates effort to cortical stimulus-action pathways whenever this is valuable. We demonstrate that the model approximates optimality. Next, we consider two hallmark effects from the cognitive control literature, namely proportion congruency and sequential congruency effects. It is shown that the model exerts both proactive and reactive cognitive control. Then, we simulate two physical effort tasks. In line with empirical work, impairing the model’s dopaminergic pathway leads to apathetic behavior. Thus, we conceptually unify the exertion of cognitive and physical effort, studied across a variety of literatures (e.g., motivation and cognitive control and animal species.

  17. Overview 2004 of NASA-Stirling Convertor CFD Model Development and Regenerator R and D Efforts

    Science.gov (United States)

    Tew, Roy C.; Dyson, Rodger W.; Wilson, Scott D.; Demko, Rikako

    2004-01-01

    This paper reports on accomplishments in 2004 in (1) development of Stirling-convertor CFD models at NASA Glenn and via a NASA grant, (2) a Stirling regenerator-research effort being conducted via a NASA grant (a follow-on effort to an earlier DOE contract), and (3) a regenerator-microfabrication contract for development of a "next-generation Stirling regenerator." Cleveland State University is the lead organization for all three grant/contractual efforts, with the University of Minnesota and Gedeon Associates as subcontractors. Also, the Stirling Technology Company and Sunpower, Inc. are both involved in all three efforts, either as funded or unfunded participants. International Mezzo Technologies of Baton Rouge, Louisiana is the regenerator fabricator for the regenerator-microfabrication contract. Results of the efforts in these three areas are summarized.

  18. Evolving Software Effort Estimation Models Using Multigene Symbolic Regression Genetic Programming

    Directory of Open Access Journals (Sweden)

    Sultan Aljahdali

    2013-12-01

    Full Text Available Software has played an essential role in engineering, economic development, stock market growth and military applications. Mature software industry count on highly predictive software effort estimation models. Correct estimation of software effort lead to correct estimation of budget and development time. It also allows companies to develop appropriate time plan for marketing campaign. Now a day it became a great challenge to get these estimates due to the increasing number of attributes which affect the software development life cycle. Software cost estimation models should be able to provide sufficient confidence on its prediction capabilities. Recently, Computational Intelligence (CI paradigms were explored to handle the software effort estimation problem with promising results. In this paper we evolve two new models for software effort estimation using Multigene Symbolic Regression Genetic Programming (GP. One model utilizes the Source Line Of Code (SLOC as input variable to estimate the Effort (E; while the second model utilize the Inputs, Outputs, Files, and User Inquiries to estimate the Function Point (FP. The proposed GP models show better estimation capabilities compared to other reported models in the literature. The validation results are accepted based Albrecht data set.

  19. Time and Effort Required by Persons with Spinal Cord Injury to Learn to Use a Powered Exoskeleton for Assisted Walking.

    Science.gov (United States)

    Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P

    2015-01-01

    Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.

  20. The intentionality model and language acquisition: engagement, effort, and the essential tension in development.

    Science.gov (United States)

    Bloom, L; Tinker, E

    2001-01-01

    The purpose of the longitudinal research reported in this Monograph was to examine language acquisition in the second year of life in the context of developments in cognition, affect, and social connectedness. The theoretical focus for the research is on the agency of the child and the importance of the child's intentionality for explaining development, rather than on language as an independent object. The model of development for the research is a Model of Intentionality with two components: the engagement in a world of persons and objects that motivates acquiring a language, and the effort that is required to express and articulate increasingly discrepant and elaborate intentional state representations. The fundamental assumption in the model is that the driving force for acquiring language is in the essential tension between engagement and effort for linguistic, emotional, and physical actions of interpretation and expression. Results of lag sequential analyses are reported to show how different behaviors--words, sentences, emotional expressions, conversational interactions, and constructing thematic relations between objects in play--converged, both in the stream of children's actions in everyday events, in real time, and in developmental time between the emergence of words at about 13 months and the transition to simple sentences at about 2 years of age. Patterns of deviation from baseline rates of the different behaviors show that child emotional expression, child speech, and mother speech clearly influence each other, and the mutual influences between them are different at times of either emergence or achievement in both language and object play. The three conclusions that follow from the results of the research are that (a) expression and interpretation are the acts of performance in which language is learned, which means that performance counts for explaining language acquisition; (b) language is not an independent object but is acquired by a child in

  1. Simple capture-recapture models permitting unequal catchability and variable sampling effort.

    Science.gov (United States)

    Agresti, A

    1994-06-01

    We consider two capture-recapture models that imply that the logit of the probability of capture is an additive function of an animal catchability parameter and a parameter reflecting the sampling effort. The models are special cases of the Rasch model, and satisfy the property of quasi-symmetry. One model is log-linear and the other is a latent class model. For the log-linear model, point and interval estimates of the population size are easily obtained using standard software, such as GLIM.

  2. Reviewing the effort-reward imbalance model: drawing up the balance of 45 empirical studies.

    Science.gov (United States)

    van Vegchel, Natasja; de Jonge, Jan; Bosma, Hans; Schaufeli, Wilmar

    2005-03-01

    The present paper provides a review of 45 studies on the Effort-Reward Imbalance (ERI) Model published from 1986 to 2003 (inclusive). In 1986, the ERI Model was introduced by Siegrist et al. (Biological and Psychological Factors in Cardiovascular Disease, Springer, Berlin, 1986, pp. 104-126; Social Science & Medicine 22 (1986) 247). The central tenet of the ERI Model is that an imbalance between (high) efforts and (low) rewards leads to (sustained) strain reactions. Besides efforts and rewards, overcommitment (i.e., a personality characteristic) is a crucial aspect of the model. Essentially, the ERI Model contains three main assumptions, which could be labeled as (1) the extrinsic ERI hypothesis: high efforts in combination with low rewards increase the risk of poor health, (2) the intrinsic overcommitment hypothesis: a high level of overcommitment may increase the risk of poor health, and (3) the interaction hypothesis: employees reporting an extrinsic ERI and a high level of overcommitment have an even higher risk of poor health. The review showed that the extrinsic ERI hypothesis has gained considerable empirical support. Results for overcommitment remain inconsistent and the moderating effect of overcommitment on the relation between ERI and employee health has been scarcely examined. Based on these review results suggestions for future research are proposed.

  3. Effort dynamics in a fisheries bioeconomic model: A vessel level approach through Game Theory

    Directory of Open Access Journals (Sweden)

    Gorka Merino

    2007-09-01

    Full Text Available Red shrimp, Aristeus antennatus (Risso, 1816 is one of the most important resources for the bottom-trawl fleets in the northwestern Mediterranean, in terms of both landings and economic value. A simple bioeconomic model introducing Game Theory for the prediction of effort dynamics at vessel level is proposed. The game is performed by the twelve vessels exploiting red shrimp in Blanes. Within the game, two solutions are performed: non-cooperation and cooperation. The first is proposed as a realistic method for the prediction of individual effort strategies and the second is used to illustrate the potential profitability of the analysed fishery. The effort strategy for each vessel is the number of fishing days per year and their objective is profit maximisation, individual profits for the non-cooperative solution and total profits for the cooperative one. In the present analysis, strategic conflicts arise from the differences between vessels in technical efficiency (catchability coefficient and economic efficiency (defined here. The ten-year and 1000-iteration stochastic simulations performed for the two effort solutions show that the best strategy from both an economic and a conservationist perspective is homogeneous effort cooperation. However, the results under non-cooperation are more similar to the observed data on effort strategies and landings.

  4. VRS Model: A Model for Estimation of Efforts and Time Duration in Development of IVR Software System

    Directory of Open Access Journals (Sweden)

    Devesh Kumar Srivastava

    2012-01-01

    Full Text Available Accurate software effort estimates are critical to measure for developers, leaders, project managers. Underestimating the costs may result in management approving proposed systems which can exceed their budgets, with underdeveloped functions and poor quality, and failure to complete on time. Various models have been derived to calculate the effort of large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. Day to day there is rapid change and growth to get new techniques and model to estimate the accurate size, effort and cost of software but still there is lack of accuracy to meet exactly the accurate effort as per company norms and standards. A BPO Company takes up a process of another company. The Company which is handling the incoming calls of customers, queries, solution, services through software is known as IVR software. In this paper the author has proposed a model named ?VRS Model? to estimate the accurate effort and schedule of IVR software applications. This model will be helpful for project managers, developers and customers to estimate accurate effort and schedule of only IVR Projects.

  5. A technique for estimating maximum harvesting effort in a stochastic fishery model

    Indian Academy of Sciences (India)

    Ram Rup Sarkar; J Chattopadhayay

    2003-06-01

    Exploitation of biological resources and the harvest of population species are commonly practiced in fisheries, forestry and wild life management. Estimation of maximum harvesting effort has a great impact on the economics of fisheries and other bio-resources. The present paper deals with the problem of a bioeconomic fishery model under environmental variability. A technique for finding the maximum harvesting effort in fluctuating environment has been developed in a two-species competitive system, which shows that under realistic environmental variability the maximum harvesting effort is less than what is estimated in the deterministic model. This method also enables us to find out the safe regions in the parametric space for which the chance of extinction of the species is minimized. A real life fishery problem has been considered to obtain the inaccessible parameters of the system in a systematic way. Such studies may help resource managers to get an idea for controlling the system.

  6. Automata networks model for alignment and least effort on vocabulary formation

    CERN Document Server

    Vera, Javier; Goles, Eric

    2015-01-01

    Can artificial communities of agents develop language with scaling relations close to the Zipf law? As a preliminary answer to this question, we propose an Automata Networks model of the formation of a vocabulary on a population of individuals, under two in principle opposite strategies: the alignment and the least effort principle. Within the previous account to the emergence of linguistic conventions (specially, the Naming Game), we focus on modeling speaker and hearer efforts as actions over their vocabularies and we study the impact of these actions on the formation of a shared language. The numerical simulations are essentially based on an energy function, that measures the amount of local agreement between the vocabularies. The results suggests that on one dimensional lattices the best strategy to the formation of shared languages is the one that minimizes the efforts of speakers on communicative tasks.

  7. Commonalities in WEPP and WEPS and efforts towards a single erosion process model

    NARCIS (Netherlands)

    Visser, S.M.; Flanagan, D.C.

    2004-01-01

    Since the late 1980's, the Agricultural Research Service (ARS) of the United States Department of Agriculture (USDA) has been developing process-based erosion models to predict water erosion and wind erosion. During much of that time, the development efforts of the Water Erosion Prediction Project (

  8. Commonalities in WEPP and WEPS and efforts towards a single erosion process model

    NARCIS (Netherlands)

    Visser, S.M.; Flanagan, D.C.

    2004-01-01

    Since the late 1980's, the Agricultural Research Service (ARS) of the United States Department of Agriculture (USDA) has been developing process-based erosion models to predict water erosion and wind erosion. During much of that time, the development efforts of the Water Erosion Prediction Project

  9. The Effect of the Demand Control and Effort Reward Imbalance Models on the Academic Burnout of Korean Adolescents

    Science.gov (United States)

    Lee, Jayoung; Puig, Ana; Lee, Sang Min

    2012-01-01

    The purpose of this study was to examine the effects of the Demand Control Model (DCM) and the Effort Reward Imbalance Model (ERIM) on academic burnout for Korean students. Specifically, this study identified the effects of the predictor variables based on DCM and ERIM (i.e., demand, control, effort, reward, Demand Control Ratio, Effort Reward…

  10. Incorporating phosphorus cycling into global modeling efforts: a worthwhile, tractable endeavor

    Science.gov (United States)

    Reed, Sasha C.; Yang, Xiaojuan; Thornton, Peter E.

    2015-01-01

    Myriad field, laboratory, and modeling studies show that nutrient availability plays a fundamental role in regulating CO2 exchange between the Earth's biosphere and atmosphere, and in determining how carbon pools and fluxes respond to climatic change. Accordingly, global models that incorporate coupled climate–carbon cycle feedbacks made a significant advance with the introduction of a prognostic nitrogen cycle. Here we propose that incorporating phosphorus cycling represents an important next step in coupled climate–carbon cycling model development, particularly for lowland tropical forests where phosphorus availability is often presumed to limit primary production. We highlight challenges to including phosphorus in modeling efforts and provide suggestions for how to move forward.

  11. Radiation Belt and Plasma Model Requirements

    Science.gov (United States)

    Barth, Janet L.

    2005-01-01

    Contents include the following: Radiation belt and plasma model environment. Environment hazards for systems and humans. Need for new models. How models are used. Model requirements. How can space weather community help?

  12. Model Calibration Efforts for the International Space Station's Solar Array Mast

    Science.gov (United States)

    Elliott, Kenny B.; Horta, Lucas G.; Templeton, Justin D.; Knight, Norman F., Jr.

    2012-01-01

    The International Space Station (ISS) relies on sixteen solar-voltaic blankets to provide electrical power to the station. Each pair of blankets is supported by a deployable boom called the Folding Articulated Square Truss Mast (FAST Mast). At certain ISS attitudes, the solar arrays can be positioned in such a way that shadowing of either one or three longerons causes an unexpected asymmetric thermal loading that if unchecked can exceed the operational stability limits of the mast. Work in this paper documents part of an independent NASA Engineering and Safety Center effort to assess the existing operational limits. Because of the complexity of the system, the problem is being worked using a building-block progression from components (longerons), to units (single or multiple bays), to assembly (full mast). The paper presents results from efforts to calibrate the longeron components. The work includes experimental testing of two types of longerons (straight and tapered), development of Finite Element (FE) models, development of parameter uncertainty models, and the establishment of a calibration and validation process to demonstrate adequacy of the models. Models in the context of this paper refer to both FE model and probabilistic parameter models. Results from model calibration of the straight longerons show that the model is capable of predicting the mean load, axial strain, and bending strain. For validation, parameter values obtained from calibration of straight longerons are used to validate experimental results for the tapered longerons.

  13. Software Project Effort Estimation Based on Multiple Parametric Models Generated Through Data Clustering

    Institute of Scientific and Technical Information of China (English)

    Juan J. Cuadrado Gallego; Daniel Rodríguez; Miguel (A)ngel Sicilia; Miguel Garre Rubio; Angel García Crespo

    2007-01-01

    Parametric software effort estimation models usually consists of only a single mathematical relationship. Withthe advent of software repositories containing data from heterogeneous projects, these types of models suffer from pooradjustment and predictive accuracy. One possible way to alleviate this problem is the use of a set of mathematical equationsobtained through dividing of the historical project datasets according to different parameters into subdatasets called parti-tions. In turn, partitions are divided into clusters that serve as a tool for more accurate models. In this paper, we describethe process, tool and results of such approach through a case study using a publicly available repository, ISBSG. Resultssuggest the adequacy of the technique as an extension of existing single-expression models without making the estimationprocess much more complex that uses a single estimation model. A tool to support the process is also presented.

  14. Competition for marine space: modelling the Baltic Sea fisheries and effort displacement under spatial restrictions

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Eigaard, Ole Ritzau

    2015-01-01

    to fishery and from vessel to vessel. The impact assessment of new spatial plans involving fisheries should be based on quantitative bioeconomic analyses that take into account individual vessel decisions, and trade-offs in cross-sector conflicting interests.Weuse a vessel-oriented decision-support tool (the...... various constraints. Interlinked spatial, technical, and biological dynamics of vessels and stocks in the scenarios result in stable profits, which compensate for the additional costs from effort displacement and release pressure on the fish stocks. The effort is further redirected away from sensitive...... benthic habitats, enhancing the ecological positive effects. The energy efficiency of some of the vessels, however, is strongly reduced with the new zonation, and some of the vessels suffer decreased profits. The DISPLACE model serves as a spatially explicit bioeconomic benchmark tool for management...

  15. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  16. Modeling and Evaluating Pilot Performance in NextGen: Review of and Recommendations Regarding Pilot Modeling Efforts, Architectures, and Validation Studies

    Science.gov (United States)

    Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.

    2013-01-01

    NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA

  17. A Framework for Modelling Software Requirements

    Directory of Open Access Journals (Sweden)

    Dhirendra Pandey

    2011-05-01

    Full Text Available Requirement engineering plays an important role in producing quality software products. In recent past years, some approaches of requirement framework have been designed to provide an end-to-end solution for system development life cycle. Textual requirements specifications are difficult to learn, design, understand, review, and maintain whereas pictorial modelling is widely recognized as an effective requirement analysis tool. In this paper, we will present a requirement modelling framework with the analysis of modern requirements modelling techniques. Also, we will discuss various domains of requirement engineering with the help of modelling elements such as semantic map of business concepts, lifecycles of business objects, business processes, business rules, system context diagram, use cases and their scenarios, constraints, and user interface prototypes. The proposed framework will be illustrated with the case study of inventory management system.

  18. Measuring Effortful Control Using the Children's Behavior Questionnaire-Very Short Form: Modeling Matters.

    Science.gov (United States)

    Backer-Grøndahl, Agathe; Nærde, Ane; Ulleberg, Pål; Janson, Harald

    2016-01-01

    Effortful control (EC) is an important concept in the research on self-regulation in children. We tested 2 alternative factor models of EC as measured by the Children's Behavior Questionnaire-Very Short Form (CBQ-VSF; Putnam & Rothbart, 2006 ) in a large sample of preschoolers (N = 1,007): 1 lower order and 1 hierarchical second-order structure. Additionally, convergent and predictive validity of EC as measured by the CBQ-VSF were investigated. The results supported a hierarchical model. Moderate convergent validity of the second-order latent EC factor was found in that it correlated with compliance and observed EC tasks. Both CBQ-VSF EC measures were also negatively correlated with child physical aggression. The results have implications for the measurement, modeling, and interpretation of EC applying the CBQ.

  19. The NASA-Langley Wake Vortex Modelling Effort in Support of an Operational Aircraft Spacing System

    Science.gov (United States)

    Proctor, Fred H.

    1998-01-01

    Two numerical modelling efforts, one using a large eddy simulation model and the other a numerical weather prediction model, are underway in support of NASA's Terminal Area Productivity program. The large-eddy simulation model (LES) has a meteorological framework and permits the interaction of wake vortices with environments characterized by crosswind shear, stratification, humidity, and atmospheric turbulence. Results from the numerical simulations are being used to assist in the development of algorithms for an operational wake-vortex aircraft spacing system. A mesoscale weather forecast model is being adapted for providing operational forecast of winds, temperature, and turbulence parameters to be used in the terminal area. This paper describes the goals and modelling approach, as well as achievements obtained to date. Simulation results will be presented from the LES model for both two and three dimensions. The 2-D model is found to be generally valid for studying wake vortex transport, while the 3-D approach is necessary for realistic treatment of decay via interaction of wake vortices and atmospheric boundary layer turbulence. Meteorology is shown to have an important affect on vortex transport and decay. Presented are results showing that wake vortex transport is unaffected by uniform fog or rain, but wake vortex transport can be strongly affected by nonlinear vertical change in the ambient crosswind. Both simulation and observations show that atmospheric vortices decay from the outside with minimal expansion of the core. Vortex decay and the onset three-dimensional instabilities are found to be enhanced by the presence of ambient turbulence.

  20. Incorporating S-shaped testing-effort functions into NHPP software reliability model with imperfect debugging

    Institute of Scientific and Technical Information of China (English)

    Qiuying Li; Haifeng Li; Minyan Lu

    2015-01-01

    Testing-effort (TE) and imperfect debugging (ID) in the reliability modeling process may further improve the fitting and pre-diction results of software reliability growth models (SRGMs). For describing the S-shaped varying trend of TE increasing rate more accurately, first, two S-shaped testing-effort functions (TEFs), i.e., delayed S-shaped TEF (DS-TEF) and inflected S-shaped TEF (IS-TEF), are proposed. Then these two TEFs are incorporated into various types (exponential-type, delayed S-shaped and in-flected S-shaped) of non-homogeneous Poisson process (NHPP) SRGMs with two forms of ID respectively for obtaining a series of new NHPP SRGMs which consider S-shaped TEFs as wel as ID. Final y these new SRGMs and several comparison NHPP SRGMs are applied into four real failure data-sets respectively for investigating the fitting and prediction power of these new SRGMs. The experimental results show that: (i) the proposed IS-TEF is more suitable and flexible for describing the consumption of TE than the previous TEFs; (i ) incorporating TEFs into the inflected S-shaped NHPP SRGM may be more effective and appropriate compared with the exponential-type and the delayed S-shaped NHPP SRGMs; (i i) the inflected S-shaped NHPP SRGM con-sidering both IS-TEF and ID yields the most accurate fitting and prediction results than the other comparison NHPP SRGMs.

  1. Differences in Perceived Mental Effort Required and Discomfort during a Working Memory Task between Individuals At-risk And Not At-risk for ADHD

    Science.gov (United States)

    Hsu, Chia-Fen; Eastwood, John D.; Toplak, Maggie E.

    2017-01-01

    Objective: The avoidance of mental effort is a symptom criterion for Attention-Deficit/Hyperactivity Disorder (ADHD), but the experience of mental effort has received relatively little attention in the empirical study of individuals at-risk for ADHD. We explored a novel method to assess the experience of effort and discomfort during a working memory task in a sample of young adults at-risk and not at-risk for ADHD. Method: A sample of 235 undergraduate students (Mean age = 21.02, 86 males) were included in this study. Based on an ADHD-screener (ASRS), 136 participants met criteria for the ADHD-risk group and 99 were in the non-ADHD risk group. Results: Individuals at-risk for ADHD reported higher mental effort and discomfort than individuals not at-risk for ADHD, even when performance on the working memory task was comparable or statistically controlled. Mental effort required and discomfort were more strongly correlated for at-risk compared to not at-risk participants. Individuals at-risk for ADHD displayed a stronger correlation between mental effort required and actual accuracy, but individuals not at-risk for ADHD displayed a stronger association between perceived accuracy and actual accuracy for the hardest experimental conditions. The most intense moment of effort required predicted retrospective discomfort ratings of the task in the ADHD-risk group, but not in the non-risk group. Conclusion: The subjective experience of in the moment mental effort is an important and viable construct that should be more carefully defined and measured. In particular, the experience of effort required (or how taxing a task is) differentiated between individuals at-risk and individuals not at-risk for ADHD in the present study. Whereas previous ADHD research has explored effort exerted, the present work demonstrated that investigating the experience of being mentally taxed might provide a productive line of investigation that could be used to advance our understanding of the

  2. An Extended Analysis of Requirements Traceability Model

    Institute of Scientific and Technical Information of China (English)

    Jiang Dandong(蒋丹东); Zhang Shensheng; Chen Lu

    2004-01-01

    A new extended meta model of traceability is presented. Then, a formalized fine-grained model of traceability is described. Some major issues about this model, including trace units, requirements and relations within the model, are further analyzed. Finally, a case study that comes from a key project of 863 Program is given.

  3. Simulation and Modeling Efforts to Support Decision Making in Healthcare Supply Chain Management

    Directory of Open Access Journals (Sweden)

    Eman AbuKhousa

    2014-01-01

    Full Text Available Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM by improving the decision making pertaining processes’ efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges.

  4. Simulation and modeling efforts to support decision making in healthcare supply chain management.

    Science.gov (United States)

    AbuKhousa, Eman; Al-Jaroodi, Jameela; Lazarova-Molnar, Sanja; Mohamed, Nader

    2014-01-01

    Recently, most healthcare organizations focus their attention on reducing the cost of their supply chain management (SCM) by improving the decision making pertaining processes' efficiencies. The availability of products through healthcare SCM is often a matter of life or death to the patient; therefore, trial and error approaches are not an option in this environment. Simulation and modeling (SM) has been presented as an alternative approach for supply chain managers in healthcare organizations to test solutions and to support decision making processes associated with various SCM problems. This paper presents and analyzes past SM efforts to support decision making in healthcare SCM and identifies the key challenges associated with healthcare SCM modeling. We also present and discuss emerging technologies to meet these challenges.

  5. An effort for developing a seamless transport modeling and remote sensing system for air pollutants

    Science.gov (United States)

    Nakajima, T.; Goto, D.; Dai, T.; Misawa, S.; Uchida, J.; Schutgens, N.; Hashimoto, M.; Oikawa, E.; Takenaka, H.; Tsuruta, H.; Inoue, T.; Higurashi, A.

    2015-12-01

    Wide area of the globe, like Asian region, still suffers from a large emission of air pollutants and cause serious impacts on the earth's climate and the public health of the area. Launch of an international initiative, Climate and Clean Air Coalition (CCAC), is an example of efforts to ease the difficulties by reducing Short-Lived Climate Pollutants (SLCPs), i.e., black carbon aerosol, methane and other short-lived atmospheric materials that heat the earth's system, along with long-lived greenhouse gas mitigation. Impact evaluation of the air pollutants, however, has large uncertainties. We like to introduce a recent effort of projects MEXT/SALSA and MOEJ/S-12 to develop a seamless transport model for atmospheric constituents, NICAM-Chem, that is flexible enough to cover global scale to regional scale by the NICAM nonhydrostatic dynamic core (NICAM), coupled with SPRINTARS aerosol model, CHASER atmospheric chemistry model and with their three computational grid systems, i.e. quasi homogeneous grids, stretched grids and diamond grids. A local ensemble transform Kalman filter/smoother with this modeling system was successfully applied to data from MODIS, AERONET, and CALIPSO for global assimilation/inversion and surface SPM and SO2 air pollution monitoring networks for Japanese area assimilation. My talk will be extended to discuss an effective utility of satellite remote sensing of aerosols using Cloud and Aerosol Imager (CAI) on board the GOSAT satellite and Advanced Himawari Imager (AHI) on board the new third generation geostationary satellite, Himawari-8. The CAI has a near-ultraviolet channel of 380nm with 500m spatial resolution and the AHI has high frequency measurement capability of every 10 minutes. These functions are very effective for accurate land aerosol remote sensing, so that a combination with the developed aerosol assimilation system is promising.

  6. Long-term dynamics simulation: Modeling requirements

    Energy Technology Data Exchange (ETDEWEB)

    Morched, A.S.; Kar, P.K.; Rogers, G.J.; Morison, G.K. (Ontario Hydro, Toronto, ON (Canada))

    1989-12-01

    This report details the required performance and modelling capabilities of a computer program intended for the study of the long term dynamics of power systems. Following a general introduction which outlines the need for long term dynamic studies, the modelling requirements for the conduct of such studies is discussed in detail. Particular emphasis is placed on models for system elements not normally modelled in power system stability programs, which will have a significant impact in the long term time frame of minutes to hours following the initiating disturbance. The report concludes with a discussion of the special computational and programming requirements for a long term stability program. 43 refs., 36 figs.

  7. Behavioral modeling of human choices reveals dissociable effects of physical effort and temporal delay on reward devaluation.

    Science.gov (United States)

    Klein-Flügge, Miriam C; Kennerley, Steven W; Saraiva, Ana C; Penny, Will D; Bestmann, Sven

    2015-03-01

    There has been considerable interest from the fields of biology, economics, psychology, and ecology about how decision costs decrease the value of rewarding outcomes. For example, formal descriptions of how reward value changes with increasing temporal delays allow for quantifying individual decision preferences, as in animal species populating different habitats, or normal and clinical human populations. Strikingly, it remains largely unclear how humans evaluate rewards when these are tied to energetic costs, despite the surge of interest in the neural basis of effort-guided decision-making and the prevalence of disorders showing a diminished willingness to exert effort (e.g., depression). One common assumption is that effort discounts reward in a similar way to delay. Here we challenge this assumption by formally comparing competing hypotheses about effort and delay discounting. We used a design specifically optimized to compare discounting behavior for both effort and delay over a wide range of decision costs (Experiment 1). We then additionally characterized the profile of effort discounting free of model assumptions (Experiment 2). Contrary to previous reports, in both experiments effort costs devalued reward in a manner opposite to delay, with small devaluations for lower efforts, and progressively larger devaluations for higher effort-levels (concave shape). Bayesian model comparison confirmed that delay-choices were best predicted by a hyperbolic model, with the largest reward devaluations occurring at shorter delays. In contrast, an altogether different relationship was observed for effort-choices, which were best described by a model of inverse sigmoidal shape that is initially concave. Our results provide a novel characterization of human effort discounting behavior and its first dissociation from delay discounting. This enables accurate modelling of cost-benefit decisions, a prerequisite for the investigation of the neural underpinnings of effort

  8. The effort-reward imbalance work-stress model and daytime salivary cortisol and dehydroepiandrosterone (DHEA) among Japanese women.

    Science.gov (United States)

    Ota, Atsuhiko; Mase, Junji; Howteerakul, Nopporn; Rajatanun, Thitipat; Suwannapong, Nawarat; Yatsuya, Hiroshi; Ono, Yuichiro

    2014-09-17

    We examined the influence of work-related effort-reward imbalance and overcommitment to work (OC), as derived from Siegrist's Effort-Reward Imbalance (ERI) model, on the hypothalamic-pituitary-adrenocortical (HPA) axis. We hypothesized that, among healthy workers, both cortisol and dehydroepiandrosterone (DHEA) secretion would be increased by effort-reward imbalance and OC and, as a result, cortisol-to-DHEA ratio (C/D ratio) would not differ by effort-reward imbalance or OC. The subjects were 115 healthy female nursery school teachers. Salivary cortisol, DHEA, and C/D ratio were used as indexes of HPA activity. Mixed-model analyses of variance revealed that neither the interaction between the ERI model indicators (i.e., effort, reward, effort-to-reward ratio, and OC) and the series of measurement times (9:00, 12:00, and 15:00) nor the main effect of the ERI model indicators was significant for daytime salivary cortisol, DHEA, or C/D ratio. Multiple linear regression analyses indicated that none of the ERI model indicators was significantly associated with area under the curve of daytime salivary cortisol, DHEA, or C/D ratio. We found that effort, reward, effort-reward imbalance, and OC had little influence on daytime variation patterns, levels, or amounts of salivary HPA-axis-related hormones. Thus, our hypotheses were not supported.

  9. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Child Care: State Efforts To Enforce Safety and Health Requirements. United States General Accounting Office Report to Congressional Requesters.

    Science.gov (United States)

    Fagnoni, Cynthia M.

    Although states must certify that they have requirements to protect the health and safety of children in child care in order to receive Child Care and Development Block Grant funds, neither the scope nor stringency of these requirements has been stipulated. At the request of Congressional members, this report identifies the most critical…

  11. Mental Effort and Perceptions of TV and Books: A Dutch Replication Study Based on Salomon's Model of Learning.

    Science.gov (United States)

    Beentjes, Hans W. J.

    This comparison of students' learning from reading books and from watching television uses Gavriel Salomon's model of learning effects, which is based on the amount of mental effort invested (AIME) in a medium as determining how deeply the information from that medium is processed. Mental effort, in turn, is predicted to depend on two perceptions…

  12. Habitat models to assist plant protection efforts in Shenandoah National Park, Virginia, USA

    Science.gov (United States)

    Van Manen, F.T.; Young, J.A.; Thatcher, C.A.; Cass, W.B.; Ulrey, C.

    2005-01-01

    During 2002, the National Park Service initiated a demonstration project to develop science-based law enforcement strategies for the protection of at-risk natural resources, including American ginseng (Panax quinquefolius L.), bloodroot (Sanguinaria canadensis L.), and black cohosh (Cimicifuga racemosa (L.) Nutt. [syn. Actaea racemosa L.]). Harvest pressure on these species is increasing because of the growing herbal remedy market. We developed habitat models for Shenandoah National Park and the northern portion of the Blue Ridge Parkway to determine the distribution of favorable habitats of these three plant species and to demonstrate the use of that information to support plant protection activities. We compiled locations for the three plant species to delineate favorable habitats with a geographic information system (GIS). We mapped potential habitat quality for each species by calculating a multivariate statistic, Mahalanobis distance, based on GIS layers that characterized the topography, land cover, and geology of the plant locations (10-m resolution). We tested model performance with an independent dataset of plant locations, which indicated a significant relationship between Mahalanobis distance values and species occurrence. We also generated null models by examining the distribution of the Mahalanobis distance values had plants been distributed randomly. For all species, the habitat models performed markedly better than their respective null models. We used our models to direct field searches to the most favorable habitats, resulting in a sizeable number of new plant locations (82 ginseng, 73 bloodroot, and 139 black cohosh locations). The odds of finding new plant locations based on the habitat models were 4.5 (black cohosh) to 12.3 (American ginseng) times greater than random searches; thus, the habitat models can be used to improve the efficiency of plant protection efforts, (e.g., marking of plants, law enforcement activities). The field searches also

  13. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  14. The complex nature of mixed farming systems requires multidimensional actions supported by integrative research and development efforts

    DEFF Research Database (Denmark)

    González-García, E; Gourdine, J L; Alexandre, G

    2012-01-01

    the requirement for a change in research strategies and initiatives through the development of a complex but necessary multi-/inter-/trans-disciplinary teamwork spirit. We stress as essential the collaboration and active participation of local and regional actors, stakeholders and end-users in the identification...

  15. Two models at work : A study of interactions and specificity in relation to the Demand-Control Model and the Effort-Reward Imbalance Model

    NARCIS (Netherlands)

    Vegchel, N.

    2005-01-01

    To investigate the relation between work and employee health, several work stress models, e.g., the Demand-Control (DC) Model and the Effort-Reward Imbalance (ERI) Model, have been developed. Although these models focus on job demands and job resources, relatively little attention has been devoted

  16. Two models at work : A study of interactions and specificity in relation to the Demand-Control Model and the Effort-Reward Imbalance Model

    NARCIS (Netherlands)

    Vegchel, N.

    2005-01-01

    To investigate the relation between work and employee health, several work stress models, e.g., the Demand-Control (DC) Model and the Effort-Reward Imbalance (ERI) Model, have been developed. Although these models focus on job demands and job resources, relatively little attention has been devoted t

  17. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  18. Necessity and Requirements of a Collaborative Effort to Develop a Large Wind Turbine Blade Test Facility in North America

    Energy Technology Data Exchange (ETDEWEB)

    Cotrell, J.; Musial, W.; Hughes, S.

    2006-05-01

    The wind power industry in North America has an immediate need for larger blade test facilities to ensure the survival of the industry. Blade testing is necessary to meet certification and investor requirements and is critical to achieving the reliability and blade life needed for the wind turbine industry to succeed. The U.S. Department of Energy's (DOE's) Wind Program is exploring options for collaborating with government, private, or academic entities in a partnership to build larger blade test facilities in North America capable of testing blades up to at least 70 m in length. The National Renewable Energy Laboratory (NREL) prepared this report for DOE to describe the immediate need to pursue larger blade test facilities in North America, categorize the numerous prospective partners for a North American collaboration, and document the requirements for a North American test facility.

  19. Economic effort management in multispecies fisheries: the FcubEcon model

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans; Ulrich, Clara

    2010-01-01

    in the development of management tools based on fleets, fisheries, and areas, rather than on unit fish stocks. A natural consequence of this has been to consider effort rather than quota management, a final effort decision being based on fleet-harvest potential and fish-stock-preservation considerations. Effort...... allocation between fleets should not be based on biological considerations alone, but also on the economic behaviour of fishers, because fisheries management has a significant impact on human behaviour as well as on ecosystem development. The FcubEcon management framework for effort allocation between fleets...... optimal manner, in both effort-management and single-quota management settings.Applying single-species assessment and quotas in multispecies fisheries can lead to overfishing or quota underutilization, because advice can be conflicting when different stocks are caught within the same fishery. During...

  20. Upending the social ecological model to guide health promotion efforts toward policy and environmental change.

    Science.gov (United States)

    Golden, Shelley D; McLeroy, Kenneth R; Green, Lawrence W; Earp, Jo Anne L; Lieberman, Lisa D

    2015-04-01

    Efforts to change policies and the environments in which people live, work, and play have gained increasing attention over the past several decades. Yet health promotion frameworks that illustrate the complex processes that produce health-enhancing structural changes are limited. Building on the experiences of health educators, community activists, and community-based researchers described in this supplement and elsewhere, as well as several political, social, and behavioral science theories, we propose a new framework to organize our thinking about producing policy, environmental, and other structural changes. We build on the social ecological model, a framework widely employed in public health research and practice, by turning it inside out, placing health-related and other social policies and environments at the center, and conceptualizing the ways in which individuals, their social networks, and organized groups produce a community context that fosters healthy policy and environmental development. We conclude by describing how health promotion practitioners and researchers can foster structural change by (1) conveying the health and social relevance of policy and environmental change initiatives, (2) building partnerships to support them, and (3) promoting more equitable distributions of the resources necessary for people to meet their daily needs, control their lives, and freely participate in the public sphere.

  1. Overview 2004 of NASA Stirling-Convertor CFD-Model Development and Regenerator R&D Efforts

    Science.gov (United States)

    Tew, Roy C.; Dyson, Rodger W.; Wilson, Scott D.; Demko, Rikako

    2005-01-01

    This paper reports on accomplishments in 2004 in development of Stirling-convertor CFD model at NASA GRC and via a NASA grant, a Stirling regenerator-research effort being conducted via a NASA grant (a follow-on effort to an earlier DOE contract), and a regenerator-microfabrication contract for development of a "next-generation Stirling regenerator." Cleveland State University is the lead organization for all three grant/contractual efforts, with the University of Minnesota and Gedeor Associates as subcontractors. Also, the Stirling Technology Co. and Sunpower, Inc. are both involved in all three efforts, either as funded or unfunded participants. International Mezzo Technologies of Baton Rouge, LA is the regenerator fabricator for the regenerator-microfabrication contract. Results of the efforts in these three areas are summarized.

  2. Early efforts in modeling the incubation period of infectious diseases with an acute course of illness

    Directory of Open Access Journals (Sweden)

    Nishiura Hiroshi

    2007-05-01

    Full Text Available Abstract The incubation period of infectious diseases, the time from infection with a microorganism to onset of disease, is directly relevant to prevention and control. Since explicit models of the incubation period enhance our understanding of the spread of disease, previous classic studies were revisited, focusing on the modeling methods employed and paying particular attention to relatively unknown historical efforts. The earliest study on the incubation period of pandemic influenza was published in 1919, providing estimates of the incubation period of Spanish flu using the daily incidence on ships departing from several ports in Australia. Although the study explicitly dealt with an unknown time of exposure, the assumed periods of exposure, which had an equal probability of infection, were too long, and thus, likely resulted in slight underestimates of the incubation period. After the suggestion that the incubation period follows lognormal distribution, Japanese epidemiologists extended this assumption to estimates of the time of exposure during a point source outbreak. Although the reason why the incubation period of acute infectious diseases tends to reveal a right-skewed distribution has been explored several times, the validity of the lognormal assumption is yet to be fully clarified. At present, various different distributions are assumed, and the lack of validity in assuming lognormal distribution is particularly apparent in the case of slowly progressing diseases. The present paper indicates that (1 analysis using well-defined short periods of exposure with appropriate statistical methods is critical when the exact time of exposure is unknown, and (2 when assuming a specific distribution for the incubation period, comparisons using different distributions are needed in addition to estimations using different datasets, analyses of the determinants of incubation period, and an understanding of the underlying disease mechanisms.

  3. Developing a primary care research agenda through collaborative efforts - a proposed "6E" model.

    Science.gov (United States)

    Tan, Ngiap Chuan; Ng, Chirk Jenn; Rosemary, Mitchell; Wahid, Khan; Goh, Lee Gan

    2014-01-01

    Primary care research is at a crossroad in South Pacific. A steering committee comprising a member of WONCA Asia Pacific Regional (APR) council and the President of Fiji College of General Practitioners garnered sponsorship from Fiji Ministry of Health, WONCA APR and pharmaceutical agencies to organize the event in October 2013. This paper describes the processes needed to set up a national primary research agenda through the collaborative efforts of local stakeholders and external facilitators using a test case in South Pacific. The setting was a 2-day primary care research workshop in Fiji. The steering committee invited a team of three external facilitators from the Asia-Pacific region to organize and operationalize the workshop. The eventual participants were 3 external facilitators, 6 local facilitators, and 29 local primary care physicians, academics, and local medical leaders from Fiji and South Pacific Islands. Pre-workshop and main workshop programs were drawn up by the external facilitators, using participants' input of research topics relating to their local clinical issues of interest. Course notes were prepared and distributed before the workshop. In the workshop, proposed research topics were shortlisted by group discussion and consensus. Study designs were proposed, scrutinized, and adopted for further research development. The facilitators reviewed the processes in setting the research agenda after the workshop and conceived the proposed 6E model. These processes can be grouped for easy reference, comprising the pre-workshop stages of "entreat", "enlist", "engage", and the workshop stages of "educe", "empower", and "encapsulate". The 6E model to establish a research agenda is conceptually logical. Its feasibility can be further tested in its application in other situation where research agenda setting is the critical step to improve the quality of primary care.

  4. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  5. Overview of past, ongoing and future efforts of the integrated modeling of global change for Northern Eurasia

    Science.gov (United States)

    Monier, Erwan; Kicklighter, David; Sokolov, Andrei; Zhuang, Qianlai; Melillo, Jerry; Reilly, John

    2016-04-01

    Northern Eurasia is both a major player in the global carbon budget (it includes roughly 70% of the Earth's boreal forest and more than two-thirds of the Earth's permafrost) and a region that has experienced dramatic climate change (increase in temperature, growing season length, floods and droughts) over the past century. Northern Eurasia has also undergone significant land-use change, both driven by human activity (including deforestation, expansion of agricultural lands and urbanization) and natural disturbances (such as wildfires and insect outbreaks). These large environmental and socioeconomic impacts have major implications for the carbon cycle in the region. Northern Eurasia is made up of a diverse set of ecosystems that range from tundra to forests, with significant areas of croplands and pastures as well as deserts, with major urban areas. As such, it represents a complex system with substantial challenges for the modeling community. In this presentation, we provide an overview of past, ongoing and possible future efforts of the integrated modeling of global change for Northern Eurasia. We review the variety of existing modeling approaches to investigate specific components of Earth system dynamics in the region. While there are a limited number of studies that try to integrate various aspects of the Earth system (through scale, teleconnections or processes), we point out that there are few systematic analyses of the various feedbacks within the Earth system (between components, regions or scale). As a result, there is a lack of knowledge of the relative importance of such feedbacks, and it is unclear how policy relevant current studies are that fail to account for these feedbacks. We review the role of Earth system models, and their advantages/limitations compared to detailed single component models. We further introduce the human activity system (global trade, economic models, demographic model and so on), the need for coupled human/earth system models

  6. Dynamic modeling efforts for system interface studies for nuclear hydrogen production.

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, R. B.; Nuclear Engineering Division

    2007-08-15

    System interface studies require not only identifying economically optimal equipment configurations, which involves studying mainly full power steady-state operation, but also assessing the operability of a design during load change and startup and assessing safety-related behavior during upset conditions. This latter task is performed with a dynamic simulation code. This report reviews the requirements of such a code. It considers the types of transients that will need to be simulated, the phenomena that will be present, the models best suited for representing the phenomena, and the type of numerical solution scheme for solving the models to obtain the dynamic response of the combined nuclear-hydrogen plant. Useful insight into plant transient behavior prior to running a dynamics code is obtained by some simple methods that take into account component time constants and energy capacitances. Methods for determining reactor stability, plant startup time, and temperature response during load change, and tripping of the reactor are described. Some preliminary results are presented.

  7. Effects of fishing effort allocation scenarios on energy efficiency and profitability: an individual-based model applied to Danish fisheries

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Andersen, Bo Sølgaard

    2010-01-01

    engine specifications, and fish and fuel prices. The outcomes of scenarios A and B indicate a trade-off between fuel savings and energy efficiency improvements when effort is displaced closer to the harbour compared to reductions in total landing amounts and profit. Scenario C indicates that historic...... efficiency (quantity of fish caught per litre of fuel used), and profitability are factors that we simulated in developing a spatially explicit individual-based model (IBM) for fishing vessel movements. The observed spatial and seasonal patterns of fishing effort for each fishing activity are evaluated...... to the harbour, and (C) allocating effort towards optimising the expected area-specific profit per trip. The model is informed by data from each Danish fishing vessel >15 m after coupling its high resolution spatial and temporal effort data (VMS) with data from logbook landing declarations, sales slips, vessel...

  8. User Requirements and Domain Model Engineering

    NARCIS (Netherlands)

    Specht, Marcus; Glahn, Christian

    2006-01-01

    Specht, M., & Glahn, C. (2006). User requirements and domain model engineering. Presentation at International Workshop in Learning Networks for Lifelong Competence Development. March, 30-31, 2006. Sofia, Bulgaria: TENCompetence Conference. Retrieved June 30th, 2006, from http://dspace.learningnetwor

  9. User Requirements and Domain Model Engineering

    NARCIS (Netherlands)

    Specht, Marcus; Glahn, Christian

    2006-01-01

    Specht, M., & Glahn, C. (2006). User requirements and domain model engineering. Presentation at International Workshop in Learning Networks for Lifelong Competence Development. March, 30-31, 2006. Sofia, Bulgaria: TENCompetence Conference. Retrieved June 30th, 2006, from http://dspace.learningnetwor

  10. Multiparametric modeling of the ineffective efforts in assisted ventilation within an ICU.

    Science.gov (United States)

    Chouvarda, I G; Babalis, D; Papaioannou, V; Maglaveras, N; Georgopoulos, D

    2016-03-01

    In the context of assisted ventilation in ICU, it is of vital importance to keep a high synchronization between the patient's attempt to breath and the assisted ventilation event, so that the patient receives the ventilation support requested. In this work, experimental equipment is employed, which allows for unobtrusive and continuous monitoring of a multiple relevant bioparameters. These are meant to guide the medical professionals in appropriately adapting the treatment and fine-tune the ventilation. However, synchronization phenomena of different origin (neurological, mechanical, ventilation parameters) may occur, which vary among patients, and during the course of monitoring of a single patient, the timely recognition of which is challenging even for experts. The dynamics and complex causal relations among bioparameters and the ventilation synchronization are not well studied. The purpose of this work is to elaborate on a methodology toward modeling the ventilation synchronization failures based on the evolution of monitored bioparameters. Principal component analysis is employed for the transformation into a small number of features and the investigation of repeating patterns and clusters within measurements. Using these features, nonlinear prediction models based on support vector machines regression are explored, in terms of what past knowledge is required and what is the future horizon that can be predicted. The proposed model shows good correlation (over 0.74) with the actual outputs, constituting an encouraging step toward understanding of ICU ventilation dynamic phenomena.

  11. Modeling requirements for in situ vitrification

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom.

  12. Application of the limited strength model of self-regulation to understanding exercise effort, planning and adherence.

    Science.gov (United States)

    Martin Ginis, Kathleen A; Bray, Steven R

    2010-12-01

    The limited strength model posits that self-regulatory strength is a finite, renewable resource that is drained when people attempt to regulate their emotions, thoughts or behaviours. The purpose of this study was to determine whether self-regulatory depletion can explain lapses in exercise effort, planning and adherence. In a lab-based experiment, participants exposed to a self-regulatory depletion manipulation generated lower levels of work during a 10 min bicycling task, and planned to exert less effort during an upcoming exercise bout, compared with control participants. The magnitude of reduction in planned exercise effort predicted exercise adherence over a subsequent 8-week period. Together, these results suggest that self-regulatory depletion can influence exercise effort, planning and decision-making and that the depletion of self-regulatory resources can explain episodes of exercise non-adherence both in the lab and in everyday life.

  13. Mental effort

    NARCIS (Netherlands)

    Kirschner, Paul A.; Kirschner, Femke

    2013-01-01

    Kirschner, P. A., & Kirschner, F. (2012). Mental effort. In N. Seel (Ed.), Encyclopedia of the sciences of learning, Volume 5 (pp. 2182-2184). New York, NY: Springer. doi:10.1007/978-1-4419-1428-6_226

  14. One State's Systems Change Efforts to Reduce Child Care Expulsion: Taking the Pyramid Model to Scale

    Science.gov (United States)

    Vinh, Megan; Strain, Phil; Davidon, Sarah; Smith, Barbara J.

    2016-01-01

    This article describes the efforts funded by the state of Colorado to address unacceptably high rates of expulsion from child care. Based on the results of a 2006 survey, the state of Colorado launched two complementary policy initiatives in 2009 to impact expulsion rates and to improve the use of evidence-based practices related to challenging…

  15. Modeling Psychological Empowerment among Youth Involved in Local Tobacco Control Efforts

    Science.gov (United States)

    Holden, Debra J.; Evans, W. Douglas; Hinnant, Laurie W.; Messeri, Peter

    2005-01-01

    The American Legacy Foundation funded 13 state health departments for their Statewide Youth Movement Against Tobacco Use in September 2000. Its goal was to create statewide tobacco control initiatives implemented with youth leadership. The underlying theory behind these initiatives was that tobacco control efforts can best be accomplished by…

  16. Understanding requirements via natural language information modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sharp, J.K.; Becker, S.D.

    1993-07-01

    Information system requirements that are expressed as simple English sentences provide a clear understanding of what is needed between system specifiers, administrators, users, and developers of information systems. The approach used to develop the requirements is the Natural-language Information Analysis Methodology (NIAM). NIAM allows the processes, events, and business rules to be modeled using natural language. The natural language presentation enables the people who deal with the business issues that are to be supported by the information system to describe exactly the system requirements that designers and developers will implement. Computer prattle is completely eliminated from the requirements discussion. An example is presented that is based upon a section of a DOE Order involving nuclear materials management. Where possible, the section is analyzed to specify the process(es) to be done, the event(s) that start the process, and the business rules that are to be followed during the process. Examples, including constraints, are developed. The presentation steps through the modeling process and shows where the section of the DOE Order needs clarification, extensions or interpretations that could provide a more complete and accurate specification.

  17. Effortful echolalia.

    Science.gov (United States)

    Hadano, K; Nakamura, H; Hamanaka, T

    1998-02-01

    We report three cases of effortful echolalia in patients with cerebral infarction. The clinical picture of speech disturbance is associated with Type 1 Transcortical Motor Aphasia (TCMA, Goldstein, 1915). The patients always spoke nonfluently with loss of speech initiative, dysarthria, dysprosody, agrammatism, and increased effort and were unable to repeat sentences longer than those containing four or six words. In conversation, they first repeated a few words spoken to them, and then produced self initiated speech. The initial repetition as well as the subsequent self initiated speech, which were realized equally laboriously, can be regarded as mitigated echolalia (Pick, 1924). They were always aware of their own echolalia and tried to control it without effect. These cases demonstrate that neither the ability to repeat nor fluent speech are always necessary for echolalia. The possibility that a lesion in the left medial frontal lobe, including the supplementary motor area, plays an important role in effortful echolalia is discussed.

  18. Health Promotion Efforts as Predictors of Physical Activity in Schools: An Application of the Diffusion of Innovations Model

    Science.gov (United States)

    Glowacki, Elizabeth M.; Centeio, Erin E.; Van Dongen, Daniel J.; Carson, Russell L.; Castelli, Darla M.

    2016-01-01

    Background: Implementing a comprehensive school physical activity program (CSPAP) effectively addresses public health issues by providing opportunities for physical activity (PA). Grounded in the Diffusion of Innovations model, the purpose of this study was to identify how health promotion efforts facilitate opportunities for PA. Methods: Physical…

  19. Modelling detectability of kiore (Rattus exulans) on Aguiguan, Mariana Islands, to inform possible eradication and monitoring efforts

    Science.gov (United States)

    Adams, A.A.Y.; Stanford, J.W.; Wiewel, A.S.; Rodda, G.H.

    2011-01-01

    Estimating the detection probability of introduced organisms during the pre-monitoring phase of an eradication effort can be extremely helpful in informing eradication and post-eradication monitoring efforts, but this step is rarely taken. We used data collected during 11 nights of mark-recapture sampling on Aguiguan, Mariana Islands, to estimate introduced kiore (Rattus exulans Peale) density and detection probability, and evaluated factors affecting detectability to help inform possible eradication efforts. Modelling of 62 captures of 48 individuals resulted in a model-averaged density estimate of 55 kiore/ha. Kiore detection probability was best explained by a model allowing neophobia to diminish linearly (i.e. capture probability increased linearly) until occasion 7, with additive effects of sex and cumulative rainfall over the prior 48 hours. Detection probability increased with increasing rainfall and females were up to three times more likely than males to be trapped. In this paper, we illustrate the type of information that can be obtained by modelling mark-recapture data collected during pre-eradication monitoring and discuss the potential of using these data to inform eradication and posteradication monitoring efforts. ?? New Zealand Ecological Society.

  20. Hybrid supply chain model for material requirement planning under financial constraints: A case study

    Science.gov (United States)

    Curci, Vita; Dassisti, Michele; Josefa, Mula Bru; Manuel, Díaz Madroñero

    2014-10-01

    Supply chain model (SCM) are potentially capable to integrate different aspects in supporting decision making for enterprise management tasks. The aim of the paper is to propose an hybrid mathematical programming model for optimization of production requirements resources planning. The preliminary model was conceived bottom-up from a real industrial case analysed oriented to maximize cash flow. Despite the intense computational effort required to converge to a solution, optimisation done brought good result in solving the objective function.

  1. Millimeter wave satellite communication studies. Results of the 1981 propagation modeling effort

    Science.gov (United States)

    Stutzman, W. L.; Tsolakis, A.; Dishman, W. K.

    1982-12-01

    Theoretical modeling associated with rain effects on millimeter wave propagation is detailed. Three areas of work are discussed. A simple model for prediction of rain attenuation is developed and evaluated. A method for computing scattering from single rain drops is presented. A complete multiple scattering model is described which permits accurate calculation of the effects on dual polarized signals passing through rain.

  2. Economic effort management in multispecies fisheries: the FcubEcon model

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans; Ulrich, Clara

    2010-01-01

    Applying single-species assessment and quotas in multispecies fisheries can lead to overfishing or quota underutilization, because advice can be conflicting when different stocks are caught within the same fishery. During the past decade, increased focus on this issue has resulted in the developm......Applying single-species assessment and quotas in multispecies fisheries can lead to overfishing or quota underutilization, because advice can be conflicting when different stocks are caught within the same fishery. During the past decade, increased focus on this issue has resulted...... optimal manner, in both effort-management and single-quota management settings.Applying single-species assessment and quotas in multispecies fisheries can lead to overfishing or quota underutilization, because advice can be conflicting when different stocks are caught within the same fishery. During...

  3. Artificial Neural Networks for Reducing Computational Effort in Active Truncated Model Testing of Mooring Lines

    DEFF Research Database (Denmark)

    Christiansen, Niels Hørbye; Voie, Per Erlend Torbergsen; Høgsberg, Jan Becker

    2015-01-01

    is by active truncated models. In these models only the very top part of the system is represented by a physical model whereas the behavior of the part below the truncation is calculated by numerical models and accounted for in the physical model by active actuators applying relevant forces to the physical...... model. Hence, in principal it is possible to achieve reliable experimental data for much larger water depths than what the actual depth of the test basin would suggest. However, since the computations must be faster than real time, as the numerical simulations and the physical experiment run...... simultaneously, this method is very demanding in terms of numerical efficiency and computational power. Therefore, this method has not yet proved to be feasible. It has recently been shown how a hybrid method combining classical numerical models and artificial neural networks (ANN) can provide a dramatic...

  4. Markov Modeling of Component Fault Growth Over A Derived Domain of Feasible Output Control Effort Modifications

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper introduces a novel Markov process formulation of stochastic fault growth modeling, in order to facilitate the development and analysis of...

  5. Modeling the impact of restoration efforts on phosphorus loading and transport through Everglades National Park, FL, USA.

    Science.gov (United States)

    Long, Stephanie A; Tachiev, Georgio I; Fennema, Robert; Cook, Amy M; Sukop, Michael C; Miralles-Wilhelm, Fernando

    2015-07-01

    Ecosystems of Florida Everglades are highly sensitive to phosphorus loading. Future restoration efforts, which focus on restoring Everglades water flows, may pose a threat to the health of these ecosystems. To determine the fate and transport of total phosphorus and evaluate proposed Everglades restoration, a water quality model has been developed using the hydrodynamic results from the M3ENP (Mike Marsh Model of Everglades National Park)--a physically-based hydrological numerical model which uses MIKE SHE/MIKE 11 software. Using advection-dispersion with reactive transport for the model, parameters were optimized and phosphorus loading in the overland water column was modeled with good accuracy (60%). The calibrated M3ENP-AD model was then modified to include future bridge construction and canal water level changes, which have shown to increase flows into ENP. These bridge additions increased total dissolved phosphorus (TP) load downstream in Shark Slough and decreased TP load in downstream Taylor Slough. However, there was a general decrease in TP concentration and TP mass per area over the entire model domain. The M3ENP-AD model has determined the mechanisms for TP transport and quantified the impacts of ENP restoration efforts on the spatial-temporal distribution of phosphorus transport. This tool can be used to guide future Everglades restoration decisions.

  6. A multidisciplinary effort to assign realistic source parameters to models of volcanic ash-cloud transport and dispersion during eruptions

    Science.gov (United States)

    Mastin, L.G.; Guffanti, M.; Servranckx, R.; Webley, P.; Barsotti, S.; Dean, K.; Durant, A.; Ewert, J.W.; Neri, A.; Rose, William I.; Schneider, D.; Siebert, L.; Stunder, B.; Swanson, G.; Tupper, A.; Volentik, A.; Waythomas, C.F.

    2009-01-01

    During volcanic eruptions, volcanic ash transport and dispersion models (VATDs) are used to forecast the location and movement of ash clouds over hours to days in order to define hazards to aircraft and to communities downwind. Those models use input parameters, called "eruption source parameters", such as plume height H, mass eruption rate ???, duration D, and the mass fraction m63 of erupted debris finer than about 4??{symbol} or 63????m, which can remain in the cloud for many hours or days. Observational constraints on the value of such parameters are frequently unavailable in the first minutes or hours after an eruption is detected. Moreover, observed plume height may change during an eruption, requiring rapid assignment of new parameters. This paper reports on a group effort to improve the accuracy of source parameters used by VATDs in the early hours of an eruption. We do so by first compiling a list of eruptions for which these parameters are well constrained, and then using these data to review and update previously studied parameter relationships. We find that the existing scatter in plots of H versus ??? yields an uncertainty within the 50% confidence interval of plus or minus a factor of four in eruption rate for a given plume height. This scatter is not clearly attributable to biases in measurement techniques or to well-recognized processes such as elutriation from pyroclastic flows. Sparse data on total grain-size distribution suggest that the mass fraction of fine debris m63 could vary by nearly two orders of magnitude between small basaltic eruptions (??? 0.01) and large silicic ones (> 0.5). We classify eleven eruption types; four types each for different sizes of silicic and mafic eruptions; submarine eruptions; "brief" or Vulcanian eruptions; and eruptions that generate co-ignimbrite or co-pyroclastic flow plumes. For each eruption type we assign source parameters. We then assign a characteristic eruption type to each of the world's ??? 1500

  7. The European Integrated Tokamak Modelling (ITM) effort: achievements and first physics results

    NARCIS (Netherlands)

    G.L. Falchetto,; Coster, D.; Coelho, R.; Scott, B. D.; Figini, L.; Kalupin, D.; Nardon, E.; Nowak, S.; L.L. Alves,; Artaud, J. F.; Basiuk, V.; João P.S. Bizarro,; C. Boulbe,; Dinklage, A.; Farina, D.; B. Faugeras,; Ferreira, J.; Figueiredo, A.; Huynh, P.; Imbeaux, F.; Ivanova-Stanik, I.; Jonsson, T.; H.-J. Klingshirn,; Konz, C.; Kus, A.; Marushchenko, N. B.; Pereverzev, G.; M. Owsiak,; Poli, E.; Peysson, Y.; R. Reimer,; Signoret, J.; Sauter, O.; Stankiewicz, R.; Strand, P.; Voitsekhovitch, I.; Westerhof, E.; T. Zok,; Zwingmann, W.; ITM-TF contributors,; ASDEX Upgrade team,; JET-EFDA Contributors,

    2014-01-01

    A selection of achievements and first physics results are presented of the European Integrated Tokamak Modelling Task Force (EFDA ITM-TF) simulation framework, which aims to provide a standardized platform and an integrated modelling suite of validated numerical codes for the simulation and

  8. Evaluation of Thin Plate Hydrodynamic Stability through a Combined Numerical Modeling and Experimental Effort

    Energy Technology Data Exchange (ETDEWEB)

    Tentner, A. [Argonne National Lab. (ANL), Argonne, IL (United States); Bojanowski, C. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Wilson, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Solbrekken, G [Univ. of Missouri, Columbia, MO (United States); Jesse, C. [Univ. of Missouri, Columbia, MO (United States); Kennedy, J. [Univ. of Missouri, Columbia, MO (United States); Rivers, J. [Univ. of Missouri, Columbia, MO (United States); Schnieders, G. [Univ. of Missouri, Columbia, MO (United States)

    2017-05-01

    An experimental and computational effort was undertaken in order to evaluate the capability of the fluid-structure interaction (FSI) simulation tools to describe the deflection of a Missouri University Research Reactor (MURR) fuel element plate redesigned for conversion to lowenriched uranium (LEU) fuel due to hydrodynamic forces. Experiments involving both flat plates and curved plates were conducted in a water flow test loop located at the University of Missouri (MU), at conditions and geometries that can be related to the MURR LEU fuel element. A wider channel gap on one side of the test plate, and a narrower on the other represent the differences that could be encountered in a MURR element due to allowed fabrication variability. The difference in the channel gaps leads to a pressure differential across the plate, leading to plate deflection. The induced plate deflection the pressure difference induces in the plate was measured at specified locations using a laser measurement technique. High fidelity 3-D simulations of the experiments were performed at MU using the computational fluid dynamics code STAR-CCM+ coupled with the structural mechanics code ABAQUS. Independent simulations of the experiments were performed at Argonne National Laboratory (ANL) using the STAR-CCM+ code and its built-in structural mechanics solver. The simulation results obtained at MU and ANL were compared with the corresponding measured plate deflections.

  9. Examining Mutual Elements of the Job Strain Model and the Effort--Reward Imbalance Model among Special Education Staff in the USA

    Science.gov (United States)

    Shyman, Eric

    2011-01-01

    Two theories of occupational stress are often cited as being most supported by research: the job strain model (JSM) and the effort--reward imbalance model (ERIM). In order to investigate the applicability of mutual theoretical elements of both models to special education in the USA, a sample of 100 special education paraeducators in public and…

  10. Finding a balance between accuracy and computational effort for modeling biomineralization

    Science.gov (United States)

    Hommel, Johannes; Ebigbo, Anozie; Gerlach, Robin; Cunningham, Alfred B.; Helmig, Rainer; Class, Holger

    2016-04-01

    One of the key issues of underground gas storage is the long-term security of the storage site. Amongst the different storage mechanisms, cap-rock integrity is crucial for preventing leakage of the stored gas due to buoyancy into shallower aquifers or, ultimately, the atmosphere. This leakage would reduce the efficiency of underground gas storage and pose a threat to the environment. Ureolysis-driven, microbially induced calcite precipitation (MICP) is one of the technologies in the focus of current research aiming at mitigation of potential leakage by sealing high-permeability zones in cap rocks. Previously, a numerical model, capable of simulating two-phase multi-component reactive transport, including the most important processes necessary to describe MICP, was developed and validated against experiments in Ebigbo et al. [2012]. The microbial ureolysis kinetics implemented in the model was improved based on new experimental findings and the model was recalibrated using improved experimental data in Hommel et al. [2015]. This increased the ability of the model to predict laboratory experiments while simplifying some of the reaction rates. However, the complexity of the model is still high which leads to high computation times even for relatively small domains. The high computation time prohibits the use of the model for the design of field-scale applications of MICP. Various approaches to reduce the computational time are possible, e.g. using optimized numerical schemes or simplified engineering models. Optimized numerical schemes have the advantage of conserving the detailed equations, as they save computation time by an improved solution strategy. Simplified models are more an engineering approach, since they neglect processes of minor impact and focus on the processes which have the most influence on the model results. This allows also for investigating the influence of a certain process on the overall MICP, which increases the insights into the interactions

  11. Integrated modelling requires mass collaboration (Invited)

    Science.gov (United States)

    Moore, R. V.

    2009-12-01

    The need for sustainable solutions to the world’s problems is self evident; the challenge is to anticipate where, in the environment, economy or society, the proposed solution will have negative consequences. If we failed to realise that the switch to biofuels would have the seemingly obvious result of reduced food production, how much harder will it be to predict the likely impact of policies whose impacts may be more subtle? It has been clear for a long time that models and data will be important tools for assessing the impact of events and the measures for their mitigation. They are an effective way of encapsulating knowledge of a process and using it for prediction. However, most models represent a single or small group of processes. The sustainability challenges that face us now require not just the prediction of a single process but the prediction of how many interacting processes will respond in given circumstances. These processes will not be confined to a single discipline but will often straddle many. For example, the question, “What will be the impact on river water quality of the medical plans for managing a ‘flu pandemic and could they cause a further health hazard?” spans medical planning, the absorption of drugs by the body, the spread of disease, the hydraulic and chemical processes in sewers and sewage treatment works and river water quality. This question nicely reflects the present state of the art. We have models of the processes and standards, such as the Open Modelling Interface (the OpenMI), allow them to be linked together and to datasets. We can therefore answer the question but with the important proviso that we thought to ask it. The next and greater challenge is to deal with the open question, “What are the implications of the medical plans for managing a ‘flu pandemic?”. This implies a system that can make connections that may well not have occurred to us and then evaluate their probable impact. The final touch will be to

  12. Markov Modeling of Component Fault Growth Over a Derived Domain of Feasible Output Control Effort Modifications

    Science.gov (United States)

    2012-09-01

    of similar stochastic modeling techniques, are given in Banjevic and Jardine (2006). The state transition probabilities in a Markov process descrip...Technology, and Dr Kai Goebel, Director of the Prognostics Center of Excellence at NASA AMES. REFERENCES Banjevic, D., & Jardine , A. (2006). Calculation of

  13. Ideals, activities, dissonance, and processing: a conceptual model to guide educators' efforts to stimulate student reflection.

    Science.gov (United States)

    Thompson, Britta M; Teal, Cayla R; Rogers, John C; Paterniti, Debora A; Haidet, Paul

    2010-05-01

    Medical schools are increasingly incorporating opportunities for reflection into their curricula. However, little is known about the cognitive and/or emotional processes that occur when learners participate in activities designed to promote reflection. The purpose of this study was to identify and elucidate those processes. In 2008, the authors analyzed qualitative data from focus groups that were originally conducted to evaluate an educational activity designed to promote reflection. These data afforded the opportunity to explore the processes of reflection in detail. Transcripts (94 pages, single-spaced) from four focus groups were analyzed using a narrative framework. The authors spent approximately 40 hours in group and 240 hours in individual coding activities. The authors developed a conceptual model of five major elements in students' reflective processes: the educational activity, the presence or absence of cognitive or emotional dissonance, and two methods of processing dissonance (preservation or reconciliation). The model also incorporates the relationship between the student's internal ideal of what a doctor is or does and the student's perception of the teacher's ideal of what a doctor is or does. The model further identifies points at which educators may be able to influence the processes of reflection and the development of professional ideals. Students' cognitive and emotional processes have important effects on the success of educational activities intended to stimulate reflection. Although additional research is needed, this model-which incorporates ideals, activities, dissonance, and processing-can guide educators as they plan and implement such activities.

  14. Dynamic material flow modeling: an effort to calibrate and validate aluminum stocks and flows in Austria.

    Science.gov (United States)

    Buchner, Hanno; Laner, David; Rechberger, Helmut; Fellner, Johann

    2015-05-01

    A calibrated and validated dynamic material flow model of Austrian aluminum (Al) stocks and flows between 1964 and 2012 was developed. Calibration and extensive plausibility testing was performed to illustrate how the quality of dynamic material flow analysis can be improved on the basis of the consideration of independent bottom-up estimates. According to the model, total Austrian in-use Al stocks reached a level of 360 kg/capita in 2012, with buildings (45%) and transport applications (32%) being the major in-use stocks. Old scrap generation (including export of end-of-life vehicles) amounted to 12.5 kg/capita in 2012, still being on the increase, while Al final demand has remained rather constant at around 25 kg/capita in the past few years. The application of global sensitivity analysis showed that only small parts of the total variance of old scrap generation could be explained by the variation of single parameters, emphasizing the need for comprehensive sensitivity analysis tools accounting for interaction between parameters and time-delay effects in dynamic material flow models. Overall, it was possible to generate a detailed understanding of the evolution of Al stocks and flows in Austria, including plausibility evaluations of the results. Such models constitute a reliable basis for evaluating future recycling potentials, in particular with respect to application-specific qualities of current and future national Al scrap generation and utilization.

  15. MCNP6 and DRiFT modeling efforts for the NEUANCE/DANCE detector array

    Energy Technology Data Exchange (ETDEWEB)

    Pinilla, Maria Isabel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-30

    This report seeks to study and benchmark code predictions against experimental data; determine parameters to match MCNP-simulated detector response functions to experimental stilbene measurements; add stilbene processing capabilities to DRiFT; and improve NEUANCE detector array modeling and analysis using new MCNP6 and DRiFT features.

  16. Controls over Ocean Mesopelagic Interior Carbon Storage (COMICS: fieldwork, synthesis and modelling efforts

    Directory of Open Access Journals (Sweden)

    Richard John Sanders

    2016-08-01

    Full Text Available The ocean’s biological carbon pump plays a central role in regulating atmospheric CO2 levels. In particular, the depth at which sinking organic carbon is broken down and respired in the mesopelagic zone is critical, with deeper remineralisation resulting in greater carbon storage. Until recently, however, a balanced budget of the supply and consumption of organic carbon in the mesopelagic had not been constructed in any region of the ocean, and the processes controlling organic carbon turnover are still poorly understood. Large-scale data syntheses suggest that a wide range of factors can influence remineralisation depth including upper-ocean ecological interactions, and interior dissolved oxygen concentration and temperature. However these analyses do not provide a mechanistic understanding of remineralisation, which increases the challenge of appropriately modelling the mesopelagic carbon dynamics. In light of this, the UK Natural Environment Research Council has funded a programme with this mechanistic understanding as its aim, drawing targeted fieldwork right through to implementation of a new parameterisation for mesopelagic remineralisation within an IPCC class global biogeochemical model. The Controls over Ocean Mesopelagic Interior Carbon Storage (COMICS programme will deliver new insights into the processes of carbon cycling in the mesopelagic zone and how these influence ocean carbon storage. Here we outline the programme’s rationale, its goals, planned fieldwork and modelling activities, with the aim of stimulating international collaboration.

  17. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to

  18. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano; Lucassen, Garm; Brinkkemper, Sjaak

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to evaluat

  19. Gamified Requirements Engineering: Model and Experimentation

    NARCIS (Netherlands)

    Lombriser, Philipp; Dalpiaz, Fabiano|info:eu-repo/dai/nl/369508394; Lucassen, Garm; Brinkkemper, Sjaak|info:eu-repo/dai/nl/07500707X

    2016-01-01

    [Context & Motivation] Engaging stakeholders in requirements engineering (RE) influences the quality of the requirements and ultimately of the system to-be. Unfortunately, stakeholder engagement is often insufficient, leading to too few, low-quality requirements. [Question/problem] We aim to evaluat

  20. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  1. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  2. Combined observational and modeling efforts of aerosol-cloud-precipitation interactions over Southeast Asia

    Science.gov (United States)

    Loftus, Adrian; Tsay, Si-Chee; Nguyen, Xuan Anh

    2016-04-01

    Low-level stratocumulus (Sc) clouds cover more of the Earth's surface than any other cloud type rendering them critical for Earth's energy balance, primarily via reflection of solar radiation, as well as their role in the global hydrological cycle. Stratocumuli are particularly sensitive to changes in aerosol loading on both microphysical and macrophysical scales, yet the complex feedbacks involved in aerosol-cloud-precipitation interactions remain poorly understood. Moreover, research on these clouds has largely been confined to marine environments, with far fewer studies over land where major sources of anthropogenic aerosols exist. The aerosol burden over Southeast Asia (SEA) in boreal spring, attributed to biomass burning (BB), exhibits highly consistent spatiotemporal distribution patterns, with major variability due to changes in aerosol loading mediated by processes ranging from large-scale climate factors to diurnal meteorological events. Downwind from source regions, the transported BB aerosols often overlap with low-level Sc cloud decks associated with the development of the region's pre-monsoon system, providing a unique, natural laboratory for further exploring their complex micro- and macro-scale relationships. Compared to other locations worldwide, studies of springtime biomass-burning aerosols and the predominately Sc cloud systems over SEA and their ensuing interactions are underrepresented in scientific literature. Measurements of aerosol and cloud properties, whether ground-based or from satellites, generally lack information on microphysical processes; thus cloud-resolving models are often employed to simulate the underlying physical processes in aerosol-cloud-precipitation interactions. The Goddard Cumulus Ensemble (GCE) cloud model has recently been enhanced with a triple-moment (3M) bulk microphysics scheme as well as the Regional Atmospheric Modeling System (RAMS) version 6 aerosol module. Because the aerosol burden not only affects cloud

  3. Ultraviolet Interstellar Linear Polarization: Initial Modeling Efforts for the Astro-2 WUPPE Data

    Science.gov (United States)

    Wolff, M. J.; Anderson, C. M.; Clayton, Geoff; Kim, S.-H.; Martin, P. G.

    1996-05-01

    Prior to the flight of the Wisconsin Ultraviolet Photo Polarimeter Experiment (WUPPE) on Astro-2, studies of ultraviolet (UV) interstellar linear polarization have generally catagorized the wavelength dependence in two ways: that which agrees with an extrapolation of the Serkowski Law into the UV and that which has a polarization greater than the extrapolation (see Clayton et al. 1995 and references within). Only one object (HD 197770) had been reported to deviate from either of these behaviors. It is important to note that earlier work has been limited in scope primarily by the amount of data available (14 published sightlines). However, with the flight of Astro-2, WUPPE has tripled the number of UV interstellar polarization observations (Anderson et al. 1995, 1996). These new data will provide a significant improvement to our ability to test interstellar dust grain models and study the effects of sightline environments. We present the modeling results for several WUPPE (Astro-2) sightlines, including two which clearly depart from the previously mentioned catagorizations: HD 147933 and HD 197770. In addition to "classial" grain modeling technques (series solution, Effective Medium Theory), we also employ the Maximum Entropy Method and the Discrete Dipole Approximate. WUPPE is supported by NASA contract NAS 5-26777. Anderson, C.M., Weitenbach, A.J., & Code, A.D. 1995, Proceedings of the Conference on Polarimetry in the Interstellar Medium, eds. Roberge & Whittet, Troy, NY, June 1995. Anderson, C.M. et al. 1996, ApJ, submitted. Clayton, G. C. et al. 1995, ApJ, 445, 947

  4. An effort allocation model considering different budgetary constraint on fault detection process and fault correction process

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2016-01-01

    Full Text Available Fault detection process (FDP and Fault correction process (FCP are important phases of software development life cycle (SDLC. It is essential for software to undergo a testing phase, during which faults are detected and corrected. The main goal of this article is to allocate the testing resources in an optimal manner to minimize the cost during testing phase using FDP and FCP under dynamic environment. In this paper, we first assume there is a time lag between fault detection and fault correction. Thus, removal of a fault is performed after a fault is detected. In addition, detection process and correction process are taken to be independent simultaneous activities with different budgetary constraints. A structured optimal policy based on optimal control theory is proposed for software managers to optimize the allocation of the limited resources with the reliability criteria. Furthermore, release policy for the proposed model is also discussed. Numerical example is given in support of the theoretical results.

  5. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  6. Latent Tuberculosis: Models, Computational efforts and the Pathogen's regulatory mechanisms during dormancy

    Directory of Open Access Journals (Sweden)

    Gesham eMagombedze

    2013-08-01

    Full Text Available Latent tuberculosis is a clinical syndrome that occurs after an individual has been exposed to the Mycobacterium tuberculosis (Mtb bacillus, the infection has been established and an immune response has been generated to control the pathogen and force it into a quiescent state. Mtb can exit this quiescent state where it is unresponsive to treatment and elusive to the immune response, and enter a rapid replicating state, hence causing infection reactivation. It remains a grey area to understand how the pathogen causes a persistent infection and it is unclear whether the organism will be in a slow replicating state or a dormant non-replicating state. The ability of the pathogen to adapt to changing host immune response mechanisms, in which it is exposed to hypoxia, low pH, nitric oxide (NO, nutrient starvation and several other anti-microbial effectors, is associated with a high metabolic plasticity that enables it to metabolise under these different conditions. Adaptive gene regulatory mechanisms are thought to coordinate how the pathogen changes their metabolic pathways through mechanisms that sense changes in oxygen tension and other stress factors, hence stimulating the pathogen to make necessary adjustments to ensure survival. Here, we review studies that give insights into latency/dormancy regulatory mechanisms that enable infection persistence and pathogen adaptation to different stress conditions. We highlight what mathematical and computational models can do and what they should do to enhance our current understanding of TB latency.

  7. Encoding of visual-spatial information in working memory requires more cerebral efforts than retrieval: Evidence from an EEG and virtual reality study.

    Science.gov (United States)

    Jaiswal, N; Ray, W; Slobounov, S

    2010-08-01

    Visual-spatial working memory tasks can be decomposed into encoding and retrieval phases. It was hypothesized that encoding of visual-spatial information is cognitively more challenging than retrieval. This was tested by combining electroencephalography with a virtual reality paradigm to observe the modulation in EEG activity. EEG power analysis results demonstrated an increase in theta activity during encoding in comparison to retrieval, whereas alpha activity was significantly higher for retrieval in comparison to encoding. We found that encoding required more cerebral efforts than retrieval. Further, as seen in fMRI studies, we observed an encoding/retrieval flip in that encoding and retrieval differentially activated similar neural substrates. Results obtained from sLORETA identified cortical sources in the inferior frontal gyrus, which is a part of dorsolateral prefrontal cortex (DLPFC) during encoding, whereas the inferior parietal lobe and precuneus cortical sources were identified during retrieval. We further tie our results into studies examining the default network, which have shown increased activation in DLPFC occurs in response to increased cerebral challenge, while posterior parietal areas show activation during baseline or internal processing tasks. We conclude that encoding of visual-spatial information via VR navigation task is more cerebrally challenging than retrieval.

  8. Three-dimensional cell culture models for anticancer drug screening: Worth the effort?

    Science.gov (United States)

    Verjans, Eddy-Tim; Doijen, Jordi; Luyten, Walter; Landuyt, Bart; Schoofs, Liliane

    2017-06-15

    High attrition of new oncology drug candidates in clinical trials is partially caused by the poor predictive capacity of artificial monolayer cell culture assays early in drug discovery. Monolayer assays do not take the natural three-dimensional (3D) microenvironment of cells into account. As a result, false positive compounds often enter clinical trials, leading to high dropout rates and a waste of time and money. Over the past 2 decades, tissue engineers and cell biologists have developed a broad range of 3D in vitro culturing tools that better represent in vivo cell biology. These tools preserve the 3D architecture of cells and can be used to predict toxicity of and resistance against antitumor agents. Recent progress in tissue engineering further improves 3D models by taking into account the tumor microenvironment, which is important for metastatic progression and vascularization. However, the widespread implementation of 3D cell cultures into cell-based research programs has been limited by various factors, including their cost and reproducibility. In addition, different 3D cell culture techniques often produce spheroids of different size and shape, which can strongly influence drug efficacy and toxicity. Hence, it is imperative to morphometrically characterize multicellular spheroids to avoid generalizations among different spheroid types. Standardized 3D culturing procedures could further reduce data variability and enhance biological relevance. Here, we critically evaluate the benefits and challenges inherent to growing cells in 3D, along with an overview of the techniques used to form spheroids. This is done with a specific focus on antitumor drug screening. © 2017 Wiley Periodicals, Inc.

  9. Putting the Hydrology Back in Water Resources: Recent Efforts to Improve Representation of Physical Hydrology in Water Resources Planning and Operations Models

    Science.gov (United States)

    Ferguson, I. M.; Parker, N.; Draper, A.; Dogrul, E. C.; Condon, L. E.

    2012-12-01

    Water resources planners and managers rely on a broad range of data analysis and modeling tools. Data analysis, statistical models, and physical hydrology models are used to estimate water supply, while systems-based planning and operations models are used to simulate system operation with respect to competing objectives—e.g., water supply vs. flood control vs. in-stream flows—under physical and regulatory constraints. In general, physical hydrology models neglect water operations, while planning and operations models lack physically-based representation hydrologic processes. Accurate assessment of climate change impacts on water resources requires modeling tools that integrate physical hydrology and water resources operations. This presentation will discuss recent efforts to improve representation of physical hydrology in water resources planning and operations models, focusing on key challenges, trade-offs between various approaches, and implications for climate change risk assessment and adaptation studies. Discussion will focus on recent model development by the US Bureau of Reclamation, California Department of Water Resources, and collaborators for the Sacramento-San Joaquin watershed in California.

  10. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  11. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  12. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, Wilco; Jonkers, Henk; Sinderen, van Marten

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for enterpris

  13. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  14. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  15. Applying the effort-reward imbalance model to household and family work: a population-based study of German mothers

    Directory of Open Access Journals (Sweden)

    Sperlich Stefanie

    2012-01-01

    Full Text Available Abstract Background This paper reports on results of a newly developed questionnaire for the assessment of effort-reward imbalance (ERI in unpaid household and family work. Methods: Using a cross-sectional population-based survey of German mothers (n = 3129 the dimensional structure of the theoretical ERI model was validated by means of Confirmatory Factor Analysis (CFA. Analyses of Variance were computed to examine relationships between ERI and social factors and health outcomes. Results CFA revealed good psychometric properties indicating that the subscale 'effort' is based on one latent factor and the subscale 'reward' is composed of four dimensions: 'intrinsic value of family and household work', 'societal esteem', 'recognition from the partner', and 'affection from the child(ren'. About 19.3% of mothers perceived lack of reciprocity and 23.8% showed high rates of overcommitment in terms of inability to withdraw from household and family obligations. Socially disadvantaged mothers were at higher risk of ERI, in particular with respect to the perception of low societal esteem. Gender inequality in the division of household and family work and work-family conflict accounted most for ERI in household and family work. Analogous to ERI in paid work we could demonstrate that ERI affects self-rated health, somatic complaints, mental health and, to some extent, hypertension. Conclusions The newly developed questionnaire demonstrates satisfied validity and promising results for extending the ERI model to household and family work.

  16. DECISION MAKING MODELING OF CONCRETE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Suhartono Irawan

    2001-01-01

    Full Text Available This paper presents the results of an experimental evaluation between predicted and practice concrete strength. The scope of the evaluation is the optimisation of the cement content for different concrete grades as a result of bringing the target mean value of tests cubes closer to the required characteristic strength value by reducing the standard deviation. Abstract in Bahasa Indonesia : concrete+mix+design%2C+acceptance+control%2C+optimisation%2C+cement+content.

  17. A MODEL FOR ALIGNING SOFTWARE PROJECTS REQUIREMENTS WITH PROJECT TEAM MEMBERS REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Robert Hans

    2013-02-01

    Full Text Available The fast-paced, dynamic environment within which information and communication technology (ICT projects are run as well as ICT professionals’ constant changing requirements present a challenge for project managers in terms of aligning projects’ requirements with project team members’ requirements. This research paper purports that if projects’ requirements are properly aligned with team members’ requirements, then this will result in a balanced decision approach. Moreover, such an alignment will result in the realization of employee’s needs as well as meeting project’s needs. This paper presents a Project’s requirements and project Team members’ requirements (PrTr alignment model and argues that a balanced decision which meets both software project’s requirements and team members’ requirements can be achieved through the application of the PrTr alignment model.

  18. Supporting requirements model evolution throughout the system life-cycle

    OpenAIRE

    Ernst, Neil; Mylopoulos, John; Yu, Yijun; Ngyuen, Tien T.

    2008-01-01

    Requirements models are essential not just during system implementation, but also to manage system changes post-implementation. Such models should be supported by a requirements model management framework that allows users to create, manage and evolve models of domains, requirements, code and other design-time artifacts along with traceability links between their elements. We propose a comprehensive framework which delineates the operations and elements necessary, and then describe a tool imp...

  19. Effort rights-based management

    DEFF Research Database (Denmark)

    Squires, Dale; Maunder, Mark; Allen, Robin

    2017-01-01

    Effort rights-based fisheries management (RBM) is less widely used than catch rights, whether for groups or individuals. Because RBM on catch or effort necessarily requires a total allowable catch (TAC) or total allowable effort (TAE), RBM is discussed in conjunction with issues in assessing fish...

  20. New Experimental Models of Diabetic Nephropathy in Mice Models of Type 2 Diabetes: Efforts to Replicate Human Nephropathy

    Directory of Open Access Journals (Sweden)

    María José Soler

    2012-01-01

    Full Text Available Diabetic nephropathy (DN is the leading cause of end-stage renal disease. The use of experimental models of DN has provided valuable information regarding many aspects of DN, including pathophysiology, progression, implicated genes, and new therapeutic strategies. A large number of mouse models of diabetes have been identified and their kidney disease was characterized to various degrees. Most experimental models of type 2 DN are helpful in studying early stages of DN, but these models have not been able to reproduce the characteristic features of more advanced DN in humans such as nodules in the glomerular tuft or glomerulosclerosis. The generation of new experimental models of DN created by crossing, knockdown, or knockin of genes continues to provide improved tools for studying DN. These models provide an opportunity to search for new mechanisms involving the development of DN, but their shortcomings should be recognized as well. Moreover, it is important to recognize that the genetic background has a substantial effect on the susceptibility to diabetes and kidney disease development in the various models of diabetes.

  1. Requirements model for an e-Health awareness portal

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  2. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; Sinderen, van Marten

    2011-01-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling te

  3. DISPLACE: a dynamic, individual-based model for spatial fishing planning and effort displacement: Integrating underlying fish population models

    DEFF Research Database (Denmark)

    Bastardie, Francois; Nielsen, J. Rasmus; Miethe, Tanja

    We previously developed a spatially explicit, individual-based model (IBM) evaluating the bio-economic efficiency of fishing vessel movements between regions according to the catching and targeting of different species based on the most recent high resolution spatial fishery data. The main purpose...... version couples the vessel model to selected size-based population models and considers the underlying resource dynamics in the distribution and density patterns of the targeted stocks for the cases of Danish and German vessels harvesting the North Sea and Baltic fish stocks. The stochastic fishing...... by vessels on the fish stocks, with resulting fishing mortality, and the vessels’ economic consequences are evaluated on high spatial and seasonal disaggregation levels by simulating different individual choices of vessel speed, fishing grounds and ports. All tested scenarios led to increased overall energy...

  4. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  5. Defining Requirements and Applying Information Modeling for Protecting Enterprise Assets

    Science.gov (United States)

    Fortier, Stephen C.; Volk, Jennifer H.

    The advent of terrorist threats has heightened local, regional, and national governments' interest in emergency response and disaster preparedness. The threat of natural disasters also challenges emergency responders to act swiftly and in a coordinated fashion. When a disaster occurs, an ad hoc coalition of pre-planned groups usually forms to respond to the incident. History has shown that these “system of systems” do not interoperate very well. Communications between fire, police and rescue components either do not work or are inefficient. Government agencies, non-governmental organizations (NGOs), and private industry use a wide array of software platforms for managing data about emergency conditions, resources and response activities. Most of these are stand-alone systems with very limited capability for data sharing with other agencies or other levels of government. Information technology advances have facilitated the movement towards an integrated and coordinated approach to emergency management. Other communication mechanisms, such as video teleconferencing, digital television and radio broadcasting, are being utilized to combat the challenges of emergency information exchange. Recent disasters, such as Hurricane Katrina and the tsunami in Indonesia, have illuminated the weaknesses in emergency response. This paper will discuss the need for defining requirements for components of ad hoc coalitions which are formed to respond to disasters. A goal of our effort was to develop a proof of concept that applying information modeling to the business processes used to protect and mitigate potential loss of an enterprise was feasible. These activities would be modeled both pre- and post-incident.

  6. The Development of the Concepts of Effort and Ability, Perception of Academic Attainment, and the Understanding that Difficult Tasks Require More Ability.

    Science.gov (United States)

    Nicholls, John G.

    1978-01-01

    Selected cognitive developments presumed to mediate the development of achievement motivation are described. Age trends for four causal schemes involving the concepts of effort and ability from 5 to 13 years of age are presented. Developments related to ability, task difficulty, and incentive value are also described. (Author/JMB)

  7. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements......Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...

  8. Associations of Extrinsic and Intrinsic Components of Work Stress with Health: A Systematic Review of Evidence on the Effort-Reward Imbalance Model.

    Science.gov (United States)

    Siegrist, Johannes; Li, Jian

    2016-04-19

    Mainstream psychological stress theory claims that it is important to include information on people's ways of coping with work stress when assessing the impact of stressful psychosocial work environments on health. Yet, some widely used respective theoretical models focus exclusively on extrinsic factors. The model of effort-reward imbalance (ERI) differs from them as it explicitly combines information on extrinsic and intrinsic factors in studying workers' health. As a growing number of studies used the ERI model in recent past, we conducted a systematic review of available evidence, with a special focus on the distinct contribution of its intrinsic component, the coping pattern "over-commitment", towards explaining health. Moreover, we explore whether the interaction of intrinsic and extrinsic components exceeds the size of effects on health attributable to single components. Results based on 51 reports document an independent explanatory role of "over-commitment" in explaining workers' health in a majority of studies. However, support in favour of the interaction hypothesis is limited and requires further exploration. In conclusion, the findings of this review support the usefulness of a work stress model that combines extrinsic and intrinsic components in terms of scientific explanation and of designing more comprehensive worksite stress prevention programs.

  9. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  10. Child Care: State Efforts to Enforce Safety and Health Requirements. Report to the Honorable Sander M. Levin, House of Representatives. GAO-04-786

    Science.gov (United States)

    Shaul, Marnie S.

    2004-01-01

    The federal government requires states that receive funds from the Child Care and Development Fund to establish basic health and safety requirements. The federal government also requires states receiving federal funds for child care to have procedures in place to ensure that providers being paid with grant dollars comply with the applicable safety…

  11. Short-term dispersal of Fukushima-derived radionuclides off Japan: modeling efforts and model-data intercomparison

    Directory of Open Access Journals (Sweden)

    I. I. Rypina

    2013-07-01

    Full Text Available The Great East Japan Earthquake and tsunami that caused a loss of power at the Fukushima nuclear power plants (FNPP resulted in emission of radioactive isotopes into the atmosphere and the ocean. In June of 2011, an international survey measuring a variety of radionuclide isotopes, including 137Cs, was conducted in surface and subsurface waters off Japan. This paper presents the results of numerical simulations specifically aimed at interpreting these observations and investigating the spread of Fukushima-derived radionuclides off the coast of Japan and into the greater Pacific Ocean. Together, the simulations and observations allow us to study the dominant mechanisms governing this process, and to estimate the total amount of radionuclides in discharged coolant waters and atmospheric airborne radionuclide fallout. The numerical simulations are based on two different ocean circulation models, one inferred from AVISO altimetry and NCEP/NCAR reanalysis wind stress, and the second generated numerically by the NCOM model. Our simulations determine that > 95% of 137Cs remaining in the water within ~600 km of Fukushima, Japan in mid-June 2011 was due to the direct oceanic discharge. The estimated strength of the oceanic source is 16.2 ± 1.6 PBq, based on minimizing the model-data mismatch. We cannot make an accurate estimate for the atmospheric source strength since most of the fallout cesium had left the survey area by mid-June. The model explained several key features of the observed 137Cs distribution. First, the absence of 137Cs at the southernmost stations is attributed to the Kuroshio Current acting as a transport barrier against the southward progression of 137Cs. Second, the largest 137Cs concentrations were associated with a semi-permanent eddy that entrained 137Cs-rich waters, collecting and stirring them around the eddy perimeter. Finally, the intermediate 137Cs concentrations at the westernmost stations are attributed to younger, and

  12. Short-term dispersal of Fukushima-derived radionuclides off Japan: modeling efforts and model-data intercomparison

    Directory of Open Access Journals (Sweden)

    I. I. Rypina

    2013-01-01

    Full Text Available The March of 2011 earthquake and tsunami that caused a loss of power at the Fukushima nuclear power plants (FNPP resulted in emission of radioactive isotopes into the atmosphere and the ocean. In June of 2011, an international survey of various radionuclide isotopes, including 137Cs, was conducted in surface and subsurface waters off Japan. This paper presents the results of numerical simulations aimed at interpreting these observations, investigating the spread of Fukushima-derived radionuclides off the coast of Japan and into the greater Pacific Ocean, studying the dominant mechanisms governing this process, as well as estimating the total amount of radionuclides in discharged coolant waters and atmospheric airborne radionuclide fallout. The numerical simulations are based on two different ocean circulation models, one inferred from AVISO altimetry and NCEP/NCAR reanalysis wind stress, and the second generated numerically by the NCOM model. Our simulations determine that >95% of 137Cs remaining in the water within ~600 km of Fukushima, Japan in mid-June 2011 was due to the direct oceanic discharge. The estimated strength of the oceanic source is 16.2 ± 1.6 PBq, based on minimizing the model-data mismatch. We cannot make an accurate estimate for the atmospheric source strength since most of the fallout cesium would have moved out of the survey area by mid-June. The model explained several features of the observed 137Cs distribution. First, the absence of 137Cs at the southernmost stations is attributed to the Kuroshio Current acting as a transport barrier against the southward progression of 137Cs. Second, the largest 137Cs concentrations were associated with a semi-permanent eddy that entrained 137Cs-rich waters collecting and stirring them around the eddy perimeter. Finally, the intermediate 137Cs concentrations at the westernmost stations were attributed

  13. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  14. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  15. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  16. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  17. Co-effect of Demand-control-support Model and Effort-reward Imbalance Model on Depression Risk Estimation in Humans:Findings from Henan Province of China

    Institute of Scientific and Technical Information of China (English)

    YU Shan Fa; NAKATA Akinori; GU Gui Zhen; SWANSON Naomi G; ZHOU Wen Hui; HE Li Hua; WANG Sheng

    2013-01-01

    Objective To investigate the co-effect of Demand-control-support (DCS) model and Effort-reward Imbalance (ERI) model on the risk estimation of depression in humans in comparison with the effects when they are used respectively. Methods A total of 3 632 males and 1 706 females from 13 factories and companies in Henan province were recruited in this cross-sectional study. Perceived job stress was evaluated with the Job Content Questionnaire and Effort-Reward Imbalance Questionnaire (Chinese version). Depressive symptoms were assessed by using the Center for Epidemiological Studies Depression Scale (CES-D). Results DC (demands/job control ratio) and ERI were shown to be independently associated with depressive symptoms. The outcome of low social support and overcommitment were similar. High DC and low social support (SS), high ERI and high overcommitment, and high DC and high ERI posed greater risks of depressive symptoms than each of them did alone. ERI model and SS model seem to be effective in estimating the risk of depressive symptoms if they are used respectively. Conclusion The DC had better performance when it was used in combination with low SS. The effect on physical demands was better than on psychological demands. The combination of DCS and ERI models could improve the risk estimate of depressive symptoms in humans.

  18. Process Model for Defining Space Sensing and Situational Awareness Requirements

    Science.gov (United States)

    2006-04-01

    process model for defining systems for space sensing and space situational awareness is presented. The paper concentrates on eight steps for determining the requirements to include: decision maker needs, system requirements, exploitation methods and vulnerabilities, critical capabilities, and identify attack scenarios. Utilization of the USAF anti-tamper (AT) implementation process as a process model departure point for the space sensing and situational awareness (SSSA...is presented. The AT implementation process model , as an

  19. [Measuring psychosocial stress at work in Spanish hospital's personnel. Psychometric properties of the Spanish version of Effort-Reward Imbalance model].

    Science.gov (United States)

    Macías Robles, María Dolores; Fernández-López, Juan Antonio; Hernández-Mejía, Radhamés; Cueto-Espinar, Antonio; Rancaño, Iván; Siegrist, Johannes

    2003-05-10

    Two main models are currently used to evaluate the psychosocial factors at work: the Demand-Control (or job strain) model developed by Karasek and the Effort-Reward Imbalance model, developed by Siegrist. A Spanish version of the first model has been validated, yet so far no validated Spanish version of the second model is available. The objective of this study was to explore the psychometric properties of the Spanish version of the Effort-Reward Imbalance model in terms of internal consistency, factorial validity, and discriminate validity. A cross-sectional study on a representative sample of 298 workers of the Spanish public hospital San Agustin in Asturias was performed. The Spanish version of Effort-Reward Imbalance Questionnaire (23 items) was obtained by a standard forward/backward translation procedure, and the information was gathered by a self-administered application. Exploratory factor analysis were performed to test the dimensional structure of the theoretical model. Cronbach's alpha coefficient was calculated to estimate the internal consistency reliability. Information on discriminate validity is given for sex, age and education. Differences were calculated with the t-test for two independent samples or ANOVA, respectively. Internal consistency was satisfactory for the two scales (reward and intrinsic effort) and Cronbach's Alpha coefficients higher than 0.80 were observed. The internal consistency for the scale of extrinsic effort was lower (alpha = 0.63). A three-factor solution was retained for the factor analysis of reward as expected, and these dimensions were interpreted as a) esteem, b) job promotion and salary and c) job instability. A one-factor solution was retained for the factor analysis of intrinsic effort. The factor analysis of the scale of extrinsic effort did not support the expected one-dimension structure. The analysis of discriminate validity displayed significant associations between measures of Effort-Reward Imbalance and the

  20. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform...... such a model automatically into programs that essentially will replace customisation and localisation by con¿guration by changing parameters in the model. In particular, we: (1) identify a number of requirements for such modeling, including requirements for the underlying logic; (2) model salient parts...

  1. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach.......Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... with simple unified modelling language (UML) requirements models, it is not easy for the development team to get confidence on the stakeholders' requirements validation. This paper describes an approach, based on the construction of executable interactive prototypes, to support the validation of workflow...

  2. Unremarked or Unperformed? Systematic Review on Reporting of Validation Efforts of Health Economic Decision Models in Seasonal Influenza and Early Breast Cancer

    NARCIS (Netherlands)

    de Boer, Pieter T.; Frederix, G.W.J.; Feenstra, Talitha L.; Vemer, Pepijn

    2016-01-01

    BACKGROUND: Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of

  3. Unremarked or Unperformed? : Systematic Review on Reporting of Validation Efforts of Health Economic Decision Models in Seasonal Influenza and Early Breast Cancer

    NARCIS (Netherlands)

    de Boer, Pieter T; Frederix, Geert W J; Feenstra, Talitha L; Vemer, Pepijn

    2016-01-01

    BACKGROUND: Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of

  4. Unremarked or Unperformed? : Systematic Review on Reporting of Validation Efforts of Health Economic Decision Models in Seasonal Influenza and Early Breast Cancer

    NARCIS (Netherlands)

    de Boer, Pieter T.; Frederix, Geert W. J.; Feenstra, Talitha L.; Vemer, Pepijn

    2016-01-01

    Background Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of

  5. GENERAL REQUIREMENTS FOR SIMULATION MODELS IN WASTE MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Ian; Kossik, Rick; Voss, Charlie

    2003-02-27

    Most waste management activities are decided upon and carried out in a public or semi-public arena, typically involving the waste management organization, one or more regulators, and often other stakeholders and members of the public. In these environments, simulation modeling can be a powerful tool in reaching a consensus on the best path forward, but only if the models that are developed are understood and accepted by all of the parties involved. These requirements for understanding and acceptance of the models constrain the appropriate software and model development procedures that are employed. This paper discusses requirements for both simulation software and for the models that are developed using the software. Requirements for the software include transparency, accessibility, flexibility, extensibility, quality assurance, ability to do discrete and/or continuous simulation, and efficiency. Requirements for the models that are developed include traceability, transparency, credibility/validity, and quality control. The paper discusses these requirements with specific reference to the requirements for performance assessment models that are used for predicting the long-term safety of waste disposal facilities, such as the proposed Yucca Mountain repository.

  6. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  7. Inferring Requirement Goals from Model Implementing in UML

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    UML is used widely in many software developmentprocesses.However,it does not make explicit requirement goals.Here is a method tending to establish the semantic relationship between requirements goals and UML models.Before the method is introduced,some relevant concepts are described

  8. Personality traits of the five-factor model are associated with effort-reward imbalance at work: a population-based study.

    Science.gov (United States)

    Törnroos, Maria; Hintsanen, Mirka; Hintsa, Taina; Jokela, Markus; Pulkki-Råback, Laura; Kivimäki, Mika; Hutri-Kähönen, Nina; Keltikangas-Järvinen, Liisa

    2012-07-01

    This study examined the association between personality traits and work stress. The sample comprised 757 women and 613 men (aged 30 to 45 years in 2007) participating in the Young Finns study. Personality was assessed with the NEO-FFI questionnaire and work stress according to Siegrist's effort-reward imbalance (ERI) model. High neuroticism, low extraversion, and low agreeableness were associated with high ERI. Low conscientiousness was associated with high ERI in men. No association was found between openness and ERI. High neuroticism, high extraversion, and low agreeableness were associated with high effort and low neuroticism, high extraversion, and high agreeableness with high rewards. High conscientiousness was associated with high effort, and in women, with high rewards. High openness was associated with high effort. This study suggests that personality traits may predispose to and protect from work stress.

  9. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required....... There are however many advantages that could be harvested from such knowledge like size, cost and efficiency improvements. In this paper a recently proposed power requirement model for active loudspeakers is experimentally validated and the model is expanded to include the closed and vented type enclosures...

  10. Efforts to Address the Aging Academic Workforce: Assessing Progress through a Three-Stage Model of Institutional Change

    Science.gov (United States)

    Kaskie, Brian; Walker, Mark; Andersson, Matthew

    2017-01-01

    The aging of the academic workforce is becoming more relevant to policy discussions in higher education. Yet there has been no formal, large-scale analysis of institutional efforts to develop policies and programs for aging employees. We fielded a representative survey of human resource specialists at 187 colleges and universities across the…

  11. Towards a Formalized Ontology-Based Requirements Model

    Institute of Scientific and Technical Information of China (English)

    JIANG Dan-dong; ZHANG Shen-sheng; WANG Ying-lin

    2005-01-01

    The goal of this paper is to take a further step towards an ontological approach for representing requirements information. The motivation for ontologies was discussed. The definitions of ontology and requirements ontology were given. Then, it presented a collection of informal terms, including four subject areas. It also discussed the formalization process of ontology. The underlying meta-ontology was determined, and the formalized requirements ontology was analyzed. This formal ontology is built to serve as a basis for requirements model. Finally, the implementation of software system was given.

  12. A transformation approach for collaboration based requirement models

    CERN Document Server

    Harbouche, Ahmed; Mokhtari, Aicha

    2012-01-01

    Distributed software engineering is widely recognized as a complex task. Among the inherent complexities is the process of obtaining a system design from its global requirement specification. This paper deals with such transformation process and suggests an approach to derive the behavior of a given system components, in the form of distributed Finite State Machines, from the global system requirements, in the form of an augmented UML Activity Diagrams notation. The process of the suggested approach is summarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model), the definition of the target Design Meta-Model and the definition of the rules to govern the transformation during the derivation process. The derivation process transforms the global system requirements described as UML diagram activities (extended with collaborations) to system roles behaviors represented as UML finite state machines. The approach is implemented using Atlas Transformation Language (ATL).

  13. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  14. Innovative Product Design Based on Customer Requirement Weight Calculation Model

    Institute of Scientific and Technical Information of China (English)

    Chen-Guang Guo; Yong-Xian Liu; Shou-Ming Hou; Wei Wang

    2010-01-01

    In the processes of product innovation and design, it is important for the designers to find and capture customer's focus through customer requirement weight calculation and ranking. Based on the fuzzy set theory and Euclidean space distance, this paper puts forward a method for customer requirement weight calculation called Euclidean space distances weighting ranking method. This method is used in the fuzzy analytic hierarchy process that satisfies the additive consistent fuzzy matrix. A model for the weight calculation steps is constructed;meanwhile, a product innovation design module on the basis of the customer requirement weight calculation model is developed. Finally, combined with the instance of titanium sponge production, the customer requirement weight calculation model is validated. By the innovation design module, the structure of the titanium sponge reactor has been improved and made innovative.

  15. A modelling framework for predicting the optimal balance between control and surveillance effort in the local eradication of tuberculosis in New Zealand wildlife.

    Science.gov (United States)

    Gormley, Andrew M; Holland, E Penelope; Barron, Mandy C; Anderson, Dean P; Nugent, Graham

    2016-03-01

    Bovine tuberculosis (TB) impacts livestock farming in New Zealand, where the introduced marsupial brushtail possum (Trichosurus vulpecula) is the wildlife maintenance host for Mycobacterium bovis. New Zealand has implemented a campaign to control TB using a co-ordinated programme of livestock diagnostic testing and large-scale culling of possums, with the long-term aim of TB eradication. For management of the disease in wildlife, methods that can optimise the balance between control and surveillance effort will facilitate the objective of eradication on a fixed or limited budget. We modelled and compared management options to optimise the balance between the two activities necessary to achieve and verify eradication of TB from New Zealand wildlife: the number of lethal population control operations required to halt the M. bovis infection cycle in possums, and the subsequent surveillance effort needed to confidently declare TB freedom post-control. The approach considered the costs of control and surveillance, as well as the potential costs of re-control resulting from false declaration of TB freedom. The required years of surveillance decreased with increasing numbers of possum lethal control operations but the overall time to declare TB freedom depended on additional factors, such as the probability of freedom from disease after control and the probability of success of mop-up control, i.e. retroactive culling following detection of persistent disease in the residual possum population. The total expected cost was also dependent on a number of factors, many of which had wide cost ranges, suggesting that an optimal strategy is unlikely to be singular and fixed, but will likely vary for each different area being considered. Our approach provides a simple framework that considers the known and potential costs of possum control and TB surveillance, enabling managers to optimise the balance between these two activities to achieve and prove eradication of a wildlife

  16. Evaluation of Foreign Exchange Risk Capital Requirement Models

    Directory of Open Access Journals (Sweden)

    Ricardo S. Maia Clemente

    2005-12-01

    Full Text Available This paper examines capital requirement for financial institutions in order to cover market risk stemming from exposure to foreign currencies. The models examined belong to two groups according to the approach involved: standardized and internal models. In the first group, we study the Basel model and the model adopted by the Brazilian legislation. In the second group, we consider the models based on the concept of value at risk (VaR. We analyze the single and the double-window historical model, the exponential smoothing model (EWMA and a hybrid approach that combines features of both models. The results suggest that the Basel model is inadequate to the Brazilian market, exhibiting a large number of exceptions. The model of the Brazilian legislation has no exceptions, though generating higher capital requirements than other internal models based on VaR. In general, VaR-based models perform better and result in less capital allocation than the standardized approach model applied in Brazil.

  17. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... a holistic approach to eliciting, analyzing, and modelling socially-oriented requirements by combining a particular form of ethnographic technique, cultural probes, with Agent Oriented Software Engineering notations to model these requirements. This paper focuses on examining the value of maintaining...... of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio...

  18. Understanding the requirements imposed by programming model middleware on a common communication subsystem.

    Energy Technology Data Exchange (ETDEWEB)

    Buntinas, D.; Gropp, W.

    2005-12-13

    In high-performance parallel computing, most programming-model middleware libraries and runtime systems use a communication subsystem to abstract the lower-level network layer. The functionality required of a communication subsystem depends largely on the programming model implemented by the middleware. In order to maximize performance, middleware libraries and runtime systems typically implement their own communication subsystems that are specially tuned for the middleware, rather than use an existing communication subsystem. This situation leads to duplicated effort and prevents different middleware libraries from being used by the same application in hybrid programming models. In this paper we describe features required by various middleware libraries as well as some desirable features that would make it easier to port a middleware library to the communication subsystem and allow the middleware to make use of high-performance features provided by some networking layers. We show that none of the communication subsystems that we evaluate support all of the features.

  19. A proposal for a coordinated effort for the determination of brainwide neuroanatomical connectivity in model organisms at a mesoscopic scale.

    Directory of Open Access Journals (Sweden)

    Jason W Bohland

    2009-03-01

    Full Text Available In this era of complete genomes, our knowledge of neuroanatomical circuitry remains surprisingly sparse. Such knowledge is critical, however, for both basic and clinical research into brain function. Here we advocate for a concerted effort to fill this gap, through systematic, experimental mapping of neural circuits at a mesoscopic scale of resolution suitable for comprehensive, brainwide coverage, using injections of tracers or viral vectors. We detail the scientific and medical rationale and briefly review existing knowledge and experimental techniques. We define a set of desiderata, including brainwide coverage; validated and extensible experimental techniques suitable for standardization and automation; centralized, open-access data repository; compatibility with existing resources; and tractability with current informatics technology. We discuss a hypothetical but tractable plan for mouse, additional efforts for the macaque, and technique development for human. We estimate that the mouse connectivity project could be completed within five years with a comparatively modest budget.

  20. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  1. Applying a Theory-Driven Framework to Guide Quality Improvement Efforts in Nursing Homes: The LOCK Model.

    Science.gov (United States)

    Mills, Whitney L; Pimentel, Camilla B; Palmer, Jennifer A; Snow, A Lynn; Wewiorski, Nancy J; Allen, Rebecca S; Hartmann, Christine W

    2017-06-23

    Implementing quality improvement (QI) programs in nursing homes continues to encounter significant challenges, despite recognized need. QI approaches provide nursing home staff with opportunities to collaborate on developing and testing strategies for improving care delivery. We present a theory-driven and user-friendly adaptable framework and facilitation package to overcome existing challenges and guide QI efforts in nursing homes. The framework is grounded in the foundational concepts of strengths-based learning, observation, relationship-based teams, efficiency, and organizational learning. We adapted these concepts to QI in the nursing home setting, creating the "LOCK" framework. The LOCK framework is currently being disseminated across the Veterans Health Administration. The LOCK framework has five tenets: (a) Look for the bright spots, (b) Observe, (c) Collaborate in huddles, (d) Keep it bite-sized, and (e) facilitation. Each tenet is described. We also present a case study documenting how a fictional nursing home can implement the LOCK framework as part of a QI effort to improve engagement between staff and residents. The case study describes sample observations, processes, and outcomes. We also discuss practical applications for nursing home staff, the adaptability of LOCK for different QI projects, the specific role of facilitation, and lessons learned. The proposed framework complements national efforts to improve quality of care and quality of life for nursing home residents and may be valuable across long-term care settings and QI project types.

  2. Using the Internet in Middle Schools: A Model for Success. A Collaborative Effort between Los Alamos National Laboratory (LANL) and Los Alamos Middle School (LAMS).

    Science.gov (United States)

    Addessio, Barbara K.; And Others

    Los Alamos National Laboratory (LANL) developed a model for school networking using Los Alamos Middle School as a testbed. The project was a collaborative effort between the school and the laboratory. The school secured administrative funding for hardware and software; and LANL provided the network architecture, installation, consulting, and…

  3. Modeling Requirements for Simulating the Effects of Extreme Acts of Terrorism: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Allen, M.; Hiebert-Dodd, K.; Marozas, D.; Paananen, O.; Pryor, R.J.; Reinert, R.K.

    1998-10-01

    This white paper presents the initial requirements for developing a new computer model for simulating the effects of extreme acts of terrorism in the United States. General characteristics of the model are proposed and the level of effort to prepare a complete written description of the model, prior to coding, is detailed. The model would simulate the decision processes and interactions of complex U. S. systems engaged in responding to and recovering from four types of terrorist incidents. The incident scenarios span the space of extreme acts of terrorism that have the potential to affect not only the impacted area, but also the entire nation. The model would be useful to decision-makers in assessing and analyzing the vulnerability of the nation's complex infrastructures, in prioritizing resources to reduce risk, and in planning strategies for immediate response and for subsequent recovery from terrorist incidents.

  4. A TRANSFORMATION APPROACH FOR COLLABORATION BASED REQUIREMENT MODELS

    Directory of Open Access Journals (Sweden)

    Ahmed Harbouche

    2012-02-01

    Full Text Available Distributed software engineering is widely recognized as a complex task. Among the inherent complexitiesis the process of obtaining a system design from its global requirement specification. This paper deals withsuch transformation process and suggests an approach to derive the behavior of a given systemcomponents, in the form of distributed Finite State Machines, from the global system requirements, in theform of an augmented UML Activity Diagrams notation. The process of the suggested approach issummarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model, the definition of the target Design Meta-Model and the definition of the rules to govern thetransformation during the derivation process. The derivation process transforms the global systemrequirements described as UML diagram activities (extended with collaborations to system rolesbehaviors represented as UML finite state machines. The approach is implemented using AtlasTransformation Language (ATL.

  5. Parental involvement, child effort, and the development of immigrant boys' and girls' reading and mathematics skills: A latent difference score growth model.

    Science.gov (United States)

    Moon, Ui Jeong; Hofferth, Sandra L

    2016-04-01

    Gender differences in elementary school performance among immigrant children have not yet been well documented. This study examined how differences in parental involvement, child effort, and family characteristics and resources contribute to immigrant boys'-and girls' academic achievement from kindergarten through 5(th)-grade. The sample was drawn from the Early Childhood Longitudinal Study-Kindergarten cohort. Using a latent score growth model, this study found that parents' involvement at home benefited boys' reading and mathematics skills throughout all early elementary school years, but did not have the same benefit for girls. For both boys and girls, child effort in reading appears to be strongly linked to better reading and mathematics skills at kindergarten and to subsequent improvement between grades. The positive associations of parental involvement and child's effort with test scores were greater during earlier years than during later years for boys, whereas there was no difference in the association over time for girls.

  6. NVC Based Model for Selecting Effective Requirement Elicitation Technique

    Directory of Open Access Journals (Sweden)

    Md. Rizwan Beg

    2012-10-01

    Full Text Available Requirement Engineering process starts from gathering of requirements i.e.; requirements elicitation. Requirementselicitation (RE is the base building block for a software project and has very high impact onsubsequent design and builds phases as well. Accurately capturing system requirements is the major factorin the failure of most of software projects. Due to the criticality and impact of this phase, it is very importantto perform the requirements elicitation in no less than a perfect manner. One of the most difficult jobsfor elicitor is to select appropriate technique for eliciting the requirement. Interviewing and Interactingstakeholder during Elicitation process is a communication intensive activity involves Verbal and Nonverbalcommunication (NVC. Elicitor should give emphasis to Non-verbal communication along with verbalcommunication so that requirements recorded more efficiently and effectively. In this paper we proposea model in which stakeholders are classified by observing non-verbal communication and use it as a basefor elicitation technique selection. We also propose an efficient plan for requirements elicitation which intendsto overcome on the constraints, faced by elicitor.

  7. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    This dissertation presents the contributions of seven publications all concerned with the application of Coloured Petri Nets (CPN) to requirements modeling for reactive systems. The publications are introduced along with relevant background material and related work, and their contributions...... interface composed of recognizable artifacts and activities. The presentation of the three publications related to Use Cases is followed by a the presentation of a publication formalizing some of the guidelines applied for structuring the CPN requirements models|namely the guidelines that make it possible...... activity. The traces are automatically recorded during execution of the model. The second publication presents a formally specified framework for automating a large part of the tasks related to integrating Problem Frames with CPN. The framework is specified in VDM++, and allows the modeler to automatically...

  8. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  9. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    material name (example, an HY80 steel ) plus additional material requirements (heat treatment, etc.) Creation of a more detailed description of the data...57 Figure 2.22. Typical Stress-Strain Curve for Steel (adapted from Ref 59) .............................. 60 Figure...structures are steel , aluminum and composites. The structural components that make up a global FEA model drive the fidelity of the model. For example

  10. Requirements for a next generation global flood inundation models

    Science.gov (United States)

    Bates, P. D.; Neal, J. C.; Smith, A.; Sampson, C. C.

    2016-12-01

    In this paper we review the current status of global hydrodynamic models for flood inundation prediction and highlight recent successes and current limitations. Building on this analysis we then go on to consider what is required to develop the next generation of such schemes and show that to achieve this a number of fundamental science problems will need to be overcome. New data sets and new types of analysis will be required, and we show that these will only partially be met by currently planned satellite missions and data collection initiatives. A particular example is the quality of available global Digital Elevation data. The current best data set for flood modelling, SRTM, is only available at a relatively modest 30m resolution, contains pixel-to-pixel noise of 6m and is corrupted by surface artefacts. Creative processing techniques have sought to address these issues with some success, but fundamentally the quality of the available global terrain data limits flood modelling and needs to be overcome. Similar arguments can be made for many other elements of global hydrodynamic models including their bathymetry data, boundary conditions, flood defence information and model validation data. We therefore systematically review each component of global flood models and document whether planned new technology will solve current limitations and, if not, what exactly will be required to do so.

  11. Methane release from the East Siberian Arctic Shelf: The role of subsea permafrost and other controlling factors as inferred from decadal observational and modeling efforts

    Science.gov (United States)

    Shakhova, N. E.

    2015-12-01

    Sustained methane (CH4) release from thawing Arctic permafrost to atmosphere may be a positive, major feedback to climate warming. East Siberian Arctic Shelf (ESAS) atmospheric CH4 venting was reported as on par with flux from Arctic tundra. Unlike release when ancient carbon in thawed on-land permafrost is mobilized, ESAS CH4 release is not determined by modern methanogenesis. Pre-formed CH4 largely stems from seabed deposits. Our investigation, including observational studies using hydrological, biogeochemical, geophysical, geo-electrical, microbiological, and isotopic methods, and modeling efforts to assess current subsea permafrost state and the ESAS' contribution to the regional CH4 budget, have clarified processes driving ESAS CH4 emissions. Subsea permafrost state is a major emission determinant; rates vary by 3-5 orders of magnitude. Outer ESAS CH4 emission rates, where subsea permafrost is predicted to be degraded due to long submergence by seawater, in places are similar to near-shore rates, where deep/open taliks can form due to combined heating effects of seawater, river runoff, geothermal flux, and pre-existing thermokarst. Progressive subsea permafrost thawing and decreasing ice extent could significantly increase ESAS CH4 emissions. Subsea permafrost drilling results reveal modern recently submerged subsea permafrost degradation rates, contradicting previous hypotheses that thousands of years required to form escape paths for permafrost-preserved gas. We used a decadal observational ESAS water column and atmospheric boundary layer (ABL) data set to define the minimum source strength required to explain observed seasonally-increased ABL CH4 concentration. Modeling results agree with estimates from in-situ sonar data. In <10 m shallow water ≤72% of CH4 remains in surfacing bubbles. Dissolved CH4 fate largely depends on 3 factors: dissolved CH4 water column turnover time, water column stability against vertical mixing, and turbulent diffusion and

  12. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  13. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  14. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  15. Modeling requirements for in situ vitrification. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom.

  16. Required experimental accuracy to select between supersymmetrical models

    Indian Academy of Sciences (India)

    David Grellscheid

    2004-03-01

    We will present a method to decide a priori whether various supersymmetrical scenarios can be distinguished based on sparticle mass data alone. For each model, a scan over all free SUSY breaking parameters reveals the extent of that model's physically allowed region of sparticle-mass-space. Based on the geometrical configuration of these regions in mass-space, it is possible to obtain an estimate of the required accuracy of future sparticle mass measurements to distinguish between the models. We will illustrate this algorithm with an example. Ths talk is based on work done in collaboration with B C Allanach (LAPTH, Annecy) and F Quevedo (DAMTP, Cambridge).

  17. Thermodynamic models for bounding pressurant mass requirements of cryogenic tanks

    Science.gov (United States)

    Vandresar, Neil T.; Haberbusch, Mark S.

    1994-01-01

    Thermodynamic models have been formulated to predict lower and upper bounds for the mass of pressurant gas required to pressurize a cryogenic tank and then expel liquid from the tank. Limiting conditions are based on either thermal equilibrium or zero energy exchange between the pressurant gas and initial tank contents. The models are independent of gravity level and allow specification of autogenous or non-condensible pressurants. Partial liquid fill levels may be specified for initial and final conditions. Model predictions are shown to successfully bound results from limited normal-gravity tests with condensable and non-condensable pressurant gases. Representative maximum collapse factor maps are presented for liquid hydrogen to show the effects of initial and final fill level on the range of pressurant gas requirements. Maximum collapse factors occur for partial expulsions with large final liquid fill fractions.

  18. Model Waveform Accuracy Requirements for the $\\chi^2$ Discriminator

    CERN Document Server

    Lindblom, Lee

    2016-01-01

    This paper derives accuracy standards for model gravitational waveforms required to ensure proper use of the $\\chi^2$ discriminator test in gravitational wave (GW) data analysis. These standards are different from previously established requirements for detection and waveform parameter measurement based on signal-to-noise optimization. We present convenient formulae both for evaluating and interpreting the contribution of model errors to measured $\\chi^2$ values. Motivated by these formula, we also present an enhanced, complexified variant of the standard $\\chi^2$ statistic used in GW searches. While our results are not directly relevant to current searches (which use the $\\chi^2$ test only to veto signal candidates with extremely high $\\chi^2$ values), they could be useful in future GW searches and as figures of merit for model gravitational waveforms.

  19. A commuting generation model requiring only aggregated data

    CERN Document Server

    Lenormand, Maxime; Gargiulo, Floriana

    2011-01-01

    We recently proposed, in (Gargiulo et al., 2011), an innova tive stochastic model with only one parameter to calibrate. It reproduces the complete network by an iterative process stochastically choosing, for each commuter living in the municipality of a region, a workplace in the region. The choice is done considering the job offer in each municipality of the region and the distance to all the possible destinations. The model is quite effective if the region is sufficiently autonomous in terms of job offers. However, calibrating or being sure of this autonomy require data or expertise which are not necessarily available. Moreover the region can be not autonomous. In the present, we overcome these limitations, extending the job search geographical base of the commuters to the outside of the region, and changing the deterrence function form. We also found a law to calibrate the improvement model which does not require data.

  20. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1982-09-01

    Report III, Volume 1 contains those specifications numbered A through J, as follows: General Specifications (A); Specifications for Pressure Vessels (C); Specifications for Tanks (D); Specifications for Exchangers (E); Specifications for Fired Heaters (F); Specifications for Pumps and Drivers (G); and Specifications for Instrumentation (J). The standard specifications of Bechtel Petroleum Incorporated have been amended as necessary to reflect the specific requirements of the Breckinridge Project, and the more stringent specifications of Ashland Synthetic Fuels, Inc. These standard specifications are available to the Initial Effort (Phase Zero) work performed by all contractors and subcontractors. Report III, Volume 1 also contains the unique specifications prepared for Plants 8, 15, and 27. These specifications will be substantially reviewed during Phase I of the project, and modified as necessary for use during the engineering, procurement, and construction of this project.

  1. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  2. Mathematical Modeling of Programmatic Requirements for Yaws Eradication

    Science.gov (United States)

    Mitjà, Oriol; Fitzpatrick, Christopher; Asiedu, Kingsley; Solomon, Anthony W.; Mabey, David C.W.; Funk, Sebastian

    2017-01-01

    Yaws is targeted for eradication by 2020. The mainstay of the eradication strategy is mass treatment followed by case finding. Modeling has been used to inform programmatic requirements for other neglected tropical diseases and could provide insights into yaws eradication. We developed a model of yaws transmission varying the coverage and number of rounds of treatment. The estimated number of cases arising from an index case (basic reproduction number [R0]) ranged from 1.08 to 3.32. To have 80% probability of achieving eradication, 8 rounds of treatment with 80% coverage were required at low estimates of R0 (1.45). This requirement increased to 95% at high estimates of R0 (2.47). Extending the treatment interval to 12 months increased requirements at all estimates of R0. At high estimates of R0 with 12 monthly rounds of treatment, no combination of variables achieved eradication. Models should be used to guide the scale-up of yaws eradication. PMID:27983500

  3. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio......This paper examines the benefit of ambiguity in describing goals in requirements modelling for the design of socio-technical systems using concepts from Agent-Oriented Software Engineering (AOSE) and ethnographic and cultural probe methods from Human Computer Interaction (HCI). The authors’ aim...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  4. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  5. A community effort towards a knowledge-base and mathematical model of the human pathogen Salmonella Typhimurium LT2

    Directory of Open Access Journals (Sweden)

    Shin Sook-Il

    2011-01-01

    Full Text Available Abstract Background Metabolic reconstructions (MRs are common denominators in systems biology and represent biochemical, genetic, and genomic (BiGG knowledge-bases for target organisms by capturing currently available information in a consistent, structured manner. Salmonella enterica subspecies I serovar Typhimurium is a human pathogen, causes various diseases and its increasing antibiotic resistance poses a public health problem. Results Here, we describe a community-driven effort, in which more than 20 experts in S. Typhimurium biology and systems biology collaborated to reconcile and expand the S. Typhimurium BiGG knowledge-base. The consensus MR was obtained starting from two independently developed MRs for S. Typhimurium. Key results of this reconstruction jamboree include i development and implementation of a community-based workflow for MR annotation and reconciliation; ii incorporation of thermodynamic information; and iii use of the consensus MR to identify potential multi-target drug therapy approaches. Conclusion Taken together, with the growing number of parallel MRs a structured, community-driven approach will be necessary to maximize quality while increasing adoption of MRs in experimental design and interpretation.

  6. The Investigation of Model of Consumers Responses to Brand Equity Based on Marketing Mix Efforts, Corporate Image and Brand Equity Relation (case stady : Butane Campany

    Directory of Open Access Journals (Sweden)

    Ahmad Sardari

    2014-07-01

    Full Text Available Abstract For keeping and continuing their perpetuity in nowadays, companies and should focus on competitive advantages and getting more consumers’ satisfaction for sale and more market shares.One of the useful tools that makes the company less vulnerable in face of market competitive activities and consumption liability and repetition is brand equity. The purpose of this paper is investigating the consumers’ responses on marketing- mix efforts, corporate image and brand equity relation using Kim & Hyun model(2011 and Buil & Martı´nez model(2013.This research is considered as applied based on goal and descriptive-survey based on data collection. Hypotheses were tested using structural equation modeling or SEM (in Lisrel and P.L.S software and consumers’ data Butane corporation productes in Tehran. Findings corroborate the positive impact of brand equity on consumers’ responses.The results of hypotheses analysis illustrate marketing- mix efforts positively impacts on brand equity and corporate image plays a significant role in creation of brand equity for Butane.So company managers should designate special places for distribution system growth, after sale services development, pricing, promotion in investment matrix for marketing mixed efforts.

  7. Operation TOMODACHI: A Model for American Disaster Response Efforts and the Collective use of Military Forces Abroad

    Science.gov (United States)

    2012-01-01

    overwhelmed electrical distribution systems in the plant and raised the water temperature, exposing radioactive material to the air. The process heat...DoD forces in the Pacific, worked conjunctively with its subordinate agency US Forces Japan to build a separate command structure dedicated to...aircrews, were eventually optimized to meet the very specific airspeed and altitude requirements of the infrared thermography equipment obtaining

  8. Improving fishing effort descriptors: Modelling engine power and gear-size relations of five European trawl fleets

    DEFF Research Database (Denmark)

    Eigaard, Ole Ritzau; Rihan, Dominic; Graham, Norman

    2011-01-01

    Based on information from an international inventory of gears currently deployed by trawlers in five European countries, the relationship between vessel engine power and trawl size is quantified for different trawl types, trawling techniques and target species. Using multiplicative modelling...

  9. A Model for Forecasting Enlisted Student IA Billet Requirements

    Science.gov (United States)

    2016-03-01

    were promised and had at least one course failure . Training times Student execution depends on TTT. TTT includes under-instruction (UI) time and...Cleared for Public Release A Model for Forecasting Enlisted Student IA Billet Requirements Steven W. Belcher with David L. Reese...and Kletus S. Lawler March 2016 Copyright © 2016 CNA This document contains the best opinion of CNA at the time of issue. It does

  10. On data requirements for calibration of integrated models for urban water systems.

    Science.gov (United States)

    Langeveld, Jeroen; Nopens, Ingmar; Schilperoort, Remy; Benedetti, Lorenzo; de Klein, Jeroen; Amerlinck, Youri; Weijers, Stefan

    2013-01-01

    Modeling of integrated urban water systems (IUWS) has seen a rapid development in recent years. Models and software are available that describe the process dynamics in sewers, wastewater treatment plants (WWTPs), receiving water systems as well as at the interfaces between the submodels. Successful applications of integrated modeling are, however, relatively scarce. One of the reasons for this is the lack of high-quality monitoring data with the required spatial and temporal resolution and accuracy to calibrate and validate the integrated models, even though the state of the art of monitoring itself is no longer the limiting factor. This paper discusses the efforts to be able to meet the data requirements associated with integrated modeling and describes the methods applied to validate the monitoring data and to use submodels as software sensor to provide the necessary input for other submodels. The main conclusion of the paper is that state of the art monitoring is in principle sufficient to provide the data necessary to calibrate integrated models, but practical limitations resulting in incomplete data-sets hamper widespread application. In order to overcome these difficulties, redundancy of future monitoring networks should be increased and, at the same time, data handling (including data validation, mining and assimilation) should receive much more attention.

  11. A longitudinal multilevel model analysis of the within-person and between-person effect of effortful engagement and academic self-efficacy on academic performance.

    Science.gov (United States)

    Galla, Brian M; Wood, Jeffrey J; Tsukayama, Eli; Har, Kim; Chiu, Angela W; Langer, David A

    2014-06-01

    Using data from an accelerated longitudinal study, we examined the within-person and between-person effect of effortful engagement and academic self-efficacy on academic performance across students (N=135) in elementary school. Teachers assessed participants' effortful engagement and participants rated their academic self-efficacy once per year for 3 years. Academic performance was assessed through standardized test scores in reading and math. Multilevel models indicated that within-person change in Effortful Engagement and Academic Self-Efficacy scores significantly predicted concomitant within-person change in reading test scores, B=2.71, p=.043, Pseudo-R2=.02 and B=4.72, p=.005, Pseudo-R2=.04, respectively. Participants with higher between-person levels of Effortful Engagement had higher initial reading test scores, B=10.03, p=.001, Pseudo-R2=.09, and math test scores, B=11.20, pAcademic Self-Efficacy showed a faster rate of increase in math test scores across elementary school, B=10.21, p=.036, Pseudo-R2=.25. At the between-person level, Effortful Engagement mediated the association between Academic Self-Efficacy and both reading and math test scores, although no support was found for mediation at the within-person level. Collectively, results suggest that trait-level psychological factors can vary meaningfully within school-aged children and that both within-person change and between-person individual differences in these traits have important consequences for academic performance.

  12. Modeling the effects of promotional efforts on aggregate pharmaceutical demand : What we know and challenges for the future

    NARCIS (Netherlands)

    Wieringa, J.E.; Osinga, E.C.; Conde, E.R.; Leeflang, P.S.H.; Stern, P.; Ding, M.; Eliashberg, J.; Stremersch, S.

    2014-01-01

    Pharmaceutical marketing is becoming an important area of research in its own right, as evidenced by the steady increase in relevant papers published in the major marketing journals in recent years. These papers utilize different modeling techniques and types of data. In this chapter we focus on

  13. Efforts to Increase Students Reading Interest on Educational Reference Through Classical Guidance and Counseling Experiential Learning Model

    Directory of Open Access Journals (Sweden)

    Tatik Sutarti

    2017-03-01

    Full Text Available The objective of the research is improving students’ reading interest on educational references through classical guidance and counseling experiential learning model. The research was carried out at STKIP Pacitan on the second semester in 2016/2017 academic year. The subject of the research was 20 fourth semester students of STKIP Pacitan. The method of the research was Classroom Action Research (CAR. The data was collected through 3 (three stages namely: data reduction, data presentation, and data conclusion or verification. The research resulted that the use of classical guidance and counseling experiential learning model gave opportunity for students to deliver their ideas related with the problems in reading interest, and then, being solved together through critical thinking.

  14. CFD Modelling and Experimental Testing of Thermal Calcination of Kaolinite Rich Clay Particles - An Effort towards Green Concrete

    DEFF Research Database (Denmark)

    Gebremariam, Abraham Teklay

    Cement industry is one of the major industrial emitters of greenhouse gases, generating 5-7% of the total anthropogenic CO2 emissions. Consequently, use of supplementary cementitious materials (SCM) to replace part of the CO2-intensive cement clinker is an attractive way to mitigate CO2 emissions...... from cement industry. SCMs based on industrial byproducts like fly ashes and slags are subject to availability problems. Yet clays are the most ubiquitous material on earth's crust. Thus, properly calcined clays are a very promising candidate for SCMs to produce green cements. Calcination...... PROcess Modeling System) software, which is suspended during the project due to the adjustment made by the project consortium. The model results from both C++ and gPROMS software show good similarity. Various experiments have been performed to derive key kinetic data, to collect data from a gas suspension...

  15. An effort to improve track and intensity prediction of tropical cyclones through vortex initialization in NCUM-global model

    Science.gov (United States)

    Singh, Vivek; Routray, A.; Mallick, Swapan; George, John P.; Rajagopal, E. N.

    2016-05-01

    Tropical cyclones (TCs) have strong impact on socio-economic conditions of the countries like India, Bangladesh and Myanmar owing to its awful devastating power. This brings in the need of precise forecasting system to predict the tracks and intensities of TCs accurately well in advance. However, it has been a great challenge for major operational meteorological centers over the years. Genesis of TCs over data sparse warm Tropical Ocean adds more difficulty to this. Weak and misplaced vortices at initial time are one of the prime sources of track and intensity errors in the Numerical Weather Prediction (NWP) models. Many previous studies have reported the forecast skill of track and intensity of TC improved due to the assimilation of satellite data along with vortex initialization (VI). Keeping this in mind, an attempt has been made to investigate the impact of vortex initialization for simulation of TC using UK-Met office global model, operational at NCMRWF (NCUM). This assessment is carried out by taking the case of a extremely severe cyclonic storm "Chapala" that occurred over Arabian Sea (AS) from 28th October to 3rd November 2015. Two numerical experiments viz. Vort-GTS (Assimilation of GTS observations with VI) and Vort-RAD (Same as Vort-GTS with assimilation of satellite data) are carried out. This vortex initialization study in NCUM model is first of its type over North Indian Ocean (NIO). The model simulation of TC is carried out with five different initial conditions through 24 hour cycles for both the experiments. The results indicate that the vortex initialization with assimilation of satellite data has a positive impact on the track and intensity forecast, landfall time and position error of the TCs.

  16. Dopamine, behavioral economics, and effort

    Directory of Open Access Journals (Sweden)

    John D Salamone

    2009-09-01

    Full Text Available Abstract. There are numerous problems with the hypothesis that brain dopamine (DA systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Research and theory related to the functions of mesolimbic DA are undergoing a substantial conceptual restructuring, with the traditional emphasis on hedonia and primary reward yielding to other concepts and lines of inquiry. The present review is focused upon the involvement of nucleus accumbens DA in behavioral activation and effort-related processes. Viewed from the framework of behavioral economics, the effects of accumbens DA depletions and antagonism on food-reinforced behavior are highly dependent upon the work requirements of the instrumental task, and DA depleted rats are more sensitive to increases in response costs (i.e., ratio requirements. Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related choice behavior. Rats with accumbens DA depletions or antagonism reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead these rats select a less-effortful type of food-seeking behavior. Nucleus accumbens DA and adenosine interact in the regulation of effort-related functions, and other brain structures (anterior cingulate cortex, amygdala, ventral pallidum also are involved. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue or anergia in depression and other neurological disorders.

  17. Dopamine, behavioral economics, and effort.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Farrar, Andrew M; Nunes, Eric J; Pardo, Marta

    2009-01-01

    There are numerous problems with the hypothesis that brain dopamine (DA) systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Research and theory related to the functions of mesolimbic DA are undergoing a substantial conceptual restructuring, with the traditional emphasis on hedonia and primary reward yielding to other concepts and lines of inquiry. The present review is focused upon the involvement of nucleus accumbens DA in behavioral activation and effort-related processes. Viewed from the framework of behavioral economics, the effects of accumbens DA depletions and antagonism on food-reinforced behavior are highly dependent upon the work requirements of the instrumental task, and DA depleted rats are more sensitive to increases in response costs (i.e., ratio requirements). Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related choice behavior. Rats with accumbens DA depletions or antagonism reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead these rats select a less-effortful type of food-seeking behavior. Nucleus accumbens DA and adenosine interact in the regulation of effort-related functions, and other brain structures (anterior cingulate cortex, amygdala, ventral pallidum) also are involved. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue or anergia in depression and other neurological disorders.

  18. Baseline requirements of the proposed action for the Transportation Management Division routing models

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy`s (DOE) Environmental Management Transportation Management Division`s (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections.

  19. Optimal effort investment for overcoming the weakest point: new insights from a computational model of neuromuscular adaptation.

    Science.gov (United States)

    Arandjelović, Ognjen

    2011-08-01

    The occurrence of so-called sticking points in a lift is pervasive in weight training practice. Biomechanically complex exercises often exhibit multi-modal variation of effective force exerted against the load as a function of the elevation and velocity of the load. This results in a variety of possible loci for the occurrence of sticking points and makes the problem of designing the optimal training strategy to overcome them challenging. In this article a case founded on theoretical grounds is made against a purely empirical method. It is argued that the nature of the problem considered and the wide range of variables involved limit the generality of conclusions which can be drawn from experimental studies alone. Instead an alternative is described, whereby a recently proposed mathematical model of neuromuscular adaptation is employed in a series of computer simulations. These are used to examine quantitatively the effects of differently targeted partial range of motion (ROM) training approaches. Counter-intuitively and in contrast to common training practices, the key novel insight inferred from the obtained results is that in some cases the most effective approach for improving performance in an exercise with a sticking point at a particular point in the ROM is to improve force production capability at a different and possibly remote position in the lift. In the context of the employed model, this result is explained by changes in the neuromuscular and biomechanical environment for force production.

  20. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  1. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  2. User requirements for hydrological models with remote sensing input

    Energy Technology Data Exchange (ETDEWEB)

    Kolberg, Sjur

    1997-10-01

    Monitoring the seasonal snow cover is important for several purposes. This report describes user requirements for hydrological models utilizing remotely sensed snow data. The information is mainly provided by operational users through a questionnaire. The report is primarily intended as a basis for other work packages within the Snow Tools project which aim at developing new remote sensing products for use in hydrological models. The HBV model is the only model mentioned by users in the questionnaire. It is widely used in Northern Scandinavia and Finland, in the fields of hydroelectric power production, flood forecasting and general monitoring of water resources. The current implementation of HBV is not based on remotely sensed data. Even the presently used HBV implementation may benefit from remotely sensed data. However, several improvements can be made to hydrological models to include remotely sensed snow data. Among these the most important are a distributed version, a more physical approach to the snow depletion curve, and a way to combine data from several sources. 1 ref.

  3. Life sciences research in space: The requirement for animal models

    Science.gov (United States)

    Fuller, C. A.; Philips, R. W.; Ballard, R. W.

    1987-01-01

    Use of animals in NASA space programs is reviewed. Animals are needed because life science experimentation frequently requires long-term controlled exposure to environments, statistical validation, invasive instrumentation or biological tissue sampling, tissue destruction, exposure to dangerous or unknown agents, or sacrifice of the subject. The availability and use of human subjects inflight is complicated by the multiple needs and demands upon crew time. Because only living organisms can sense, integrate and respond to the environment around them, the sole use of tissue culture and computer models is insufficient for understanding the influence of the space environment on intact organisms. Equipment for spaceborne experiments with animals is described.

  4. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    Science.gov (United States)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  5. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Vienna, John D.; Kim, Dong-Sang; Skorski, Daniel C.; Matyas, Josef

    2013-07-31

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminary in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.

  6. Glass Property Models and Constraints for Estimating the Glass to be Produced at Hanford by Implementing Current Advanced Glass Formulation Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kim, Dong-Sang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Skorski, Daniel C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Matyas, Josef [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-07-01

    Recent glass formulation and melter testing data have suggested that significant increases in waste loading in HLW and LAW glasses are possible over current system planning estimates. The data (although limited in some cases) were evaluated to determine a set of constraints and models that could be used to estimate the maximum loading of specific waste compositions in glass. It is recommended that these models and constraints be used to estimate the likely HLW and LAW glass volumes that would result if the current glass formulation studies are successfully completed. It is recognized that some of the models are preliminary in nature and will change in the coming years. Plus the models do not currently address the prediction uncertainties that would be needed before they could be used in plant operations. The models and constraints are only meant to give an indication of rough glass volumes and are not intended to be used in plant operation or waste form qualification activities. A current research program is in place to develop the data, models, and uncertainty descriptions for that purpose. A fundamental tenet underlying the research reported in this document is to try to be less conservative than previous studies when developing constraints for estimating the glass to be produced by implementing current advanced glass formulation efforts. The less conservative approach documented herein should allow for the estimate of glass masses that may be realized if the current efforts in advanced glass formulations are completed over the coming years and are as successful as early indications suggest they may be. Because of this approach there is an unquantifiable uncertainty in the ultimate glass volume projections due to model prediction uncertainties that has to be considered along with other system uncertainties such as waste compositions and amounts to be immobilized, split factors between LAW and HLW, etc.

  7. Modeling of Testability Requirement Based on Generalized Stochastic Petri Nets

    Institute of Scientific and Technical Information of China (English)

    SU Yong-ding; QIU Jing; LIU Guan-jun; QIAN Yan-ling

    2009-01-01

    Testability design is an effective way to realize the fault detection and isolation. Its important step is to determine testability figures of merits (TFOM). Firstly, some influence factors for TFOMs are analyzed, such as the processes of system operation, maintenance and support, fault detection and isolation and so on. Secondly, a testability requirement analysis model is built based on generalized stochastic Petri net (GSPN). Then, the system's reachable states are analyzed based on the model, a Markov chain isomorphic with Petri net is constructed, a state transition matrix is created and the system's steady state probability is obtained. The relationship between the steady state availability and testability parameters can be revealed and reasoned. Finally, an example shows that the proposed method can determine TFOM, such as fault detection rate and fault isolation rate, effectively and reasonably.

  8. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  9. Requirements for high level models supporting design space exploration in model-based systems engineering

    NARCIS (Netherlands)

    Haveman, Steven; Bonnema, Gerrit Maarten

    2013-01-01

    Most formal models are used in detailed design and focus on a single domain. Few effective approaches exist that can effectively tie these lower level models to a high level system model during design space exploration. This complicates the validation of high level system requirements during

  10. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  11. Neurocomputational mechanisms underlying subjective valuation of effort costs

    Science.gov (United States)

    Giehl, Kathrin; Sillence, Annie

    2017-01-01

    In everyday life, we have to decide whether it is worth exerting effort to obtain rewards. Effort can be experienced in different domains, with some tasks requiring significant cognitive demand and others being more physically effortful. The motivation to exert effort for reward is highly subjective and varies considerably across the different domains of behaviour. However, very little is known about the computational or neural basis of how different effort costs are subjectively weighed against rewards. Is there a common, domain-general system of brain areas that evaluates all costs and benefits? Here, we used computational modelling and functional magnetic resonance imaging (fMRI) to examine the mechanisms underlying value processing in both the cognitive and physical domains. Participants were trained on two novel tasks that parametrically varied either cognitive or physical effort. During fMRI, participants indicated their preferences between a fixed low-effort/low-reward option and a variable higher-effort/higher-reward offer for each effort domain. Critically, reward devaluation by both cognitive and physical effort was subserved by a common network of areas, including the dorsomedial and dorsolateral prefrontal cortex, the intraparietal sulcus, and the anterior insula. Activity within these domain-general areas also covaried negatively with reward and positively with effort, suggesting an integration of these parameters within these areas. Additionally, the amygdala appeared to play a unique, domain-specific role in processing the value of rewards associated with cognitive effort. These results are the first to reveal the neurocomputational mechanisms underlying subjective cost–benefit valuation across different domains of effort and provide insight into the multidimensional nature of motivation. PMID:28234892

  12. MODSARE-V: Validation of Dependability and Safety Critical Software Components with Model Based Requirements

    Science.gov (United States)

    Silveira, Daniel T. de M. M.; Schoofs, Tobias; Alana Salazar, Elena; Rodriguez Rodriguez, Ana Isabel; Devic, Marie-Odile

    2010-08-01

    The wide use of RAMS methods and techniques [1] (e.g. SFMECA, SFTA, HAZOP, HA...) in critical software development resulted in the specification of new software requirements, design constraints and other issues such as mandatory coding rules. Given the large variety of RAMS Requirements and Techniques, different types of Verification and Validation (V&V) [14] are spread over the phases of the software engineering process. As a result, the V&V process becomes complex and the cost and time required for a complete and consistent V&V process is increased. By introducing the concept of a model based approach to facilitate the RAMS requirements definition process, the V&V may be reduce in time and effort. MODSARE-V is demonstrates the feasibility of this concept based on case studies applied to ground or on-board software space projects with critical functions/components. This paper describes the approach adopted at MODSARE-V to realize the concept into a prototype and summarizes the results and conclusions met after the prototype application on the case studies.

  13. Finite element model approach of a cylindrical lithium ion battery cell with a focus on minimization of the computational effort and short circuit prediction

    Science.gov (United States)

    Raffler, Marco; Sevarin, Alessio; Ellersdorfer, Christian; Heindl, Simon F.; Breitfuss, Christoph; Sinz, Wolfgang

    2017-08-01

    In this research, a parameterized beam-element-based mechanical modeling approach for cylindrical lithium ion batteries is developed. With the goal to use the cell model in entire vehicle crash simulations, focus of development is on minimizing the computational effort whilst simultaneously obtaining accurate mechanical behavior. The cylindrical cell shape is approximated by radial beams connected to each other in circumferential and longitudinal directions. The discrete beam formulation is used to define an anisotropic material behavior. An 18650 lithium ion cell model constructed in LS-Dyna is used to show the high degree of parameterization of the approach. A criterion which considers the positive pole deformation and the radial deformation of the cell is developed for short circuit prediction during simulation. An abuse testing program, consisting of radial crush, axial crush, and penetration is performed to evaluate the mechanical properties and internal short circuit behavior of a commercially available 18650 lithium cell. Additional 3-point-bending tests are performed to verify the approach objectively. By reducing the number of strength-related elements to 1600, a fast and accurate cell model can be created. Compared to typical cell models in technical literature, simulation time of a single cell load case can be reduced by approx. 90%.

  14. Reproductive effort in viscous populations

    NARCIS (Netherlands)

    Pen, Ido

    2000-01-01

    Here I study a kin selection model of reproductive effort, the allocation of resources to fecundity versus survival, in a patch-structured population. Breeding females remain in the same patch for life. Offspring have costly, partial long-distance dispersal and compete for breeding sites, which beco

  15. LSST camera heat requirements using CFD and thermal seeing modeling

    Science.gov (United States)

    Sebag, Jacques; Vogiatzis, Konstantinos

    2010-07-01

    The LSST camera is located above the LSST primary/tertiary mirror and in front of the secondary mirror in the shadow of its central obscuration. Due to this position within the optical path, heat released from the camera has a potential impact on the seeing degradation that is larger than traditionally estimated for Cassegrain or Nasmyth telescope configurations. This paper presents the results of thermal seeing modeling combined with Computational Fluid Dynamics (CFD) analyzes to define the thermal requirements on the LSST camera. Camera power output fluxes are applied to the CFD model as boundary conditions to calculate the steady-state temperature distribution on the camera and the air inside the enclosure. Using a previously presented post-processing analysis to calculate the optical seeing based on the mechanical turbulence and temperature variations along the optical path, the optical performance resulting from the seeing is determined. The CFD simulations are repeated for different wind speeds and orientations to identify the worst case scenario and generate an estimate of seeing contribution as a function of camera-air temperature difference. Finally, after comparing with the corresponding error budget term, a maximum allowable temperature for the camera is selected.

  16. Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration

    Science.gov (United States)

    Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang

    Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.

  17. Requirements-Driven Deployment: Customizing the Requirements Model for the Host Environment

    NARCIS (Netherlands)

    Ali, Raian; Dalpiaz, Fabiano; Giorgini, Paolo

    2014-01-01

    Deployment is a main development phase which configures a software to be ready for use in a certain environment. The ultimate goal of deployment is to enable users to achieve their requirements while using the deployed software. However, requirements are not uniform and differ between deployment env

  18. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  19. After the Tournament: Outcomes and Effort Provision

    OpenAIRE

    McGee, Andrew; McGee, Peter

    2013-01-01

    Modeling the incentive effects of competitions among employees for promotions or financial rewards, economists have largely ignored the effects of competition on effort provision once the competition is finished. In a laboratory experiment, we examine how competition outcomes affect the provision of post-competition effort. We find that subjects who lose arbitrarily decided competitions choose lower subsequent effort levels than subjects who lose competitions decided by their effort choices. ...

  20. Modelling efforts needed to advance herpes simplex virus (HSV) vaccine development: Key findings from the World Health Organization Consultation on HSV Vaccine Impact Modelling.

    Science.gov (United States)

    Gottlieb, Sami L; Giersing, Birgitte; Boily, Marie-Claude; Chesson, Harrell; Looker, Katharine J; Schiffer, Joshua; Spicknall, Ian; Hutubessy, Raymond; Broutet, Nathalie

    2017-06-21

    Development of a vaccine against herpes simplex virus (HSV) is an important goal for global sexual and reproductive health. In order to more precisely define the health and economic burden of HSV infection and the theoretical impact and cost-effectiveness of an HSV vaccine, in 2015 the World Health Organization convened an expert consultation meeting on HSV vaccine impact modelling. The experts reviewed existing model-based estimates and dynamic models of HSV infection to outline critical future modelling needs to inform development of a comprehensive business case and preferred product characteristics for an HSV vaccine. This article summarizes key findings and discussions from the meeting on modelling needs related to HSV burden, costs, and vaccine impact, essential data needs to carry out those models, and important model components and parameters. Copyright © 2017. Published by Elsevier Ltd.

  1. Hanford Soil Inventory Model (SIM) Rev. 1 Software Documentation – Requirements, Design, and Limitations

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Brett C.; Corbin, Rob A.; Anderson, Michael J.; Kincaid, Charles T.

    2006-09-25

    The objective of this document is to support the simulation results reported by Corbin et al. (2005) by documenting the requirements, conceptual model, simulation methodology, testing, and quality assurance associated with the Hanford Soil Inventory Model (SIM). There is no conventional software life-cycle documentation associated with the Hanford SIM because of the research and development nature of the project. Because of the extensive use of commercial- off-the-shelf software products, there was little actual software development as part of this application. This document is meant to provide historical context and technical support of Corbin et al. (2005), which is a significant revision and update to an earlier product Simpson et al. (2001). The SIM application computed waste discharges composed of 75 analytes at 377 waste sites (liquid disposal, unplanned releases, and tank farm leaks) over an operational period of approximately 50 years. The development and application of SIM was an effort to develop a probabilistic approach to estimate comprehensive, mass balanced-based contaminant inventories for the Hanford Site post-closure setting. A computer model capable of calculating inventories and the associated uncertainties as a function of time was identified to address the needs of the Remediation and Closure Science (RCS) Project.

  2. Simulation Modeling Requirements for Loss-of-Control Accident Prevention of Turboprop Transport Aircraft

    Science.gov (United States)

    Crider, Dennis; Foster, John V.

    2012-01-01

    In-flight loss of control remains the leading contributor to aviation accident fatalities, with stall upsets being the leading causal factor. The February 12, 2009. Colgan Air, Inc., Continental Express flight 3407 accident outside Buffalo, New York, brought this issue to the forefront of public consciousness and resulted in recommendations from the National Transportation Safety Board to conduct training that incorporates stalls that are fully developed and develop simulator standards to support such training. In 2010, Congress responded to this accident with Public Law 11-216 (Section 208), which mandates full stall training for Part 121 flight operations. Efforts are currently in progress to develop recommendations on implementation of stall training for airline pilots. The International Committee on Aviation Training in Extended Envelopes (ICATEE) is currently defining simulator fidelity standards that will be necessary for effective stall training. These recommendations will apply to all civil transport aircraft including straight-wing turboprop aircraft. Government-funded research over the previous decade provides a strong foundation for stall/post-stall simulation for swept-wing, conventional tail jets to respond to this mandate, but turboprops present additional and unique modeling challenges. First among these challenges is the effect of power, which can provide enhanced flow attachment behind the propellers. Furthermore, turboprops tend to operate for longer periods in an environment more susceptible to ice. As a result, there have been a significant number of turboprop accidents as a result of the early (lower angle of attack) stalls in icing. The vulnerability of turboprop configurations to icing has led to studies on ice accumulation and the resulting effects on flight behavior. Piloted simulations of these effects have highlighted the important training needs for recognition and mitigation of icing effects, including the reduction of stall margins

  3. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  4. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  5. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  6. Achieving a System Operational Availability Requirement (ASOAR) Model

    Science.gov (United States)

    1992-07-01

    ASOAR requires only system and end item level input data, not Line Replaceable Unit (LRU) Input data. ASOAR usage provides concepts for major logistics...the Corp/Theater ADP Service Center II (CTASC II) to a systen operational availabilty goal. The CTASC II system configuration had many redundant types

  7. Navy superconductivity efforts

    Science.gov (United States)

    Gubser, D. U.

    1990-04-01

    Both the new high temperature superconductors (HTS) and the low temperature superconductors (LTS) are important components of Navy's total plan to integrate superconductivity into field operational systems. Fundamental research is an important component of the total Navy program and focuses on the HTS materials. Power applications (ship propulsion, etc.) use LTS materials while space applications (MMW electronics, etc.) use HTS materials. The Space Experiment being conducted at NRL will involve space flight testing of HTS devices built by industry and will demonstrate the ability to engineer and space qualify these devices for systems use. Another important component of the Navy's effort is the development of Superconducting Quantum Interference Device (SQUID) magnetometers. This program will use LTS materials initially, but plans to implement HTS materials as soon as possible. Hybrid HTS/LTS systems are probable in many applications. A review of the status of the Navy's HTS materials research is given as well as an update on the Navy's development efforts in superconductivity, with particular emphasis on the related SDIO sponsored program on HTS applications.

  8. Navy superconductivity efforts

    Science.gov (United States)

    Gubser, D. U.

    1990-01-01

    Both the new high temperature superconductors (HTS) and the low temperature superconductors (LTS) are important components of Navy's total plan to integrate superconductivity into field operational systems. Fundamental research is an important component of the total Navy program and focuses on the HTS materials. Power applications (ship propulsion, etc.) use LTS materials while space applications (MMW electronics, etc.) use HTS materials. The Space Experiment being conducted at NRL will involve space flight testing of HTS devices built by industry and will demonstrate the ability to engineer and space qualify these devices for systems use. Another important component of the Navy's effort is the development of Superconducting Quantum Interference Device (SQUID) magnetometers. This program will use LTS materials initially, but plans to implement HTS materials as soon as possible. Hybrid HTS/LTS systems are probable in many applications. A review of the status of the Navy's HTS materials research is given as well as an update on the Navy's development efforts in superconductivity, with particular emphasis on the related SDIO sponsored program on HTS applications.

  9. Effort, Wages, and the International Division of Labor

    OpenAIRE

    Edward E. Leamer

    1999-01-01

    This paper embeds variable effort into a traditional multi-sector model. Effort enters a production function like total-factor-productivity and on the assumption that effort doesn't affect capital depreciation, the capital-cost savings from high effort operations are passed on to workers. The labor market thus offers a set of contracts with higher wages compensating for higher effort. Among the implications of the model are: The capital savings from effort are greatest in the capital-intensiv...

  10. Perceived distributed effort in team ball sports.

    Science.gov (United States)

    Beniscelli, Violeta; Tenenbaum, Gershon; Schinke, Robert Joel; Torregrosa, Miquel

    2014-01-01

    In this study, we explored the multifaceted concept of perceived mental and physical effort in team sport contexts where athletes must invest individual and shared efforts to reach a common goal. Semi-structured interviews were conducted with a convenience sample of 15 Catalan professional coaches (3 women and 12 men, 3 each from the following sports: volleyball, basketball, handball, soccer, and water polo) to gain their views of three perceived effort-related dimensions: physical, psychological, and tactical. From a theoretical thematic analysis, it was found that the perception of effort is closely related to how effort is distributed within the team. Moreover, coaches viewed physical effort in relation to the frequency and intensity of the players' involvement in the game. They identified psychological effort in situations where players pay attention to proper cues, and manage emotions under difficult circumstances. Tactical effort addressed the decision-making process of players and how they fulfilled their roles while taking into account the actions of their teammates and opponents. Based on these findings, a model of perceived distributed effort was developed, which delineates the elements that compose each of the aforementioned dimensions. Implications of perceived distributed effort in team coordination and shared mental models are discussed.

  11. Work experiences among nurses and physicians in the beginning of their professional careers - analyses using the effort-reward imbalance model.

    Science.gov (United States)

    Birgit, Enberg; Gunnevi, Sundelin; Ann, Öhman

    2013-03-01

    The aim of the study was to scrutinise how nurses and physicians, employed by the county councils in Sweden, assess their work environment in terms of effort and reward at the start of their career. The aim was also to estimate associations between work satisfaction and the potential outcomes from the effort-reward imbalance (ERI) questionnaire. The study group, 198 nurses and 242 physicians who graduated in 1999, is a subsample drawn from a national cross-sectional survey. Data were collected in the third year after graduation among the nurses and in the fourth year after graduation among registered physicians. The effort-reward imbalance questionnaire, together with a question on work satisfaction, was used to evaluate psychosocial factors at work. The results reveal that nurses scored higher on effort, lower on reward and experienced higher effort-reward imbalance, compared with physicians. Women scored higher on work-related overcommitment (WOC) compared with men. Among the physicians, logistic regression analysis revealed a statistically significant association between WOC and ERI, sex, effort and reward. Logistic regression analysis also revealed a statistically significant association between WOC and ERI and between WOC and effort among the nurses. Dissatisfaction with work was significantly higher among those who scored worst on all three ERI subscales (effort, reward and WOC) and also among those with the highest ERI ratios compared with the other respondents. In conclusion, to prevent future work-related health problems and work dissatisfaction among nurses and physicians in the beginning of their professional careers, signs of poor psychosocial working conditions have to been taken seriously. In future work-related stress research among healthcare personnel, gender-specific aspects of working conditions must be further highlighted to develop more gender-sensitive analyses.

  12. Cassini launch contingency effort

    Science.gov (United States)

    Chang, Yale; O'Neil, John M.; McGrath, Brian E.; Heyler, Gene A.; Brenza, Pete T.

    2002-01-01

    On 15 October 1997 at 4:43 AM EDT, the Cassini spacecraft was successfully launched on a Titan IVB/Centaur on a mission to explore the Saturnian system. It carried three Radioisotope Thermoelectric Generators (RTGs) and 117 Light Weight Radioisotope Heater Units (LWRHUs). As part of the joint National Aeronautics and Space Administration (NASA)/U.S. Department of Energy (DoE) safety effort, a contingency plan was prepared to address the unlikely events of an accidental suborbital reentry or out-of-orbital reentry. The objective of the plan was to develop procedures to predict, within hours, the Earth impact footprints (EIFs) for the nuclear heat sources released during the atmospheric reentry. The footprint predictions would be used in subsequent notification and recovery efforts. As part of a multi-agency team, The Johns Hopkins University Applied Physics Laboratory (JHU/APL) had the responsibility to predict the EIFs of the heat sources after a reentry, given the heat sources' release conditions from the main spacecraft. (No ablation burn-through of the heat sources' aeroshells was expected, as a result of earlier testing.) JHU/APL's other role was to predict the time of reentry from a potential orbital decay. The tools used were a three degree-of-freedom trajectory code, a database of aerodynamic coefficients for the heat sources, secure links to obtain tracking data, and a high fidelity special perturbation orbit integrator code to predict time of spacecraft reentry from orbital decay. In the weeks and days prior to launch, all the codes and procedures were exercised. Notional EIFs were derived from hypothetical reentry conditions. EIFs predicted by JHU/APL were compared to those by JPL and US SPACECOM, and were found to be in good agreement. The reentry time from orbital decay for a booster rocket for the Russian Progress M-36 freighter, a cargo ship for the Mir space station, was predicted to within 5 minutes more than two hours before reentry. For the

  13. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  14. Mapping telemedicine efforts

    DEFF Research Database (Denmark)

    Kierkegaard, Patrick

    2015-01-01

    are being utilized? What medical disciplines are being addressed using telemedicine systems? Methods: All data was surveyed from the "Telemedicinsk Landkort", a newly created database designed to provide a comprehensive and systematic overview of all telemedicine technologies in Denmark. Results......Objectives: The aim of this study is to survey telemedicine services currently in operation across Denmark. The study specifically seeks to answer the following questions: What initiatives are deployed within the different regions? What are the motivations behind the projects? What technologies......: The results of this study suggest that a growing number of telemedicine initiatives are currently in operation across Denmark but that considerable variations existed in terms of regional efforts as the number of operational telemedicine projects varied from region to region. Conclusions: The results...

  15. Requirements for competence modelling in professional learning: experience from the water sector

    OpenAIRE

    Éva Rátky; Stracke, Christian M.; Charalampos Thanopoulos; Cleo Sgouropoulou

    2010-01-01

    Competence Models are proved as critical instruments for human resources management and development, and of determined for both the labour market (employers) for the selection of the employees and training providers for the enhancement of the vocational training opportunities. The concept of competence modeling is still under development and considerable efforts are focused on the creation of new Competence Models and their application to a broad range of professional learning sectors. The sc...

  16. Requirements and Problems in Parallel Model Development at DWD

    Directory of Open Access Journals (Sweden)

    Ulrich Schäattler

    2000-01-01

    Full Text Available Nearly 30 years after introducing the first computer model for weather forecasting, the Deutscher Wetterdienst (DWD is developing the 4th generation of its numerical weather prediction (NWP system. It consists of a global grid point model (GME based on a triangular grid and a non-hydrostatic Lokal Modell (LM. The operational demand for running this new system is immense and can only be met by parallel computers. From the experience gained in developing earlier NWP models, several new problems had to be taken into account during the design phase of the system. Most important were portability (including efficieny of the programs on several computer architectures and ease of code maintainability. Also the organization and administration of the work done by developers from different teams and institutions is more complex than it used to be. This paper describes the models and gives some performance results. The modular approach used for the design of the LM is explained and the effects on the development are discussed.

  17. Modelling and Simulation for Requirements Engineering and Options Analysis

    Science.gov (United States)

    2010-05-01

    Defence, 2010 © Sa Majesté la Reine (en droit du Canada), telle que représentée par le ministre de la Défense nationale, 2010 DRDC Toronto CR 2010...externalize their mental model of the assumed solution for critique and correction by others, and whether or not this would assist in ensuring that

  18. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    Science.gov (United States)

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  19. Thermal Modeling and Feedback Requirements for LIFE Neutronic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, J E

    2009-07-15

    An initial study is performed to determine how temperature considerations affect LIFE neutronic simulations. Among other figures of merit, the isotopic mass accumulation, thermal power, tritium breeding, and criticality are analyzed. Possible fidelities of thermal modeling and degrees of coupling are explored. Lessons learned from switching and modifying nuclear datasets is communicated.

  20. Voluntary versus Enforced Team Effort

    Directory of Open Access Journals (Sweden)

    Claudia Keser

    2011-08-01

    Full Text Available We present a model where each of two players chooses between remuneration based on either private or team effort. Although at least one of the players has the equilibrium strategy to choose private remuneration, we frequently observe both players to choose team remuneration in a series of laboratory experiments. This allows for high cooperation payoffs but also provides individual free-riding incentives. Due to significant cooperation, we observe that, in team remuneration, participants make higher profits than in private remuneration. We also observe that, when participants are not given the option of private remuneration, they cooperate significantly less.

  1. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  2. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    The project cogeneration plant supplies electric power, process steam and treated boiler feedwater for use by the project plants. The plant consists of multiple turbine generators and steam generators connected to a common main steam header. The major plant systems which are required to produce steam, electrical power and treated feedwater are discussed individually. The systems are: steam, steam generator, steam generator fuel, condensate and feedwater deaeration, condensate and blowdown collection, cooling water, boiler feedwater treatment, coal handling, ash handling (fly ash and bottom ash), electrical, and control system. The plant description is based on the Phase Zero design basis established for Plant 31 in July of 1980 and the steam/condensate balance as presented on Drawing 31-E-B-1. Updating of steam requirements as more refined process information becomes available has generated some changes in the steam balance. Boiler operation with these updated requirements is reflected on Drawing 31-D-B-1A. The major impact of updating has been that less 600 psig steam generated within the process units requires more extraction steam from the turbine generators to close the 600 psig steam balance. Since the 900 psig steam generation from the boilers was fixed at 1,200,000 lb/hr, the additional extraction steam required to close the 600 psig steam balance decreased the quantity of electrical power available from the turbine generators. In the next phase of engineering work, the production of 600 psig steam will be augmented by increasing convection bank steam generation in the Plant 3 fired heaters by 140,000 to 150,000 lb/hr. This modification will allow full rated power generation from the turbine generators.

  3. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques.

  4. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    -specific needs. Currently, these difficulties are handled in most major ERP systems by customising and localising the native code of the ERP systems for each specific country and industry. We propose an alternative that uses logical modeling of VAT legislation. The potential benefit is to eventually transform......Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...

  5. Great efforts required: Solar industry in Chile. Chile's solar industry is small but rapidly expanding; Noch viel zu tun. Die Solarbranche in Chile ist klein, aber sie entwickelt sich in raschem Tempo

    Energy Technology Data Exchange (ETDEWEB)

    Rosell, Alejandro Diego

    2010-03-15

    The photovoltaics industry in Chile so far has been radly noticed. There are a small number of manufacturers and installers of lamps and isolated photovoltaic systems wo started a regional sector that is working on opening a market for solar technology. After five hard years, the business is now facing better prospects. The Chilean government, too, is making efforts now to attract foreign investors. (orig.)

  6. Swedish nuclear waste efforts

    Energy Technology Data Exchange (ETDEWEB)

    Rydberg, J.

    1981-09-01

    After the introduction of a law prohibiting the start-up of any new nuclear power plant until the utility had shown that the waste produced by the plant could be taken care of in an absolutely safe way, the Swedish nuclear utilities in December 1976 embarked on the Nuclear Fuel Safety Project, which in November 1977 presented a first report, Handling of Spent Nuclear Fuel and Final Storage of Vitrified Waste (KBS-I), and in November 1978 a second report, Handling and Final Storage of Unreprocessed Spent Nuclear Fuel (KBS II). These summary reports were supported by 120 technical reports prepared by 450 experts. The project engaged 70 private and governmental institutions at a total cost of US $15 million. The KBS-I and KBS-II reports are summarized in this document, as are also continued waste research efforts carried out by KBS, SKBF, PRAV, ASEA and other Swedish organizations. The KBS reports describe all steps (except reprocessing) in handling chain from removal from a reactor of spent fuel elements until their radioactive waste products are finally disposed of, in canisters, in an underground granite depository. The KBS concept relies on engineered multibarrier systems in combination with final storage in thoroughly investigated stable geologic formations. This report also briefly describes other activities carried out by the nuclear industry, namely, the construction of a central storage facility for spent fuel elements (to be in operation by 1985), a repository for reactor waste (to be in operation by 1988), and an intermediate storage facility for vitrified high-level waste (to be in operation by 1990). The R and D activities are updated to September 1981.

  7. Using a Model of Team Collaboration to Investigate Inter-Organizational Collaboration During the Relief Effort of the January 2010 Haiti Earthquake

    Science.gov (United States)

    2011-06-01

    whack, it’ll be tough. TKS SM 2. This question of security and the rumors of security and the racism behind the idea of security has been our major...of team collaboration is not likely at this point of diminishing returns. As such, further validation efforts are warranted. The March 2011 Japan

  8. Is Effort Praise Motivational? The Role of Beliefs in the Effort-Ability Relationship

    Science.gov (United States)

    Lam, Shui-fong; Yim, Pui-shan; Ng, Yee-lam

    2008-01-01

    In two studies, we investigated how beliefs in the effort-ability relationship moderated the effects of effort praise on student motivation. Study 1 showed that the more the participants believed that effort and ability were related positively (the positive rule) versus related negatively (the inverse rule), the more they would have positive…

  9. Towards a Concerted Effort

    DEFF Research Database (Denmark)

    Johansen, Mette-Louise; Mouritsen, Tina; Montgomery, Edith

    2006-01-01

    This book contains a method model for the prevention of youth crime in Danish municipalities. The method model consists of instructions for conducting processual network meetings between traumatized refugee parents and the professional specialists working with their children on an intermunicipal...... and division of responsibilities between specialists and parents. The book is based on a method development project carried out in Karlebo municipality involving refugee families and welfare staff representatives in the municipality, the health system, and the police. The project was carried out with financial...

  10. Studies on Models,Patterns and Require-ments of Digestible Amino Acids for Layers by Nitrogen Metabolism

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The nitrogen (N) metabolic experiments were made to estimate separately amino acid requirements of 43~48 weeks old layers for maintenance, for protein accretion to estabolish models to estimate digestible amino acid requirements. The regression relationship of nitrogen retention vs amino acid intake was estimated for each amino acid by giving, at rate of N intake of 0.91, 0.52, 0.15 and 0.007g.kg-1 body-weight (W0.75) per d, the semi-synthetic diets was made specially deficient in one amino acid. From the regression coefficients, it was calculated that, for the accretion of 1 g protein, the dietary digestible amino acid requirements were (mg) Thr 63.1, Val 100.4, Met 39.9, Ile 88.6, Leu 114.3, Phe 63.2, Lys 87.0, His 20.5, Arg 87.9, Trp 21.4, Met+Cys 77.6, and Phe+Tyr 114.3. Daily amino acid requirements for N equilibrium were estimated to be (mg.kg-1W0.75 per day) Thr 50.6, Val 74.7, Met 30.3, ILe 66.7 Leu 81.4, Phe 44.8, Lys 60.5 His 14.7, Arg 73.9 ,Trp 17.3, Met+Cys 58.6, and Phe+Tyr 83.9 The dietary degestible amino acid patterns for protein accretion and N equilibrium were also proposed. The models of estimating digestible amino acid requirements for the different productions were developed.

  11. Dopamine and Effort-Based Decision Making

    Directory of Open Access Journals (Sweden)

    Irma Triasih Kurniawan

    2011-06-01

    Full Text Available Motivational theories of choice focus on the influence of goal values and strength of reinforcement to explain behavior. By contrast relatively little is known concerning how the cost of an action, such as effort expended, contributes to a decision to act. Effort-based decision making addresses how we make an action choice based on an integration of action and goal values. Here we review behavioral and neurobiological data regarding the representation of effort as action cost, and how this impacts on decision making. Although organisms expend effort to obtain a desired reward there is a striking sensitivity to the amount of effort required, such that the net preference for an action decreases as effort cost increases. We discuss the contribution of the neurotransmitter dopamine (DA towards overcoming response costs and in enhancing an animal’s motivation towards effortful actions. We also consider the contribution of brain structures, including the basal ganglia (BG and anterior cingulate cortex (ACC, in the internal generation of action involving a translation of reward expectation into effortful action.

  12. Requirements for competence modelling in professional learning: experience from the water sector

    Directory of Open Access Journals (Sweden)

    Éva Rátky

    2010-11-01

    Full Text Available Competence Models are proved as critical instruments for human resources management and development, and of determined for both the labour market (employers for the selection of the employees and training providers for the enhancement of the vocational training opportunities. The concept of competence modeling is still under development and considerable efforts are focused on the creation of new Competence Models and their application to a broad range of professional learning sectors. The scope of this inquiry is to contribute to this research field by setting the basis for the design and development of a Competence Model for the Water Sector.

  13. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    Report IV, Volume 7 provides descriptions, data, and drawings pertaining to the Oxygen Plant (Plant 15) and Naphtha Hydrotreating and Reforming (Plant 18). The Oxygen Plant (Plant 15) utilizes low-pressure air separation to manufacture the oxygen required in Gasification and Purification (Plant 12). The Oxygen Plant also supplies nitrogen as needed by the H-COAL process. Naphtha Hydrotreating and Reforming (Plant 18) upgrades the raw H-COAL naphtha. The following information is provided for both plants described in this volume: a description of the plant's process design, including the utility balance, catalysts and chemicals usage, and process flow diagrams; an equipment list including item numbers and descriptions; data sheets and sketches for major plant components (Oxygen Plant only); and pertinent engineering drawings. An appendix contains: an overall site plan showing the locations of all plants; and the symbols and legend for the piping and instrument diagrams included in this volume.

  14. Forecasting Model of Coal Requirement Quantity Based on Grey System Theory

    Institute of Scientific and Technical Information of China (English)

    孙继湖

    2001-01-01

    The generally used methods of forecasting coal requirement quantity include the analogy method, the outside-push method and the cause-effect analysis method. However, the precision of forecasting results using these methods is lower. This paper uses the grey system theory, and sets up grey forecasting model GM (1, 3) to coal requirement quantity. The forecasting result for the Chinese coal requirement quantity coincides with the actual values, and this shows that the model is reliable. Finally, this model are used to forecast Chinese coal requirement quantity in the future ten years.

  15. Digital Avionics Information System (DAIS): Training Requirements Analysis Model Users Guide. Final Report.

    Science.gov (United States)

    Czuchry, Andrew J.; And Others

    This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…

  16. Breckinridge Project, initial effort

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    Report IV, Volume 6 provides descriptions, data, and drawings pertaining to Gasification and Purification (Plant 12). Gasification and Purification (Plant 12) produces makeup hydrogen for H-COAL Preheating and Reaction (Plant 3), and produces a medium Btu fuel gas for consumption in fired heaters. The following information is included: a description of the plant's process design, including the utility balance, catalysts and chemicals usage, and a process flow diagram; an equipment list, including item numbers and descriptions; data sheets and sketches for major plant components; and pertinent engineering drawings. An appendix contains: an overall site plan showing the locations of all plants; and the symbols and legend for the piping and instrument diagrams included in this volume. Gasification and Purification (Plant 12) utilizes process technology from three licensors: gasification of vacuum bottoms using the Texaco process, shift conversion using the Haldor Topsoe process, and purification of fuel gas and shifted gas using the Allied Chemical Selexol process. This licensed technology is proprietary in nature. As a result, this volume does not contain full disclosure of these processes although a maximum of information has been presented consistent with the confidentiality requirements. Where data appears incomplete in this volume, it is due to the above described limitations. Full data concerning this plant are available for DOE review at the Houston offices of Bechtel Petroleum, Inc.

  17. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  18. Software Development Effort Estimation Techniques: A Review

    Directory of Open Access Journals (Sweden)

    Rshma Chawla

    2014-09-01

    Full Text Available The most important activity in software project management process is the estimation of Software development effort. The literature shows many algorithmic cost estimation models such as Boehm’s COCOMO, Albrecht's Function Point Analysis, Putnam’s SLIM, ESTIMACS, Soft computing based techniques etc., but each model have their own advantages and disadvantages in predicting development cost and effort. This is because of the availability of project data in the initial stages of development process is often incomplete, inconsistent and vague. The accurate effort estimation in software project management process is major challenge. This paper is a systematic reviewof classic and contemporary literature on software effort estimation. A systematicsearch is done across data sources to understand the issues and research problems ineffort estimation problem domain

  19. Improving the Enterprise Requirements and Acquisition Model’s Developmental Test and Evaluation Process Fidelity

    Science.gov (United States)

    2014-03-27

    by Leach and Searle (2010), Montomery (2011), and Baldus and others (2013). Below is a summary of their research efforts. Table 1: Overview of ERAM... Leach and Searle 2010 ERAM 1.1 ExtendSim Updates by the Aerospace Design Team and served as new baseline model ERAM 1.2 ExtendSim Implemented...complex relationships within DAMS and to assist in supporting acquisition reform. 66 DT&E Silver Bullet The most substantial improvement from a

  20. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms...... of a derivative of UML 2.0 high-level Sequence Diagrams. The automated requirement checking is part of a bigger tool framework in which VDM++ is applied to automatically generate initial CPN models based on Problem Diagrams. These models are manually enhanced to provide behavioral descriptions of the environment...

  1. Model requirements for decision support under uncertainty in data scarce dynamic deltas

    NARCIS (Netherlands)

    Haasnoot, Marjolijn; van Deursen, W.P.A.; Kwakkel, J. H.; Middelkoop, H.

    2016-01-01

    There is a long tradition of model-based decision support in water management. The consideration of deep uncertainty, however, changes the requirements imposed on models.. In the face of deep uncertainty, models are used to explore many uncertainties and the decision space across multiple outcomes o

  2. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    . In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling...

  3. School Trips: Are They Worth the Effort?

    Science.gov (United States)

    Johnston, Robert

    2015-01-01

    Even the most basic of school trips will require booking places, arranging transport, writing to parents, collecting payments, planning activities, producing worksheets and, of course, endless risk assessments. It always leaves teachers wondering: "is it really worth all this effort?" Robert Johnston believes that every teacher should…

  4. Modeling and verifying SoS performance requirements of C4ISR systems

    Institute of Scientific and Technical Information of China (English)

    Yudong Qi; Zhixue Wang; Qingchao Dong; Hongyue He

    2015-01-01

    System-of-systems (SoS) engineering involves a com-plex process of refining high-level SoS requirements into more detailed systems requirements and assessing the extent to which the performances of to-be systems may possibly satisfy SoS capa-bility objectives. The key issue is how to model such requirements to automate the process of analysis and assessment. This paper suggests a meta-model that defines both functional and non-functional features of SoS requirements for command and control, communication, computer, intel igence, surveil ance reconnais-sance (C4ISR) systems. A domain-specific modeling language is defined by extending unified modeling language (UML) con-structed of class and association with fuzzy theory in order to model the fuzzy concepts of performance requirements. An effi-ciency evaluation function is introduced, based on B´ezier curves, to predict the effectiveness of systems. An algorithm is presented to transform domain models in fuzzy UML into a requirements ontology in description logic (DL) so that requirements verification can be automated with a popular DL reasoner such as Pel et.

  5. Requirements Evolution Processes Modeling%需求演化过程建模

    Institute of Scientific and Technical Information of China (English)

    张国生

    2012-01-01

    Requirements tasks, requirements activities, requirements engineering processes and requirements engineering processes system are formally defined. Requirements tasks are measured with information entropy. Requirements activities, requirements engineering processes and requirements engineering processes system are measured with joint entropy. From point of view of requirements engineering processes, microcosmic evolution of iteration and feedback of the requirements engineering processes are modeled with condition-event nets. From point of view of system engineering, macro evolution of the whole software requirements engineering processes system is modeled with dissipative structure theory.%对需求任务、需求活动、需求工程过程以及需求工程过程系统进行形式化定义.用信息熵对需求任务演化进行度量,用联合熵对需求活动、需求工程过程以及需求工程过程系统演化进行度量.从需求工程过程的角度,用条件一事件网对需求工程过程的迭代、反馈进行微观演化建模.从系统工程的角度,用耗散结构理论对整个软件需求工程过程系统进行宏观演化建模.

  6. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    New sources of ancillary services are expected in the power system. For large and conventional generation units the dynamic response is well understood and detailed individual measurement is feasible, which factors in to the straightforward performance requirements applied today. For secure power...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies.......New sources of ancillary services are expected in the power system. For large and conventional generation units the dynamic response is well understood and detailed individual measurement is feasible, which factors in to the straightforward performance requirements applied today. For secure power...... system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...

  7. Fraud Risk Modelling: Requirements Elicitation in the Case of Telecom Services

    DEFF Research Database (Denmark)

    Yesuf, Ahmed; Wolos, Lars Peter; Rannenberg, Kai

    2017-01-01

    . In this paper, we highlight the important requirements for a usable and context-aware fraud risk modelling approach for Telecom services. To do so, we have conducted two workshops with experts from a Telecom provider and experts from multi-disciplinary areas. In order to show and document the requirements, we...

  8. The Nuremberg Code subverts human health and safety by requiring animal modeling

    OpenAIRE

    Greek Ray; Pippus Annalea; Hansen Lawrence A

    2012-01-01

    Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive...

  9. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  10. Virtual Community Life Cycle: a Model to Develop Systems with Fluid Requirements

    OpenAIRE

    El Morr, Christo; Maret, Pierre de; Rioux, Marcia; Dinca-Panaitescu, Mihaela; Subercaze, Julien

    2011-01-01

    This paper reports the results of an investigation into the life cycle model needed to develop information systems for group of people with fluid requirements. For this purpose, we developed a modified spiral model and applied to the analysis, design and implementation of a virtual community for a group of researchers and organizations that collaborated in a research project and had changing system requirements? The virtual knowledge community was dedicated to support mobilization and dissemi...

  11. Modeling the Impact of Simulated Educational Interventions on the Use and Abuse of Pharmaceutical Opioids in the United States: A Report on Initial Efforts

    Science.gov (United States)

    Wakeland, Wayne; Nielsen, Alexandra; Schmidt, Teresa D.; McCarty, Dennis; Webster, Lynn R.; Fitzgerald, John; Haddox, J. David

    2013-01-01

    Three educational interventions were simulated in a system dynamics model of the medical use, trafficking, and nonmedical use of pharmaceutical opioids. The study relied on secondary data obtained in the literature for the period of 1995 to 2008 as well as expert panel recommendations regarding model parameters and structure. The behavior of the…

  12. Genetic programming as alternative for predicting development effort of individual software projects.

    Directory of Open Access Journals (Sweden)

    Arturo Chavoya

    Full Text Available Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment.

  13. A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation Modelling

    Science.gov (United States)

    2009-10-01

    Beattie - Bridgeman Virial expansion The above equations are suitable for moderate pressures and are usually based on either empirical constants...CR 2010-013 October 2009 A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation...Defence R&D Canada. A Review of Equation of State Models, Chemical Equilibrium Calculations and CERV Code Requirements for SHS Detonation

  14. Learning Environment and Student Effort

    Science.gov (United States)

    Hopland, Arnt O.; Nyhus, Ole Henning

    2016-01-01

    Purpose: The purpose of this paper is to explore the relationship between satisfaction with learning environment and student effort, both in class and with homework assignments. Design/methodology/approach: The authors use data from a nationwide and compulsory survey to analyze the relationship between learning environment and student effort. The…

  15. Structural model requirements to describe microbial inactivation during a mild heat treatment.

    Science.gov (United States)

    Geeraerd, A H; Herremans, C H; Van Impe, J F

    2000-09-10

    The classical concept of D and z values, established for sterilisation processes, is unable to deal with the typical non-loglinear behaviour of survivor curves occurring during the mild heat treatment of sous vide or cook-chill food products. Structural model requirements are formulated, eliminating immediately some candidate model types. Promising modelling approaches are thoroughly analysed and, if applicable, adapted to the specific needs: two models developed by Casolari (1988), the inactivation model of Sapru et al. (1992), the model of Whiting (1993), the Baranyi and Roberts growth model (1994), the model of Chiruta et al. (1997), the model of Daughtry et al. (1997) and the model of Xiong et al. (1999). A range of experimental data of Bacillus cereus, Yersinia enterocolitica, Escherichia coli O157:H7, Listeria monocytogenes and Lactobacillus sake are used to illustrate the different models' performances. Moreover, a novel modelling approach is developed, fulfilling all formulated structural model requirements, and based on a careful analysis of literature knowledge of the shoulder and tailing phenomenon. Although a thorough insight in the occurrence of shoulders and tails is still lacking from a biochemical point of view, this newly developed model incorporates the possibility of a straightforward interpretation within this framework.

  16. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on ba

  17. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on b

  18. The pharmacology of effort-related choice behavior: Dopamine, depression, and individual differences.

    Science.gov (United States)

    Salamone, John D; Correa, Merce; Yohn, Samantha; Lopez Cruz, Laura; San Miguel, Noemi; Alatorre, Luisa

    2016-06-01

    This review paper is focused upon the involvement of mesolimbic dopamine (DA) and related brain systems in effort-based processes. Interference with DA transmission affects instrumental behavior in a manner that interacts with the response requirements of the task, such that rats with impaired DA transmission show a heightened sensitivity to ratio requirements. Impaired DA transmission also affects effort-related choice behavior, which is assessed by tasks that offer a choice between a preferred reinforcer that has a high work requirement vs. less preferred reinforcer that can be obtained with minimal effort. Rats and mice with impaired DA transmission reallocate instrumental behavior away from food-reinforced tasks with high response costs, and show increased selection of low reinforcement/low cost options. Tests of effort-related choice have been developed into models of pathological symptoms of motivation that are seen in disorders such as depression and schizophrenia. These models are being employed to explore the effects of conditions associated with various psychopathologies, and to assess drugs for their potential utility as treatments for effort-related symptoms. Studies of the pharmacology of effort-based choice may contribute to the development of treatments for symptoms such as psychomotor slowing, fatigue or anergia, which are seen in depression and other disorders.

  19. An EMG-assisted model calibration technique that does not require MVCs.

    Science.gov (United States)

    Dufour, Jonathan S; Marras, William S; Knapik, Gregory G

    2013-06-01

    As personalized biologically-assisted models of the spine have evolved, the normalization of raw electromyographic (EMG) signals has become increasingly important. The traditional method of normalizing myoelectric signals, relative to measured maximum voluntary contractions (MVCs), is susceptible to error and is problematic for evaluating symptomatic low back pain (LBP) patients. Additionally, efforts to circumvent MVCs have not been validated during complex free-dynamic exertions. Therefore, the objective of this study was to develop an MVC-independent biologically-assisted model calibration technique that overcomes the limitations of previous normalization efforts, and to validate this technique over a variety of complex free-dynamic conditions including symmetrical and asymmetrical lifting. The newly developed technique (non-MVC) eliminates the need to collect MVCs by combining gain (maximum strength per unit area) and MVC into a single muscle property (gain ratio) that can be determined during model calibration. Ten subjects (five male, five female) were evaluated to compare gain ratio prediction variability, spinal load predictions, and model fidelity between the new non-MVC and established MVC-based model calibration techniques. The new non-MVC model calibration technique demonstrated at least as low gain ratio prediction variability, similar spinal loads, and similar model fidelity when compared to the MVC-based technique, indicating that it is a valid alternative to traditional MVC-based EMG normalization. Spinal loading for individuals who are unwilling or unable to produce reliable MVCs can now be evaluated. In particular, this technique will be valuable for evaluating symptomatic LBP patients, which may provide significant insight into the underlying nature of the LBP disorder.

  20. Workshop on Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials

    Energy Technology Data Exchange (ETDEWEB)

    Giles, GE

    2005-02-03

    The purpose of this Workshop on ''Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials'' was to solicit functional requirements for tools that help Incident Managers plan for and deal with the consequences of industrial or terrorist releases of materials into the nation's waterways and public water utilities. Twenty representatives attended and several made presentations. Several hours of discussions elicited a set of requirements. These requirements were summarized in a form for the attendees to vote on their highest priority requirements. These votes were used to determine the prioritized requirements that are reported in this paper and can be used to direct future developments.

  1. Wastewater treatment models in teaching and training: the mismatch between education and requirements for jobs.

    Science.gov (United States)

    Hug, Thomas; Benedetti, Lorenzo; Hall, Eric R; Johnson, Bruce R; Morgenroth, Eberhard; Nopens, Ingmar; Rieger, Leiv; Shaw, Andrew; Vanrolleghem, Peter A

    2009-01-01

    As mathematical modeling of wastewater treatment plants has become more common in research and consultancy, a mismatch between education and requirements for model-related jobs has developed. There seems to be a shortage of skilled people, both in terms of quantity and in quality. In order to address this problem, this paper provides a framework to outline different types of model-related jobs, assess the required skills for these jobs and characterize different types of education that modelers obtain "in school" as well as "on the job". It is important to consider that education of modelers does not mainly happen in university courses and that the variety of model related jobs goes far beyond use for process design by consulting companies. To resolve the mismatch, the current connection between requirements for different jobs and the various types of education has to be assessed for different geographical regions and professional environments. This allows the evaluation and improvement of important educational paths, considering quality assurance and future developments. Moreover, conclusions from a workshop involving practitioners and academics from North America and Europe are presented. The participants stressed the importance of non-technical skills and recommended strengthening the role of realistic modeling experience in university training. However, this paper suggests that all providers of modeling education and support, not only universities, but also software suppliers, professional associations and companies performing modeling tasks are called to assess and strengthen their role in training and support of professional modelers.

  2. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Science.gov (United States)

    2012-01-01

    Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented. PMID:22769234

  3. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  4. Update on the Status of the On-Going Range Dependent Low Frequency Active Sonar Model Benchmarking Effort : From Cambridge to Kos [abstract

    NARCIS (Netherlands)

    Zampolli, M.; Ainslie, M.A.

    2011-01-01

    In April 2010, a symposium in Memory of David Weston was held at Clare College in Cambridge (UK). International researchers from academia and research laboratories met to discuss two sets of test problems for sonar performance models, one aimed at understanding mammal echolocation sonar („Problem AI

  5. Creating the finite element models of car seats with passive head restraints to meet the requirements of passive safety

    Directory of Open Access Journals (Sweden)

    D. Yu. Solopov

    2014-01-01

    Full Text Available A problem solution to create the car chairs using modern software complexes (CAE based on the finite elements is capable to increase an efficiency of designing process significantly. Designing process is complicated by the fact that at present there are no available techniques focused on this sort of tasks.This article shows the features to create the final element models (FEM of the car chairs having three levels of complexity. It assesses a passive safety, which is ensured by the developed chair models with passive head restraints according to requirements of UNECE No 25 Regulations, and an accuracy of calculation results compared with those of full-scale experiments.This work is part of the developed technique, which allows effective development of the car chair designs both with passive, and with active head restraints, meeting the requirements of passive safety.By results of calculations and experiments it was established that at assessment by an UNECE No 25 technique the "rough" FEM (the 1st and 2nd levels can be considered as rational (in terms of effort to its creation and task solution and by the errors of results, and it is expedient to use them for preliminary and multiple calculations. Detailed models (the 3rd level provide the greatest accuracy (for accelerations the relative error makes 10%, for movements it is 11%, while in comparison with calculations, the relative error for a model of head restraint only decreases by 5% for accelerations and for 9% for movements.The materials presented in the article are used both in research activities and in training students at the Chair of Wheel Vehicles of the Scientific and Educational Complex "Special Mechanical Engineering" of Bauman Moscow State Technical University.

  6. Advanced materials characterization and modeling using synchrotron, neutron, TEM, and novel micro-mechanical techniques - A European effort to accelerate fusion materials development

    DEFF Research Database (Denmark)

    Linsmeier, Ch.; Fu, C.-C.; Kaprolat, A.

    2013-01-01

    For the realization of fusion as an energy source, the development of suitable materials is one of the most critical issues. The required material properties are in many aspects unique compared to the existing solutions, particularly the need for necessary resistance to irradiation with neutrons...... having energies up to 14 MeV. In addition to withstanding the effects of neutrons, the mechanical stability of structural materials has to be maintained up to high temperatures. Plasma-exposed materials must be compatible with the fusion plasma, both with regard to the generation of impurities injected...... as testing under neutron flux-induced conditions. For the realization of a DEMO power plant, the materials solutions must be available in time. The European initiative FEMaS-CA – Fusion Energy Materials Science – Coordination Action – aims at accelerating materials development by integrating advanced...

  7. Rent seeking with efforts and bids

    NARCIS (Netherlands)

    Haan, M.A.; Schoonbeek, L.

    2003-01-01

    We introduce bids in a rent-seeking contest. Players compete for a prize. Apart from exerting lobbying efforts, they also submit a bid which is payable only if they win the prize. We show that our model has a unique Nash equilibrium in pure strategies, in which each active player submits the same bi

  8. Net benefits of wildfire prevention education efforts

    Science.gov (United States)

    Jeffrey P. Prestemon; David T. Butry; Karen L. Abt; Ronda. Sutphen

    2010-01-01

    Wildfire prevention education efforts involve a variety of methods, including airing public service announcements, distributing brochures, and making presentations, which are intended to reduce the occurrence of certain kinds of wildfires. A Poisson model of preventable Florida wildfires from 2002 to 2007 by fire management region was developed. Controlling for...

  9. Nash Equilibria in Shared Effort Games

    NARCIS (Netherlands)

    Polevoy, G.; Trajanovski, S.; De Weerdt, M.M.

    2014-01-01

    Shared effort games model people's contribution to projects and sharing the obtained profits. Those games generalize both public projects like writing for Wikipedia, where everybody shares the resulting benefits, and all-pay auctions such as contests and political campaigns, where only the winner ob

  10. An emergency dispatch model considering the urgency of the requirement for reliefs in different disaster areas

    Directory of Open Access Journals (Sweden)

    Liu Sheng

    2015-11-01

    Full Text Available Abstract: Purpose: Frequent sudden-onset disasters which have threatened the survival of human and the development of society force the public to pay an increasing attention to emergency management. A challenging task in the process of emergency management is emergency dispatch of reliefs. An emergency dispatch model considering the urgency of the requirement for reliefs in different disaster areas is proposed in this paper to dispatch reliefs reasonably and reduce the effect of sudden-onset disasters. Design/methodology/approach: Firstly, quantitative assessment on the urgency of the requirement for reliefs in different disaster areas is done by an evaluation method based on Fuzzy Comprehensive Evaluation and improved Evidence Reasoning which is proposed in this paper. And then based the quantitative results, an emergency dispatch model aiming to minimize the response time, the distribution cost and the unsatisfied rate of the requirement for reliefs is proposed, which reflects the requests of disaster areas under emergency, including the urgency of requirement, the economy of distribution and the equity of allocation. Finally, the Genetic Algorithm is improved based on the adaptive crossover and mutation probability function to solve the emergency dispatch model. Findings and Originality/value: A case that the Y hydraulic power enterprise carries on emergency dispatch of reliefs under continuous sudden-onset heavy rain is given to illustrate the availability of the emergency dispatch model proposed in this paper. The results show that the emergency dispatch model meets the distribution priority requirement of disaster area with the higher urgency, so thatreliefs are supplied more timely. Research limitations/implications: The emergency dispatch model faced to large scale sudden-onset disasters is complex. The quantity of reliefs that disaster area requires and the running time of vehicles are viewed as available information, and the problem

  11. Modeling of Car-Following Required Safe Distance Based on Molecular Dynamics

    OpenAIRE

    Dayi Qu; Xiufeng Chen; Wansan Yang; Xiaohua Bian

    2014-01-01

    In car-following procedure, some distances are reserved between the vehicles, through which drivers can avoid collisions with vehicles before and after them in the same lane and keep a reasonable clearance with lateral vehicles. This paper investigates characters of vehicle operating safety in car following state based on required safe distance. To tackle this problem, we probe into required safe distance and car-following model using molecular dynamics, covering longitudinal and lateral safe...

  12. Computer software requirements specification for the world model light duty utility arm system

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J.E.

    1996-02-01

    This Computer Software Requirements Specification defines the software requirements for the world model of the Light Duty Utility Arm (LDUA) System. It is intended to be used to guide the design of the application software, to be a basis for assessing the application software design, and to establish what is to be tested in the finished application software product. (This deploys end effectors into underground storage tanks by means of robotic arm on end of telescoping mast.)

  13. IDENTIFYING OPERATIONAL REQUIREMENTS TO SELECT SUITABLE DECISION MODELS FOR A PUBLIC SECTOR EPROCUREMENT DECISION SUPPORT SYSTEM

    Directory of Open Access Journals (Sweden)

    Mohamed Adil

    2014-10-01

    Full Text Available Public sector procurement should be a transparent and fair process. Strict legal requirements are enforced on public sector procurement to make it a standardised process. To make fair decisions on selecting suppliers, a practical method which adheres to legal requirements is important. The research that is the base for this paper aimed at identifying a suitable Multi-Criteria Decision Analysis (MCDA method for the specific legal and functional needs of the Maldivian Public Sector. To identify such operational requirements, a set of focus group interviews were conducted in the Maldives with public officials responsible for procurement decision making. Based on the operational requirements identified through focus groups, criteria-based evaluation is done on published MCDA methods to identify the suitable methods for e-procurement decision making. This paper describes the identification of the operational requirements and the results of the evaluation to select suitable decision models for the Maldivian context.

  14. Postconcussive complaints, cognition, symptom attribution and effort among veterans.

    Science.gov (United States)

    Larson, Eric B; Kondiles, Bethany R; Starr, Christine R; Zollman, Felise S

    2013-01-01

    The etiology of postconcussive symptoms is not clearly understood. Development of etiological models of those symptoms will be helpful for accurate diagnosis and for planning effective treatment. Such a model should characterize the role of subject characteristics (education, premorbid intelligence), social psychological factors and symptom validity. Toward that end, the present study examined the association of postconcussive complaints and cognitive performance with symptom attribution and level of effort on testing. In a sample of 155 veterans, attribution to concussion was associated with endorsement of more severe postconcussive complaints, after controlling for the effects of other factors such as subject characteristics. Similarly, effort was associated with cognitive performance after controlling for the effects of these other factors. The present findings are consistent with previous reports that illness perception and effort on testing are associated with postconcussive complaints. This supports previous recommendations to routinely educate all concussion patients immediately after injury to reduce distorted perceptions and related persistent complaints. Finally, these findings highlight a need for routine assessment of patients' perception of their injury to identify cases that may require psychotherapy to address any misattributions that develop.

  15. Monitoring, Operational Manager Efforts and Inventory Policy

    OpenAIRE

    Alfaro, J A; Tribó, J. (Josep)

    2003-01-01

    Operations managers are becoming more important in modern corporations. They do not only care on firms’ inventory management but also they are involved in firms’ strategic decisions. Within this setting we ask about the consequences in the inventory policy of this new role undertaken by these managers. To do so, we develop a model where a firm’s Operations Manager can devote some efforts to develop non-inventory related activities. These efforts, although non-verifiable, may be known with a c...

  16. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    Control systems are purposeful systems involving goal-oriented information processing (cyber) and technical (physical) structures. Requirements modeling formalizes fundamental concepts and relations of a system architecture at a high-level design stage and can be used to identify potential design...... at later design stages. However, languages employed for requirements modeling today do not offer the expressiveness necessary to represent control purposes in relation to domain level interactions and therefore miss several types of interdependencies. This paper introduces the idea of control structure...

  17. Modeling and Simulation for Exploring Human-Robot Team Interaction Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Dudenhoeffer, Donald Dean; Bruemmer, David Jonathon; Davis, Midge Lee

    2001-12-01

    Small-sized and micro-robots will soon be available for deployment in large-scale forces. Consequently, the ability of a human operator to coordinate and interact with largescale robotic forces is of great interest. This paper describes the ways in which modeling and simulation have been used to explore new possibilities for human-robot interaction. The paper also discusses how these explorations have fed implementation of a unified set of command and control concepts for robotic force deployment. Modeling and simulation can play a major role in fielding robot teams in actual missions. While live testing is preferred, limitations in terms of technology, cost, and time often prohibit extensive experimentation with physical multi-robot systems. Simulation provides insight, focuses efforts, eliminates large areas of the possible solution space, and increases the quality of actual testing.

  18. Data requirements of GREAT-ER: Modelling and validation using LAS in four UK catchments

    Energy Technology Data Exchange (ETDEWEB)

    Price, Oliver R., E-mail: oliver.price@unilever.co [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Munday, Dawn K. [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Whelan, Mick J. [Department of Natural Resources, School of Applied Sciences, Cranfield University, College Road, Cranfield, Bedfordshire MK43 0AL (United Kingdom); Holt, Martin S. [ECETOC, Ave van Nieuwenhuyse 4, Box 6, B-1160 Brussels (Belgium); Fox, Katharine K. [85 Park Road West, Birkenhead, Merseyside CH43 8SQ (United Kingdom); Morris, Gerard [Environment Agency, Phoenix House, Global Avenue, Leeds LS11 8PG (United Kingdom); Young, Andrew R. [Wallingford HydroSolutions Ltd, Maclean building, Crowmarsh Gifford, Wallingford, Oxon OX10 8BB (United Kingdom)

    2009-10-15

    Higher-tier environmental risk assessments on 'down-the-drain' chemicals in river networks can be conducted using models such as GREAT-ER (Geography-referenced Regional Exposure Assessment Tool for European Rivers). It is important these models are evaluated and their sensitivities to input variables understood. This study had two primary objectives: evaluate GREAT-ER model performance, comparing simulated modelled predictions for LAS (linear alkylbenzene sulphonate) with measured concentrations, for four rivers in the UK, and investigate model sensitivity to input variables. We demonstrate that the GREAT-ER model is very sensitive to variability in river discharges. However it is insensitive to the form of distributions used to describe chemical usage and removal rate in sewage treatment plants (STPs). It is concluded that more effort should be directed towards improving empirical estimates of effluent load and reducing uncertainty associated with usage and removal rates in STPs. Simulations could be improved by incorporating the effect of river depth on dissipation rates. - Validation of GREAT-ER.

  19. Automatically multi-paradigm requirements modeling and analyzing: An ontology-based approach

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    There are several purposes for modeling and analyzing the problem domain before starting the software requirements analysis. First, it focuses on the problem domain, so that the domain users could be involved easily. Secondly, a comprehensive description on the problem domain will advantage getting a comprehensive software requirements model. This paper proposes an ontology-based approach for mod-eling the problem domain. It interacts with the domain users by using terminology that they can under-stand and guides them to provide the relevant information. A multiple paradigm analysis approach, with the basis of the description on the problem domain, has also been presented. Three criteria, i.e. the ra-tionality of organization structure, the achievability of organization goals, and the feasibility of organiza-tion process, have been proposed. The results of the analysis could be used as feedbacks for guiding the domain users to provide further information on the problem domain. And those models on the problem domain could be a kind of document for the pre-requirements analysis phase. They also will be the basis for further software requirements modeling.

  20. Optimal Work Effort and Monitoring Cost

    Directory of Open Access Journals (Sweden)

    Tamara Todorova

    2012-12-01

    Full Text Available Using a simple job market equilibrium model we study the relationship between work effort and monitoring by firms. Some other determinants of work effort investigated include the educational level of the worker, the minimum or start-up salary as well as the economic conjuncture. As common logic dictates, optimal work effort increases with the amount of monitoring done by the employer. Quite contrary to common logic, though, we find that at the optimum employers observe and control good workers much more stringently and meticulously than poor workers. This is because under profit maximization most of the employer’s profit and surplus result from good workers and he risks losing a large amount of profit by not observing those. Managers monitor strictly more productive workers, fast learners and those starting at a higher autonomous level of monitoring, as those contribute more substantially to the firm’s profit.

  1. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  2. Risk Reduction of an Invasive Insect by Targeting Surveillance Efforts with the Assistance of a Phenology Model and International Maritime Shipping Routes and Schedules.

    Science.gov (United States)

    Gray, David R

    2016-05-01

    Reducing the risk of introduction to North America of the invasive Asian gypsy moth (Lymantria dispar asiatica Vnukovskij and L. d. japonica [Motschulsky]) on international maritime vessels involves two tactics: (1) vessels that wish to arrive in Canada or the United States and have visited any Asian port that is subject to regulation during designated times must obtain a predeparture inspection certificate from an approved entity; and (2) vessels with a certificate may be subjected to an additional inspection upon arrival. A decision support tool is described here with which the allocation of inspection resources at North American ports can be partitioned among multiple vessels according to estimates of the potential onboard Asian gypsy moth population and estimates of the onboard larval emergence pattern. The decision support tool assumes that port inspection is uniformly imperfect at the Asian ports and that each visit to a regulated port has potential for the vessel to be contaminated with gypsy moth egg masses. The decision support tool uses a multigenerational phenology model to estimate the potential onboard population of egg masses by calculating the temporal intersection between the dates of port visits to regulated ports and the simulated oviposition pattern in each port. The phenological development of the onboard population is simulated each day of the vessel log until the vessel arrives at the port being protected from introduction. Multiple independent simulations are used to create a probability distribution of the size and timing of larval emergence.

  3. Vocal effort and voice handicap among teachers.

    Science.gov (United States)

    Sampaio, Márcio Cardoso; dos Reis, Eduardo José Farias Borges; Carvalho, Fernando Martins; Porto, Lauro Antonio; Araújo, Tânia Maria

    2012-11-01

    The relationship between voice handicap and professional vocal effort was investigated among teachers in a cross-sectional study of census nature on 4496 teachers within the public elementary education network in Salvador, Bahia, Brazil. Voice handicap (the outcome of interest) was evaluated using the Voice Handicap Index 10. The main exposure, the lifetime vocal effort index, was obtained as the product of the number of years working as a teacher multiplied by the mean weekly working hours. The prevalence of voice handicap was 28.8% among teachers with high professional vocal effort and 21.3% among those with acceptable vocal effort, thus yielding a crude prevalence ratio (PR) of 1.36 (95% confidence interval [CI]=1.14-1.61). In the final logistic model, the prevalence of voice handicap was statistically associated with the professional vocal effort index (PR=1.47; 95% CI=1.19-1.82), adjusted according to sex, microphone availability in the classroom, excessive noise, pressure from the school management, heartburn, and rhinitis.

  4. The dynamic system of parental work of care for children with special health care needs: A conceptual model to guide quality improvement efforts

    Directory of Open Access Journals (Sweden)

    Hexem Kari R

    2011-10-01

    Full Text Available Abstract Background The work of care for parents of children with complex special health care needs may be increasing, while excessive work demands may erode the quality of care. We sought to summarize knowledge and develop a general conceptual model of the work of care. Methods Systematic review of peer-reviewed journal articles that focused on parents of children with special health care needs and addressed factors related to the physical and emotional work of providing care for these children. From the large pool of eligible articles, we selected articles in a randomized sequence, using qualitative techniques to identify the conceptual components of the work of care and their relationship to the family system. Results The work of care for a child with special health care needs occurs within a dynamic system that comprises 5 core components: (1 performance of tasks such as monitoring symptoms or administering treatments, (2 the occurrence of various events and the pursuit of valued outcomes regarding the child's physical health, the parent's mental health, or other attributes of the child or family, (3 operating with available resources and within certain constraints (4 over the passage of time, (5 while mentally representing or depicting the ever-changing situation and detecting possible problems and opportunities. These components interact, some with simple cause-effect relationships and others with more complex interdependencies. Conclusions The work of care affecting the health of children with special health care needs and their families can best be understood, studied, and managed as a multilevel complex system.

  5. Evaluating HIV prevention efforts using semiparametric regression models: results from a large cohort of women participating in an HIV prevention trial from KwaZulu-Natal, South Africa

    Directory of Open Access Journals (Sweden)

    Gita Ramjee

    2013-11-01

    Full Text Available Objective: To describe and quantify the differences in risk behaviours, HIV prevalence and incidence rates by birth cohorts among a group of women in Durban, South Africa. Methods: Cross-sectional and prospective cohort analyses were conducted for women who consented to be screened and enrolled in an HIV prevention trial. Demographic and sexual behaviours were described by five-year birth cohorts. Semiparametric regression models were used to investigate the bivariate associations between these factors and the birth cohorts. HIV seroconversion rates were also estimated by birth cohorts. Results: The prevalence of HIV-1 infection at the screening visit was lowest (20.0% among the oldest (born before 1960 cohorts, while the highest prevalence was observed among those born between 1975 and 79. Level of education increased across the birth cohorts while the median age at first sexual experience declined among those born after 1975 compared to those born before 1975. Only 33.03% of the oldest group reported ever using a condom while engaging in vaginal sex compared to 73.68% in the youngest group; however, HIV and other sexually transmitted infection (STI incidence rates were significantly higher among younger women compared to older women. Conclusions: These findings clearly suggest that demographic and sexual risk behaviours are differentially related to the birth cohorts. Significantly high HIV and STI incidence rates were observed among the younger group. Although the level of education increased, early age at sexual debut was more common among the younger group. The continuing increase in HIV and STI incidence rates among the later cohorts suggests that the future trajectory of the epidemic will be dependent on the infection patterns in younger birth cohorts.

  6. Using a DSGE Model to Assess the Macroeconomic Effects of Reserve Requirements in Brazil

    OpenAIRE

    Waldyr Dutra Areosa; Christiano Arrigoni Coelho

    2013-01-01

    The goal of this paper is to present how a Dynamic General Equilibrium Model (DSGE) can be used by policy makers in the qualitative and quantitative evaluation of the macroeconomics impacts of two monetary policy instruments: (i) short term interest rate and (ii) reserve requirements ratio. In our model, this last instrument affects the leverage of banks that have to deal with agency problems in order to raise funds from depositors. We estimated a modified version of Gertler and Karadi (2011)...

  7. A Formal Method to Model Early Requirement of Multi-Agent System

    Institute of Scientific and Technical Information of China (English)

    MAO Xin-jun; YU Eric

    2004-01-01

    A formal specification language iFL based on i* framework is presented in this paper to formally specify and analyze the early requirement of multi-agent system. It is a branching temporal logic which defines the concepts and models in i* framework in a rigorous way. The method to transform the i* models to iFL formal specification is also put forward.

  8. System Design Description and Requirements for Modeling the Off-Gas Systems for Fuel Recycling Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Daryl R. Haefner; Jack D. Law; Troy J. Tranter

    2010-08-01

    This document provides descriptions of the off-gases evolved during spent nuclear fuel processing and the systems used to capture the gases of concern. Two reprocessing techniques are discussed, namely aqueous separations and electrochemical (pyrochemical) processing. The unit operations associated with each process are described in enough detail so that computer models to mimic their behavior can be developed. The document also lists the general requirements for the desired computer models.

  9. Using a DSGE Model to Assess the Macroeconomic Effects of Reserve Requirements in Brazil

    OpenAIRE

    Waldyr Dutra Areosa; Christiano Arrigoni Coelho

    2013-01-01

    The goal of this paper is to present how a Dynamic General Equilibrium Model (DSGE) can be used by policy makers in the qualitative and quantitative evaluation of the macroeconomics impacts of two monetary policy instruments: (i) short term interest rate and (ii) reserve requirements ratio. In our model, this last instrument affects the leverage of banks that have to deal with agency problems in order to raise funds from depositors. We estimated a modified version of Gertler and Karadi (2011)...

  10. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  11. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  12. ASME Code Efforts Supporting HTGRs

    Energy Technology Data Exchange (ETDEWEB)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  13. De novo actin polymerization is required for model Hirano body formation in Dictyostelium

    Directory of Open Access Journals (Sweden)

    Yun Dong

    2016-06-01

    Full Text Available Hirano bodies are eosinophilic, actin-rich inclusions found in autopsied brains in numerous neurodegenerative diseases. The mechanism of Hirano body formation is unknown. Mass spectrometry analysis was performed to identify proteins from partially purified model Hirano bodies from Dictyostelium. This analysis identified proteins primarily belonging to ribosomes, proteasomes, mitochondria and cytoskeleton. Profilin, Arp/2/3 and WASH identified by mass spectrometry were found to colocalise with model Hirano bodies. Due to their roles in actin regulation, we selected these proteins for further investigation. Inhibition of the Arp2/3 complex by CK666 prevented formation of model Hirano bodies. Since Arp2/3 activation occurs via the WASH or WAVE complex, we next investigated how these proteins affect Hirano body formation. Whereas model Hirano bodies could form in WASH-deficient cells, they failed to form in cells lacking HSPC300, a member of the WAVE complex. We identified other proteins required for Hirano body formation that include profilin and VASP, an actin nucleation factor. In the case of VASP, both its G- and F-actin binding domains were required for model Hirano body formation. Collectively, our results indicate that de novo actin polymerization is required to form model Hirano bodies.

  14. Breaking wheat yield barriers requires integrated efforts in developing countries

    Institute of Scientific and Technical Information of China (English)

    Saeed Rauf; Maria Zaharieva; Marilyn L Warburton; ZHANG Ping-zhi; Abdullah M AL-Sadi; Farghama Khalil; Marcin Kozak; Sultan A Tariq

    2015-01-01

    Most yield progress obtained through the so cal ed“Green Revolution”, particularly in the irrigated areas of Asia, has reached a limit, and major resistance genes are quickly overcome by the appearance of new strains of disease causing organisms. New plant stresses due to a changing environment are dififcult to breed for as quickly as the changes occur. There is con-sequently a continual need for new research programs and breeding strategies aimed at improving yield potential, abiotic stress tolerance and resistance to new, major pests and diseases. Recent advances in plant breeding encompass novel methods of expanding genetic variability and selecting for recombinants, including the development of synthetic hexaploid, hybrid and transgenic wheats. In addition, the use of molecular approaches such as quantitative trait locus (QTL) and asso-ciation mapping may increase the possibility of directly selecting positive chromosomal regions linked with natural variation for grain yield and stress resistance. The present article reviews the potential contribution of these new approaches and tools to the improvement of wheat yield in farmer’s ifelds, with a special emphasis on the Asian countries, which are major wheat producers, and contain the highest concentration of resource-poor wheat farmers.

  15. Breaking wheat yield barriers requires integrated efforts in developing countries

    Science.gov (United States)

    Most yield progress obtained through the so called “green revolution”, particularly in the irrigated areas of Asia, has reached a limit, and major resistance genes are quickly overcome by the appearance of new strains of disease causing organisms. New plant stresses due to a changing environment are...

  16. Effort problem of chemical pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okrajni, J.; Ciesla, M.; Mutwil, K. [Silesian Technical University, Katowice (Poland)

    1998-12-31

    The problem of the technical state assessment of the chemical pipelines working under mechanical and thermal loading has been shown in the paper. The pipelines effort after the long time operating period has been analysed. Material geometrical and loading conditions of the crack initiation and crack growth process in the chosen object has been discussed. Areas of the maximal effort have been determined. The material structure charges after the long time operating period have been described. Mechanisms of the crack initiation and crack growth in the pipeline elements have been analysed and mutual relations between the chemical and mechanical influences have been shown. (orig.) 16 refs.

  17. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  18. INTEGRATED DATA CAPTURING REQUIREMENTS FOR 3D SEMANTIC MODELLING OF CULTURAL HERITAGE: THE INCEPTION PROTOCOL

    Directory of Open Access Journals (Sweden)

    R. Di Giulio

    2017-02-01

    In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  19. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand;

    2015-01-01

    Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production...

  20. Non-formal techniques for requirements elicitation, modeling, and early assessment for services

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Vyas, Dhaval; Dittmar, A.; Forbig, P.

    2011-01-01

    Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder involvement in early design where

  1. The AstroHDF Effort

    NARCIS (Netherlands)

    J. Masters; A. Alexov; M. Folk; R. Hanisch; G. Heber; M. Wise

    2011-01-01

    Here we update the astronomy community on our effort to deal with the demands of ever-increasing astronomical data size and complexity, using the Hierarchical Data Format, version 5 (HDF5) format (Wise et al. 2011). NRAO, LOFAR and VAO have joined forces with The HDF Group to write an NSF grant, req

  2. Neural Network based Software Effort Estimation: A Survey

    OpenAIRE

    Muhammad Waseem Khan; Imran Qureshi

    2014-01-01

    Software effort estimation is used to estimate how many resources and how many hours are required to develop a software project. The accurate and reliable prediction is the key to success of a project. There are numerous mechanisms in software effort estimation but accurate prediction is still a challenge for the researchers and software project managers. In this paper, the use of Neural Network techniques for Software Effort Estimation is discussed and evaluate on the basis of MMRE and Predi...

  3. Modeling the Imprecise Relationship of Goals for Agent-Oriented Requirements Engineering

    Institute of Scientific and Technical Information of China (English)

    SHAOKun; LIUZongtian

    2004-01-01

    Agent concepts have been used in a number of recent approaches of requirement engineering (RE),such as KAOS (Knowledge acquisition in automated specification), i* and GBRAM (Goal-based requirements analysis method). And the modeling languages used in those approaches only permit precise and unambiguous modeling of system properties and behavior. However, some system problems, particularly those drawn from the agentoriented problem domain, may be difficult to model in crisp or precise terms. There are several reasons for this. On one hand, the lack of information may produce the uncertainty of the class to which an object belongs. If we have enough information or if we are considering sufficient attributes,we should be able to make a precise categorization. On the other hand, uncertainty may also arise from some natural imprecision in requirement describing itself, such as soft goal describing and uncertain concepts describing. In the second case, the classification into precise classes may be impossible, not because we do not have enough information, but because the classes themselves are not naturally discrete. In this paper, we start with a discussion of the uncertainty in agent-oriented requirement engineering. Then we propose to handle the uncertainty using fuzzy sets. Finally we refine this proposal to integrate a fuzzy version of Z with the KAOS method. This integration is illustrated on the example of the mine pump. In the conclusion part,we compare the advantages of our approach with those of the classical KAOS approach.

  4. A new model to predict acute kidney injury requiring renal replacement therapy after cardiac surgery

    Science.gov (United States)

    Pannu, Neesh; Graham, Michelle; Klarenbach, Scott; Meyer, Steven; Kieser, Teresa; Hemmelgarn, Brenda; Ye, Feng; James, Matthew

    2016-01-01

    Background: Acute kidney injury after cardiac surgery is associated with adverse in-hospital and long-term outcomes. Novel risk factors for acute kidney injury have been identified, but it is unknown whether their incorporation into risk models substantially improves prediction of postoperative acute kidney injury requiring renal replacement therapy. Methods: We developed and validated a risk prediction model for acute kidney injury requiring renal replacement therapy within 14 days after cardiac surgery. We used demographic, and preoperative clinical and laboratory data from 2 independent cohorts of adults who underwent cardiac surgery (excluding transplantation) between Jan. 1, 2004, and Mar. 31, 2009. We developed the risk prediction model using multivariable logistic regression and compared it with existing models based on the C statistic, Hosmer–Lemeshow goodness-of-fit test and Net Reclassification Improvement index. Results: We identified 8 independent predictors of acute kidney injury requiring renal replacement therapy in the derivation model (adjusted odds ratio, 95% confidence interval [CI]): congestive heart failure (3.03, 2.00–4.58), Canadian Cardiovascular Society angina class III or higher (1.66, 1.15–2.40), diabetes mellitus (1.61, 1.12–2.31), baseline estimated glomerular filtration rate (0.96, 0.95–0.97), increasing hemoglobin concentration (0.85, 0.77–0.93), proteinuria (1.65, 1.07–2.54), coronary artery bypass graft (CABG) plus valve surgery (v. CABG only, 1.25, 0.64–2.43), other cardiac procedure (v. CABG only, 3.11, 2.12–4.58) and emergent status for surgery booking (4.63, 2.61–8.21). The 8-variable risk prediction model had excellent performance characteristics in the validation cohort (C statistic 0.83, 95% CI 0.79–0.86). The net reclassification improvement with the prediction model was 13.9% (p < 0.001) compared with the best existing risk prediction model (Cleveland Clinic Score). Interpretation: We have developed

  5. Deriving required model structures to predict global wildfire burned area from multiple satellite and climate observations

    Science.gov (United States)

    Forkel, Matthias; Dorigo, Wouter; Lasslop, Gitta; Teubner, Irene; Chuvieco, Emilio; Thonicke, Kirsten

    2017-04-01

    Vegetation fires have important effects on human infrastructures and ecosystems, and affect atmospheric composition and the climate system. Consequently, it is necessary to accurately represent fire dynamics in global vegetation models to realistically represent the role of fires in the Earth system. However, it is unclear which model structures are required in global vegetation/fire models to represent fire activity at regional to global scales. Here we aim to identify required structural components and necessary complexities of global vegetation/fire models to predict spatial-temporal dynamics of burned area. For this purpose, we developed the SOFIA (satellite observations for fire activity) modelling approach to predict burned area from several satellite and climate datasets. A large ensemble of SOFIA models was generated and each model was optimized against observed burned area data. Models that account for a suppression of fire activity at wet conditions result in the highest performances in predicting burned area. Models that include vegetation optical depth data from microwave satellite observations reach higher performances in predicting burned area than models that do not include this dataset. Vegetation optical depth is a proxy for vegetation biomass, density and water content and thus indicates a strong control of vegetation states and dynamics on fire activity. We further compared the best performing SOFIA models with the global process-oriented vegetation/fire model JSBACH-SPITFIRE, and with the GFED and Fire_CCI burned area datasets. SOFIA models outperform JSBACH-SPITFIRE in predicting regional variabilities of burned area. We further applied the best SOFIA model to identify controlling factors for burned area. The results indicate that fire activity is controlled by regionally diverse and complex interactions of human, vegetation and climate factors. Our results demonstrate that the use of multiple observational datasets on climate, hydrological

  6. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  7. A mathematical model for metabolic tradeoffs, minimal requirements, and evolutionary transitions. (Invited)

    Science.gov (United States)

    Kempes, C.; Hoehler, T. M.; Follows, M. J.; Dutkiewicz, S.

    2013-12-01

    Understanding the minimal energy requirements for life is a difficult challenge because of the great variety of processes required for life. Our approach is to discover general trends applicable to diverse species in order to understand the average constraints faced by life. We then leverage these trends to predict minimal requirements for life. We have focused on broad trends in metabolism, growth, basic bioenergetics, and overall genomic structure and composition. We have developed a simple mathematical model of metabolic partitioning which is able to capture the growth of both single cells and populations of cells for diverse organisms spanning the three domains of life. This model also anticipates the observed interspecific trends in population growth rate and predicts the observed minimum size of a bacterium. Our model connects evolutionary limitations and transitions, including minimal life, to energetic constraints imposed by body architecture and the metabolism of a given species. This model can also be connected to genomic variation across species in order to describe the tradeoffs associated with various genes and their functionality. This forms the basis for a theory of the possibility space for minimal physiological function given evolutionary tradeoffs, general metabolic and biological architecture, and the energetic limitations of the environment.

  8. A New Rapid Simplified Model for Urban Rainstorm Inundation with Low Data Requirements

    Directory of Open Access Journals (Sweden)

    Ji Shen

    2016-11-01

    Full Text Available This paper proposes a new rapid simplified inundation model (NRSIM for flood inundation caused by rainstorms in an urban setting that can simulate the urban rainstorm inundation extent and depth in a data-scarce area. Drainage basins delineated from a floodplain map according to the distribution of the inundation sources serve as the calculation cells of NRSIM. To reduce data requirements and computational costs of the model, the internal topography of each calculation cell is simplified to a circular cone, and a mass conservation equation based on a volume spreading algorithm is established to simulate the interior water filling process. Moreover, an improved D8 algorithm is outlined for the simulation of water spilling between different cells. The performance of NRSIM is evaluated by comparing the simulated results with those from a traditional rapid flood spreading model (TRFSM for various resolutions of digital elevation model (DEM data. The results are as follows: (1 given high-resolution DEM data input, the TRFSM model has better performance in terms of precision than NRSIM; (2 the results from TRFSM are seriously affected by the decrease in DEM data resolution, whereas those from NRSIM are not; and (3 NRSIM always requires less computational time than TRFSM. Apparently, compared with the complex hydrodynamic or traditional rapid flood spreading model, NRSIM has much better applicability and cost-efficiency in real-time urban inundation forecasting for data-sparse areas.

  9. Cognitive Effort in Modality Retrieval by Young and Older Adults.

    Science.gov (United States)

    Mellinger, Jeanne C.; And Others

    Recent studies of contextual attributes thought to be automatic have reported deficits among the elderly, raising the question of whether automatic memory processing does require some effortful attention and if so, whether such effort is needed during encoding, storage, or retrieval. This study used a secondary task methodology to examine these…

  10. The Telemetry Agile Manufacturing Effort

    Energy Technology Data Exchange (ETDEWEB)

    Brown, K.D.

    1995-01-01

    The Telemetry Agile Manufacturing Effort (TAME) is an agile enterprising demonstration sponsored by the US Department of Energy (DOE). The project experimented with new approaches to product realization and assessed their impacts on performance, cost, flow time, and agility. The purpose of the project was to design the electrical and mechanical features of an integrated telemetry processor, establish the manufacturing processes, and produce an initial production lot of two to six units. This paper outlines the major methodologies utilized by the TAME, describes the accomplishments that can be attributed to each methodology, and finally, examines the lessons learned and explores the opportunities for improvement associated with the overall effort. The areas for improvement are discussed relative to an ideal vision of the future for agile enterprises. By the end of the experiment, the TAME reduced production flow time by approximately 50% and life cycle cost by more than 30%. Product performance was improved compared with conventional DOE production approaches.

  11. Modeling of Car-Following Required Safe Distance Based on Molecular Dynamics

    Directory of Open Access Journals (Sweden)

    Dayi Qu

    2014-01-01

    Full Text Available In car-following procedure, some distances are reserved between the vehicles, through which drivers can avoid collisions with vehicles before and after them in the same lane and keep a reasonable clearance with lateral vehicles. This paper investigates characters of vehicle operating safety in car following state based on required safe distance. To tackle this problem, we probe into required safe distance and car-following model using molecular dynamics, covering longitudinal and lateral safe distance. The model was developed and implemented to describe the relationship between longitudinal safe distance and lateral safe distance under the condition where the leader keeps uniform deceleration. The results obtained herein are deemed valuable for car-following theory and microscopic traffic simulation.

  12. Required levels of catalysis for emergence of autocatalytic sets in models of chemical reaction systems.

    Science.gov (United States)

    Hordijk, Wim; Kauffman, Stuart A; Steel, Mike

    2011-01-01

    The formation of a self-sustaining autocatalytic chemical network is a necessary but not sufficient condition for the origin of life. The question of whether such a network could form "by chance" within a sufficiently complex suite of molecules and reactions is one that we have investigated for a simple chemical reaction model based on polymer ligation and cleavage. In this paper, we extend this work in several further directions. In particular, we investigate in more detail the levels of catalysis required for a self-sustaining autocatalytic network to form. We study the size of chemical networks within which we might expect to find such an autocatalytic subset, and we extend the theoretical and computational analyses to models in which catalysis requires template matching.

  13. Required Levels of Catalysis for Emergence of Autocatalytic Sets in Models of Chemical Reaction Systems

    Directory of Open Access Journals (Sweden)

    Wim Hordijk

    2011-05-01

    Full Text Available The formation of a self-sustaining autocatalytic chemical network is a necessary but not sufficient condition for the origin of life. The question of whether such a network could form “by chance” within a sufficiently complex suite of molecules and reactions is one that we have investigated for a simple chemical reaction model based on polymer ligation and cleavage. In this paper, we extend this work in several further directions. In particular, we investigate in more detail the levels of catalysis required for a self-sustaining autocatalytic network to form. We study the size of chemical networks within which we might expect to find such an autocatalytic subset, and we extend the theoretical and computational analyses to models in which catalysis requires template matching.

  14. Modeling and verifying Web services driven by requirements: An ontology-based approach

    Institute of Scientific and Technical Information of China (English)

    HOU Lishan; JIN ZHi; WU Budan

    2006-01-01

    Automatic discovery and composition of Web services is an important research area in Web service technology, in which the specification of Web services is a key issue. This paper presents a Web service capability description framework based on the environment ontology. This framework depicts Web services capability in two aspects:the operable environment and the environment changes resulting from behaviors of the Web service. On the basis of the framework, a requirement-driven Web service composition model has been constructed. This paper brings forward the formalization of Web service interactions with π calculus. And an automatic mechanism converting conceptual capability description to the formal process expression has been built. This kind of formal specification assists in verifying whether the composite Web service model matches the requirement.

  15. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  16. A case study in modeling company policy documents as a source of requirements

    Energy Technology Data Exchange (ETDEWEB)

    CRUMPTON,KATHLEEN MARIE; GONZALES,REGINA M.; TRAUTH,SHARON L.

    2000-04-11

    This paper describes an approach that was developed to produce structured models that graphically reflect the requirements contained within a text document. The document used in this research is a draft policy document governing business in a research and development environment. In this paper, the authors present a basic understanding of why this approach is needed, the techniques developed, lessons learned during modeling and analysis, and recommendations for future investigation. The modeling method applied on the policy document was developed as an extension to entity relationship (ER) diagrams, which built in some structural information typically associated with object-oriented techniques. This approach afforded some structure as an analysis tool, while remaining flexible enough to be used with the text document. It provided a visual representation that allowed further analysis and layering of the model to be done.

  17. Expansion of the Kano model to identify relevant customer segments and functional requirements

    DEFF Research Database (Denmark)

    Atlason, Reynir Smari; Stefansson, Arnaldur Smari; Wietz, Miriam

    2017-01-01

    or a service. A current limitation of the Kano model is that it does not allow developers to visualise which combined sets of FRs would provide the highest satisfaction between different customer segments. In this paper, a stepwise method to address this particular shortcoming is presented. First......The Kano model of customer satisfaction has been widely used to analyse perceived needs of customers. The model provides product developers valuable information about if, and then how much a given functional requirement (FR) will impact customer satisfaction if implemented within a product, system...... are identified. At last, the functions of the chosen segments with the smallest interval, define the FRs appealing to the biggest target group. The proposed extension to the model should assist product developers within various fields to more effectively evaluate which FRs should be implemented when considering...

  18. Requirements for tolerances in a CAM-I generalized, solid geometric modeling system

    Energy Technology Data Exchange (ETDEWEB)

    Easterday, R.J.

    1980-01-01

    For a geometric modeling system to support computer-assisted manufacturing, it is necessary that dimensioning and tolerancing information be available in computer-readable form. The requirements of a tolerancing scheme within a geometric modeling system are discussed; they include structure sufficient to characterize the tolerance specifications currently in use by industry, means to associate tolerance structures to the boundary representation, means to create and edit information in the tolerance structures, means to extract information from the data base, and functions to check for completeness and validity of the tolerances. 1 figure, 8 tables. (RWR)

  19. Reduction of wafer-edge overlay errors using advanced correction models, optimized for minimal metrology requirements

    Science.gov (United States)

    Kim, Min-Suk; Won, Hwa-Yeon; Jeong, Jong-Mun; Böcker, Paul; Vergaij-Huizer, Lydia; Kupers, Michiel; Jovanović, Milenko; Sochal, Inez; Ryan, Kevin; Sun, Kyu-Tae; Lim, Young-Wan; Byun, Jin-Moo; Kim, Gwang-Gon; Suh, Jung-Joon

    2016-03-01

    In order to optimize yield in DRAM semiconductor manufacturing for 2x nodes and beyond, the (processing induced) overlay fingerprint towards the edge of the wafer needs to be reduced. Traditionally, this is achieved by acquiring denser overlay metrology at the edge of the wafer, to feed field-by-field corrections. Although field-by-field corrections can be effective in reducing localized overlay errors, the requirement for dense metrology to determine the corrections can become a limiting factor due to a significant increase of metrology time and cost. In this study, a more cost-effective solution has been found in extending the regular correction model with an edge-specific component. This new overlay correction model can be driven by an optimized, sparser sampling especially at the wafer edge area, and also allows for a reduction of noise propagation. Lithography correction potential has been maximized, with significantly less metrology needs. Evaluations have been performed, demonstrating the benefit of edge models in terms of on-product overlay performance, as well as cell based overlay performance based on metrology-to-cell matching improvements. Performance can be increased compared to POR modeling and sampling, which can contribute to (overlay based) yield improvement. Based on advanced modeling including edge components, metrology requirements have been optimized, enabling integrated metrology which drives down overall metrology fab footprint and lithography cycle time.

  20. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  1. Assessing Requirements Quality through Requirements Coverage

    Science.gov (United States)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software

  2. Multipartite Entanglement Detection with Minimal Effort

    Science.gov (United States)

    Knips, Lukas; Schwemmer, Christian; Klein, Nico; Wieśniak, Marcin; Weinfurter, Harald

    2016-11-01

    Certifying entanglement of a multipartite state is generally considered a demanding task. Since an N qubit state is parametrized by 4N-1 real numbers, one might naively expect that the measurement effort of generic entanglement detection also scales exponentially with N . Here, we introduce a general scheme to construct efficient witnesses requiring a constant number of measurements independent of the number of qubits for states like, e.g., Greenberger-Horne-Zeilinger states, cluster states, and Dicke states. For four qubits, we apply this novel method to experimental realizations of the aforementioned states and prove genuine four-partite entanglement with two measurement settings only.

  3. Modelo de requisitos para sistemas embebidos: Model of requirements for embedded systems

    Directory of Open Access Journals (Sweden)

    Liliana González Palacio

    2008-07-01

    Full Text Available En este artículo se presenta un modelo de requisitos como apoyo para la construcción de sistemas embebidos. En la actualidad, las metodologías de Ingeniería de Requisitos propuestas para este dominio no establecen continuidad en su proceso de desarrollo, ya que poseen una fuerte orientación a la etapa de diseño y un énfasis más débil en la etapa de análisis. Además, dichas metodologías ofrecen pautas para tratar los requisitos luego de que han sido obtenidos, pero no proponen herramientas; como por ejemplo, un modelo de requisitos, para la obtención de estos. Este trabajo hace parte de un proyecto de investigación que tiene como objetivo proponer una metodología de Ingeniería de Requisitos (IR para el análisis de Sistemas Embebidos (SE. El modelo de requisitos propuesto y su forma de utilización se ilustran mediante un caso de aplicación consistente en la obtención de requisitos para un sistema de sensado de movimiento, embebido en un sistema de alarma para hogar.In this paper a model of requirements for supporting the construction of embedded systems is presented. Currently, the methodologies of Engineering of Requirements, in this field, do not let continuity in their development process, since they have a strong orientation to design stage and a weaker emphasis on the analysis stage. Furthermore, such methodologies provide guidelines for treating requirements after being obtained. However, they do not propose tools such as a model of requirements for obtaining them. This paper is the result of a research project which objective is to propose engineering of requirements methodology for embedded systems analysis. The model of proposed requirements and its use are illustrated through an application case consisting on obtaining requirements for a movement sensing system, embedded in a home alarm system.

  4. High resolution weather data for urban hydrological modelling and impact assessment, ICT requirements and future challenges

    Science.gov (United States)

    ten Veldhuis, Marie-claire; van Riemsdijk, Birna

    2013-04-01

    Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This

  5. DEPENDABLE PRIVACY REQUIREMENTS BY AGILE MODELED LAYERED SECURITY ARCHITECTURES – WEB SERVICES CASE STUDY

    Directory of Open Access Journals (Sweden)

    M.Upendra Kumar

    2011-07-01

    Full Text Available Software Engineering covers the definition of processes, techniques and models suitable for its environment to guarantee quality of results. An important design artifact in any software development project is the Software Architecture. Software Architecture’s important part is the set of architectural design rules. A primary goal of the architecture is to capture the architecture design decisions. An important part of these design decisions consists of architectural design rules In an MDA (Model-Driven Architecture context, the design of the system architecture is captured in the models of the system. MDA is known to be layered approach for modeling the architectural design rules and uses design patterns to improve the quality of software system. And to include the security to the software system, security patterns are introduced that offer security at the architectural level. More over, agile software development methods are used to build secure systems. There are different methods defined in agile development as extreme programming (XP, scrum, feature driven development (FDD, test driven development (TDD, etc. Agile processing is includes the phases as agile analysis, agile design and agile testing. These phases are defined in layers of MDA to provide security at the modeling level which ensures that security at the system architecture stage will improve the requirements for that system. Agile modeled Layered Security Architectures increase the dependability of the architecture in terms of privacy requirements. We validate this with a case study of dependability of privacy of Web Services Security Architectures, which helps for secure service oriented security architecture. In this paper the major part is given to model architectural design rules using MDA so that architects and developers are responsible to automatic enforcement on the detailed design and easy to understand and use by both of them. This MDA approach is implemented in use of

  6. ZNJPrice/Earnings Ratio Model through Dividend Yield and Required Yield Above Expected Inflation

    Directory of Open Access Journals (Sweden)

    Emil Mihalina

    2010-07-01

    Full Text Available Price/earnings ratio is the most popular and most widespread evaluation model used to assess relative capital asset value on financial markets. In functional terms, company earnings in the very long term can be described with high significance. Empirically, it is visible from long-term statistics that the demanded (required yield on capital markets has certain regularity. Thus, investors first require a yield above the stable inflation rate and then a dividend yield and a capital increase caused by the growth of earnings that influence the price, with the assumption that the P/E ratio is stable. By combining the Gordon model for current dividend value, the model of market capitalization of earnings (price/earnings ratio and bearing in mind the influence of the general price levels on company earnings, it is possible to adjust the price/earnings ratio by deriving a function of the required yield on capital markets measured by a market index through dividend yield and inflation rate above the stable inflation rate increased by profit growth. The S&P 500 index for example, has in the last 100 years grown by exactly the inflation rate above the stable inflation rate increased by profit growth. The comparison of two series of price/earnings ratios, a modelled one and an average 7-year ratio, shows a notable correlation in the movement of two series of variables, with a three year deviation. Therefore, it could be hypothesized that three years of the expected inflation level, dividend yield and profit growth rate of the market index are discounted in the current market prices. The conclusion is that, at the present time, the relationship between the adjusted average price/earnings ratio and its effect on the market index on one hand and the modelled price/earnings ratio on the other can clearly show the expected dynamics and course in the following period.

  7. A model predicting fluindione dose requirement in elderly inpatients including genotypes, body weight, and amiodarone.

    Science.gov (United States)

    Moreau, Caroline; Pautas, Eric; Duverlie, Charlotte; Berndt, Celia; Andro, Marion; Mahé, Isabelle; Emmerich, Joseph; Lacut, Karine; Le Gal, Grégoire; Peyron, Isabelle; Gouin-Thibault, Isabelle; Golmard, Jean-Louis; Loriot, Marie-Anne; Siguret, Virginie

    2014-04-01

    Indandione VKAs have been widely used for decades, especially in Eastern Europe and France. Contrary to coumarin VKAs, the relative contribution of individual factors to the indandione-VKA response is poorly known. In the present multicentre study, we sought to develop and validate a model including genetic and non-genetic factors to predict the daily fluindione dose requirement in elderly patients in whom VKA dosing is challenging. We prospectively recorded clinical and therapeutic data in 230 Caucasian inpatients mean aged 85 ± 6 years, who had reached international normalized ratio stabilisation (range 2.0-3.0) on fluindione. In the derivation cohort (n=156), we analysed 13 polymorphisms in seven genes potentially involved in the pharmacological effect or vitamin-K cycle (VKORC1, CYP4F2, EPHX1) and fluindione metabolism/transport (CYP2C9, CYP2C19, CYP3A5, ABCB1). We built a regression model incorporating non-genetic and genetic data and evaluated the model performances in a separate cohort (n=74).Body-weight, amiodarone intake, VKORC1, CYP4F2, ABCB1 genotypes were retained in the final model, accounting for 31.5% of dose variability. None influence of CYP2C9 was observed. Our final model showed good performances: in 83.3% of the validation cohort patients, the dose was accurately predicted within 5 mg, i.e.the usual step used for adjusting fluindione dosage. In conclusion, in addition to body-weight and amiodarone-intake, pharmacogenetic factors (VKORC1, CYP4F2, ABCB1) related to the pharmacodynamic effect and transport of fluindione significantly influenced the dose requirement in elderly patients while CYP2C9 did not. Studies are required to know whether fluindione could be an alternative VKA in carriers of polymorphic CYP2C9 alleles, hypersensitive to coumarins.

  8. The Role of Dispersion in Radionuclide Transport - Data and Modeling Requirements: Revision No. 1

    Energy Technology Data Exchange (ETDEWEB)

    Stoller-Navarro Joint Venture

    2004-02-01

    This document is the collaborative effort of the members of an ad hoc subcommittee of the Underground Test Area Project Technical Working Group. This subcommittee was to answer questions and concerns raised by the Nevada Division of Environmental Protection to the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office, regarding Pahute Mesa Corrective Action Units (CAUs) 101 and 102. The document attempts to synthesize the combined comments made by each member of this subcommittee into insights made in the role of dispersion in radionuclide transport data and modeling. Dispersion is one of many processes that control the concentration of radionuclides in groundwater beneath the Nevada Test Site where CAUs 101 and 102 are located. In order to understand the role of dispersion in radionuclide transport, there is a critical need for CAU- or site-specific data related to transport parameters which is currently lacking, particularly in the case of Western a nd Central Pahute Mesa. The purpose of this technical basis document is to: (1) define dispersion and its role in contaminant transport, (2) present a synopsis of field-scale dispersion measurements, (3) provide a literature review of theories to explain field-scale dispersion, (4) suggest approaches to account for dispersion in CAU-scale radionuclide modeling, and (5) to determine if additional dispersion measurements should be made at this time.

  9. Mere effort and stereotype threat performance effects.

    Science.gov (United States)

    Jamieson, Jeremy P; Harkins, Stephen G

    2007-10-01

    Although the fact that stereotype threat impacts performance is well established, the underlying process(es) is(are) not clear. Recently, T. Schmader and M. Johns (2003) argued for a working memory interference account, which proposes that performance suffers because cognitive resources are expended on processing information associated with negative stereotypes. The antisaccade task provides a vehicle to test this account because optimal performance requires working memory resources to inhibit the tendency to look at an irrelevant, peripheral cue (the prepotent response) and to generate volitional saccades to the target. If stereotype threat occupies working memory resources, then the ability to inhibit the prepotent response and to launch volitional saccades will be impaired, and performance will suffer. In contrast, S. Harkins's (2006) mere effort account argues that stereotype threat participants are motivated to perform well, which potentiates the prepotent response, but also leads to efforts to counter this tendency if participants recognize that the response is incorrect, know the correct response, and have the opportunity to make it. Results from 4 experiments support the mere effort but not the working memory interference account.

  10. Model requirements for estimating and reporting soil C stock changes in national greenhouse gas inventories

    Science.gov (United States)

    Didion, Markus; Blujdea, Viorel; Grassi, Giacomo; Hernández, Laura; Jandl, Robert; Kriiska, Kaie; Lehtonen, Aleksi; Saint-André, Laurent

    2016-04-01

    Globally, soils are the largest terrestrial store of carbon (C) and small changes may contribute significantly to the global C balance. Due to the potential implications for climate change, accurate and consistent estimates of C fluxes at the large-scale are important as recognized, for example, in international agreements such as the United Nations Framework Convention on Climate Change (UNFCCC). Under the UNFCCC and also under the Kyoto Protocol it is required to report C balances annually. Most measurement-based soil inventories are currently not able to detect annual changes in soil C stocks consistently across space and representative at national scales. The use of models to obtain relevant estimates is considered an appropriate alternative under the UNFCCC and the Kyoto Protocol. Several soil carbon models have been developed but few models are suitable for a consistent application across larger-scales. Consistency is often limited by the lack of input data for models, which can result in biased estimates and, thus, the reporting criteria of accuracy (i.e., emission and removal estimates are systematically neither over nor under true emissions or removals) may be met. Based on a qualitative assessment of the ability to meet criteria established for GHG reporting under the UNFCCC including accuracy, consistency, comparability, completeness, and transparency, we identified the suitability of commonly used simulation models for estimating annual C stock changes in mineral soil in European forests. Among six discussed simulation models we found a clear trend toward models for providing quantitative precise site-specific estimates which may lead to biased estimates across space. To meet reporting needs for national GHG inventories, we conclude that there is a need for models producing qualitative realistic results in a transparent and comparable manner. Based on the application of one model along a gradient from Boreal forests in Finland to Mediterranean forests

  11. Requirements on catchment modelling for an optimized reservoir operation in water deficient regions

    Science.gov (United States)

    Froebrich, J.; Kirkby, M. J.; Reder, C.

    2002-12-01

    To provide long term water security in water deficient regions, the interaction of erosion, pollutant emission, the impact of irrigation areas, the characteristics of ephemeral streams and resulting water quality in reservoirs must be considered in water management plans. In many semiarid regions, reservoirs are the only source of water, the indispensable element required for human existence. By the year 2000 the world had built many small and > 45,000 large dams. In these reservoirs, water quality and quantity are affected both by climate change and catchment land use. Results of past projects indicate that the specific control of reservoirs can lead to a significant improvement of water quality, but reservoirs have already transformed the quantity and quality of surface waters in a remarkable manner. Reservoirs with their distinct behaviour as reactors could therefore be considered as key elements in semiarid and arid catchments, linking and transforming rivers and channels. Effective practical operation schemes require a thorough knowledge of spatial and temporal variation in water quality and quantity, and simulation models can be used to support the identification of most effective management potentials at catchment scale. We discuss here the particular requirements for water quality modelling at catchment scale in semiarid and arid regions. Results of reservoir water quality modelling are presented. The potential of catchment models like the PESERA model is demonstrated. Knowledge gaps, such as the consideration of ephemeral streams in catchment models, are addressed and fresh problem solving strategies are introduced. Erosion models like PESERA can provide important information on sediment transport and hence describing the carrier potentials for organic matter, heavy metals and pesticides from terrestrial areas into the water courses. The new EU-research project tempQsim will improve understanding of how the organic matter is transformed in river beds

  12. APS Education and Diversity Efforts

    Science.gov (United States)

    Prestridge, Katherine; Hodapp, Theodore

    2015-11-01

    American Physical Society (APS) has a wide range of education and diversity programs and activities, including programs that improve physics education, increase diversity, provide outreach to the public, and impact public policy. We present the latest programs spearheaded by the Committee on the Status of Women in Physics (CSWP), with highlights from other diversity and education efforts. The CSWP is working to increase the fraction of women in physics, understand and implement solutions for gender-specific issues, enhance professional development opportunities for women in physics, and remedy issues that impact gender inequality in physics. The Conferences for Undergraduate Women in Physics, Professional Skills Development Workshops, and our new Professional Skills program for students and postdocs are all working towards meeting these goals. The CSWP also has site visit and conversation visit programs, where department chairs request that the APS assess the climate for women in their departments or facilitate climate discussions. APS also has two significant programs to increase participation by underrepresented minorities (URM). The newest program, the APS National Mentoring Community, is working to provide mentoring to URM undergraduates, and the APS Bridge Program is an established effort that is dramatically increasing the number of URM PhDs in physics.

  13. A prospective overview of the essential requirements in molecular modeling for nanomedicine design.

    Science.gov (United States)

    Kumar, Pradeep; Khan, Riaz A; Choonara, Yahya E; Pillay, Viness

    2013-05-01

    Nanotechnology has presented many new challenges and opportunities in the area of nanomedicine design. The issues related to nanoconjugation, nanosystem-mediated targeted drug delivery, transitional stability of nanovehicles, the integrity of drug transport, drug-delivery mechanisms and chemical structural design require a pre-estimated and determined course of assumptive actions with property and characteristic estimations for optimal nanomedicine design. Molecular modeling in nanomedicine encompasses these pre-estimations and predictions of pertinent design data via interactive computographic software. Recently, an increasing amount of research has been reported where specialized software is being developed and employed in an attempt to bridge the gap between drug discovery, materials science and biology. This review provides an assimilative and concise incursion into the current and future strategies of molecular-modeling applications in nanomedicine design and aims to describe the utilization of molecular models and theoretical-chemistry computographic techniques for expansive nanomedicine design and development.

  14. A Prognostic Model for One-year Mortality in Patients Requiring Prolonged Mechanical Ventilation

    Science.gov (United States)

    Carson, Shannon S.; Garrett, Joanne; Hanson, Laura C.; Lanier, Joyce; Govert, Joe; Brake, Mary C.; Landucci, Dante L.; Cox, Christopher E.; Carey, Timothy S.

    2009-01-01

    Objective A measure that identifies patients who are at high risk of mortality after prolonged ventilation will help physicians communicate prognosis to patients or surrogate decision-makers. Our objective was to develop and validate a prognostic model for 1-year mortality in patients ventilated for 21 days or more. Design Prospective cohort study. Setting University-based tertiary care hospital Patients 300 consecutive medical, surgical, and trauma patients requiring mechanical ventilation for at least 21 days were prospectively enrolled. Measurements and Main Results Predictive variables were measured on day 21 of ventilation for the first 200 patients and entered into logistic regression models with 1-year and 3-month mortality as outcomes. Final models were validated using data from 100 subsequent patients. One-year mortality was 51% in the development set and 58% in the validation set. Independent predictors of mortality included requirement for vasopressors, hemodialysis, platelet count ≤150 ×109/L, and age ≥50. Areas under the ROC curve for the development model and validation model were 0.82 (se 0.03) and 0.82 (se 0.05) respectively. The model had sensitivity of 0.42 (se 0.12) and specificity of 0.99 (se 0.01) for identifying patients who had ≥90% risk of death at 1 year. Observed mortality was highly consistent with both 3- and 12-month predicted mortality. These four predictive variables can be used in a simple prognostic score that clearly identifies low risk patients (no risk factors, 15% mortality) and high risk patients (3 or 4 risk factors, 97% mortality). Conclusions Simple clinical variables measured on day 21 of mechanical ventilation can identify patients at highest and lowest risk of death from prolonged ventilation. PMID:18552692

  15. Evaluation of olive flowering at low latitude sites in Argentina using a chilling requirement model

    Energy Technology Data Exchange (ETDEWEB)

    Aybar, V.E.; Melo-Abreu, J.P. de; Searles, P.S.; Matias, A.G.; Del Rio, C.; Caballero, C. M.; Rousseaux, M.C.

    2015-07-01

    Olive production has expanded significantly from the Mediterranean Basin into the New World over the last two decades. In some cases, cultivars of European origin have been introduced at a large commercial scale with little previous evaluation of potential productivity. The objective of this study was to evaluate whether a temperature-driven simulation model developed in the Mediterranean Basin to predict normal flowering occurrence and flowering date using cultivar-specific thermal requirements was suitable for the low latitude areas of Northwest Argentina. The model was validated at eight sites over several years and a wide elevation range (350–1200 m above mean sea level) for three cultivars (‘Arbequina’, ‘Frantoio’, ‘Leccino’) with potentially different chilling requirements. In ‘Arbequina’, normal flowering was observed at almost all sites and in all years, while normal flowering events in ‘Frantoio’ and ‘Leccino’ were uncommon. The model successfully predicted if flowering would be normal in 92% and 83% of the cases in ‘Arbequina’ and ‘Frantoio’, respectively, but was somewhat less successful in ‘Leccino’ (61%). When flowering occurred, the predicted flowering date was within ± 7 days of the observed date in 71% of the cases. Overall, the model results indicate that cultivar-specific simulation models may be used as an approximate tool to predict whether individual cultivars will be successful in new growing areas. In Northwest Argentina, the model could be used to identify cultivars to replace ‘Frantoio’ and ‘Leccino’ and to simulate global warming scenarios. (Author)

  16. Application and project portfolio valuation using enterprise architecture and business requirements modelling

    Science.gov (United States)

    Quartel, Dick; Steen, Maarten W. A.; Lankhorst, Marc M.

    2012-05-01

    This article describes an architecture-based approach to IT valuation. This approach offers organisations an instrument to valuate their application and project portfolios and to make well-balanced decisions about IT investments. The value of a software application is assessed in terms of its contribution to a selection of business goals. Based on such assessments, the value of different applications can be compared, and requirements for innovation, development, maintenance and phasing out can be identified. IT projects are proposed to realise the requirements. The value of each project is assessed in terms of the value it adds to one or more applications. This value can be obtained by relating the 'as-is' application portfolio to the 'to-be' portfolio that is being proposed by the project portfolio. In this way, projects can be ranked according to their added value, given a certain selection of business goals. The approach uses ArchiMate to model the relationship between software applications, business processes, services and products. In addition, two language extensions are used to model the relationship of these elements to business goals and requirements and to projects and project portfolios. The approach is illustrated using the portfolio method of Bedell and has been implemented in BiZZdesign Architect.

  17. Job Satisfaction, Effort, and Performance: A Reasoned Action Perspective

    OpenAIRE

    Icek Ajzen

    2011-01-01

    In this article the author takes issue with the recurrent reliance on job satisfaction to explain job-related effort and performance.  The disappointing findings in this tradition are explained by lack of compatibility between job satisfaction–-a very broad attitude–-and the more specific effort and performance criteria.  Moreover, attempts to apply the expectancy-value model of attitude to explore the determinants of effort and performance suffer from reliance on unrepresentative sets of bel...

  18. Modelling regional variability of irrigation requirements due to climate change in Northern Germany.

    Science.gov (United States)

    Riediger, Jan; Breckling, Broder; Svoboda, Nikolai; Schröder, Winfried

    2016-01-15

    The question whether global climate change invalidates the efficiency of established land use practice cannot be answered without systemic considerations on a region specific basis. In this context plant water availability and irrigation requirements, respectively, were investigated in Northern Germany. The regions under investigation--Diepholz, Uelzen, Fläming and Oder-Spree--represent a climatic gradient with increasing continentality from West to East. Besides regional climatic variation and climate change, soil conditions and crop management differ on the regional scale. In the model regions, temporal seasonal droughts influence crop success already today, but on different levels of intensity depending mainly on climate conditions. By linking soil water holding capacities, crop management data and calculations of evapotranspiration and precipitation from the climate change scenario RCP 8.5 irrigation requirements for maintaining crop productivity were estimated for the years 1991 to 2070. Results suggest that water requirement for crop irrigation is likely to increase with considerable regional variation. For some of the regions, irrigation requirements might increase to such an extent that the established regional agricultural practice might be hard to retain. Where water availability is limited, agricultural practice, like management and cultivated crop spectrum, has to be changed to deal with the new challenges.

  19. Modeling Crop Water Requirement at Regional Scales in the Context of Integrated Hydrology

    Science.gov (United States)

    Dogrul, E. C.; Kadir, T.; Brush, C. F.; Chung, F. I.

    2009-12-01

    In developed watersheds, the stresses on surface and subsurface water resources are generally created by groundwater pumping and stream flow diversions to satisfy agricultural and urban water requirements. The application of pumping and diversion to meet these requirements also affects the surface and subsurface water system through recharge of the aquifer and surface runoff back into the streams. The agricultural crop water requirement is a function of climate, soil and land surface physical properties as well as land use management practices which are spatially distributed and evolve in time. In almost all modeling studies pumping and diversions are specified as predefined stresses and are not included in the simulation as an integral and dynamic component of the hydrologic cycle that depend on other hydrologic components. To address this issue, California Department of Water Resources has been developing a new root zone module that can either be used as a stand-alone modeling tool or can be linked to other stream and aquifer modeling tools. The tool, named Integrated Water Flow Model Demand Calculator (IDC), computes crop water requirements under user-specified climatic, land-use and irrigation management settings at regional scales, and routes the precipitation and irrigation water through the root zone using physically-based methods. In calculating the crop water requirement, IDC uses an irrigation-scheduling type approach where irrigation is triggered when the soil moisture falls below a user-specified level. Water demands for managed wetlands, urban areas, and agricultural crops including rice, can either be computed by IDC or specified by the user depending on the requirements and available data for the modeling project. For areas covered with native vegetation water demand is not computed and only precipitation is routed through the root zone. Many irrigational practices such as irrigation for leaching, re-use of irrigation return flow, flooding and

  20. Cognitive dissonance in children: justification of effort or contrast?

    Science.gov (United States)

    Alessandri, Jérôme; Darcheville, Jean-Claude; Zentall, Thomas R

    2008-06-01

    Justification of effort is a form of cognitive dissonance in which the subjective value of an outcome is directly related to the effort that went into obtaining it. However, it is likely that in social contexts (such as the requirements for joining a group) an inference can be made (perhaps incorrectly) that an outcome that requires greater effort to obtain in fact has greater value. Here we present evidence that a cognitive dissonance effect can be found in children under conditions that offer better control for the social value of the outcome. This effect is quite similar to contrast effects that recently have been studied in animals. We suggest that contrast between the effort required to obtain the outcome and the outcome itself provides a more parsimonious account of this phenomenon and perhaps other related cognitive dissonance phenomena as well. Research will be needed to identify cognitive dissonance processes that are different from contrast effects of this kind.

  1. Technical support document for proposed revision of the model energy code thermal envelope requirements

    Energy Technology Data Exchange (ETDEWEB)

    Conner, C.C.; Lucas, R.G.

    1993-02-01

    This report documents the development of the proposed revision of the council of American Building Officials' (CABO) 1993 supplement to the 1992 Model Energy Code (MEC) (referred to as the 1993 MEC) building thermal envelope requirements for single-family and low-rise multifamily residences. The goal of this analysis was to develop revised guidelines based on an objective methodology that determined the most cost-effective (least total life-cycle cost [LCC]) combination of energy conservation measures (ECMs) for residences in different locations. The ECMs with the lowest LCC were used as a basis for proposing revised MEC maximum U[sub o]-value (thermal transmittance) curves in the MEC format. The changes proposed here affect the requirements for group R'' residences. The group R residences are detached one- and two-family dwellings (referred to as single-family) and all other residential buildings three stories or less (referred to as multifamily).

  2. Technical support document for proposed revision of the model energy code thermal envelope requirements

    Energy Technology Data Exchange (ETDEWEB)

    Conner, C.C.; Lucas, R.G.

    1993-02-01

    This report documents the development of the proposed revision of the council of American Building Officials` (CABO) 1993 supplement to the 1992 Model Energy Code (MEC) (referred to as the 1993 MEC) building thermal envelope requirements for single-family and low-rise multifamily residences. The goal of this analysis was to develop revised guidelines based on an objective methodology that determined the most cost-effective (least total life-cycle cost [LCC]) combination of energy conservation measures (ECMs) for residences in different locations. The ECMs with the lowest LCC were used as a basis for proposing revised MEC maximum U{sub o}-value (thermal transmittance) curves in the MEC format. The changes proposed here affect the requirements for ``group R`` residences. The group R residences are detached one- and two-family dwellings (referred to as single-family) and all other residential buildings three stories or less (referred to as multifamily).

  3. Model of an aquaponic system for minimised water, energy and nitrogen requirements.

    Science.gov (United States)

    Reyes Lastiri, D; Slinkert, T; Cappon, H J; Baganz, D; Staaks, G; Keesman, K J

    2016-01-01

    Water and nutrient savings can be established by coupling water streams between interacting processes. Wastewater from production processes contains nutrients like nitrogen (N), which can and should be recycled in order to meet future regulatory discharge demands. Optimisation of interacting water systems is a complex task. An effective way of understanding, analysing and optimising such systems is by applying mathematical models. The present modelling work aims at supporting the design of a nearly emission-free aquaculture and hydroponic system (aquaponics), thus contributing to sustainable production and to food security for the 21st century. Based on the model, a system that couples 40 m(3) fish tanks and a hydroponic system of 1,000 m(2) can produce 5 tons of tilapia and 75 tons of tomato yearly. The system requires energy to condense and recover evaporated water, for lighting and heating, adding up to 1.3 GJ/m(2) every year. In the suggested configuration, the fish can provide about 26% of the N required in a plant cycle. A coupling strategy that sends water from the fish to the plants in amounts proportional to the fish feed input, reduces the standard deviation of the NO3(-) level in the fish cycle by 35%.

  4. Minimum requirements for predictive pore-network modeling of solute transport in micromodels

    Science.gov (United States)

    Mehmani, Yashar; Tchelepi, Hamdi A.

    2017-10-01

    Pore-scale models are now an integral part of analyzing fluid dynamics in porous materials (e.g., rocks, soils, fuel cells). Pore network models (PNM) are particularly attractive due to their computational efficiency. However, quantitative predictions with PNM have not always been successful. We focus on single-phase transport of a passive tracer under advection-dominated regimes and compare PNM with high-fidelity direct numerical simulations (DNS) for a range of micromodel heterogeneities. We identify the minimum requirements for predictive PNM of transport. They are: (a) flow-based network extraction, i.e., discretizing the pore space based on the underlying velocity field, (b) a Lagrangian (particle tracking) simulation framework, and (c) accurate transfer of particles from one pore throat to the next. We develop novel network extraction and particle tracking PNM methods that meet these requirements. Moreover, we show that certain established PNM practices in the literature can result in first-order errors in modeling advection-dominated transport. They include: all Eulerian PNMs, networks extracted based on geometric metrics only, and flux-based nodal transfer probabilities. Preliminary results for a 3D sphere pack are also presented. The simulation inputs for this work are made public to serve as a benchmark for the research community.

  5. Termination of prehospital resuscitative efforts

    DEFF Research Database (Denmark)

    Mikkelsen, Søren; Schaffalitzky de Muckadell, Caroline; Binderup, Lars Grassmé

    2017-01-01

    BACKGROUND: Discussions on ethical aspects of life-and-death decisions within the hospital are often made in plenary. The prehospital physician, however, may be faced with ethical dilemmas in life-and-death decisions when time-critical decisions to initiate or refrain from resuscitative efforts...... need to be taken without the possibility to discuss matters with colleagues. Little is known whether these considerations regarding ethical issues in crucial life-and-death decisions are documented prehospitally. This is a review of the ethical considerations documented in the prehospital medical....... The medical records with possible documentation of ethical issues were independently reviewed by two philosophers in order to identify explicit ethical or philosophical considerations pertaining to the decision to resuscitate or not. RESULTS: In total, 1275 patients were either declared dead at the scene...

  6. Vitamin D Signaling in the Bovine Immune System: A Model for Understanding Human Vitamin D Requirements

    Directory of Open Access Journals (Sweden)

    Corwin D. Nelson

    2012-03-01

    Full Text Available The endocrine physiology of vitamin D in cattle has been rigorously investigated and has yielded information on vitamin D requirements, endocrine function in health and disease, general metabolism, and maintenance of calcium homeostasis in cattle. These results are relevant to human vitamin D endocrinology. The current debate regarding vitamin D requirements is centered on the requirements for proper intracrine and paracrine vitamin D signaling. Studies in adult and young cattle can provide valuable insight for understanding vitamin D requirements as they relate to innate and adaptive immune responses during infectious disease. In cattle, toll-like receptor recognition activates intracrine and paracrine vitamin D signaling mechanism in the immune system that regulates innate and adaptive immune responses in the presence of adequate 25-hydroxyvitamin D. Furthermore, experiments with mastitis in dairy cattle have provided in vivo evidence for the intracrine vitamin D signaling mechanism in macrophages as well as vitamin D mediated suppression of infection. Epidemiological evidence indicates that circulating concentrations above 32 ng/mL of 25-hydroxyvitamin D are necessary for optimal vitamin D signaling in the immune system, but experimental evidence is lacking for that value. Experiments in cattle can provide that evidence as circulating 25-hydroxyvitamin D concentrations can be experimentally manipulated within ranges that are normal for humans and cattle. Additionally, young and adult cattle can be experimentally infected with bacteria and viruses associated with significant diseases in both cattle and humans. Utilizing the bovine model to further delineate the immunomodulatory role of vitamin D will provide potentially valuable insights into the vitamin D requirements of both humans and cattle, especially as they relate to immune response capacity and infectious disease resistance.

  7. Vitamin D signaling in the bovine immune system: a model for understanding human vitamin D requirements.

    Science.gov (United States)

    Nelson, Corwin D; Reinhardt, Timothy A; Lippolis, John D; Sacco, Randy E; Nonnecke, Brian J

    2012-03-01

    The endocrine physiology of vitamin D in cattle has been rigorously investigated and has yielded information on vitamin D requirements, endocrine function in health and disease, general metabolism, and maintenance of calcium homeostasis in cattle. These results are relevant to human vitamin D endocrinology. The current debate regarding vitamin D requirements is centered on the requirements for proper intracrine and paracrine vitamin D signaling. Studies in adult and young cattle can provide valuable insight for understanding vitamin D requirements as they relate to innate and adaptive immune responses during infectious disease. In cattle, toll-like receptor recognition activates intracrine and paracrine vitamin D signaling mechanism in the immune system that regulates innate and adaptive immune responses in the presence of adequate 25-hydroxyvitamin D. Furthermore, experiments with mastitis in dairy cattle have provided in vivo evidence for the intracrine vitamin D signaling mechanism in macrophages as well as vitamin D mediated suppression of infection. Epidemiological evidence indicates that circulating concentrations above 32 ng/mL of 25-hydroxyvitamin D are necessary for optimal vitamin D signaling in the immune system, but experimental evidence is lacking for that value. Experiments in cattle can provide that evidence as circulating 25-hydroxyvitamin D concentrations can be experimentally manipulated within ranges that are normal for humans and cattle. Additionally, young and adult cattle can be experimentally infected with bacteria and viruses associated with significant diseases in both cattle and humans. Utilizing the bovine model to further delineate the immunomodulatory role of vitamin D will provide potentially valuable insights into the vitamin D requirements of both humans and cattle, especially as they relate to immune response capacity and infectious disease resistance.

  8. Impaired effort allocation in patients with schizophrenia.

    Science.gov (United States)

    Treadway, Michael T; Peterman, Joel S; Zald, David H; Park, Sohee

    2015-02-01

    A hallmark of negative symptoms in schizophrenia is reduced motivation and goal directed behavior. While preclinical models suggest that blunted striatal dopamine levels can produce such reductions, this mechanism is inconsistent with evidence for enhanced striatal dopamine levels in schizophrenia. In seeking to reconcile this discrepancy, one possibility is that negative symptoms reflect a failure of striatal motivational systems to mobilize appropriately in response to reward-related information. In the present study, we used a laboratory effort-based decision-making task in a sample of patients with schizophrenia and healthy controls to examine allocation of effort in exchange for varying levels of monetary reward. We found that patients and controls did not differ in the overall amount of effort expenditure, but patients made significantly less optimal choices in terms of maximizing rewards. These results provide further evidence for a selective deficit in the ability of schizophrenia patients to utilize environmental cues to guide reward-seeking behavior. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Estimating required information size by quantifying diversity in random-effects model meta-analyses

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Thorlund, Kristian; Brok, Jesper;

    2009-01-01

    an intervention effect suggested by trials with low-risk of bias. METHODS: Information size calculations need to consider the total model variance in a meta-analysis to control type I and type II errors. Here, we derive an adjusting factor for the required information size under any random-effects model meta......-analysis. RESULTS: We devise a measure of diversity (D2) in a meta-analysis, which is the relative variance reduction when the meta-analysis model is changed from a random-effects into a fixed-effect model. D2 is the percentage that the between-trial variability constitutes of the sum of the between...... and interpreted using several simulations and clinical examples. In addition we show mathematically that diversity is equal to or greater than inconsistency, that is D2 >or= I2, for all meta-analyses. CONCLUSION: We conclude that D2 seems a better alternative than I2 to consider model variation in any random...

  10. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    Science.gov (United States)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried

  11. Meeting Human Reliability Requirements through Human Factors Design, Testing, and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Boring

    2007-06-01

    In the design of novel systems, it is important for the human factors engineer to work in parallel with the human reliability analyst to arrive at the safest achievable design that meets design team safety goals and certification or regulatory requirements. This paper introduces the System Development Safety Triptych, a checklist of considerations for the interplay of human factors and human reliability through design, testing, and modeling in product development. This paper also explores three phases of safe system development, corresponding to the conception, design, and implementation of a system.

  12. Current efforts in medical education to incorporate national health priorities.

    Science.gov (United States)

    Nair, Manisha; Fellmeth, Gracia

    2017-08-03

    As a reflection on the Edinburgh Declaration, this conceptual synthesis presents six important challenges in relation to the role of medical education in meeting current national health priorities. This paper presents a conceptual synthesis of current efforts in medical education to incorporate national health priorities as a reflection on how the field has evolved since the Edinburgh Declaration. Considering that health needs vary from country to country, our paper focuses on three broad and cross-cutting themes: health equity, health systems strengthening, and changing patterns of disease. Considering the complexity of this topic, we conducted a targeted search to broadly sample and critically review the literature in two phases. Phase 1: within each theme, we assessed the current challenges in the field of medical education to meet the health priority. Phase 2: a search for various strategies in undergraduate and postgraduate education that have been tested in an effort to address the identified challenges. We conducted a qualitative synthesis of the literature followed by mapping of the identified challenges within each of the three themes with targeted efforts. We identified six important challenges: (i) mismatch between the need for generalist models of health care and medical education curricula's specialist focus; (ii) attitudes of health care providers contributing to disparities in health care; (iii) the lack of a universal approach in preparing medical students for 21st century health systems; (iv) the inability of medical education to keep up with the abundance of new health care technologies; (v) a mismatch between educational requirements for integrated care and poorly integrated, specialised health care systems; and (vi) development of a globally interdependent education system to meet global health challenges. Examples of efforts being made to address these challenges are offered. Although strategies for combatting these challenges exist, the

  13. Emission inventories and modeling requirements for the development of air quality plans. Application to Madrid (Spain).

    Science.gov (United States)

    Borge, Rafael; Lumbreras, Julio; Pérez, Javier; de la Paz, David; Vedrenne, Michel; de Andrés, Juan Manuel; Rodríguez, Ma Encarnación

    2014-01-01

    Modeling is an essential tool for the development of atmospheric emission abatement measures and air quality plans. Most often these plans are related to urban environments with high emission density and population exposure. However, air quality modeling in urban areas is a rather challenging task. As environmental standards become more stringent (e.g. European Directive 2008/50/EC), more reliable and sophisticated modeling tools are needed to simulate measures and plans that may effectively tackle air quality exceedances, common in large urban areas across Europe, particularly for NO₂. This also implies that emission inventories must satisfy a number of conditions such as consistency across the spatial scales involved in the analysis, consistency with the emission inventories used for regulatory purposes and versatility to match the requirements of different air quality and emission projection models. This study reports the modeling activities carried out in Madrid (Spain) highlighting the atmospheric emission inventory development and preparation as an illustrative example of the combination of models and data needed to develop a consistent air quality plan at urban level. These included a series of source apportionment studies to define contributions from the international, national, regional and local sources in order to understand to what extent local authorities can enforce meaningful abatement measures. Moreover, source apportionment studies were conducted in order to define contributions from different sectors and to understand the maximum feasible air quality improvement that can be achieved by reducing emissions from those sectors, thus targeting emission reduction policies to the most relevant activities. Finally, an emission scenario reflecting the effect of such policies was developed and the associated air quality was modeled. © 2013.

  14. Modeling regulatory policies associated with offshore structure removal requirements in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Mark J. [Center for Energy Studies, Louisiana State University, Energy Coast and Environment Building, Baton Rouge, LA (United States)

    2008-07-15

    Federal regulations require that a lease in the Outer Continental Shelf of the Gulf of Mexico be cleared of all structures within one year after production on the lease ceases, but in recent years, the Minerals Management Service has begun to encourage operators to remove idle (non-producing) structures on producing leases that are no longer ''economically viable''. At the end of 2003, there were 2175 producing structures, 898 idle (non-producing) structures, and 440 auxiliary (never-producing) structures on 1356 active leases; and 329 idle structures and 65 auxiliary structures on 273 inactive leases. The purpose of this paper is to model the impact of alternative regulatory policies on the removal trends of structures and the inventory of idle iron, and to provide first-order estimates of the cost of each regulatory option. A description of the modeling framework and implementation results is presented. (author)

  15. Feedforward consequences of isometric contractions: effort and ventilation.

    Science.gov (United States)

    Luu, Billy L; Smith, Janette L; Martin, Peter G; McBain, Rachel A; Taylor, Janet L; Butler, Jane E

    2016-08-01

    The onset of voluntary muscle contractions causes rapid increases in ventilation and is accompanied by a sensation of effort. Both the ventilatory response and perception of effort are proportional to contraction intensity, but these behaviors have been generalized from contractions of a single muscle group. Our aim was to determine how these relationships are affected by simultaneous contractions of multiple muscle groups. We examined the ventilatory response and perceived effort of contraction during separate and simultaneous isometric contractions of the contralateral elbow flexors and of an ipsilateral elbow flexor and knee extensor. Subjects made 10-sec contractions at 25, 50, and 100% of maximum during normocapnia and hypercapnia. For simultaneous contractions, both muscle groups were activated at the same intensities. Ventilation was measured continuously and subjects rated the effort required to produce each contraction. As expected, ventilation and perceived effort increased proportionally with contraction intensity during individual contractions. However, during simultaneous contractions, neither ventilation nor effort reflected the combined muscle output. Rather, the ventilatory response was similar to when contractions were performed separately, and effort ratings showed a small but significant increase for simultaneous contractions. Hypercapnia at rest doubled baseline ventilation, but did not affect the difference in perceived effort between separate and simultaneous contractions. The ventilatory response and the sense of effort at the onset of muscle activity are not related to the total output of the motor pathways, or the working muscles, but arise from cortical regions upstream from the motor cortex.

  16. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    Science.gov (United States)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  17. Model-based Assessment for Balancing Privacy Requirements and Operational Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-17

    The smart grid changes the way energy is produced and distributed. In addition both, energy and information is exchanged bidirectionally among participating parties. Therefore heterogeneous systems have to cooperate effectively in order to achieve a common high-level use case, such as smart metering for billing or demand response for load curtailment. Furthermore, a substantial amount of personal data is often needed for achieving that goal. Capturing and processing personal data in the smart grid increases customer concerns about privacy and in addition, certain statutory and operational requirements regarding privacy aware data processing and storage have to be met. An increase of privacy constraints, however, often limits the operational capabilities of the system. In this paper, we present an approach that automates the process of finding an optimal balance between privacy requirements and operational requirements in a smart grid use case and application scenario. This is achieved by formally describing use cases in an abstract model and by finding an algorithm that determines the optimum balance by forward mapping privacy and operational impacts. For this optimal balancing algorithm both, a numeric approximation and – if feasible – an analytic assessment are presented and investigated. The system is evaluated by applying the tool to a real-world use case from the University of Southern California (USC) microgrid.

  18. [In-depth interviews and the Kano model to determine user requirements in a burns unit].

    Science.gov (United States)

    González-Revaldería, J; Holguín-Holgado, P; Lumbreras-Marín, E; Núñez-López, G

    To determine the healthcare requirements of patients in a Burns Unit, using qualitative techniques, such us in-depth personal interviews and Kano's methodology. Qualitative methodology using in-depth personal interviews (12 patients), Kano's conceptual model, and the SERVQHOS questionnaire (24 patients). All patients had been hospitalised in the last 12 months in the Burns Unit. Using Kano's methodology, service attributes were grouped by affinity diagrams, and classified as follows: must-be, attractive (unexpected, great satisfaction), and one-dimensional (linked to the degree of functionality of the service). The outcomes were compared with those obtained with SERVQHOS questionnaire. From the analysis of in-depth interviews, 11 requirements were obtained, referring to hotel aspects, information, need for closer staff relationship, and organisational aspects. The attributes classified as must-be were free television and automatic TV disconnection at midnight. Those classified as attractive were: individual room for more privacy, information about dressing change times in order to avoid anxiety, and additional staff for in-patients. The results were complementary to those obtained with the SERVQHOS questionnaire. In-depth personal interviews provide extra knowledge about patient requirements, complementing the information obtained with questionnaires. With this methodology, a more active patient participation is achieved and the companion's opinion is also taken into account. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  19. Business Process Modelling is an Essential Part of a Requirements Analysis. Contribution of EFMI Primary Care Working Group.

    Science.gov (United States)

    de Lusignan, S; Krause, P; Michalakidis, G; Vicente, M Tristan; Thompson, S; McGilchrist, M; Sullivan, F; van Royen, P; Agreus, L; Desombre, T; Taweel, A; Delaney, B

    2012-01-01

    To perform a requirements analysis of the barriers to conducting research linking of primary care, genetic and cancer data. We extended our initial data-centric approach to include socio-cultural and business requirements. We created reference models of core data requirements common to most studies using unified modelling language (UML), dataflow diagrams (DFD) and business process modelling notation (BPMN). We conducted a stakeholder analysis and constructed DFD and UML diagrams for use cases based on simulated research studies. We used research output as a sensitivity analysis. Differences between the reference model and use cases identified study specific data requirements. The stakeholder analysis identified: tensions, changes in specification, some indifference from data providers and enthusiastic informaticians urging inclusion of socio-cultural context. We identified requirements to collect information at three levels: micro- data items, which need to be semantically interoperable, meso- the medical record and data extraction, and macro- the health system and socio-cultural issues. BPMN clarified complex business requirements among data providers and vendors; and additional geographical requirements for patients to be represented in both linked datasets. High quality research output was the norm for most repositories. Reference models provide high-level schemata of the core data requirements. However, business requirements' modelling identifies stakeholder issues and identifies what needs to be addressed to enable participation.

  20. Integrating behavioral-motive and experiential-requirement perspectives on psychological needs: a two process model.

    Science.gov (United States)

    Sheldon, Kennon M

    2011-10-01

    Psychological need theories offer much explanatory potential for behavioral scientists, but there is considerable disagreement and confusion about what needs are and how they work. A 2-process model of psychological needs is outlined, viewing needs as evolved functional systems that provide both (a) innate psychosocial motives that tend to impel adaptive behavior and (b) innate experiential requirements that when met reinforce adaptive behavior and promote mental health. The literature is reviewed to find support for 8 hypotheses derived from this model: that certain basic psychosocial motives are present at birth; that successful enactment of these motives supports the functioning and wellness of all humans; that individual differences in these motives develop in childhood; that these strong motive dispositions tend to produce the satisfying experiences they seek; that motive dispositions do not moderate the effect of motive-corresponding need satisfaction on well-being but do moderate the effect of assigned goal-type on rated self-concordance for those goals; that need dissatisfaction and need satisfaction correspond to the separable behavioral-motive and experiential-reward aspects of needs; and that motives and needs can become decoupled when chronic dissatisfaction of particular requirements warps or depresses the corresponding motives, such that the adaptive process fails in its function. Implications for self-determination theory and motive disposition theory are considered.

  1. Control and Effort Costs Influence the Motivational Consequences of Choice

    Science.gov (United States)

    Sullivan-Toole, Holly; Richey, John A.; Tricomi, Elizabeth

    2017-01-01

    The act of making a choice, apart from any outcomes the choice may yield, has, paradoxically, been linked to both the enhancement and the detriment of intrinsic motivation. Research has implicated two factors in potentially mediating these contradictory effects: the personal control conferred by a choice and the costs associated with a choice. Across four experiments, utilizing a physical effort task disguised as a simple video game, we systematically varied costs across two levels of physical effort requirements (Low-Requirement, High-Requirement) and control over effort costs across three levels of choice (Free-Choice, Restricted-Choice, and No-Choice) to disambiguate how these factors affect the motivational consequences of choosing within an effortful task. Together, our results indicated that, in the face of effort requirements, illusory control alone may not sufficiently enhance perceptions of personal control to boost intrinsic motivation; rather, the experience of actual control may be necessary to overcome effort costs and elevate performance. Additionally, we demonstrated that conditions of illusory control, while otherwise unmotivating, can through association with the experience of free-choice, be transformed to have a positive effect on motivation. PMID:28515705

  2. Integrated Data Capturing Requirements for 3d Semantic Modelling of Cultural Heritage: the Inception Protocol

    Science.gov (United States)

    Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.

    2017-02-01

    The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  3. Predictive Modeling of Tacrolimus Dose Requirement Based on High-Throughput Genetic Screening.

    Science.gov (United States)

    Damon, C; Luck, M; Toullec, L; Etienne, I; Buchler, M; Hurault de Ligny, B; Choukroun, G; Thierry, A; Vigneau, C; Moulin, B; Heng, A-E; Subra, J-F; Legendre, C; Monnot, A; Yartseva, A; Bateson, M; Laurent-Puig, P; Anglicheau, D; Beaune, P; Loriot, M A; Thervet, E; Pallet, N

    2017-04-01

    Any biochemical reaction underlying drug metabolism depends on individual gene-drug interactions and on groups of genes interacting together. Based on a high-throughput genetic approach, we sought to identify a set of covariant single-nucleotide polymorphisms predictive of interindividual tacrolimus (Tac) dose requirement variability. Tac blood concentrations (Tac C0 ) of 229 kidney transplant recipients were repeatedly monitored after transplantation over 3 mo. Given the high dimension of the genomic data in comparison to the low number of observations and the high multicolinearity among the variables (gene variants), we developed an original predictive approach that integrates an ensemble variable-selection strategy to reinforce the stability of the variable-selection process and multivariate modeling. Our predictive models explained up to 70% of total variability in Tac C0 per dose with a maximum of 44 gene variants (p-value <0.001 with a permutation test). These models included molecular networks of drug metabolism with oxidoreductase activities and the multidrug-resistant ABCC8 transporter, which was found in the most stringent model. Finally, we identified an intronic variant of the gene encoding SLC28A3, a drug transporter, as a key gene involved in Tac metabolism, and we confirmed it in an independent validation cohort. © 2016 The American Society of Transplantation and the American Society of Transplant Surgeons.

  4. Mental and physical effort affect vigilance differently

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  5. Mental and physical effort affect vigilance differently.

    NARCIS (Netherlands)

    Smit, A.S.; Eling, P.A.T.M.; Hopman, M.T.E.; Coenen, A.M.L.

    2005-01-01

    Both physical and mental effort are thought to affect vigilance. Mental effort is known for its vigilance declining effects, but the effects of physical effort are less clear. This study investigated whether these two forms of effort affect the EEG and subjective alertness differently. Participants

  6. Bioenergetics model for estimating food requirements of female Pacific walruses (Odobenus rosmarus divergens)

    Science.gov (United States)

    Noren, S.R.; Udevitz, M.S.; Jay, C.V.

    2012-01-01

    Pacific walruses Odobenus rosmarus divergens use sea ice as a platform for resting, nursing, and accessing extensive benthic foraging grounds. The extent of summer sea ice in the Chukchi Sea has decreased substantially in recent decades, causing walruses to alter habitat use and activity patterns which could affect their energy requirements. We developed a bioenergetics model to estimate caloric demand of female walruses, accounting for maintenance, growth, activity (active in-water and hauled-out resting), molt, and reproductive costs. Estimates for non-reproductive females 0–12 yr old (65−810 kg) ranged from 16359 to 68960 kcal d−1 (74−257 kcal d−1 kg−1) for years with readily available sea ice for which we assumed animals spent 83% of their time in water. This translated into the energy content of 3200–5960 clams per day, equivalent to 7–8% and 14–9% of body mass per day for 5–12 and 2–4 yr olds, respectively. Estimated consumption rates of 12 yr old females were minimally affected by pregnancy, but lactation had a large impact, increasing consumption rates to 15% of body mass per day. Increasing the proportion of time in water to 93%, as might happen if walruses were required to spend more time foraging during ice-free periods, increased daily caloric demand by 6–7% for non-lactating females. We provide the first bioenergetics-based estimates of energy requirements for walruses and a first step towards establishing bioenergetic linkages between demography and prey requirements that can ultimately be used in predicting this population’s response to environmental change.

  7. Diverse secreted effectors are required for Salmonella persistence in a mouse infection model.

    Directory of Open Access Journals (Sweden)

    Afshan S Kidwai

    Full Text Available Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  8. Diverse Secreted Effectors Are Required for Salmonella Persistence in a Mouse Infection Model

    Energy Technology Data Exchange (ETDEWEB)

    Kidwai, Afshan S.; Mushamiri, Ivy T.; Niemann, George; Brown, Roslyn N.; Adkins, Joshua N.; Heffron, Fred

    2013-08-12

    Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS) apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI) was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  9. Analysis and Design Environment for Large Scale System Models and Collaborative Model Development Project

    Data.gov (United States)

    National Aeronautics and Space Administration — As NASA modeling efforts grow more complex and more distributed among many working groups, new tools and technologies are required to integrate their efforts...

  10. Economic growth, biodiversity loss and conservation effort.

    Science.gov (United States)

    Dietz, Simon; Adger, W Neil

    2003-05-01

    This paper investigates the relationship between economic growth, biodiversity loss and efforts to conserve biodiversity using a combination of panel and cross section data. If economic growth is a cause of biodiversity loss through habitat transformation and other means, then we would expect an inverse relationship. But if higher levels of income are associated with increasing real demand for biodiversity conservation, then investment to protect remaining diversity should grow and the rate of biodiversity loss should slow with growth. Initially, economic growth and biodiversity loss are examined within the framework of the environmental Kuznets hypothesis. Biodiversity is represented by predicted species richness, generated for tropical terrestrial biodiversity using a species-area relationship. The environmental Kuznets hypothesis is investigated with reference to comparison of fixed and random effects models to allow the relationship to vary for each country. It is concluded that an environmental Kuznets curve between income and rates of loss of habitat and species does not exist in this case. The role of conservation effort in addressing environmental problems is examined through state protection of land and the regulation of trade in endangered species, two important means of biodiversity conservation. This analysis shows that the extent of government environmental policy increases with economic development. We argue that, although the data are problematic, the implications of these models is that conservation effort can only ever result in a partial deceleration of biodiversity decline partly because protected areas serve multiple functions and are not necessarily designated to protect biodiversity. Nevertheless institutional and policy response components of the income biodiversity relationship are important but are not well captured through cross-country regression analysis.

  11. Search, Effort, and Locus of Control

    OpenAIRE

    McGee, Andrew; McGee, Peter

    2011-01-01

    We test the hypothesis that locus of control – one's perception of control over events in life – influences search by affecting beliefs about the efficacy of search effort in a laboratory experiment. We find that reservation offers and effort are increasing in the belief that one's efforts influence outcomes when subjects exert effort without knowing how effort influences the generation of offers but are unrelated to locus of control beliefs when subjects are informed about the relationship b...

  12. Optimal Effort in Consumer Choice : Theory and Experimental Evidence for Binary Choice

    NARCIS (Netherlands)

    Conlon, B.J.; Dellaert, B.G.C.; van Soest, A.H.O.

    2001-01-01

    This paper develops a theoretical model of optimal effort in consumer choice.The model extends previous consumer choice models in that the consumer not only chooses a product, but also decides how much effort to apply to a given choice problem.The model yields a unique optimal level of effort, which

  13. Modelling the semivariograms and cross-semivariograms required in downscaling cokriging by numerical convolution deconvolution

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Atkinson, Peter M.

    2007-10-01

    A practical problem of interest in remote sensing is to increase the spatial resolution of a coarse spatial resolution image by fusing the information of that image with another fine spatial resolution image (from the same sensor or from sensors on different satellites). Thus, the problem is how to introduce spatial 'detail' into a coarse spatial resolution image (decrease the pixel size) such that it is coherent with the spectral information of the image. Cokriging provides a geostatistical solution to the problem and has several interesting advantages: it is a sound statistical method by being unbiased and minimizing a prediction variance (c.f. ad hoc procedures), it takes into account the effect of pixel size, and also autocorrelation in each image as well as the cross-correlation between images, it may be extended to incorporate extra information from other sources and it provides an estimation of the uncertainty of the final predictions. When formulating the cokriging system, semivariograms and cross-semivariograms (or covariances and cross-covariances) appear, some of which cannot be estimated from data directly. Cross-variograms between different variables as well as cross-semivariograms between different supports for the same variable are required. The problem is solved by using linear systems theory in which any variable for any pixel size is seen as the output of a linear system when the input is the same variable on a point support. In remote-sensing applications, the linear system is specified by the point-spread function (or impulse response) of the sensor. Linear systems theory provides the theoretical relations between the different semivariograms and cross-semivariograms. Overall, one must ensure that the whole set of covariances and cross-covariances is positive-definite and models must be estimated for non-observed semivariograms and cross-semivariograms. The models must also be realistic, taking into account, for example, the parabolic behaviour

  14. Directed-energy process technology efforts

    Science.gov (United States)

    Alexander, P.

    1985-06-01

    A summary of directed-energy process technology for solar cells was presented. This technology is defined as directing energy or mass to specific areas on solar cells to produce a desired effect in contrast to exposing a cell to a thermal or mass flow environment. Some of these second generation processing techniques are: ion implantation; microwave-enhanced chemical vapor deposition; rapid thermal processing; and the use of lasers for cutting, assisting in metallization, assisting in deposition, and drive-in of liquid dopants. Advantages of directed energy techniques are: surface heating resulting in the bulk of the cell material being cooler and unchanged; better process control yields; better junction profiles, junction depths, and metal sintering; lower energy consumption during processing and smaller factory space requirements. These advantages should result in higher-efficiency cells at lower costs. The results of the numerous contracted efforts were presented as well as the application potentials of these new technologies.

  15. Estimating Irrigation Water Requirements using MODIS Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Imhoff, Marc L.; Bounoua, Lahouari; Harriss, Robert; Harriss, Robert; Wells, Gordon; Glantz, Michael; Dukhovny, Victor A.; Orlovsky, Leah

    2007-01-01

    An inverse process approach using satellite-driven (MODIS) biophysical modeling was used to quantitatively assess water resource demand in semi-arid and arid agricultural lands by comparing the carbon and water flux modeled under both equilibrium (in balance with prevailing climate) and non-equilibrium (irrigated) conditions. Since satellite observations of irrigated areas show higher leaf area indices (LAI) than is supportable by local precipitation, we postulate that the degree to which irrigated lands vary from equilibrium conditions is related to the amount of irrigation water used. For an observation year we used MODIS vegetation indices, local climate data, and the SiB2 photosynthesis-conductance model to examine the relationship between climate and the water stress function for a given grid-cell and observed leaf area. To estimate the minimum amount of supplemental water required for an observed cell, we added enough precipitation to the prevailing climatology at each time step to minimize the water stress function and bring the soil to field capacity. The experiment was conducted on irrigated lands on the U.S. Mexico border and Central Asia and compared to estimates of irrigation water used.

  16. Estimating Irrigation Water Requirements using MODIS Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Imhoff, Marc L.; Bounoua, Lahouari; Harriss, Robert; Harriss, Robert; Wells, Gordon; Glantz, Michael; Dukhovny, Victor A.; Orlovsky, Leah

    2007-01-01

    An inverse process approach using satellite-driven (MODIS) biophysical modeling was used to quantitatively assess water resource demand in semi-arid and arid agricultural lands by comparing the carbon and water flux modeled under both equilibrium (in balance with prevailing climate) and non-equilibrium (irrigated) conditions. Since satellite observations of irrigated areas show higher leaf area indices (LAI) than is supportable by local precipitation, we postulate that the degree to which irrigated lands vary from equilibrium conditions is related to the amount of irrigation water used. For an observation year we used MODIS vegetation indices, local climate data, and the SiB2 photosynthesis-conductance model to examine the relationship between climate and the water stress function for a given grid-cell and observed leaf area. To estimate the minimum amount of supplemental water required for an observed cell, we added enough precipitation to the prevailing climatology at each time step to minimize the water stress function and bring the soil to field capacity. The experiment was conducted on irrigated lands on the U.S. Mexico border and Central Asia and compared to estimates of irrigation water used.

  17. Stochastic modeling to identify requirements for centralized monitoring of distributed wastewater treatment.

    Science.gov (United States)

    Hug, T; Maurer, M

    2012-01-01

    Distributed (decentralized) wastewater treatment can, in many situations, be a valuable alternative to a centralized sewer network and wastewater treatment plant. However, it is critical for its acceptance whether the same overall treatment performance can be achieved without on-site staff, and whether its performance can be measured. In this paper we argue and illustrate that the system performance depends not only on the design performance and reliability of the individual treatment units, but also significantly on the monitoring scheme, i.e. on the reliability of the process information. For this purpose, we present a simple model of a fleet of identical treatment units. Thereby, their performance depends on four stochastic variables: the reliability of the treatment unit, the respond time for the repair of failed units, the reliability of on-line sensors, and the frequency of routine inspections. The simulated scenarios show a significant difference between the true performance and the observations by the sensors and inspections. The results also illustrate the trade-off between investing in reactor and sensor technology and in human interventions in order to achieve a certain target performance. Modeling can quantify such effects and thereby support the identification of requirements for the centralized monitoring of distributed treatment units. The model approach is generic and can be extended and applied to various distributed wastewater treatment technologies and contexts.

  18. Fast tracking ICT infrastructure requirements and design, based on Enterprise Reference Architecture and matching Reference Models

    DEFF Research Database (Denmark)

    Bernus, Peter; Baltrusch, Rob; Vesterager, Johan;

    2002-01-01

    The Globemen Consortium has developed the virtual enterprise reference architecture and methodology (VERAM), based on GERAM and developed reference models for virtual enterprise management and joint mission delivery. The planned virtual enterprise capability includes the areas of sales and market......The Globemen Consortium has developed the virtual enterprise reference architecture and methodology (VERAM), based on GERAM and developed reference models for virtual enterprise management and joint mission delivery. The planned virtual enterprise capability includes the areas of sales...... and marketing, global engineering, and customer relationship management. The reference models are the basis for the development of ICT infrastructure requirements. These in turn can be used for ICT infrastructure specification (sometimes referred to as 'ICT architecture').Part of the ICT architecture...... is industry-wide, part of it is industry-specific and a part is specific to the domains of the joint activity that characterises the given Virtual Enterprise Network at hand. The article advocates a step by step approach to building virtual enterprise capability....

  19. Requirement for estrogen receptor alpha in a mouse model for human papillomavirus-associated cervical cancer.

    Science.gov (United States)

    Chung, Sang-Hyuk; Wiedmeyer, Kerri; Shai, Anny; Korach, Kenneth S; Lambert, Paul F

    2008-12-01

    The majority of human cervical cancers are associated with the high-risk human papillomaviruses (HPV), which encode the potent E6 and E7 oncogenes. On prolonged treatment with physiologic levels of exogenous estrogen, K14E7 transgenic mice expressing HPV-16 E7 oncoprotein in their squamous epithelia succumb to uterine cervical cancer. Furthermore, prolonged withdrawal of exogenous estrogen results in complete or partial regression of tumors in this mouse model. In the current study, we investigated whether estrogen receptor alpha (ERalpha) is required for the development of cervical cancer in K14E7 transgenic mice. We show that exogenous estrogen fails to promote either dysplasia or cervical cancer in K14E7/ERalpha-/- mice despite the continued presence of the presumed cervical cancer precursor cell type, reserve cells, and evidence for E7 expression therein. We also observed that cervical cancers in our mouse models are strictly associated with atypical squamous metaplasia (ASM), which is believed to be the precursor for cervical cancer in women. Consistently, E7 and exogenous estrogen failed to promote ASM in the absence of ERalpha. We conclude that ERalpha plays a crucial role at an early stage of cervical carcinogenesis in this mouse model.

  20. Shell Inspection History and Current CMM Inspection Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Montano, Joshua Daniel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-26

    The following report provides a review of past and current CMM Shell Inspection efforts. Calibration of the Sheffield rotary contour gauge has expired and the primary inspector, Matthew Naranjo, has retired. Efforts within the Inspection team are transitioning from maintaining and training new inspectors on Sheffield to off-the-shelf CMM technology. Although inspection of a shell has many requirements, the scope of the data presented in this report focuses on the inner contour, outer contour, radial wall thickness and mass comparisons.

  1. A decision-making framework to model environmental flow requirements in oasis areas using Bayesian networks

    Science.gov (United States)

    Xue, Jie; Gui, Dongwei; Zhao, Ying; Lei, Jiaqiang; Zeng, Fanjiang; Feng, Xinlong; Mao, Donglei; Shareef, Muhammad

    2016-09-01

    The competition for water resources between agricultural and natural oasis ecosystems has become an increasingly serious problem in oasis areas worldwide. Recently, the intensive extension of oasis farmland has led to excessive exploitation of water discharge, and consequently has resulted in a lack of water supply in natural oasis. To coordinate the conflicts, this paper provides a decision-making framework for modeling environmental flows in oasis areas using Bayesian networks (BNs). Three components are included in the framework: (1) assessment of agricultural economic loss due to meeting environmental flow requirements; (2) decision-making analysis using BNs; and (3) environmental flow decision-making under different water management scenarios. The decision-making criterion is determined based on intersection point analysis between the probability of large-level total agro-economic loss and the ratio of total to maximum agro-economic output by satisfying environmental flows. An application in the Qira oasis area of the Tarim Basin, Northwest China indicates that BNs can model environmental flow decision-making associated with agricultural economic loss effectively, as a powerful tool to coordinate water-use conflicts. In the case study, the environmental flow requirement is determined as 50.24%, 49.71% and 48.73% of the natural river flow in wet, normal and dry years, respectively. Without further agricultural economic loss, 1.93%, 0.66% and 0.43% of more river discharge can be allocated to eco-environmental water demands under the combined strategy in wet, normal and dry years, respectively. This work provides a valuable reference for environmental flow decision-making in any oasis area worldwide.

  2. User Requirements from the Climate Modelling Community for Next Generation Global Products from Land Cover CCI Project

    Science.gov (United States)

    Kooistra, Lammert; van Groenestijn, Annemarie; Kalogirou, Vasileios; Arino, Olivier; Herold, Martin

    2011-01-01

    Land Cover has been selected as one of 11 Essential Climate Variables which will be elaborated during the first phase of the ESA Climate Change Initiative (2010- 2013). In the first stage of the Land Cover CCI project, an user requirements analysis has been carried out on the basis of which the detailed specifications of a global land cover product can be defined which match the requirements from the Global Climate Observing System (GCOS) and the climate modelling community. As part of the requirements analysis, an user consultation mechanism was set-up to actively involve different climate modelling groups by setting out surveys to different type of users within the climate modelling community and the broad land cover data user community. The evolution of requirements from current models to future new modelling approaches was specifically taken into account. In addition, requirements from the GCOS Implementation Plan 2004 and 2010 and associated strategic earth observation documents for land cover were assessed and a detailed literature review was carried out. The outcome of the user requirements assessment shows that although the range of requirements coming from the climate modelling community is broad, there is a good match among the requirements coming from different user groups and the broader requirements derived from GCOS, CMUG and other relevant international panels. More specific requirements highlight that future land cover datasets should be both stable and have a dynamic component; deal with the consistency in relationships between land cover classes and land surface parameters; should provide flexibility to serve different scales and purposes; and should provide transparency of product quality. As a next step within the Land Cover CCI project, the outcome of this user requirements analysis will be used as input for the product specification of the next generation Global Land Cover datasets.

  3. Change Impact Analysis for SysML Requirements Models based on Semantics of Trace Relations

    NARCIS (Netherlands)

    Hove, ten David; Göknil, Arda; Kurtev, Ivan; Berg, van den Klaas; Goede, de Koos; Oldevik, J.; Olsen, G. K.; Neple, T.; Kolovos, D.

    2009-01-01

    Change impact analysis is one of the applications of requirements traceability in software engineering community. In this paper, we focus on requirements and requirements relations from traceability perspective. We provide formal definitions of the requirements relations in SysML for change impact a

  4. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  5. The Design of Effective ICT-Supported Learning Activities: Exemplary Models, Changing Requirements, and New Possibilities

    Directory of Open Access Journals (Sweden)

    Cameron Richards

    2005-01-01

    Full Text Available Despite the imperatives of policy and rhetoric about their integration in formal education, Information and Communication Technologies (ICTs are often used as an "add-on" in many classrooms and in many lesson plans. Nevertheless, many teachers find that interesting and well-planned tasks, projects, and resources provide a key to harnessing the educational potential of digital resources, Internet communications and interactive multimedia to engage the interest, interaction, and knowledge construction of young learners. To the extent that such approaches go beyond and transform traditional "transmission" models of teaching and formal lesson planning, this paper investigates the changing requirements and new possibilities represented by the challenge of integrating ICTs in education in a way which at the same time connects more effectively with both the specific contents of the curriculum and the various stages and elements of the learning process. Case studies from teacher education foundation courses provide an exemplary focus of inquiry in order to better link relevant new theories or models of learning with practice, to build upon related learner-centered strategies for integrating ICT resources and tools, and to incorporate interdependent functions of learning as information access, communication, and applied interactions. As one possible strategy in this direction, the concept of an "ICT-supported learning activity" suggests the need for teachers to approach this increasing challenge more as "designers" of effective and integrated learning rather than mere "transmitters" of skills or information through an add-on use of ICTs.

  6. Model Based User's Access Requirement Analysis of E-Governance Systems

    Science.gov (United States)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  7. NASA Software Engineering Benchmarking Effort

    Science.gov (United States)

    Godfrey, Sally; Rarick, Heather

    2012-01-01

    Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA

  8. Geosynchronous platform definition study. Volume 4, Part 2: Traffic analysis and system requirements for the new traffic model

    Science.gov (United States)

    1973-01-01

    A condensed summary of the traffic analyses and systems requirements for the new traffic model is presented. The results of each study activity are explained, key analyses are described, and important results are highlighted.

  9. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  10. A Qualitative Readiness-Requirements Assessment Model for Enterprise Big-Data Infrastructure Investment

    Energy Technology Data Exchange (ETDEWEB)

    Olama, Mohammed M [ORNL; McNair, Wade [ORNL; Sukumar, Sreenivas R [ORNL; Nutaro, James J [ORNL

    2014-01-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system s readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMA-DMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system s data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  11. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment...—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) For the following continuous...

  12. Knowing requires data

    Science.gov (United States)

    Naranjo, Ramon C.

    2017-01-01

    Groundwater-flow models are often calibrated using a limited number of observations relative to the unknown inputs required for the model. This is especially true for models that simulate groundwater surface-water interactions. In this case, subsurface temperature sensors can be an efficient means for collecting long-term data that capture the transient nature of physical processes such as seepage losses. Continuous and spatially dense network of diverse observation data can be used to improve knowledge of important physical drivers, conceptualize and calibrate variably saturated groundwater flow models. An example is presented for which the results of such analysis were used to help guide irrigation districts and water management decisions on costly upgrades to conveyance systems to improve water usage, farm productivity and restoration efforts to improve downstream water quality and ecosystems.

  13. Data sharing requirements of supply - And logistics innovations - Towards a maturity model

    NARCIS (Netherlands)

    Hofman, W.J.

    2016-01-01

    Supply - and logistics innovations require data of different, heterogeneous sources. Supply chain resilience for instance requires visibility of goods flows and data of planned infrastructure maintenance and unforeseen accidents or incidents that may cause delays. Technically, there are different wa

  14. Improving groundwater management in rural India using simple modeling tools with minimal data requirements

    Science.gov (United States)

    Moysey, S. M.; Oblinger, J. A.; Ravindranath, R.; Guha, C.

    2008-12-01

    shortly after the start of the monsoon and villager water use is small compared to the other fluxes. Groundwater fluxes were accounted for by conceptualizing the contributing areas upstream and downstream of the reservoir as one dimensional flow tubes. This description of the flow system allows for the definition of physically-based parameters making the model useful for investigating WHS infiltration under a variety of management scenarios. To address concerns regarding the uniqueness of the model parameters, 10,000 independent model calibrations were performed using randomly selected starting parameters. Based on this Monte Carlo analysis, it was found that the mean volume of water contributed by the WHS to infiltration over the study period (Sept.-Dec., 2007) was 48.1x103m3 with a 95% confidence interval of 43.7-53.7x103m3. This volume represents 17-21% of the total natural groundwater recharge contributed by the entire watershed, which was determined independently using a surface water balance. Despite the fact that the model is easy to use and requires minimal data, the results obtained provide a powerful quantitative starting point for managing groundwater withdrawals in the dry season.

  15. PROTEINURIA AND ACUTE PHYSICAL EFFORT

    Directory of Open Access Journals (Sweden)

    Radu M.D.

    2015-08-01

    Full Text Available It is well known that intense exercise leads to increased urinary excretion of protein, a phenomenon encountered both in experimental models in laboratory animals and in amateur and professional athletes, where proteinuria mechanism is still unclear. Proteinuria is an important marker of physiological integrity of the excretory system. Proteinuria after exercise, induced the effect of physical exercise on renal function. During physical exertion blood flow is directed mainly towards skeletal muscles that are in business, to the detriment of many organs that are subjected to transient ischemia. Ischemic reperfusion is an important source of activation and generation of oxygen free radicals in organs involved in supporting passive exercise. The effects of biochemical or functional neurons induced oxygen free radicals play an important role in urinary protein excretion. Therefore, exercise is an inducer of oxidative stress phenomenon not only in skeletal muscle in operation. Experimental study quantify the biochemical adaptation of functional kidney in one workout. Experimental results suggest that functional alterations of neuronal membranes, due to oxygen free radicals actions are a cause of proteinuria after exercise in laboratory animals.

  16. Strength of Intentional Effort Enhances the Sense of Agency

    Directory of Open Access Journals (Sweden)

    Rin Minohara

    2016-08-01

    Full Text Available Sense of agency refers to the feeling of controlling one’s own actions, and the experience of controlling external events with one’s actions. The present study examined the effect of strength of intentional effort on sense of agency. We manipulated the strength of intentional effort using three types of buttons that differed in the amount of force required to depress them. We used a self-attribution task as an explicit measure of sense of agency. The results indicate that strength of intentional effort enhanced self-attribution when action-effect congruency was unreliable. We concluded that intentional effort importantly affects the integration of multiple cues affecting explicit judgments of agency when the causal relationship action and effect was unreliable.

  17. REQUIREMENTS PATTERNS FOR FORMAL CONTRACTS IN ARCHITECTURAL ANALYSIS AND DESIGN LANGUAGE (AADL) MODELS

    Science.gov (United States)

    2017-04-17

    requirements are often used to specify the behavior of complex cyber- physical systems . The process of transforming these requirements to a formal...Cyberphysical Systems , Formal Methods, Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...33 Approved for Public Release; Distribution Unlimited. 1 1. Summary Modern aircraft are complex cyber- physical systems with safety

  18. Application of a hydrodynamic and sediment transport model for guidance of response efforts related to the Deepwater Horizon oil spill in the Northern Gulf of Mexico along the coast of Alabama and Florida

    Science.gov (United States)

    Plant, Nathaniel G.; Long, Joseph W.; Dalyander, P. Soupy; Thompson, David M.; Raabe, Ellen A.

    2013-01-01

    U.S. Geological Survey (USGS) scientists have provided a model-based assessment of transport and deposition of residual Deepwater Horizon oil along the shoreline within the northern Gulf of Mexico in the form of mixtures of sand and weathered oil, known as surface residual balls (SRBs). The results of this USGS research, in combination with results from other components of the overall study, will inform operational decisionmaking. The results will provide guidance for response activities and data collection needs during future oil spills. In May 2012 the U.S. Coast Guard, acting as the Deepwater Horizon Federal on-scene coordinator, chartered an operational science advisory team to provide a science-based review of data collected and to conduct additional directed studies and sampling. The goal was to characterize typical shoreline profiles and morphology in the northern Gulf of Mexico to identify likely sources of residual oil and to evaluate mechanisms whereby reoiling phenomena may be occurring (for example, burial and exhumation and alongshore transport). A steering committee cochaired by British Petroleum Corporation (BP) and the National Oceanic and Atmospheric Administration (NOAA) is overseeing the project and includes State on-scene coordinators from four States (Alabama, Florida, Louisiana, and Mississippi), trustees of the U.S. Department of the Interior (DOI), and representatives from the U.S. Coast Guard. This report presents the results of hydrodynamic and sediment transport models and developed techniques for analyzing potential SRB movement and burial and exhumation along the coastline of Alabama and Florida. Results from these modeling efforts are being used to explain the complexity of reoiling in the nearshore environment and to broaden consideration of the different scenarios and difficulties that are being faced in identifying and removing residual oil. For instance, modeling results suggest that larger SRBs are not, under the most commonly

  19. Automata learning algorithms and processes for providing more complete systems requirements specification by scenario generation, CSP-based syntax-oriented model construction, and R2D2C system requirements transformation

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Margaria, Tiziana (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor); Steffen, Bernard (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.

  20. A case study of regional catchment water quality modelling to identify pollution control requirements.

    Science.gov (United States)

    Crabtree, B; Seward, A J; Thompson, L

    2006-01-01

    There are four ecologically important river catchments that contain candidate Special Areas of Conservation (cSACs) under the Habitats Directive in the Lake District National Park located in the North of England. These are the rivers Ehen, Kent, Derwent and Eden. For each cSAC, there are defined ecological criteria that include water quality targets to protect the designated species. Stretches of the riverine cSACs in each catchment are failing to meet these and other water quality targets. The Environment Agency commissioned a study of each catchment to provide the underpinning scientific knowledge to allow it to deliver its statutory obligations under the Habitats Directive. SIMCAT river water quality models were produced and used to predict the water quality impacts resulting from a number of water quality planning scenarios aimed at achieving full compliance with the Habitats Directive and other national and EEC water quality targets. The results indicated that further controls on effluent discharges will allow the majority of targets to be met but other sources of pollution will also need to be controlled. The outcome of the study also recognised that water quality improvements alone will not necessarily produce the required improvement to the ecological interest features in each cSAC.

  1. Real-time total system error estimation:Modeling and application in required navigation performance

    Institute of Scientific and Technical Information of China (English)

    Fu Li; Zhang Jun; Li Rui

    2014-01-01

    In required navigation performance (RNP), total system error (TSE) is estimated to pro-vide a timely warning in the presence of an excessive error. In this paper, by analyzing the under-lying formation mechanism, the TSE estimation is modeled as the estimation fusion of a fixed bias and a Gaussian random variable. To address the challenge of high computational load induced by the accurate numerical method, two efficient methods are proposed for real-time application, which are called the circle tangent ellipse method (CTEM) and the line tangent ellipse method (LTEM), respectively. Compared with the accurate numerical method and the traditional scalar quantity summation method (SQSM), the computational load and accuracy of these four methods are exten-sively analyzed. The theoretical and experimental results both show that the computing time of the LTEM is approximately equal to that of the SQSM, while it is only about 1/30 and 1/6 of that of the numerical method and the CTEM. Moreover, the estimation result of the LTEM is parallel with that of the numerical method, but is more accurate than those of the SQSM and the CTEM. It is illustrated that the LTEM is quite appropriate for real-time TSE estimation in RNP application.

  2. Protective antiviral antibody responses in a mouse model of influenza virus infection require TACI

    Science.gov (United States)

    Wolf, Amaya I.; Mozdzanowska, Krystyna; J. Quinn, William; Metzgar, Michele; Williams, Katie L.; Caton, Andrew J.; Meffre, Eric; Bram, Richard J.; Erickson, Loren D.; Allman, David; Cancro, Michael P.; Erikson, Jan

    2011-01-01

    Antiviral Abs, for example those produced in response to influenza virus infection, are critical for virus neutralization and defense against secondary infection. While the half-life of Abs is short, Ab titers can last a lifetime due to a subset of the Ab-secreting cells (ASCs) that is long lived. However, the mechanisms governing ASC longevity are poorly understood. Here, we have identified a critical role for extrinsic cytokine signals in the survival of respiratory tract ASCs in a mouse model of influenza infection. Irradiation of mice at various time points after influenza virus infection markedly diminished numbers of lung ASCs, suggesting that they are short-lived and require extrinsic factors in order to persist. Neutralization of the TNF superfamily cytokines B lymphocyte stimulator (BLyS; also known as BAFF) and a proliferation-inducing ligand (APRIL) reduced numbers of antiviral ASCs in the lungs and bone marrow, whereas ASCs in the spleen and lung-draining lymph node were surprisingly unaffected. Mice deficient in transmembrane activator and calcium-modulator and cyclophilin ligand interactor (TACI), a receptor for BLyS and APRIL, mounted an initial antiviral B cell response similar to that generated in WT mice but failed to sustain protective Ab titers in the airways and serum, leading to increased susceptibility to secondary viral challenge. These studies highlight the importance of TACI signaling for the maintenance of ASCs and protection against influenza virus infection. PMID:21881204

  3. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up.

  4. MMP-10 is required for efficient muscle regeneration in mouse models of injury and muscular dystrophy.

    Science.gov (United States)

    Bobadilla, Míriam; Sáinz, Neira; Rodriguez, José Antonio; Abizanda, Gloria; Orbe, Josune; de Martino, Alba; García Verdugo, José Manuel; Páramo, José A; Prósper, Felipe; Pérez-Ruiz, Ana

    2014-02-01

    Matrix metalloproteinases (MMPs), a family of endopeptidases that are involved in the degradation of extracellular matrix components, have been implicated in skeletal muscle regeneration. Among the MMPs, MMP-2 and MMP-9 are upregulated in Duchenne muscular dystrophy (DMD), a fatal X-linked muscle disorder. However, inhibition or overexpression of specific MMPs in a mouse model of DMD (mdx) has yielded mixed results regarding disease progression, depending on the MMP studied. Here, we have examined the role of MMP-10 in muscle regeneration during injury and muscular dystrophy. We found that skeletal muscle increases MMP-10 protein expression in response to damage (notexin) or disease (mdx mice), suggesting its role in muscle regeneration. In addition, we found that MMP-10-deficient muscles displayed impaired recruitment of endothelial cells, reduced levels of extracellular matrix proteins, diminished collagen deposition, and decreased fiber size, which collectively contributed to delayed muscle regeneration after injury. Also, MMP-10 knockout in mdx mice led to a deteriorated dystrophic phenotype. Moreover, MMP-10 mRNA silencing in injured muscles (wild-type and mdx) reduced muscle regeneration, while addition of recombinant human MMP-10 accelerated muscle repair, suggesting that MMP-10 is required for efficient muscle regeneration. Furthermore, our data suggest that MMP-10-mediated muscle repair is associated with VEGF/Akt signaling. Thus, our findings indicate that MMP-10 is critical for skeletal muscle maintenance and regeneration during injury and disease.

  5. A review method for UML requirements analysis model employing system-side prototyping.

    Science.gov (United States)

    Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.

  6. Security Architectures for Model Driven Web Requirements – Financial Application Case Study

    Directory of Open Access Journals (Sweden)

    A.V.Krishna Prasad

    2010-07-01

    Full Text Available MDA with executable UML offers an approach that embodies all the key ingredients of the process for developing dependable systems, by offering: A uniform strategy for preserving investment in existing models built using unsupported tools, by automatically migrating them to profiled UML models for subsequent maintenance and development using state of the art UML tools; A clean separation of application behavior from the platform specific implementation using technologies such as Integrated Modular Avionics (IMA, allowing the full potential of IMA to be realized in a consistent and dependable way; A semantically well defined formalism that can be used a basis for modular certification of safety related systems; The ability to generate not only the components of the target system, but components of development tool chain, providing scope for model translation and offering “executable specifications” that can be tested early and mapped reliably onto the target, leading to greater levels of dependency. MDA is a new approach for most organizations, and therefore carries additional training and learning curve costs and also currently the availability of production quality code generators is currently limited. MDA requires developers to work at a more abstract level than code although experience shows that most do not have any difficulty making the adjustment, there will be some who find this change of emphasis difficult to achieve. Building upon the initial success of MDA deployment so far, work is now proceeding on the enhancement of Ada code mapping rules to cover the entire xUML formalism. Work is also underway to develop a generic “adapter/router”component to provide a standard component to provide a standard way to interface re-engineered xUML components with pre-existing components. These techniques are now being applied to another avionics system in the same organization, in response to the customers need for a faster and cheaper upgrade

  7. Grey Prediction Based Software Stage-Effort Estimation

    Institute of Scientific and Technical Information of China (English)

    WANG Yong; SONG Qinbao; SHEN Junyi

    2007-01-01

    The software stage-effort estimation can be used to dynamically adjust software project schedule, further to help make the project finished on budget. This paper presents a grey model Verhulst based method for stage-effort estimation during software development process, a bias correction technology was used to improve the estimation accuracy. The proposed method was evaluated with a large-scale industrial software engineering database. The results are very encouraging and indicate the method has considerable potential.

  8. Putting User Stories First: Experiences Adapting the Legacy Data Models and Information Architecture at NASA JPL's PO.DAAC to Accommodate the New Information Lifecycle Required by SWOT

    Science.gov (United States)

    McGibbney, L. J.; Hausman, J.; Laurencelle, J. C.; Toaz, R., Jr.; McAuley, J.; Freeborn, D. J.; Stoner, C.

    2016-12-01

    The Surface Water & Ocean Topography (SWOT) mission brings together two communities focused on a better understanding of the world's oceans and its terrestrial surface waters. U.S. and French oceanographers and hydrologists and international partners have joined forces to develop this new space mission. At NASA JPL's PO.DAAC, the team is currently engaged in the gathering of SWOT User Stores (access patterns, metadata requirements, primary and value added product requirements, data access protocols, etc.) to better inform the adaptive planning of what will be known as the next generation PO.DAAC Information Architecture (IA). The IA effort acknowledges that missions such as SWOT (and NISAR) have few or no precedent in terms of data volume, hot and cold storage, archival, analysis, existing system engineering complexities, etc. and that the only way we can better understand the projected impacts of such requirements is to interface directly with the User Community. Additionally, it also acknowledges that collective learning has taken place to understand certain limitations in the existing data models (DM) underlying the existing PO.DAAC Data Management and Archival System. This work documents an evolutionary, use case based, standards driven approach to adapting the legacy DM and accompanying knowledge representation infrastructure at NASA JPL's PO.DAAC to address forthcoming DAAC mission requirements presented by missions such as SWOT. Some of the topics covered in this evolution include, but are not limited to: How we are leveraging lessons learned from the development of existing DM (such as that generated for SMAP) in an attempt to map them to SWOT. What is the governance model for the SWOT IA? What are the `governing' entities? What is the hierarchy of the `governed entities'? How are elements grouped? How is the design-working group formed? How is model independence maintained and what choices/requirements do we have for the implementation language? The use of

  9. RBANS embedded measures of suboptimal effort in dementia: effort scale has a lower failure rate than the effort index.

    Science.gov (United States)

    Burton, Rachel L; Enright, Joe; O'Connell, Megan E; Lanting, Shawnda; Morgan, Debra

    2015-02-01

    The importance of evaluating effort in neuropsychological assessments has been widely acknowledged, but measuring effort in the context of dementia remains challenging due to the impact of dementia severity on effort measure scores. Two embedded measures have been developed for the repeatable battery for the assessment of neuropsychological status (RBANS; Randolph, C., Tierney, M. C., Mohr, E., & Chase, T. N. (1998). The repeatable battery for the assessment of neuropsychological status (RBANS): Preliminary clinical validity. Journal of Clinical and Experimental Neuropsychology, 20 (3), 310-319): the Effort Index (EI; Silverberg, N. D., Wertheimer, J. C., & Fichtenberg, N. L. (2007). An effort index for the repeatable battery for the assessment of neuropsychological status (RBANS). Clinical Neuropsychologist, 21 (5), 841-854) and the Effort Scale (ES; Novitski, J., Steele, S., Karantzoulis, S., & Randolph, C. (2012). The repeatable battery for the assessment of neuropsychological status effort scale. Archives of Clinical Neuropsychology, 27 (2), 190-195). We explored failure rates on these effort measures in a non-litigating mixed dementia sample (N = 145). Failure rate on the EI was high (48%) and associated with dementia severity. In contrast, failure on the ES was 14% but differed based on type of dementia. ES failure was low (4%) when dementia was due to Alzheimer disease (AD), but high (31%) for non-AD dementias. These data raise concerns about use of the RBANS embedded effort measures in dementia evaluations.

  10. Effort-reward imbalance and depression in Japanese medical residents.

    Science.gov (United States)

    Sakata, Yumi; Wada, Koji; Tsutsumi, Akizumi; Ishikawa, Hiroyasu; Aratake, Yutaka; Watanabe, Mayumi; Katoh, Noritada; Aizawa, Yoshiharu; Tanaka, Katsutoshi

    2008-01-01

    The effort-reward imbalance is an important psychosocial factor which is related to poor health among employees. However, there are few studies that have evaluated effort-reward imbalance among medical residents. The present study was done to determine the association between psychosocial factors at work as defined by the effort-reward imbalance model and depression among Japanese medical residents. We distributed a questionnaire to 227 medical residents at 16 teaching hospitals in Japan at the end of August 2005. We asked participants to answer questions which included demographic information, depressive symptoms, effort-reward imbalance, over-commitment and social support. Depression was evaluated using the Japanese version of the Center for Epidemiologic Studies-Depression (CES-D) scale. The effort-reward imbalance and over-commitment were assessed by the Effort-Reward Imbalance (ERI) questionnaire which Siegrist developed. Social support was determined on a visual analog scale. Logistic regression analysis was performed to determine the associations between effort-reward imbalance and depressive symptoms. Depressive symptoms were found in 35 (29.2%) 1st-year residents and 21 (27.6%) 2nd-year residents. The effort-reward ratio >1 (OR, 8.83; 95% CI, 2.87-27.12) and low social support score (OR, 2.77, 95% CI, 1.36-5.64) were associated with depressive symptoms among medical residents. Effort-reward imbalance was independently related to depression among Japanese medical residents. The present study suggests that balancing between effort and reward at work is important for medical residents' mental health.

  11. Geosynchronous platform definition study. Volume 4, Part 1: Traffic analysis and system requirements for the baseline traffic model

    Science.gov (United States)

    1973-01-01

    The traffic analyses and system requirements data generated in the study resulted in the development of two traffic models; the baseline traffic model and the new traffic model. The baseline traffic model provides traceability between the numbers and types of geosynchronous missions considered in the study and the entire spectrum of missions foreseen in the total national space program. The information presented pertaining to the baseline traffic model includes: (1) definition of the baseline traffic model, including identification of specific geosynchronous missions and their payload delivery schedules through 1990; (2) Satellite location criteria, including the resulting distribution of the satellite population; (3) Geosynchronous orbit saturation analyses, including the effects of satellite physical proximity and potential electromagnetic interference; and (4) Platform system requirements analyses, including satellite and mission equipment descriptions, the options and limitations in grouping satellites, and on-orbit servicing criteria (both remotely controlled and man-attended).

  12. Measuring the Cognitive Effort of Literal Translation Processes

    DEFF Research Database (Denmark)

    Schaeffer, Moritz

    2014-01-01

    It has been claimed that human translators rely on some sort of literal translation equivalences to produce translations and to check their validity. More effort would be required if translations are less literal. However, to our knowledge, there is no established metric to measure and quantify t...

  13. Characterization and Validation of Requirements Management Measures Using Correlation and Regression Model

    Directory of Open Access Journals (Sweden)

    S. Arun Kumar

    2011-04-01

    Full Text Available Requirements engineering is one of the challenging and crucial phase in the development of softwareproducts. One of the key reasons found in literature survey the failure of software projects due to poorproject management and requirement management activity. This paper mainly addresses 1. Formulate amixed organization structure of both traditional approaches and agile approaches, to apply KMpractices for both the approaches to achieve requirements issues such as missing and inconsistency ofrequirements and improve the project management activities in a global software developmentenvironment. 2. Propose requirements metrics to measure and manage software process during thedevelopment of information systems. The major contribution of this paper is well-founded methods tomanage the project and effective requirements management metrics to measure changing requirementswhile giving particular attention to the requirements engineering issues such as completeness andconsistency. Two hypotheses have been formulated and tested this problem through statistical techniquesand validate the same.

  14. 7-Years of Using Distributed Temperature Sensing (DTS) to assess river restoration efforts : synergies of high-resolution observation and modeling on the Middle Fork of the John Day River

    Science.gov (United States)

    Hall, A.; Diabat, M.

    2014-12-01

    Temperature is a key factor for salmonid health and is an important restoration metric on the Middle Fork of the John Day River, northeast Oregon. The longest undammed tributary to the Columbia, the headwaters of the Middle Fork are crucial to steelhead and spring Chinook and summer Chinook juvenile rearing. In the past century the river has been altered by dredge mining, overgrazing, logging activities, and irrigation resulting in bank erosion, low effective shade, and channelization. These factors decreased fish habitat and led to increased stream temperature maxima. Restoration has focused on restoring fish habitat, creating thermal refugia, and planting native vegetation. The most recent completed restoration project diverted the flow into the historic, meandering stream channel from the dredged, straightened channel. Over the past seven years, Oregon State University researchers (Tara O'Donnell-2012, Julie Huff-2009) have been involved in a planned-to-be 10-year stream temperature monitoring study to assess maximum temperatures during low-flow summer months. The use of fiber optics through distributed temperature sensing (DTS) made it possible to record high resolution temperature data at both temporal and spatial scales; data which is used to assess the efficacy of restoration efforts on the reach. Furthermore, DTS provided temperature data that reveals subtle hydrologic processes such as groundwater or hyporheic inflows and quantifies their effect on the stream. Current research has focused on large scale DTS installations on the Middle Fork of the John Day River on the Oxbow, Forrest, and the upstream Galena ("RPB") conservation properties. In the summers of 2013 and 2014, 16 km of river were monitored. Our study compares temperatures before and after the restoration project and provides essential guidance for future restoration projects. Direct comparisons coupled with a deterministic modeling using HeatSource assist in better understanding the

  15. Economic response to harvest and effort control in fishery

    DEFF Research Database (Denmark)

    Hoff, Ayoe; Frost, Hans

    for fisheries management. The report outlines bio-economic models, which are designed to shed light on the efficiency of different management tools in terms of quota or effort restrictions given the objectives of the Common Fisheries Policy about sustainable and economic viable fisheries. The report addresses...... the complexities of biological and economic interaction in a multispecies, multifleet framework and outlines consistent mathematical models....

  16. A geospatial evaluation of Aedes vigilax larval control efforts across a coastal wetland, Northern Territory, Australia.

    Science.gov (United States)

    Kurucz, N; Whelan, P I; Carter, J M; Jacups, S P

    2009-12-01

    Adjacent to the northern suburbs of Darwin is a coastal wetland that contains important larval habitats for Aedes vigilax (Skuse), the northern salt marsh mosquito. This species is a vector for Ross River virus and Barmah Forest virus, as well as an appreciable human pest. In order to improve aerial larval control efforts, we sought to identify the most important vegetation categories and climatic/seasonal aspects associated with control operations in these wetlands. By using a generalized linear model to compare aerial control for each vegetation category, we found that Schoenoplectus/mangrove areas require the greatest amount of control for tide-only events (30.1%), and also extensive control for tide and rain events coinciding (18.2%). Our results further indicate that tide-affected reticulate vegetation indicated by the marsh grasses Sporobolus virginicus and Xerochloa imberbis require extensive control for Ae. vigilax larvae after rain-only events (44.7%), and tide and rain events coinciding (38.0%). The analyses of vector control efforts by month indicated that September to January, with a peak in November and December, required the most control. A companion paper identifies the vegetation categories most associated with Aedes vigilax larvae population densities in the coastal wetland. To maximize the efficiency of aerial salt marsh mosquito control operations in northern Australia, aerial control efforts should concentrate on the vegetation categories with high larval densities between September and January.

  17. Devising a Structural Equation Model of Relationships between Preservice Teachers' Time and Study Environment Management, Effort Regulation, Self-Efficacy, Control of Learning Beliefs, and Metacognitive Self-Regulation

    Science.gov (United States)

    Sen, Senol; Yilmaz, Ayhan

    2016-01-01

    The objective of this study is to analyze the relationship between preservice teachers' time and study environment management, effort regulation, self-efficacy beliefs, control of learning beliefs and metacognitive self-regulation. This study also investigates the direct and indirect effects of metacognitive self-regulation on time and study…

  18. Transaction-based building controls framework, Volume 2: Platform descriptive model and requirements

    Energy Technology Data Exchange (ETDEWEB)

    Akyol, Bora A. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Haack, Jereme N. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Carpenter, Brandon J. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Lutes, Robert G. [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Hernandez, George [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)

    2015-07-31

    Transaction-based Building Controls (TBC) offer a control systems platform that provides an agent execution environment that meets the growing requirements for security, resource utilization, and reliability. This report outlines the requirements for a platform to meet these needs and describes an illustrative/exemplary implementation.

  19. Productive and Ineffective Efforts: How Student Effort in High School Mathematics Relates to College Calculus Success

    Science.gov (United States)

    Barnett, M.D.; Sonnert, G.; Sadler, P.M.

    2014-01-01

    Relativizing the popular belief that student effort is the key to success, this article finds that effort in the most advanced mathematics course in US high schools is not consistently associated with college calculus performance. We distinguish two types of student effort: productive and ineffective efforts. Whereas the former carries the…

  20. Productive and Ineffective Efforts: How Student Effort in High School Mathematics Relates to College Calculus Success

    Science.gov (United States)

    Barnett, M.D.; Sonnert, G.; Sadler, P.M.

    2014-01-01

    Relativizing the popular belief that student effort is the key to success, this article finds that effort in the most advanced mathematics course in US high schools is not consistently associated with college calculus performance. We distinguish two types of student effort: productive and ineffective efforts. Whereas the former carries the…

  1. The influence of music on mental effort and driving performance.

    Science.gov (United States)

    Ünal, Ayça Berfu; Steg, Linda; Epstude, Kai

    2012-09-01

    The current research examined the influence of loud music on driving performance, and whether mental effort mediated this effect. Participants (N=69) drove in a driving simulator either with or without listening to music. In order to test whether music would have similar effects on driving performance in different situations, we manipulated the simulated traffic environment such that the driving context consisted of both complex and monotonous driving situations. In addition, we systematically kept track of drivers' mental load by making the participants verbally report their mental effort at certain moments while driving. We found that listening to music increased mental effort while driving, irrespective of the driving situation being complex or monotonous, providing support to the general assumption that music can be a distracting auditory stimulus while driving. However, drivers who listened to music performed as well as the drivers who did not listen to music, indicating that music did not impair their driving performance. Importantly, the increases in mental effort while listening to music pointed out that drivers try to regulate their mental effort as a cognitive compensatory strategy to deal with task demands. Interestingly, we observed significant improvements in driving performance in two of the driving situations. It seems like mental effort might mediate the effect of music on driving performance in situations requiring sustained attention. Other process variables, such as arousal and boredom, should also be incorporated to study designs in order to reveal more on the nature of how music affects driving. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Getting Grip on Security Requirements Elicitation by Structuring and Reusing Security Requirements Sources

    Directory of Open Access Journals (Sweden)

    Christian Schmitt

    2015-07-01

    Full Text Available This paper presents a model for structuring and reusing security requirements sources. The model serves as blueprint for the development of an organization-specific repository, which provides relevant security requirements sources, such as security information and knowledge sources and relevant compliance obligations, in a structured and reusable form. The resulting repository is intended to be used by development teams during the elicitation and analysis of security requirements with the goal to understand the security problem space, incorporate all relevant requirements sources, and to avoid unnecessary effort for identifying, understanding, and correlating applicable security requirements sources on a project-wise basis. We start with an overview and categorization of important security requirements sources, followed by the description of the generic model. To demonstrate the applicability and benefits of the model, the instantiation approach and details of the resulting repository of security requirements sources are presented.

  3. Building on the CMIP5 effort to prepare next steps : integrate community related effort in the every day workflow to lower the data distribution and data management burden

    Science.gov (United States)

    Denvil, Sébastien; Bhardwaj, Ashish; Morgan, Mark; Mancip, Martial; Brockmann, Patrick

    2010-05-01

    The Pierre Simon Laplace Institute (IPSL), like many other climate modeling groups, is involved in the development of a comprehensive Earth System Model (ESM) to study the interactions between chemical, physical, and biological processes. This work entails the coupling of different components (land, ocean, atmosphere, chemistry...etc) and requires an execution environment platform that can tackle the entire range of interdependent model configurations. Furthermore, the ever-increasing number of simulations, executed against model configurations within scientific computing centres, is generating a huge volume of data and meta-data that must be made available to researchers, modelers, students and general users. Each user group has a different set of information demands related to climate simulation data and meta-data, and thus fulfilling the requirements of the entire community is highly challenging. This talk will focus upon the strategy adopted by IPSL to simultaneously fulfill the needs of the community and to lower the data distribution and data management burdens upon the climate modeling group due to the growing interest related to climate simulations data and information. To achieve these objectives we decided to integrate the efforts of international and European projects such as Earth System Grid, METAFOR and IS-ENES, within our execution environment platform. We will present the emerging workflow that will be in place to run CMIP5 simulations and that we will extend to manage the "every day" simulations that are intended not only for participation within a large model intercomparaison project such as CMIP5.

  4. Improved Inventory Models for the United States Coast Guard Requirements Determination Process

    Science.gov (United States)

    1993-10-01

    Trepp present two versions of 5-24 a multi-item, supply availability safety level model.9 They used the Method of Lagrange Multipliers to solve for ki...the safety-level factor for item i. The Presutti and Trepp models address units backordered. To convert their unit models to requisition models, that...requisition size. R sE-Tnw MODELING In their paper, Presutti and Trepp also gave two ve.sions of a multi-item, response-time, safety-level model

  5. Time preferences, study effort, and academic performance

    NARCIS (Netherlands)

    Non, J.A.; Tempelaar, D.T.

    2014-01-01

    We analyze the relation between time preferences, study effort, and academic performance among first-year Business and Economics students. Time preferences are measured by stated preferences for an immediate payment over larger delayed payments. Data on study efforts are derived from an electronic l

  6. Visual Cues and Listening Effort: Individual Variability

    Science.gov (United States)

    Picou, Erin M.; Ricketts, Todd A; Hornsby, Benjamin W. Y.

    2011-01-01

    Purpose: To investigate the effect of visual cues on listening effort as well as whether predictive variables such as working memory capacity (WMC) and lipreading ability affect the magnitude of listening effort. Method: Twenty participants with normal hearing were tested using a paired-associates recall task in 2 conditions (quiet and noise) and…

  7. Endogenous Effort Norms in Hierarchical Firms

    NARCIS (Netherlands)

    J. Tichem (Jan)

    2013-01-01

    markdownabstract__Abstract__ This paper studies how a three-layer hierarchical firm (principal-supervisor-agent) optimally creates effort norms for its employees. The key assumption is that effort norms are affected by the example of superiors. In equilibrium, norms are eroded as one moves down

  8. The Effect of Age on Listening Effort

    Science.gov (United States)

    Degeest, Sofie; Keppler, Hannah; Corthals, Paul

    2015-01-01

    Purpose: The objective of this study was to investigate the effect of age on listening effort. Method: A dual-task paradigm was used to evaluate listening effort in different conditions of background noise. Sixty adults ranging in age from 20 to 77 years were included. A primary speech-recognition task and a secondary memory task were performed…

  9. Listening Effort With Cochlear Implant Simulations

    NARCIS (Netherlands)

    Pals, Carina; Sarampalis, Anastasios; Başkent, Deniz

    2013-01-01

    Purpose: Fitting a cochlear implant (CI) for optimal speech perception does not necessarily optimize listening effort. This study aimed to show that listening effort may change between CI processing conditions for which speech intelligibility remains constant. Method: Nineteen normal-hearing partici

  10. Listening Effort with Cochlear Implant Simulations

    Science.gov (United States)

    Pals, Carina; Sarampalis, Anastasios; Baskent, Deniz

    2013-01-01

    Purpose: Fitting a cochlear implant (CI) for optimal speech perception does not necessarily optimize listening effort. This study aimed to show that listening effort may change between CI processing conditions for which speech intelligibility remains constant. Method: Nineteen normal-hearing participants listened to CI simulations with varying…

  11. Minimum required number of specimen records to develop accurate species distribution models

    NARCIS (Netherlands)

    Proosdij, van A.S.J.; Sosef, M.S.M.; Wieringa, J.J.; Raes, N.

    2016-01-01

    Species distribution models (SDMs) are widely used to predict the occurrence of species. Because SDMs generally use presence-only data, validation of the predicted distribution and assessing model accuracy is challenging. Model performance depends on both sample size and species’ prevalence, being t

  12. Minimum required number of specimen records to develop accurate species distribution models

    NARCIS (Netherlands)

    Proosdij, van A.S.J.; Sosef, M.S.M.; Wieringa, Jan; Raes, N.

    2015-01-01

    Species Distribution Models (SDMs) are widely used to predict the occurrence of species. Because SDMs generally use presence-only data, validation of the predicted distribution and assessing model accuracy is challenging. Model performance depends on both sample size and species’ prevalence, being

  13. Minimum required number of specimen records to develop accurate species distribution models

    NARCIS (Netherlands)

    Proosdij, van A.S.J.; Sosef, M.S.M.; Wieringa, J.J.; Raes, N.

    2016-01-01

    Species distribution models (SDMs) are widely used to predict the occurrence of species. Because SDMs generally use presence-only data, validation of the predicted distribution and assessing model accuracy is challenging. Model performance depends on both sample size and species’ prevalence, being

  14. Tsunami Defense Efforts at Samcheok Port, Korea

    Science.gov (United States)

    Cho, Y. S.

    2016-02-01

    Tsunamis mainly triggered by impulsive undersea motions are long waves and can propagate a long distance. Thus, they can cause huge casualties not only neighboring countries but also distant countries. Recently, several devastating tsunamis have been occurred around the Pacific Ocean rim. Among them, the Great East Japan tsunami occurred on March 11, 2011 is probably recorded as one of the most destructive tsunamis during last several decades. The Tsunami killed more than 20,000 people (including missing people) and deprived of property damage of approximately 300 billion USD. The eastern coast of the Korean Peninsula has been attacked historically by unexpected tsunami events. These tsunamis were generated by undersea earthquakes occurred off the west coast of Japan. For example, the Central East Sea Tsunami occurred on May 26, 1983 killed 3 people and caused serious property damage at Samcheok Port located at the eastern coast of Korea. Thus, a defense plan against unexpected tsunami strikes is an essential task for the port authority to protect lives of human beings and port facilities. In this study, a master plan of tsunami defense is introduced at Samcheok Port. A tsunami hazard map is also made by employing both propagation and inundation models. Detailed defense efforts are described including the procedure of development of a tsunami hazard map. Keywords: tsunami, hazard map, run-up height, emergency action plan

  15. Teaching Case: IS Security Requirements Identification from Conceptual Models in Systems Analysis and Design: The Fun & Fitness, Inc. Case

    Science.gov (United States)

    Spears, Janine L.; Parrish, James L., Jr.

    2013-01-01

    This teaching case introduces students to a relatively simple approach to identifying and documenting security requirements within conceptual models that are commonly taught in systems analysis and design courses. An introduction to information security is provided, followed by a classroom example of a fictitious company, "Fun &…

  16. 40 CFR Table 4 to Subpart Ffff of... - Model Rule-Requirements for Continuous Emission Monitoring Systems (CEMS)

    Science.gov (United States)

    2010-07-01

    ... Emission Monitoring Systems (CEMS) 4 Table 4 to Subpart FFFF of Part 60 Protection of Environment...—Model Rule—Requirements for Continuous Emission Monitoring Systems (CEMS) As stated in § 60.3039, you... CEMS Use the following performance specifications (P.S.) in appendix B of this part for your CEMS...

  17. Teaching Case: IS Security Requirements Identification from Conceptual Models in Systems Analysis and Design: The Fun & Fitness, Inc. Case

    Science.gov (United States)

    Spears, Janine L.; Parrish, James L., Jr.

    2013-01-01

    This teaching case introduces students to a relatively simple approach to identifying and documenting security requirements within conceptual models that are commonly taught in systems analysis and design courses. An introduction to information security is provided, followed by a classroom example of a fictitious company, "Fun &…

  18. Job Satisfaction, Effort, and Performance: A Reasoned Action Perspective

    Directory of Open Access Journals (Sweden)

    Icek Ajzen

    2011-12-01

    Full Text Available In this article the author takes issue with the recurrent reliance on job satisfaction to explain job-related effort and performance.  The disappointing findings in this tradition are explained by lack of compatibility between job satisfaction–-a very broad attitude–-and the more specific effort and performance criteria.  Moreover, attempts to apply the expectancy-value model of attitude to explore the determinants of effort and performance suffer from reliance on unrepresentative sets of beliefs about the likely consequences of these behaviors.  The theory of planned behavior (Ajzen, 1991, 2012, with its emphasis on the proximal antecedents of job effort and performance, is offered as an alternative.  According to the theory, intentions to exert effort and to attain a certain performance level are determined by attitudes, subjective norms, and perceptions of control in relation to these behaviors; and these variables, in turn, are a function of readily accessible beliefs about the likely outcomes of effort and performance, about the normative expectations of important others, and about factors that facilitate or hinder effective performance.

  19. Practitioner's knowledge representation a pathway to improve software effort estimation

    CERN Document Server

    Mendes, Emilia

    2014-01-01

    The main goal of this book is to help organizations improve their effort estimates and effort estimation processes by providing a step-by-step methodology that takes them through the creation and validation of models that are based on their own knowledge and experience. Such models, once validated, can then be used to obtain predictions, carry out risk analyses, enhance their estimation processes for new projects and generally advance them as learning organizations.Emilia Mendes presents the Expert-Based Knowledge Engineering of Bayesian Networks (EKEBNs) methodology, which she has used and adapted during the course of several industry collaborations with different companies world-wide over more than 6 years. The book itself consists of two major parts: first, the methodology's foundations in knowledge management, effort estimation (with special emphasis on the intricacies of software and Web development) and Bayesian networks are detailed; then six industry case studies are presented which illustrate the pra...

  20. Contributions to the Science Modeling Requirements Document; Earth Limb & Auroral Backgrounds

    Science.gov (United States)

    2007-11-02

    composition information. All models give these parameters as functions of altitude. Depending on its sophistication, a model may also report these...magnetospheric forcing (Huguenin et al., 1989; Wohlers et al., 1989). Malkmus et al. (1989) con- structed a limb clutter model for the middle ultraviolet (0.2...taken once per orbit (-100 — EL & A Bkgds, 35— Table 4-5a AURORAL ALERT SUMMARY — 1 Item Description Schedule Access Preliminary Report & Forecast