WorldWideScience

Sample records for models requires extensive

  1. Core competency requirements among extension workers in peninsular Malaysia: Use of Borich's needs assessment model.

    Science.gov (United States)

    Umar, Sulaiman; Man, Norsida; Nawi, Nolila Mohd; Latif, Ismail Abd; Samah, Bahaman Abu

    2017-06-01

    The study described the perceived importance of, and proficiency in core agricultural extension competencies among extension workers in Peninsular Malaysia; and evaluating the resultant deficits in the competencies. The Borich's Needs Assessment Model was used to achieve the objectives of the study. A sample of 298 respondents was randomly selected and interviewed using a pre-tested structured questionnaire. Thirty-three core competency items were assessed. Instrument validity and reliability were ensured. The cross-sectional data obtained was analysed using SPSS for descriptive statistics including mean weighted discrepancy score (MWDS). Results of the study showed that on a scale of 5, the most important core extension competency items according to respondents' perception were: "Making good use of information and communication technologies/access and use of web-based resources" (M=4.86, SD=0.23); "Conducting needs assessments" (M=4.84, SD=0.16); "organizing extension campaigns" (M=4.82, SD=0.47) and "Managing groups and teamwork" (M=4.81, SD=0.76). In terms of proficiency, the highest competency identified by the respondents was "Conducting farm and home visits (M=3.62, SD=0.82) followed by 'conducting meetings effectively' (M=3.19, SD=0.72); "Conducting focus group discussions" (M=3.16, SD=0.32) and "conducting community forums" (M=3.13, SD=0.64). The discrepancies implying competency deficits were widest in "Acquiring and allocating resources" (MWDS=12.67); use of information and communication technologies (ICTs) and web-based resources in agricultural extension (MWDS=12.59); and report writing and sharing the results and impacts (MWDS=11.92). It is recommended that any intervention aimed at developing the capacity of extension workers in Peninsular Malaysia should prioritize these core competency items in accordance with the deficits established in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Wrist extension strength required for power grip: a study using a radial nerve block model.

    Science.gov (United States)

    Suzuki, T; Kunishi, T; Kakizaki, J; Iwakura, N; Takahashi, J; Kuniyoshi, K

    2012-06-01

    The aim of this study was to investigate the correlation of wrist extension strength (WES) and grip strength (GS) using a radial nerve block, and to determine the WES required to prevent the "wrist flexion phenomenon" (antagonistic WES) when making a fist. We tested 14 arms in seven healthy males. WES and GS were measured before blocking as standard WES and standard GS. All participants then had radial nerve blocks with mepivacaine hydrochloride. During the recovery process from radial nerve blockade, WES and GS were recorded every 5 minutes. There was a very strong correlation between WES and GS (p < 0.0001). The mean antagonistic WES was 51% of standard WES, and the mean GS, recorded at the same time, was 66% of standard GS.

  3. Extension agents' technical knowledge requirements for effective ...

    African Journals Online (AJOL)

    Technical knowledge requirements of extension agents were investigated in this study. Data for the study was collected with the aid of structured questionnaire administered to the 78 respondents. It was found that respondents were mainly males, were married, were in the middle age category, had BSc/HND, made ...

  4. Requirement Generation For The Habitation Extension Module

    Science.gov (United States)

    Hempsell, M.

    As part of a debate within United Kingdom regarding its policy to avoid project involving human space flight, a study design was produced to explore the implications of a late entry as a full partner in the International Space Station (ISS). This objective generates many diverse requirements from the two primary stakeholders, the existing ISS partners and United Kingdom itself. It was found that a Soyuz/Fregat launched Habitation Extension Module with a logistic supply capability could meet all these requirements. It is unusual for a system to successfully meet such a wide range of requirements, but in this case the ability to scope the requirements in a single objective and the flexibility inherent in the wide design space created by the many options have made it possible.

  5. Extensions of the Standard Model

    CERN Document Server

    Zwirner, Fabio

    1996-01-01

    Rapporteur talk at the International Europhysics Conference on High Energy Physics, Brussels (Belgium), July 27-August 2, 1995. This talk begins with a brief general introduction to the extensions of the Standard Model, reviewing the ideology of effective field theories and its practical implications. The central part deals with candidate extensions near the Fermi scale, focusing on some phenomenological aspects of the Minimal Supersymmetric Standard Model. The final part discusses some possible low-energy implications of further extensions near the Planck scale, namely superstring theories.

  6. Extensions of the standard model

    International Nuclear Information System (INIS)

    Ramond, P.

    1983-01-01

    In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinn symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references

  7. Competency Modeling in Extension Education: Integrating an Academic Extension Education Model with an Extension Human Resource Management Model

    Science.gov (United States)

    Scheer, Scott D.; Cochran, Graham R.; Harder, Amy; Place, Nick T.

    2011-01-01

    The purpose of this study was to compare and contrast an academic extension education model with an Extension human resource management model. The academic model of 19 competencies was similar across the 22 competencies of the Extension human resource management model. There were seven unique competencies for the human resource management model.…

  8. Control and Modeling of Extensible Continuum Robots

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this research is to develop fundamental control theory, dynamic modeling, and control technology for extensible continuum robotic manipulators. These...

  9. A glacier runoff extension to the Precipitation Runoff Modeling System

    Science.gov (United States)

    A. E. Van Beusekom; R. J. Viger

    2016-01-01

    A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...

  10. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software......, this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security...... are based on (quite successful) ad-hoc techniques. We believe they can be significantly improved beyond the state-of-the-art by pairing them with static analyses techniques. In this paper we present an approach to both formalising those real-world systems, as well as providing an underlying semantics, which...

  11. Economic modeling for life extension decision making

    International Nuclear Information System (INIS)

    Farber, M.A.; Harrison, D.L.; Carlson, D.D.

    1986-01-01

    This paper presents a methodology for the economic and financial analysis of nuclear plant life extension under uncertainty and demonstrates its use in a case analysis. While the economic and financial evaluation of life extension does not require new analytical tools, such studies should be based on the following three premises. First, the methodology should examine effects at the level of the company or utility system, because the most important economic implications of life extension relate to the altered generation system expansion plan. Second, it should focus on the implications of uncertainty in order to understand the factors that most affect life extension benefits and identify risk management efforts. Third, the methodology should address multiple objectives, at a minimum, both economic and financial objectives

  12. Extending the Agricultural Extension Model. Preliminary Draft.

    Science.gov (United States)

    Rogers, Everett M.; And Others

    The purposes of this report are: to describe the main elements of the U.S. agricultural extension model and its effects on the agricultural revolution; to analyze attempts to extend this model to non-agricultural technology and/or to less developed countries; and to draw general conclusions about the diffusion of technological innovations, with…

  13. Symmetric Functional Model for Extensions of Hermitian

    CERN Document Server

    Ryzhov, V

    2006-01-01

    This paper offers the functional model of a class of non-selfadjoint extensions of a Hermitian operator with equal deficiency indices. The explicit form of dilation of a dissipative extension is offered and the symmetric form of Sz.Nagy-Foia\\c{s} model as developed by B.~Pavlov is constructed. A variant of functional model for a general non-selfadjoint non-dissipative extension is formulated. We illustrate the theory by two examples: singular perturbations of the Laplace operator in~$L_2(\\Real^3)$ by a finite number of point interactions, and the Schr\\"odinger operator on the half axis~$(0, \\infty)$ in the Weyl limit circle case at infinity.

  14. Mathematical model of subscriber extension line

    OpenAIRE

    Petříková, Iva; Diviš, Zdeněk; Tesař, Zdeněk

    2012-01-01

    The paper focuses on measurement properties of metallic subscriber extension lines to build regression mathematical model for a symmetric pair cable. The regression model is compared with an analytical model based on a theoretical description of transfer parameters for this type of line. The output of the paper should demonstrate the impact of electromagnetic interference on the symmetric pair. The paper also describes the method to identify the interference sources and ...

  15. Applications and extensions of degradation modeling

    International Nuclear Information System (INIS)

    Hsu, F.; Subudhi, M.; Samanta, P.K.; Vesely, W.E.

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs

  16. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. [Brookhaven National Lab., Upton, NY (United States); Vesely, W.E. [Science Applications International Corp., Columbus, OH (United States)

    1991-12-31

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  17. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. (Brookhaven National Lab., Upton, NY (United States)); Vesely, W.E. (Science Applications International Corp., Columbus, OH (United States))

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  18. Extension of association models to complex chemicals

    DEFF Research Database (Denmark)

    Avlund, Ane Søgaard

    ; CPA and sPC-SAFT. Phase equilibrium and monomer fraction calculations with sPC-SAFT for methanol are used in the thesis to illustrate the importance of parameter estimation when using SAFT. Different parameter sets give similar pure component vapor pressure and liquid density results, whereas very......Summary of “Extension of association models to complex chemicals”. Ph.D. thesis by Ane Søgaard Avlund The subject of this thesis is application of SAFT type equations of state (EoS). Accurate and predictive thermodynamic models are important in many industries including the petroleum industry...... not account for steric self-hindrance for tree-like structures. An important practical problem is how to obtain optimal and consistent parameters. Moreover, multifunctional associating molecules represent a special challenge. In this work two equations of state using the SAFT theory for association are used...

  19. Complex singlet extension of the standard model

    International Nuclear Information System (INIS)

    Barger, V.; Langacker, P.; McCaskey, M.; Ramsey-Musolf, M.; Shaughnessy, G.

    2009-01-01

    We analyze a simple extension of the standard model (SM) obtained by adding a complex singlet to the scalar sector (cxSM). We show that the cxSM can contain one or two viable cold dark matter candidates and analyze the conditions on the parameters of the scalar potential that yield the observed relic density. When the cxSM potential contains a global U(1) symmetry that is both softly and spontaneously broken, it contains both a viable dark matter candidate and the ingredients necessary for a strong first order electroweak phase transition as needed for electroweak baryogenesis. We also study the implications of the model for discovery of a Higgs boson at the Large Hadron Collider

  20. Sequence modelling and an extensible data model for genomic database

    Energy Technology Data Exchange (ETDEWEB)

    Li, Peter Wei-Der [California Univ., San Francisco, CA (United States); Univ. of California, Berkeley, CA (United States)

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS`s do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data model that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the ``Extensible Object Model``, to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.

  1. Sequence modelling and an extensible data model for genomic database

    Energy Technology Data Exchange (ETDEWEB)

    Li, Peter Wei-Der (California Univ., San Francisco, CA (United States) Lawrence Berkeley Lab., CA (United States))

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS's do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data model that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the Extensible Object Model'', to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.

  2. Reality Check: OK Extension Helps Teachers Meet Financial Education Requirements

    Science.gov (United States)

    St. Pierre, Eileen; Simpson, Mickey; Moffat, Susan; Cothren, Phillis

    2011-01-01

    According to the Jump$tart Coalition, Oklahoma is one of 24 states to adopt financial education requirements for students (Jump$tart Coalition, 2010). The Passport to Financial Literacy Act of 2007, Oklahoma House Bill 1476, requires Oklahoma students in grades 7 through 12 to fulfill established financial literacy requirements to graduate with a…

  3. 75 FR 9953 - Definition and Requirements for a Nationally Recognized Testing Laboratory (NRTL); Extension of...

    Science.gov (United States)

    2010-03-04

    ...] Definition and Requirements for a Nationally Recognized Testing Laboratory (NRTL); Extension of the Office of Management and Budget's (OMB) Approval of Information Collection (Paperwork) Requirements AGENCY... its Regulation on the Definition and Requirements for a Nationally Recognized Testing Laboratory (29...

  4. Agriflection: A Learning Model for Agricultural Extension in South Africa

    Science.gov (United States)

    Worth, S. H.

    2006-01-01

    Prosperity--continuous and sustainable wealth creation--is an elusive goal in South African smallholder agriculture. This paper suggests that agricultural extension can facilitate realising this objective if an appropriate approach to extension can be developed. To develop such an approach requires that the definition of extension and the…

  5. Mediating Informal Care Online: Findings from an Extensive Requirements Analysis

    Directory of Open Access Journals (Sweden)

    Christiane Moser

    2015-05-01

    Full Text Available Organizing and satisfying the increasing demand for social and informal care for older adults is an important topic. We aim at building a peer-to-peer exchange platform that empowers older adults to benefit from receiving support for daily activities and reciprocally offering support to others. In situated interviews and within a survey we investigated the requirements and needs of 246 older adults with mild impairments. Additionally, we conducted an interpretative role analysis of older adults’ collaborative care processes (i.e., support exchange practices in order to identify social roles and understand the inherent expectations towards the execution of support. We will describe our target group in the form of personas and different social roles, as well as user requirements for establishing a successful peer-to-peer collaboration. We also consider our finding from the perspective of social capital theory that allows us to describe in our requirements how relationships provide valuable social resources (i.e., social capital for informal and social care.

  6. Testing Extension Services through AKAP Models

    Science.gov (United States)

    De Rosa, Marcello; Bartoli, Luca; La Rocca, Giuseppe

    2014-01-01

    Purpose: The aim of the paper is to analyse the attitude of Italian farms in gaining access to agricultural extension services (AES). Design/methodology/approach: The ways Italian farms use AES are described through the AKAP (Awareness, Knowledge, Adoption, Product) sequence. This article investigated the AKAP sequence by submitting a…

  7. Extensions to the Joshua GDMS to support environmental science and analysis data handling requirements

    International Nuclear Information System (INIS)

    Suich, J.E.; Honeck, H.C.

    1978-01-01

    For the past ten years, a generalized data management system (GDMS) called JOSHUA has been in use at the Savannah River Laboratory. Originally designed and implemented to support nuclear reactor physics and safety computational applications, the system is now also supporting environmental science modeling and impact assessment. Extensions to the original system are being developed to meet neet new data handling requirements, which include more general owner-member record relationships occurring in geographically encoded data sets, unstructured (relational) inquiry capability, cartographic analysis and display, and offsite data exchange. This paper discusses the need for these capabilities, places them in perspective as generic scientific data management activities, and presents the planned context-free extensions to the basic JOSHUA GDMS

  8. Extensions to the Joshua GDMS to support environmental science and analysis data handling requirements

    International Nuclear Information System (INIS)

    Suich, J.E.; Honeck, H.C.

    1977-01-01

    For the past ten years, a generalized data management system (GDMS) called JOSHUA has been in use at the Savannah River Laboratory. Originally designed and implemented to support nuclear reactor physics and safety computational applications, the system is now also supporting environmental science modeling and impact assessment. Extensions to the original system are being developed to meet new data handling requirements, which include more general owner-member record relationships occurring in geographically encoded data sets, unstructured (relational) inquiry capability, cartographic analysis and display, and offsite data exchange. This paper discusses the need for these capabilities, places them in perspective as generic scientific data management activities, and presents the planned context-free extensions to the basic JOSHUA GDMS

  9. Java Architecture for Detect and Avoid Extensibility and Modeling

    Science.gov (United States)

    Santiago, Confesor; Mueller, Eric Richard; Johnson, Marcus A.; Abramson, Michael; Snow, James William

    2015-01-01

    Unmanned aircraft will equip with a detect-and-avoid (DAA) system that enables them to comply with the requirement to "see and avoid" other aircraft, an important layer in the overall set of procedural, strategic and tactical separation methods designed to prevent mid-air collisions. This paper describes a capability called Java Architecture for Detect and Avoid Extensibility and Modeling (JADEM), developed to prototype and help evaluate various DAA technological requirements by providing a flexible and extensible software platform that models all major detect-and-avoid functions. Figure 1 illustrates JADEM's architecture. The surveillance module can be actual equipment on the unmanned aircraft or simulators that model the process by which sensors on-board detect other aircraft and provide track data to the traffic display. The track evaluation function evaluates each detected aircraft and decides whether to provide an alert to the pilot and its severity. Guidance is a combination of intruder track information, alerting, and avoidance/advisory algorithms behind the tools shown on the traffic display to aid the pilot in determining a maneuver to avoid a loss of well clear. All these functions are designed with a common interface and configurable implementation, which is critical in exploring DAA requirements. To date, JADEM has been utilized in three computer simulations of the National Airspace System, three pilot-in-the-loop experiments using a total of 37 professional UAS pilots, and two flight tests using NASA's Predator-B unmanned aircraft, named Ikhana. The data collected has directly informed the quantitative separation standard for "well clear", safety case, requirements development, and the operational environment for the DAA minimum operational performance standards. This work was performed by the Separation Assurance/Sense and Avoid Interoperability team under NASA's UAS Integration in the NAS project.

  10. A glacier runoff extension to the Precipitation Runoff Modeling System

    Science.gov (United States)

    Van Beusekom, Ashley; Viger, Roland

    2016-01-01

    A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while maintaining model usability. PRMSglacier is validated on two basins in Alaska, Wolverine, and Gulkana Glacier basin, which have been studied since 1966 and have a substantial amount of data with which to test model performance over a long period of time covering a wide range of climatic and hydrologic conditions. When error in field measurements is considered, the Nash-Sutcliffe efficiencies of streamflow are 0.87 and 0.86, the absolute bias fractions of the winter mass balance simulations are 0.10 and 0.08, and the absolute bias fractions of the summer mass balances are 0.01 and 0.03, all computed over 42 years for the Wolverine and Gulkana Glacier basins, respectively. Without taking into account measurement error, the values are still within the range achieved by the more computationally expensive codes tested over shorter time periods.

  11. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    of a pre-existing, open-source modeling and analysis framework known as Ptolemy II (http://ptolemy.org). The University of California, Berkeley...worked with the Air Force Research Laboratory, Rome Research Site on adapting Ptolemy II for modeling and simulation of large scale dynamics of Political...capabilities were prototyped in Ptolemy II and delivered via version control and software releases. Each of these capabilities specifically supports one or

  12. Kidnapping model: an extension of Selten's game.

    Science.gov (United States)

    Iqbal, Azhar; Masson, Virginie; Abbott, Derek

    2017-12-01

    Selten's game is a kidnapping model where the probability of capturing the kidnapper is independent of whether the hostage has been released or executed. Most often, in view of the elevated sensitivities involved, authorities put greater effort and resources into capturing the kidnapper if the hostage has been executed, in contrast with the case when a ransom is paid to secure the hostage's release. In this paper, we study the asymmetric game when the probability of capturing the kidnapper depends on whether the hostage has been executed or not and find a new uniquely determined perfect equilibrium point in Selten's game.

  13. Requirements for Medical Modeling Languages

    Science.gov (United States)

    van der Maas, Arnoud A.F.; Ter Hofstede, Arthur H.M.; Ten Hoopen, A. Johannes

    2001-01-01

    Objective: The development of tailor-made domain-specific modeling languages is sometimes desirable in medical informatics. Naturally, the development of such languages should be guided. The purpose of this article is to introduce a set of requirements for such languages and show their application in analyzing and comparing existing modeling languages. Design: The requirements arise from the practical experience of the authors and others in the development of modeling languages in both general informatics and medical informatics. The requirements initially emerged from the analysis of information modeling techniques. The requirements are designed to be orthogonal, i.e., one requirement can be violated without violation of the others. Results: The proposed requirements for any modeling language are that it be “formal” with regard to syntax and semantics, “conceptual,” “expressive,” “comprehensible,” “suitable,” and “executable.” The requirements are illustrated using both the medical logic modules of the Arden Syntax as a running example and selected examples from other modeling languages. Conclusion: Activity diagrams of the Unified Modeling Language, task structures for work flows, and Petri nets are discussed with regard to the list of requirements, and various tradeoffs are thus made explicit. It is concluded that this set of requirements has the potential to play a vital role in both the evaluation of existing domain-specific languages and the development of new ones. PMID:11230383

  14. Cellular potts models multiscale extensions and biological applications

    CERN Document Server

    Scianna, Marco

    2013-01-01

    A flexible, cell-level, and lattice-based technique, the cellular Potts model accurately describes the phenomenological mechanisms involved in many biological processes. Cellular Potts Models: Multiscale Extensions and Biological Applications gives an interdisciplinary, accessible treatment of these models, from the original methodologies to the latest developments. The book first explains the biophysical bases, main merits, and limitations of the cellular Potts model. It then proposes several innovative extensions, focusing on ways to integrate and interface the basic cellular Potts model at the mesoscopic scale with approaches that accurately model microscopic dynamics. These extensions are designed to create a nested and hybrid environment, where the evolution of a biological system is realistically driven by the constant interplay and flux of information between the different levels of description. Through several biological examples, the authors demonstrate a qualitative and quantitative agreement with t...

  15. A model for the dynamics of extensible semiflexible polymers

    NARCIS (Netherlands)

    Barkema, G.T.; van Leeuwen, J.M.J.

    2012-01-01

    We present a model for semiflexible polymers in Hamiltonian formulation which interpolates between a Rouse chain and worm-like chain. Both models are realized as limits for the parameters. The model parameters can also be chosen to match the experimental force-extension curve for double-stranded

  16. Requirements for effective modelling strategies.

    NARCIS (Netherlands)

    Gaunt, J.L.; Riley, J.; Stein, A.; Penning de Vries, F.W.T.

    1997-01-01

    As the result of a recent BBSRC-funded workshop between soil scientists, modellers, statisticians and others to discuss issues relating to the derivation of complex environmental models, a set of modelling guidelines is presented and the required associated research areas are discussed.

  17. Farmworkers' Irrigation Schools: An Extension Model for Hispanic Farm Laborers.

    Science.gov (United States)

    Youmans, David; And Others

    1982-01-01

    Describes a model for Hispanic farm laborer irrigation schools that was developed, implemented, and evaluated by cooperative extension personnel. Success of the approach was due to attention to critical elements in the model, which is applicable to other adult basic education programs. (JOW)

  18. 78 FR 66670 - Housing Counseling Program: New Certification Requirements; Extension of Public Comment Period

    Science.gov (United States)

    2013-11-06

    ... URBAN DEVELOPMENT 24 CFR Part 214 Housing Counseling Program: New Certification Requirements; Extension... Housing Counseling Program regulations for the purpose of implementing the Dodd-Frank Wall Street Reform and Consumer Protection Act amendments to the housing counseling statute. This document announces that...

  19. Sequential Sampling Models in Cognitive Neuroscience: Advantages, Applications, and Extensions.

    Science.gov (United States)

    Forstmann, B U; Ratcliff, R; Wagenmakers, E-J

    2016-01-01

    Sequential sampling models assume that people make speeded decisions by gradually accumulating noisy information until a threshold of evidence is reached. In cognitive science, one such model--the diffusion decision model--is now regularly used to decompose task performance into underlying processes such as the quality of information processing, response caution, and a priori bias. In the cognitive neurosciences, the diffusion decision model has recently been adopted as a quantitative tool to study the neural basis of decision making under time pressure. We present a selective overview of several recent applications and extensions of the diffusion decision model in the cognitive neurosciences.

  20. Real gauge singlet scalar extension of the Standard Model: A ...

    Indian Academy of Sciences (India)

    2013-03-05

    Mar 5, 2013 ... Abstract. The simplest extension of Standard Model (SM) is considered in which a real SM gauge singlet scalar with an additional discrete symmetry Z2 is introduced to SM. This additional scalar can be a viable candidate of cold dark matter (CDM) since the stability of S is achieved by the application of Z2 ...

  1. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  2. Analyzing the Required Professional Qualification for Agricultural Extension Experts in Operational Level in the Mazandaran Province

    Directory of Open Access Journals (Sweden)

    Amir Ahmadpour

    2015-08-01

    Full Text Available Extension experts who play an active role at the operational level are required to have some indispensable competencies to enable them to provide the rural community with some high quality, ­applicable and important educational programs. Accordingly, the study sought to analyze the components of professional qualifications for agricultural extension experts’ operational level. This study is a descriptive and survey research. The statistical population (Agricultural Extension Experts in Operational Levels was comprised of 290 persons. And the proportional stratified sampling using Krejcie-Morgan Table was applied and 165 subjects were selected. The data collection tool was a researcher-made questionnaire, and its content validity was approved by agricultural extension experts and by using KMO coefficient and Bartlett’s Test giving a reliability of KMO=0.737(. The data analysis results showed that seven extracted factors of (research factors, technical-professional factors, teaching factors, managerial factors, personality factors, communication factors and virtual technology factors explain 63.691% of the total variance of the professional competencies for agriculture extension experts’ operational levels in the province. The  findings indicate that based on scientific methods of research,  assessment of needs, planning and assessment, and in-service training workshops implementation for experts seem to be necessary. Distinctive attention should be practiced by Agriculture Organization to improve agents’ skills in a variety of crops cultivation and in working with software and agricultural applications.

  3. Conformal Extensions of the Standard Model with Veltman Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Mojaza, Matin; Sannino, Francesco

    2014-01-01

    the Higgs is predicted to have the experimental value of the mass equal to 126 GeV. This model also predicts the existence of one more standard model singlet scalar boson with a mass of 541 GeV and the Higgs self-coupling to emerge radiatively. We study several other PNC examples that generally predict...... a somewhat smaller mass of the Higgs to the perturbative order we have investigated them. Our results can be a useful guide when building extensions of the standard model featuring fundamental scalars....

  4. Nuclear EMC effect in non-extensive statistical model

    Energy Technology Data Exchange (ETDEWEB)

    Trevisan, Luis A. [Departamento de Matematica e Estatistica, Universidade Estadual de Ponta Grossa, 84010-790, Ponta Grossa, PR (Brazil); Mirez, Carlos [ICET, Universidade Federal dos Vales do Jequitinhonha e Mucuri - UFVJM, Campus do Mucuri, Rua do Cruzeiro 01, Jardim Sao Paulo, 39803-371, Teofilo Otoni, MG (Brazil)

    2013-05-06

    In the present work, we attempt to describe the nuclear EMC effect by using the proton structure functions obtained from the non-extensive statistical quark model. We record that such model has three fundamental variables, the temperature T, the radius, and the Tsallis parameter q. By combining different small changes, a good agreement with the experimental data may be obtained. Another interesting point of the model is to allow phenomenological interpretation, for instance, with q constant and changing the radius and the temperature or changing the radius and q and keeping the temperature.

  5. Hard-sphere displacive model of extension twinning in magnesium

    OpenAIRE

    Cayron, Cyril

    2017-01-01

    A crystallographic displacive model is proposed for the extension twins in magnesium. The atomic displacements are established, and the homogeneous lattice distortion is analytically expressed as a continuous angular-distortive matrix that becomes a shear when the distortion is complete. The calculations prove that a volume change of 3% occurs for the intermediate states. The twinning plane, even if untilted and restored when the distortion is complete, is not fully invariant during the trans...

  6. Asymptotically Safe Standard Model Extensions arXiv

    CERN Document Server

    Pelaggi, Giulio Maria; Salvio, Alberto; Sannino, Francesco; Smirnov, Juri; Strumia, Alessandro

    We consider theories with a large number NF of charged fermions and compute the renormalisation group equations for the gauge, Yukawa and quartic couplings resummed at leading order in NF. We construct extensions of the Standard Model where SU(2) and/or SU(3) are asymptotically safe. When the same procedure is applied to the Abelian U(1) factor, we find that the Higgs quartic can not be made asymptotically safe and stay perturbative at the same time.

  7. Thinning factor distributions viewed through numerical models of continental extension

    Science.gov (United States)

    Svartman Dias, Anna Eliza; Hayman, Nicholas W.; Lavier, Luc L.

    2016-12-01

    A long-standing question surrounding rifted margins concerns how the observed fault-restored extension in the upper crust is usually less than that calculated from subsidence models or from crustal thickness estimates, the so-called "extension discrepancy." Here we revisit this issue drawing on recently completed numerical results. We extract thinning profiles from four end-member geodynamic model rifts with varying width and asymmetry and propose tectonic models that best explain those results. We then relate the spatial and temporal evolution of upper to lower crustal thinning, or crustal depth-dependent thinning (DDT), and crustal thinning to mantle thinning, or lithospheric DDT, which are difficult to achieve in natural systems due to the lack of observations that constrain thinning at different stages between prerift extension and lithospheric breakup. Our results support the hypothesis that crustal DDT cannot be the main cause of the extension discrepancy, which may be overestimated because of the difficulty in recognizing distributed deformation, and polyphase and detachment faulting in seismic data. More importantly, the results support that lithospheric DDT is likely to dominate at specific stages of rift evolution because crustal and mantle thinning distributions are not always spatially coincident and at times are not even balanced by an equal magnitude of thinning in two dimensions. Moreover, either pure or simple shear models can apply at various points of time and space depending on the type of rift. Both DDT and pure/simple shear variations across space and time can result in observed complex fault geometries, uplift/subsidence, and thermal histories.

  8. Does the extension of the nuclear power plant operation require the agreement of the Federal Council?

    International Nuclear Information System (INIS)

    Witt, Siegfried de

    2010-01-01

    It is violently disputed whether an extension of the nuclear power plant operation related to energy policy is meaningful, impedes or accelerates the reconstruction of the power supply. Depending upon the situation of political interest also the constitutional question is exploited whether a law for the increase of the residual power generation of the Supplement 3 of the Atomic Energy Act requires the agreement of the Federal Council. The answer to this question should not be determined from the personal opinion on the nuclear power. The answer should be examined and decided on the basis of constitutional requirements. This is done in this contribution under consideration of expertises and statements.

  9. Extension of the Navy aerosol model to coastal areas

    NARCIS (Netherlands)

    Piazzola, J.; Eijk, A.M.J. van; Leeuw, G. de

    2000-01-01

    The performance assessment of electro-optical (EO) systems with propagation prediction codes requires accurate atmospheric models. The model that is most frequently used for the prediction of aerosols and their effect on extinction in the marine atmosphere is the U.S. Navy aerosol model (NAM)

  10. Modelling Dynamic Topologies via Extensions of VDM-RT

    DEFF Research Database (Denmark)

    Nielsen, Claus Ballegård

    Only a few formal methods include descriptions of the network topology that the modelled system is deployed onto. In VDM Real-Time (VDM-RT) this has been enabled for distributed systems that have a static structure. However, when modelling dynamic systems this fixed topology becomes an issue....... Systems with highly distributed and alternating relationships cannot be expressed correctly in a static model. This document describes how VDM-RT can be extended with new language constructs to enable the description of dynamic reconfiguration of the network topology during the runtime execution...... of a model. The extension is developed on the basis of a case study involving a dynamic system that has a constant changing system topology. With a basis in the case study a model is developed that uses the static version of VDM-RT in order to reveal the limitations of the language. The case study...

  11. Conflict Resolution for Product Performance Requirements Based on Propagation Analysis in the Extension Theory

    Directory of Open Access Journals (Sweden)

    Yanwei Zhao

    2014-01-01

    Full Text Available Traditional product data mining methods are mainly focused on the static data. Performance requirements are generally met as possible by finding some cases and changing their structures. However, when one is satisfied with the structures changed, the other effects are not taken into account by analyzing the correlations; that is, design conflicts are not identified and resolved. An approach to resolving the conflict problems is proposed based on propagation analysis in Extension Theory. Firstly, the extension distance is improved to better fit evaluating the similarity among cases, then, a case retrieval method is developed. Secondly, the transformations that can be made on selected cases are formulated by understanding the conflict natures in the different performance requirements, which leads to the extension transformation strategy development for coordinating conflicts using propagation analysis. Thirdly, the effects and levels of propagation are determined by analyzing the performance values before and after the transformations, thus the co-existing conflict coordination strategy of multiple performances is developed. The method has been implemented in a working prototype system for supporting decision-making. And it has been demonstrated the feasible and effective through resolving the conflicts of noise, exhaust, weight and intake pressure for the screw air compressor performance design.

  12. The Standard-Model Extension and Gravitational Tests

    Directory of Open Access Journals (Sweden)

    Jay D. Tasson

    2016-10-01

    Full Text Available The Standard-Model Extension (SME provides a comprehensive effective field-theory framework for the study of CPT and Lorentz symmetry. This work reviews the structure and philosophy of the SME and provides some intuitive examples of symmetry violation. The results of recent gravitational tests performed within the SME are summarized including analysis of results from the Laser Interferometer Gravitational-Wave Observatory (LIGO, sensitivities achieved in short-range gravity experiments, constraints from cosmic-ray data, and results achieved by studying planetary ephemerids. Some proposals and ongoing efforts will also be considered including gravimeter tests, tests of the Weak Equivalence Principle, and antimatter experiments. Our review of the above topics is augmented by several original extensions of the relevant work. We present new examples of symmetry violation in the SME and use the cosmic-ray analysis to place first-ever constraints on 81 additional operators.

  13. Precision calculations in supersymmetric extensions of the Standard Model

    International Nuclear Information System (INIS)

    Slavich, P.

    2013-01-01

    This dissertation is organized as follows: in the next chapter I will summarize the structure of the supersymmetric extensions of the standard model (SM), namely the MSSM (Minimal Supersymmetric Standard Model) and the NMSSM (Next-to-Minimal Supersymmetric Standard Model), I will provide a brief overview of different patterns of SUSY (supersymmetry) breaking and discuss some issues on the renormalization of the input parameters that are common to all calculations of higher-order corrections in SUSY models. In chapter 3 I will review and describe computations on the production of MSSM Higgs bosons in gluon fusion. In chapter 4 I will review results on the radiative corrections to the Higgs boson masses in the NMSSM. In chapter 5 I will review the calculation of BR(B → X s γ in the MSSM with Minimal Flavor Violation (MFV). Finally, in chapter 6 I will briefly summarize the outlook of my future research. (author)

  14. Modeling Brand Extension as a Real Option: How Expectation, Competition and Financial Constraints Drive the Timing of Extensions

    NARCIS (Netherlands)

    L.H. Pattikawa

    2006-01-01

    textabstractDespite their strategic importance firm’s motivations to extend brands have received only modest attentions by marketing scholars. We use multiple events duration models to examine the timing of launching brand extensions. We provide a theoretical framework of brand extensions based on

  15. Top quark and Higgs physics in standard model extensions

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Patrick Jose

    2012-05-25

    In this thesis we have studied several extensions of the SM and their implications on the strength and structure of the tbW vertex, on the production and decays of pseudoscalar and heavy Higgs scalars at the LHC, and the effects that models with a fourth generation have on electroweak precision observables. Apart from the SM with a fourth generation of chiral fermions, the extensions we studied all feature an extended electroweak symmetry breaking (EWSB) sector. In the case of the type-II 2HDM and the MSSM, the extended EWSB sector consists of elementary Higgs fields. In the case of Topcolor assisted Technicolor (TC2), which is a model of dynamical EWSB, the scalar and pseudoscalar fields are composite. By scanning over the phenomenologically and theoretically allowed regions of the respective parameters spaces, we determined the largest possible cross sections σ(pp→φ→VV{sup '}) where VV{sup p}rime element of {W"+W"-, ZZγγ, Zγ} for both the heavy scalar and pseudoscalar states in the above models. We found that non-SUSY models with an extended Higgs sector and only three generations, namely the type-II 2HDM and the TC2, still allow for observable pseudoscalar cross sections σ(pp → A → VV') at the LHC. In particular for the final states W{sup +}W{sup -} and γγ. In the MSSM, the discovery of the pseudoscalar A through its decays into electroweak gauge bosons is very unlikely. However, scalar cross sections σ(pp→H→W{sup +}W{sup -}) can still be of observable size at the LHC in large parts of the MSSM parameter space. SM extensions with an extended EWSB sector and four chiral generations are strongly disfavoured; direct Higgs boson searches exclude large parts of the parameter space and it is challenging to bring such an extension into accordance with electroweak precision data. On the other hand, models with additional vector-like quarks and an extended Higgs sector are still viable. The SM with four chiral generations is (still) not

  16. Modeling Effects of Axial Extension on Arterial Growth and Remodeling

    Science.gov (United States)

    Valentín, A.; Humphrey, J.D.

    2013-01-01

    Diverse mechanical perturbations elicit arterial growth and remodeling responses that appear to optimize structure and function so as to achieve mechanical homeostasis. For example, it is well known that functional adaptations to sustained changes in transmural pressure and blood flow primarily affect wall thickness and caliber to restore circumferential and wall shear stresses toward normal. More recently, however, it has been shown that changes in axial extension similarly prompt dramatic cell and matrix reorganization and turnover, resulting in marked changes in unloaded geometry and mechanical behavior that presumably restore axial stress toward normal. Because of the inability to infer axial stress from in vivo measurements, simulations are needed to examine this hypothesis and to guide the design of future experiments. In this paper, we show that a constrained mixture model predicts salient features of observed responses to step increases in axial extension, including marked increases in fibrous constituent production, leading to a compensatory lengthening that restores original mechanical behavior. Because axial extension can be modified via diverse surgical procedures, including bypass operations and exploited in tissue regeneration research, there is a need for increased attention to this important aspect of arterial biomechanics and mechanobiology. PMID:19649667

  17. Neutron electric dipole moment and extension of the standard model

    International Nuclear Information System (INIS)

    Oshimo, Noriyuki

    2001-01-01

    A nonvanishing value for the electric dipole moment (EDM) of the neutron is a prominent signature for CP violation. The EDM induced by the Kobayashi-Maskawa mechanism of the standard model (SM) has a small magnitude and its detection will be very difficult. However, since baryon asymmetry of the universe cannot be accounted for by the SM, there should exist some other source of CP violation, which may generate a large magnitude for the EDM. One of the most hopeful candidates for physics beyond the SM is the supersymmetric standard model, which contains such sources of CP violation. This model suggests that the EDM has a magnitude not much smaller than the present experimental bounds. Progress in measuring the EDM provides very interesting information about extension of the SM. (author)

  18. An Extension of the Burridge-Knopoff Model for Friction

    Directory of Open Access Journals (Sweden)

    Veturia Chiroiu

    2015-09-01

    Full Text Available The paper presents an extension of the Burridge-Knopoff (BK model with an additional kinetic equation for the friction force in order to reproduce the both the velocity weakening friction between the tire and the road and the increase of static friction with time when the car is not moving. The BK was initially proposed to investigate statistical properties of earthquakes. In this model the sliding force decreases monotonously from a reference value, and the static friction can have negative values to prevent back sliding. The stability of the system is affected and the sliding regime at small sliding velocities and large stiffness cannot be reproduced. The extended model BK assures the stability of the diagram sliding-stationary sliding, and correctly reproduces the stability diagram for sliding friction under various loading conditions.

  19. A right colonic volvulus requiring extensive colectomy in an infant with trisomy 13

    Directory of Open Access Journals (Sweden)

    Kazuto Suda

    2015-12-01

    Full Text Available Colonic volvulus is a rare surgical emergency condition in children. Only approximately 40 children with cecal volvulus have been reported in English literature in the past 50 years. Among these, a right colonic volvulus involving the long segment from the ileal end to the transverse colon, as in our case, is limited to a few reports. Neurodevelopmental delay and a history of chronic constipation have been reported as common associated disorders. This is the first report about a case of right colonic volvulus in an infant with trisomy 13 who required extensive colectomy during an emergency laparotomy.

  20. Theory of spin Hall effect: extension of the Drude model.

    Science.gov (United States)

    Chudnovsky, Eugene M

    2007-11-16

    An extension of the Drude model is proposed that accounts for the spin and spin-orbit interaction of charge carriers. Spin currents appear due to the combined action of the external electric field, crystal field, and scattering of charge carriers. The expression for the spin Hall conductivity is derived for metals and semiconductors that is independent of the scattering mechanism. In cubic metals, the spin Hall conductivity sigma s and charge conductivity sigma c are related through sigma s=[2pi variant /(3mc2)]sigma2c with m being the bare electron mass. The theoretically computed value is in agreement with experiment.

  1. 75 FR 53988 - Notice Regarding the Requirement To Use eXtensible Business Reporting Language Format To Make...

    Science.gov (United States)

    2010-09-02

    ... eXtensible Business Reporting Language Format To Make Publicly Available the Information Required... Internet Web site in eXtensible Business Reporting Language (``XBRL'') format.\\7\\ The rule provides that in... that NRSROs make the information required under Rule 17g-2(d)(3) available on its corporate website in...

  2. Some extensions in continuous models for immunological correlates of protection.

    Science.gov (United States)

    Dunning, Andrew J; Kensler, Jennifer; Coudeville, Laurent; Bailleux, Fabrice

    2015-12-28

    A scaled logit model has previously been proposed to quantify the relationship between an immunological assay and protection from disease, and has been applied in a number of settings. The probability of disease was modelled as a function of the probability of exposure, which was assumed to be fixed, and of protection, which was assumed to increase smoothly with the value of the assay. Some extensions are here investigated. Alternative functions to represent the protection curve are explored, applications to case-cohort designs are evaluated, and approaches to variance estimation compared. The steepness of the protection curve must sometimes be bounded to achieve convergence and methods for doing so are outlined. Criteria for evaluating the fit of models are proposed and approaches to assessing the utility of results suggested. Models are evaluated by application to sixteen datasets from vaccine clinical trials. Alternative protection curve functions improved model evaluation criteria for every dataset. Standard errors based on the observed information were found to be unreliable; bootstrap estimates of precision were to be preferred. In most instances, case-cohort designs resulted in little loss of precision. Some results achieved suggested measures for utility. The original scaled logit model can be improved upon. Evaluation criteria permit well-fitting models and useful results to be identified. The proposed methods provide a comprehensive set of tools for quantifying the relationship between immunological assays and protection from disease.

  3. Early universe cosmology. In supersymmetric extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Baumann, Jochen Peter

    2012-03-19

    In this thesis we investigate possible connections between cosmological inflation and leptogenesis on the one side and particle physics on the other side. We work in supersymmetric extensions of the Standard Model. A key role is played by the right-handed sneutrino, the superpartner of the right-handed neutrino involved in the type I seesaw mechanism. We study a combined model of inflation and non-thermal leptogenesis that is a simple extension of the Minimal Supersymmetric Standard Model (MSSM) with conserved R-parity, where we add three right-handed neutrino super fields. The inflaton direction is given by the imaginary components of the corresponding scalar component fields, which are protected from the supergravity (SUGRA) {eta}-problem by a shift symmetry in the Kaehler potential. We discuss the model first in a globally supersymmetric (SUSY) and then in a supergravity context and compute the inflationary predictions of the model. We also study reheating and non-thermal leptogenesis in this model. A numerical simulation shows that shortly after the waterfall phase transition that ends inflation, the universe is dominated by right-handed sneutrinos and their out-of-equilibrium decay can produce the desired matter-antimatter asymmetry. Using a simplified time-averaged description, we derive analytical expressions for the model predictions. Combining the results from inflation and leptogenesis allows us to constrain the allowed parameter space from two different directions, with implications for low energy neutrino physics. As a second thread of investigation, we discuss a generalisation of the inflationary model discussed above to include gauge non-singlet fields as inflatons. This is motivated by the fact that in left-right symmetric, supersymmetric Grand Unified Theories (SUSY GUTs), like SUSY Pati-Salam unification or SUSY SO(10) GUTs, the righthanded (s)neutrino is an indispensable ingredient and does not have to be put in by hand as in the MSSM. We discuss

  4. An age-structured extension to the vectorial capacity model.

    Directory of Open Access Journals (Sweden)

    Vasiliy N Novoseltsev

    Full Text Available Vectorial capacity and the basic reproductive number (R(0 have been instrumental in structuring thinking about vector-borne pathogen transmission and how best to prevent the diseases they cause. One of the more important simplifying assumptions of these models is age-independent vector mortality. A growing body of evidence indicates that insect vectors exhibit age-dependent mortality, which can have strong and varied affects on pathogen transmission dynamics and strategies for disease prevention.Based on survival analysis we derived new equations for vectorial capacity and R(0 that are valid for any pattern of age-dependent (or age-independent vector mortality and explore the behavior of the models across various mortality patterns. The framework we present (1 lays the groundwork for an extension and refinement of the vectorial capacity paradigm by introducing an age-structured extension to the model, (2 encourages further research on the actuarial dynamics of vectors in particular and the relationship of vector mortality to pathogen transmission in general, and (3 provides a detailed quantitative basis for understanding the relative impact of reductions in vector longevity compared to other vector-borne disease prevention strategies.Accounting for age-dependent vector mortality in estimates of vectorial capacity and R(0 was most important when (1 vector densities are relatively low and the pattern of mortality can determine whether pathogen transmission will persist; i.e., determines whether R(0 is above or below 1, (2 vector population growth rate is relatively low and there are complex interactions between birth and death that differ fundamentally from birth-death relationships with age-independent mortality, and (3 the vector exhibits complex patterns of age-dependent mortality and R(0 ∼ 1. A limiting factor in the construction and evaluation of new age-dependent mortality models is the paucity of data characterizing vector mortality

  5. No Evidence for Extensions to the Standard Cosmological Model

    Science.gov (United States)

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-01

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).

  6. Electroweak baryogenesis in extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Fromme, L.

    2006-07-07

    We investigate the generation of the baryon asymmetry in two extensions of the Standard Model; these are the {phi}{sup 6} and the two-Higgs-doublet model. Analyzing the thermal potential in the presence of CP violation, we find a strong first order phase transition for a wide range of parameters in both models. We compute the relevant bubble wall properties which then enter the transport equations. In non-supersymmetric models electroweak baryogenesis is dominated by top transport, which we treat in the WKB approximation. We calculate the CP-violating source terms starting from the Dirac equation. We show how to resolve discrepancies between this treatment and the computation in the Schwinger-Keldysh formalism. Furthermore, we keep inelastic scatterings of quarks and W bosons at a finite rate, which considerably affects the amount of the generated baryon asymmetry depending on the bubble wall velocity. In addition, we improve the transport equations by novel source terms which are generated by CP-conserving perturbations in the plasma. It turns out that their effect is relatively small. Both models under consideration predict a baryon to entropy ratio close to the observed value for a large part of the parameter space without being in conflict with constraints on electric dipole moments. (orig.)

  7. Extensions to the energy system GMM model: An overview

    Energy Technology Data Exchange (ETDEWEB)

    Barreto, L.; Kypreos, S

    2006-09-15

    This report describes recent extensions to the energy-systems GMM (Global Multiregional MARKAL) model undertaken by the Energy Economics Group (EEG) of the Paul Scherrer Institute (PSI) in Switzerland (hereon referred to as PSI-EEG) in the context of the SAPIENTIA project sponsored by the European Commission (DG Research) and the Swiss National Centre for Competence in Research on Climate (NCCR-Climate). GMM is a multi-regional 'bottom-up' energy-systems optimization model that endogenizes technology learning. The model has been developed and is used at PSI-EEG. The main extensions undertaken here concern the incorporation of a clusters approach to technology learning, the introduction of an improved representation of the transportation sector with emphasis on the passenger sub-sector and the implementation of marginal abatement curves for CH{sub 4} and N{sub 2}O, two main non-CO{sub 2} greenhouse gases. Also, a linear representation of the atmospheric concentration of CO{sub 2}, CH{sub 4} and N{sub 2}O has been included. Other changes are related to the inclusion of additional technologies for production of synthetic fuels (hydrogen and Fischer- Tropsch liquids) and the inclusion of CO{sub 2} capture in fossil-based and biomass-based hydrogen production. Several of the developments described here follow the work of Turton and Barreto (2004, 2006) for the ERIS model at the Environmentally Compatible Energy Strategies (ECS) Program of IIASA. The remainder of this report is organized as follows. Section 2 describes the basic structure of the GMM model, the main assumptions for the scenario developed and the basic approach to endogenize technology learning in the model and examine the effects of R+D and D+D programs. Section 3 discusses the implementation of technology clusters and describes the key components chosen here. Section 4 presents the improvements to the transportation sector with emphasis on the passenger car subsector. Section 5 briefly

  8. Extensions to the energy system GMM model: An overview

    International Nuclear Information System (INIS)

    Barreto, L.; Kypreos, S.

    2006-09-01

    This report describes recent extensions to the energy-systems GMM (Global Multiregional MARKAL) model undertaken by the Energy Economics Group (EEG) of the Paul Scherrer Institute (PSI) in Switzerland (hereon referred to as PSI-EEG) in the context of the SAPIENTIA project sponsored by the European Commission (DG Research) and the Swiss National Centre for Competence in Research on Climate (NCCR-Climate). GMM is a multi-regional 'bottom-up' energy-systems optimization model that endogenizes technology learning. The model has been developed and is used at PSI-EEG. The main extensions undertaken here concern the incorporation of a clusters approach to technology learning, the introduction of an improved representation of the transportation sector with emphasis on the passenger sub-sector and the implementation of marginal abatement curves for CH 4 and N 2 O, two main non-CO 2 greenhouse gases. Also, a linear representation of the atmospheric concentration of CO 2 , CH 4 and N 2 O has been included. Other changes are related to the inclusion of additional technologies for production of synthetic fuels (hydrogen and Fischer- Tropsch liquids) and the inclusion of CO 2 capture in fossil-based and biomass-based hydrogen production. Several of the developments described here follow the work of Turton and Barreto (2004, 2006) for the ERIS model at the Environmentally Compatible Energy Strategies (ECS) Program of IIASA. The remainder of this report is organized as follows. Section 2 describes the basic structure of the GMM model, the main assumptions for the scenario developed and the basic approach to endogenize technology learning in the model and examine the effects of R+D and D+D programs. Section 3 discusses the implementation of technology clusters and describes the key components chosen here. Section 4 presents the improvements to the transportation sector with emphasis on the passenger car subsector. Section 5 briefly describes the new technologies for synthetic fuel

  9. Consumer Attitude Towards Brand Extensions : An Integrative model and research propositions

    OpenAIRE

    Czellar, Sandor

    2002-01-01

    The paper proposes an integrative model of the antecedents and consequences of brand extension attitude based on the dominant cognitive paradigm. The four key processes of the model are : (1) the perception of fit, (2) the formation of primary attitudes towards the extension, (3) the link between extension attitude and marketplace behaviour and (4) the reciprocal effect of brand extension attitude on parent brand/extension category attitude. Moderator and control variables of these processes ...

  10. A business model framework for product life extension

    NARCIS (Netherlands)

    Den Hollander, M.C.; Bakker, C.A.

    2012-01-01

    Product life extension is an increase in the utilization period of products. Design research on product life extension strategies has so far mainly focused on technical aspects of products, like ‘prevention engineering’ or ‘design for repair, maintenance and upgradability’, and on individual

  11. Validation and extension of the reward-mountain model.

    Science.gov (United States)

    Breton, Yannick-André; Mullett, Ada; Conover, Kent; Shizgal, Peter

    2013-01-01

    The reward-mountain model relates the vigor of reward seeking to the strength and cost of reward. Application of this model provides information about the stage of processing at which manipulations such as drug administration, lesions, deprivation states, and optogenetic interventions act to alter reward seeking. The model has been updated by incorporation of new information about frequency following in the directly stimulated neurons responsible for brain stimulation reward and about the function that maps objective opportunity costs into subjective ones. The behavioral methods for applying the model have been updated and improved as well. To assess the impact of these changes, two related predictions of the model that were supported by earlier work have been retested: (1) altering the duration of rewarding brain stimulation should change the pulse frequency required to produce a reward of half-maximal intensity, and (2) this manipulation should not change the opportunity cost at which half-maximal performance is directed at earning a maximally intense reward. Prediction 1 was supported in all six subjects, but prediction 2 was supported in only three. The latter finding is interpreted to reflect recruitment, at some stimulation sites, of a heterogeneous reward substrate comprising dual, parallel circuits that integrate the stimulation-induced neural signals.

  12. Extension of local front reconstruction method with controlled coalescence model

    Science.gov (United States)

    Rajkotwala, A. H.; Mirsandi, H.; Peters, E. A. J. F.; Baltussen, M. W.; van der Geld, C. W. M.; Kuerten, J. G. M.; Kuipers, J. A. M.

    2018-02-01

    The physics of droplet collisions involves a wide range of length scales. This poses a challenge to accurately simulate such flows with standard fixed grid methods due to their inability to resolve all relevant scales with an affordable number of computational grid cells. A solution is to couple a fixed grid method with subgrid models that account for microscale effects. In this paper, we improved and extended the Local Front Reconstruction Method (LFRM) with a film drainage model of Zang and Law [Phys. Fluids 23, 042102 (2011)]. The new framework is first validated by (near) head-on collision of two equal tetradecane droplets using experimental film drainage times. When the experimental film drainage times are used, the LFRM method is better in predicting the droplet collisions, especially at high velocity in comparison with other fixed grid methods (i.e., the front tracking method and the coupled level set and volume of fluid method). When the film drainage model is invoked, the method shows a good qualitative match with experiments, but a quantitative correspondence of the predicted film drainage time with the experimental drainage time is not obtained indicating that further development of film drainage model is required. However, it can be safely concluded that the LFRM coupled with film drainage models is much better in predicting the collision dynamics than the traditional methods.

  13. Development of procedural requirements for life extension of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Hoon; Son, Moon Kyu [Korea Association for Nuclear Technology, Taejon (Korea, Republic of); Jeong, Ji Hwan [Baekseok College, Cheonan (Korea, Republic of); Chang, Keun Sun [Sunmoon Univ., Asan (Korea, Republic of); Ham, Chul Hoon [The Catholic University of Korea, Seoul (Korea, Republic of); Chang, Soon Hong [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2002-03-15

    Technical issues relevant to life extension of NPP were investigated. The GALL report, domestic PSR and periodic inspection rules were reviewed. Technical issues appearing in the safety evaluation reports related to license renewal of Calvert Ciffs 1 and 2 and Qconee 1,2 and 3 NPPs were reviewed. Preliminary study on PSA usage in NPP life extension assessment was performed and further works were suggested. The environment of rules and regulations was analyzed from the viewpoint of plant life extension. Two alternatives are suggested to revise the current domestic nuclear acts.

  14. Development of procedural requirements for life extension of nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Sang Hoon; Son, Moon Kyu; Jeong, Ji Hwan; Chang, Keun Sun; Ham, Chul Hoon; Chang, Soon Hong

    2002-03-01

    Technical issues relevant to life extension of NPP were investigated. The GALL report, domestic PSR and periodic inspection rules were reviewed. Technical issues appearing in the safety evaluation reports related to license renewal of Calvert Ciffs 1 and 2 and Qconee 1,2 and 3 NPPs were reviewed. Preliminary study on PSA usage in NPP life extension assessment was performed and further works were suggested. The environment of rules and regulations was analyzed from the viewpoint of plant life extension. Two alternatives are suggested to revise the current domestic nuclear acts

  15. 76 FR 52693 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2011-08-23

    ... Examinee, Work Experience and Career Exploration (WECEP) Regulations, 29 CFR 570.35a. A copy of the... Review: Extension. Agency: Wage and Hour Division. Titles: Work Experience and Career Exploration...

  16. An Innovative Model to Estimate Fracture Extensions. | Adeniji ...

    African Journals Online (AJOL)

    Hydraulic fracturing is a Well intervention program, designed to create fracture(s) within a reservoir system and hopefully, extend the volumes of these fractures, to facilitate improved recovery of in-situ fluid(s). This paper presents mathematical equations in dimensionless forms, to rapidly estimate the fracture extension and ...

  17. Bringing the DuPont Profitability Model to Extension

    Science.gov (United States)

    Roucan-Kane, Maud; Wolfskill, L. A.; Boehlje, Michael D.; Gray, Allan W.

    2013-01-01

    This article discusses a financial training program used by Deere and Company for almost 10 years. The objective is to describe the program and to discuss a pre-test/post-test methodology to test the effectiveness of a program for possible duplication by Extension. Results show that participants significantly improved from the pre-test to the…

  18. Model 9975 Life Extension Package 1 - Final Report

    International Nuclear Information System (INIS)

    Daugherty, W.

    2011-01-01

    Life extension package LE1 (9975-03382) was instrumented and subjected to a temperature/humidity environment that bounds KAMS package storage conditions for 92 weeks. During this time, the maximum fiberboard temperature was ∼180 F, and was established by a combination of internal heat (12 watts) and external heat (∼142 F). The relative humidity external to the package was maintained at 80 %RH. This package was removed from test in November 2010 after several degraded conditions were observed during a periodic examination. These conditions included degraded fiberboard (easily broken, bottom layer stuck to the drum), corrosion of the drum, and separation of the air shield from the upper fiberboard assembly. Several tests and parameters were used to characterize the package components. Results from these tests generally indicate agreement between this full-scale shipping package and small-scale laboratory tests on fiberboard and O-ring samples. These areas of agreement include the rate of fiberboard weight loss, change in fiberboard thermal conductivity, fiberboard compression strength, and O-ring compression set. In addition, this package provides an example of the extent to which moisture within the fiberboard can redistribute in the presence of a temperature gradient such as might be created by a 12 watt internal heat load. Much of the moisture near the fiberboard ID surface migrated towards the OD surface, but there was not a significant axial moisture gradient during most of the test duration. Only during the last inspection period (i.e. after 92 weeks exposure during the second phase) did enough moisture migrate to the bottom fiberboard layers to cause saturation. A side effect of moisture migration is the leaching of soluble compounds from the fiberboard. In particular, the corrosion observed on the drum appears related primarily to the leaching and concentration of chlorides. In most locations, this attack appears to be general corrosion, with shallow

  19. A repository based on a dynamically extensible data model supporting multidisciplinary research in neuroscience.

    Science.gov (United States)

    Corradi, Luca; Porro, Ivan; Schenone, Andrea; Momeni, Parastoo; Ferrari, Raffaele; Nobili, Flavio; Ferrara, Michela; Arnulfo, Gabriele; Fato, Marco M

    2012-10-08

    Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i) supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii) handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii) providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of "meta" data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach. Finally, data integration aspects have been

  20. A repository based on a dynamically extensible data model supporting multidisciplinary research in neuroscience

    Directory of Open Access Journals (Sweden)

    Corradi Luca

    2012-10-01

    Full Text Available Abstract Background Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. Methods A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of “meta” data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach

  1. The Extension Model of Sustainable Management of Industrial Enterprises

    Directory of Open Access Journals (Sweden)

    Aleksandr N. Kuzminov

    2016-09-01

    Full Text Available The paper discusses the problem of assessing the sustainability of an industrial enterprise in the context of a balanced use of limited resources. As a conceptual model approach is adopted to «3S» consider enterprise as a space that allows to combine different approaches to the sustainable development of enterprises at all stages of the life cycle. It is shown that the stability of viewing each space element is methodologically advisable to rely on system constraints, describing the boundary condition of temporary equilibrium, the achievement of which the local period synergistically causes some balance the allocation of limited resources of the enterprise. Financial stability, reflecting the nominal effective use of resources at all stages of the life cycle can be one of the proxy indicators for rapid assessment of the interaction of all subsystems, which optimality criterion proposed system coenoses restrictions. The above statement of the problem made it possible to formulate the basic requirements for the content criteria-based device diagnosis of possible states and justify the use coenosis sustainability as a synthetic approach that allows mathematically describe the self-organizing systems in the dynamics within the limits of survival. The algorithm of the mathematical and statistical evaluation of the financial resources of states, reflecting the degree of stability of the company as a consumer of scarce resources over time is offered.

  2. PLEXFIN a computer model for the economic assessment of nuclear power plant life extension. User's manual

    International Nuclear Information System (INIS)

    2007-01-01

    The IAEA developed PLEXFIN, a computer model analysis tool aimed to assist decision makers in the assessment of the economic viability of a nuclear power plant life/licence extension. This user's manual was produced to facilitate the application of the PLEXFIN computer model. It is widely accepted in the industry that the operational life of a nuclear power plant is not limited to a pre-determined number of years, sometimes established on non-technical grounds, but by the capability of the plant to comply with the nuclear safety and technical requirements in a cost effective manner. The decision to extend the license/life of a nuclear power plant involves a number of political, technical and economic issues. The economic viability is a cornerstone of the decision-making process. In a liberalized electricity market, the economics to justify a nuclear power plant life/license extension decision requires a more complex evaluation. This user's manual was elaborated in the framework of the IAEA's programmes on Continuous process improvement of NPP operating performance, and on Models for analysis and capacity building for sustainable energy development, with the support of four consultants meetings

  3. Meta-requirements that Model Change

    OpenAIRE

    Gouri Prakash

    2010-01-01

    One of the common problems encountered in software engineering is addressing and responding to the changing nature of requirements. While several approaches have been devised to address this issue, ranging from instilling resistance to changing requirements in order to mitigate impact to project schedules, to developing an agile mindset towards requirements, the approach discussed in this paper is one of conceptualizing the delta in requirement and modeling it, in order t...

  4. An evaluation of information sources and requirements for nuclear plant-aging research with life-extension implications

    International Nuclear Information System (INIS)

    Jacobs, P.T.

    1986-01-01

    Information requirements for plant-aging and life-extension research are discussed. Various information sources that have been used in plant-aging studies and reliability assessments are described. Data-base searches and analyses were performed for a specific system using several data bases and plant sources. Comments are provided on the results using the various information sources

  5. 75 FR 36444 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2010-06-25

    ... be provided in the desired format, reporting burden (time and financial resources) is minimized... DEPARTMENT OF LABOR Wage and Hour Division Proposed Extension of the Approval of Information... comment on proposed and/or continuing collections of information in accordance with the Paperwork...

  6. 76 FR 48181 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2011-08-08

    ... be provided in a desired format, reporting burden (time and financial resources) is minimized... DEPARTMENT OF LABOR Wage and Hour Division Proposed Extension of the Approval of Information... comment on proposed and/or continuing collections of information in accordance with the Paperwork...

  7. 76 FR 15348 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2011-03-21

    ... be provided in the desired format, reporting burden (time and financial resources) is minimized... DEPARTMENT OF LABOR Wage and Hour Division Proposed Extension of the Approval of Information... comment on proposed and/or continuing collections of information in accordance with the Paperwork...

  8. 76 FR 60086 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2011-09-28

    ... be provided in a desired format, reporting burden (time and financial resources) is minimized... DEPARTMENT OF LABOR Wage and Hour Division Proposed Extension of the Approval of Information... comment on proposed and/or continuing collections of information in accordance with the Paperwork...

  9. 8 CFR 214.1 - Requirements for admission, extension, and maintenance of status.

    Science.gov (United States)

    2010-01-01

    ... maintenance of status. 214.1 Section 214.1 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY IMMIGRATION... Security. The passport of an alien applying for extension of stay must be valid at the time of application... biometric identifiers, may constitute a failure of the alien to maintain the terms of his or her...

  10. Extensions and Applications of the Cox-Aalen Survival Model

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2003-01-01

    Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects......Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects...

  11. An extension of clarke's model with stochastic amplitude flip processes

    KAUST Repository

    Hoel, Hakon

    2014-07-01

    Stochastic modeling is an essential tool for studying statistical properties of wireless channels. In multipath fading channel (MFC) models, the signal reception is modeled by a sum of wave path contributions, and Clarke\\'s model is an important example of such which has been widely accepted in many wireless applications. However, since Clarke\\'s model is temporally deterministic, Feng and Field noted that it does not model real wireless channels with time-varying randomness well. Here, we extend Clarke\\'s model to a novel time-varying stochastic MFC model with scatterers randomly flipping on and off. Statistical properties of the MFC model are analyzed and shown to fit well with real signal measurements, and a limit Gaussian process is derived from the model when the number of active wave paths tends to infinity. A second focus of this work is a comparison study of the error and computational cost of generating signal realizations from the MFC model and from its limit Gaussian process. By rigorous analysis and numerical studies, we show that in many settings, signal realizations are generated more efficiently by Gaussian process algorithms than by the MFC model\\'s algorithm. Numerical examples that strengthen these observations are also presented. © 2014 IEEE.

  12. Lifespan extension by cranberry supplementation partially requires SOD2 and is life stage independent.

    Science.gov (United States)

    Sun, Yaning; Yolitz, Jason; Alberico, Thomas; Sun, Xiaoping; Zou, Sige

    2014-02-01

    Many nutraceuticals and pharmaceuticals have been shown to promote healthspan and lifespan. However, the mechanisms underlying the beneficial effects of prolongevity interventions and the time points at which interventions should be implemented to achieve beneficial effects are not well characterized. We have previously shown that a cranberry-containing nutraceutical can promote lifespan in worms and flies and delay age-related functional decline of pancreatic cells in rats. Here we investigated the mechanism underlying lifespan extension induced by cranberry and the effects of short-term or life stage-specific interventions with cranberry on lifespan in Drosophila. We found that lifespan extension induced by cranberry was associated with reduced phosphorylation of ERK, a component of oxidative stress response MAPK signaling, and slightly increased phosphorylation of AKT, a component of insulin-like signaling. Lifespan extension was also associated with a reduced level of 4-hydroxynonenal protein adducts, a biomarker of lipid oxidation. Moreover, lifespan extension induced by cranberry was partially suppressed by knockdown of SOD2, a major mitochondrial superoxide scavenger. Furthermore, cranberry supplementation was administered in three life stages of adult flies, health span (3-30 days), transition span (31-60 days) and senescence span (61 days to the end when all flies died). Cranberry supplementation during any of these life stages extended the remaining lifespan relative to the non-supplemented and life stage-matched controls. These findings suggest that cranberry supplementation is sufficient to promote longevity when implemented during any life stage, likely through reducing oxidative damage. Published by Elsevier Inc.

  13. Caloric Restriction-Induced Extension of Chronological Lifespan Requires Intact Respiration in Budding Yeast

    OpenAIRE

    Kwon, Young-Yon; Lee, Sung-Keun; Lee, Cheol-Koo

    2017-01-01

    Caloric restriction (CR) has been shown to extend lifespan and prevent cellular senescence in various species ranging from yeast to humans. Many effects of CR may contribute to extend lifespan. Specifically, CR prevents oxidative damage from reactive oxygen species (ROS) by enhancing mitochondrial function. In this study, we characterized 33 single electron transport chain (ETC) gene-deletion strains to identify CR-induced chronological lifespan (CLS) extension mechanisms. Interestingly, defe...

  14. Caloric Restriction-Induced Extension of Chronological Lifespan Requires Intact Respiration in Budding Yeast.

    Science.gov (United States)

    Kwon, Young-Yon; Lee, Sung-Keun; Lee, Cheol-Koo

    2017-04-01

    Caloric restriction (CR) has been shown to extend lifespan and prevent cellular senescence in various species ranging from yeast to humans. Many effects of CR may contribute to extend lifespan. Specifically, CR prevents oxidative damage from reactive oxygen species (ROS) by enhancing mitochondrial function. In this study, we characterized 33 single electron transport chain (ETC) gene-deletion strains to identify CR-induced chronological lifespan (CLS) extension mechanisms. Interestingly, defects in 17 of these 33 ETC gene-deleted strains showed loss of both respiratory function and CR-induced CLS extension. On the contrary, the other 16 respiration-capable mutants showed increased CLS upon CR along with increased mitochondrial membrane potential (MMP) and intracellular adenosine triphosphate (ATP) levels, with decreased mitochondrial superoxide generation. We measured the same parameters in the 17 non-respiratory mutants upon CR. CR simultaneously increased MMP and mitochondrial superoxide generation without altering intracellular ATP levels. In conclusion, respiration is essential for CLS extension by CR and is important for balancing MMP, ROS, and ATP levels.

  15. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Economic Impact of e-Velanmai Model of Extension Service

    Directory of Open Access Journals (Sweden)

    Duraisamy Prabha

    2016-07-01

    Full Text Available A study was carried out to assess the impact of e-Velanmai project, which is an ICT (Information and Communication Technology enabled extension service implemented by Tamil Nadu Agricultural University, in three districts viz., Coimbatore, Tirupur and Villupuram of Tamil Nadu state, with 180 farmer respondents (90 e-Velanmai beneficiaries and 90 non-beneficiaries. Partial budgeting analysis revealed that the beneficiary respondentswith respect to yield, it is noticed that the yield value of beneficiaries was higher than that of the non-beneficiaries. As a result the net-gain for the beneficiaries was Rs. 28,481 per acre. With respect to the constraints faced by beneficiaries, an overwhelming percentage (94.40 % of the beneficiary respondents had expressed that they faced no constraints, while a small percentage (5.50 % indicated that there is no direct contact with TNAU Scientists’, and no follow-up visit by Field Coordinators after giving advice (1.10 %.

  17. No evidence for extensions to the standard cosmological model

    CSIR Research Space (South Africa)

    Heavens, A

    2017-09-01

    Full Text Available is not included. Many alternative models are strongly disfavoured by the data, including primordial correlated isocurvature models (lnB=-7.8), non-zero scalar-to-tensor ratio (lnB=-4.3), running of the spectral index (lnB=-4.7), curvature (lnB=-3.6), non...

  18. Review of Model Predictions for Extensive Air Showers

    Science.gov (United States)

    Pierog, Tanguy

    In detailed air shower simulations, the uncertainty in the prediction of shower observable for different primary particles and energies is currently dominated by differences between hadronic interaction models. With the results of the first run of the LHC, the difference between post-LHC model predictions has been reduced at the same level as experimental uncertainties of cosmic ray experiments. At the same time new types of air shower observables, like the muon production depth, have been measured, adding new constraints on hadronic models. Currently no model is able to reproduce consistently all mass composition measurements possible with the Pierre Auger Observatory for instance. We review the current model predictions for various particle production observables and their link with air shower observables and discuss the future possible improvements.

  19. 76 FR 28242 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2011-05-16

    ... desired format, reporting burden (time and financial resources) is minimized, collection instruments are... Management and Budget (OMB) approval of the Information Collection: Regulations 29 CFR part 547, Requirements of a ``Bona Fide Thrift or Savings Plan'' and Regulations 29 CFR part 549, Requirements of a ``Bona...

  20. An extensive comparison of species-abundance distribution models.

    Science.gov (United States)

    Baldridge, Elita; Harris, David J; Xiao, Xiao; White, Ethan P

    2016-01-01

    A number of different models have been proposed as descriptions of the species-abundance distribution (SAD). Most evaluations of these models use only one or two models, focus on only a single ecosystem or taxonomic group, or fail to use appropriate statistical methods. We use likelihood and AIC to compare the fit of four of the most widely used models to data on over 16,000 communities from a diverse array of taxonomic groups and ecosystems. Across all datasets combined the log-series, Poisson lognormal, and negative binomial all yield similar overall fits to the data. Therefore, when correcting for differences in the number of parameters the log-series generally provides the best fit to data. Within individual datasets some other distributions performed nearly as well as the log-series even after correcting for the number of parameters. The Zipf distribution is generally a poor characterization of the SAD.

  1. An extensive comparison of species-abundance distribution models

    Directory of Open Access Journals (Sweden)

    Elita Baldridge

    2016-12-01

    Full Text Available A number of different models have been proposed as descriptions of the species-abundance distribution (SAD. Most evaluations of these models use only one or two models, focus on only a single ecosystem or taxonomic group, or fail to use appropriate statistical methods. We use likelihood and AIC to compare the fit of four of the most widely used models to data on over 16,000 communities from a diverse array of taxonomic groups and ecosystems. Across all datasets combined the log-series, Poisson lognormal, and negative binomial all yield similar overall fits to the data. Therefore, when correcting for differences in the number of parameters the log-series generally provides the best fit to data. Within individual datasets some other distributions performed nearly as well as the log-series even after correcting for the number of parameters. The Zipf distribution is generally a poor characterization of the SAD.

  2. A no extensive statistical model for the nucleon structure function

    Energy Technology Data Exchange (ETDEWEB)

    Trevisan, Luis A. [Departamento de Matematica e Estatistica, Universidade Estadual de Ponta Grossa, 84010-790, Ponta Grossa, PR (Brazil); Mirez, Carlos [Instituto de Ciencia, Engenharia e Tecnologia - ICET, Universidade Federal dos Vales do Jequitinhonha e Mucuri - UFVJM, Campus do Mucuri, Rua do Cruzeiro 01, Jardim Sao Paulo, 39803-371, Teofilo Otoni, Minas Gerais (Brazil)

    2013-03-25

    We studied an application of nonextensive thermodynamics to describe the structure function of nucleon, in a model where the usual Fermi-Dirac and Bose-Einstein energy distribution were replaced by the equivalent functions of the q-statistical. The parameters of the model are given by an effective temperature T, the q parameter (from Tsallis statistics), and two chemical potentials given by the corresponding up (u) and down (d) quark normalization in the nucleon.

  3. Some Extensions of the Nerlove-Press Model.

    Science.gov (United States)

    1980-10-01

    17- REFERENCES 1. Goodman, L. A., "A General Model for the Analysis of Surveys," American Journal of Sociology, Vol. 77, No. 6, May 1972, pp. 1035...Vol. 5, No. 4 pp. 525-545. 3. Huthin, B. (1979), "A Structural Probit Model with Latent Variables," Journal of the American Statistical Association...Rloot ThKKor of Sientif ic Pulication mint Scienc. 9 May 1977. San Fraisco, Cali- summer Entilenment on Duration and job Search Prmductivlvy

  4. 78 FR 58267 - Notice of Request for Extension of Approval of an Information Collection; Requirements for...

    Science.gov (United States)

    2013-09-23

    ... Ramirez, Senior Staff Veterinarian, NCIE, APHIS, 4700 River Road Unit 40, Riverdale, MD 20737; (301) 851... carry the USDA seal and be endorsed by an authorized APHIS veterinarian. In addition, APHIS requires...

  5. A generic simulation cell method for developing extensible, efficient and readable parallel computational models

    Science.gov (United States)

    Honkonen, I.

    2015-03-01

    I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring modification of existing code. This is an advantage for the development and testing of, e.g., geoscientific software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. An implementation of the generic simulation cell method presented here, generic simulation cell class (gensimcell), also includes support for parallel programming by allowing model developers to select which simulation variables of, e.g., a domain-decomposed model to transfer between processes via a Message Passing Interface (MPI) library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class requires a C++ compiler that supports a version of the language standardized in 2011 (C++11). The code is available at https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those who do are kindly requested to acknowledge and cite this work.

  6. A renormalizable extension of the NJL-model

    International Nuclear Information System (INIS)

    Langfeld, K.; Kettner, C.; Reinhardt, H.

    1996-01-01

    The Nambu-Jona-Lasinio model is supplemented by the quark interaction generated by the one-gluon exchange. The employed gluon propagator exhibits the correct large-momentum behavior of QCD, whereas the Landau pole at low energies is screened. The emerging constituent quark model is one-loop renormalizable and interpolates between the phenomenologically successful Nambu-Jona-Lasinio model (modified by a transversal projector) at low energies and perturbative QCD at high momenta. Consequently, the momentum dependence of the quark self-energy at high energy coincides with the prediction from perturbative QCD. The chiral phase transition is studied in dependence on the low-energy four-quark interaction strength in the Dyson-Schwinger equation approach. The critical exponents of the quark self-energy and the quark condensate are obtained. The latter exponent deviates from the NJL-result. Pion properties are addressed by means of the Bethe-Salpeter equation. The validity of the Gell-Mann-Oakes-Renner relation is verified. Finally, we study the conditions under which the Nambu-Jona-Lasinio model is a decent approximation to our renormalizable theory as well as the shortcoming of the NJL-model due to its inherent non-renormalizability. (orig.)

  7. Biosocial models of adolescent problem behavior: extension to panel design.

    Science.gov (United States)

    Drigotas, S M; Udry, J R

    1993-01-01

    We extended the biosocial model of problem behavior tested by Udry (1990) to a panel design, following a sample of over one hundred boys in adolescence for three years. We found the expected results for sociological variables, but weaker effects for testosterone than Udry found on cross-sectional data. Using panel models with lagged hormone effects, we identified relationships between Time-1 testosterone and problem behavior one year or more later. The relationship between testosterone and problem behavior was not present for subsequent measures of testosterone, either in cross-section or with time-lagged models. Therefore we cannot interpret the results as showing testosterone effects on problem behavior. Rather it appears that testosterone level in early adolescence is a marker for a more general growth trajectory of early development.

  8. THE MISHKIN TEST: AN ANALYSIS OF MODEL EXTENSIONS

    Directory of Open Access Journals (Sweden)

    Diana MURESAN

    2015-04-01

    Full Text Available This paper reviews empirical research that apply Mishkin test for the examination of the existence of accruals anomaly using alternative approaches. Mishkin test is a test used in macro-econometrics for rational hypothesis, which test for the market efficiency. Starting with Sloan (1996 the model has been applied to accruals anomaly literature. Since Sloan (1996, the model has known various improvements and it has been the subject to many debates in the literature regarding its efficacy. Nevertheless, the current evidence strengthens the pervasiveness of the model. The analyses realized on the extended studies on Mishkin test highlights that adding additional variables enhances the results, providing insightful information about the occurrence of accruals anomaly.

  9. An extension of the multiple-trapping model

    International Nuclear Information System (INIS)

    Shkilev, V. P.

    2012-01-01

    The hopping charge transport in disordered semiconductors is considered. Using the concept of the transport energy level, macroscopic equations are derived that extend a multiple-trapping model to the case of semiconductors with both energy and spatial disorders. It is shown that, although both types of disorder can cause dispersive transport, the frequency dependence of conductivity is determined exclusively by the spatial disorder.

  10. Extensions of Scott's Graph Model and Kleene's Second Algebra

    NARCIS (Netherlands)

    van Oosten, J.; Voorneveld, Niels

    We use a way to extend partial combinatory algebras (pcas) by forcing them to represent certain functions. In the case of Scott’s Graph Model, equality is computable relative to the complement function. However, the converse is not true. This creates a hierarchy of pcas which relates to similar

  11. Quark matter revisited with non-extensive MIT bag model

    Science.gov (United States)

    Cardoso, Pedro H. G.; Nunes da Silva, Tiago; Deppman, Airton; Menezes, Débora P.

    2017-10-01

    In this work we revisit the MIT bag model to describe quark matter within both the usual Fermi-Dirac and the Tsallis statistics. We verify the effects of the non-additivity of the latter by analysing two different pictures: the first order phase transition of the QCD phase diagram and stellar matter properties. While the QCD phase diagram is visually affected by the Tsallis statistics, the resulting effects on quark star macroscopic properties are barely noticed.

  12. Quark matter revisited with non-extensive MIT bag model

    Energy Technology Data Exchange (ETDEWEB)

    Cardoso, Pedro H.G.; Nunes da Silva, Tiago; Menezes, Debora P. [Universidade Federal de Santa Catarina, Departamento de Fisica, CFM, Florianopolis (Brazil); Deppman, Airton [Instituto de Fisica da Universidade de Sao Paulo, Sao Paulo (Brazil)

    2017-10-15

    In this work we revisit the MIT bag model to describe quark matter within both the usual Fermi-Dirac and the Tsallis statistics. We verify the effects of the non-additivity of the latter by analysing two different pictures: the first order phase transition of the QCD phase diagram and stellar matter properties. While the QCD phase diagram is visually affected by the Tsallis statistics, the resulting effects on quark star macroscopic properties are barely noticed. (orig.)

  13. Turbulence Modeling of Flows with Extensive Crossflow Separation

    Directory of Open Access Journals (Sweden)

    Argyris G. Panaras

    2015-07-01

    Full Text Available The reasons for the difficulty in simulating accurately strong 3-D shock wave/turbulent boundary layer interactions (SBLIs and high-alpha flows with classical turbulence models are investigated. These flows are characterized by the appearance of strong crossflow separation. In view of recent additional evidence, a previously published flow analysis, which attributes the poor performance of classical turbulence models to the observed laminarization of the separation domain, is reexamined. According to this analysis, the longitudinal vortices into which the separated boundary layer rolls up in this type of separated flow, transfer external inviscid air into the part of the separation adjacent to the wall, decreasing its turbulence. It is demonstrated that linear models based on the Boussinesq equation provide solutions of moderate accuracy, while non-linear ones and others that consider the particular structure of the flow are more efficient. Published and new Reynolds Averaged Navier–Stokes (RANS simulations are reviewed, as well as results from a recent Large Eddy Simulation (LES study, which indicate that in calculations characterized by sufficient accuracy the turbulent kinetic energy of the reverse flow inside the separation vortices is very low, i.e., the flow is almost laminar there.

  14. 76 FR 4946 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2011-01-27

    ... that requested data can be provided in the desired format, reporting burden (time and financial... Information Collection Requirements AGENCY: Wage and Hour Division, Department of Labor. ACTION: Notice... with an opportunity to comment on proposed and/or continuing collections of information in accordance...

  15. 78 FR 16299 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2013-03-14

    ... that requested data can be provided in the desired format, reporting burden (time and financial... Information Collection Requirements AGENCY: Wage and Hour Division, Department of Labor. ACTION: Notice... with an opportunity to comment on proposed and/or continuing collections of information in accordance...

  16. 75 FR 57296 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2010-09-20

    ... units earned, if paid on a piece work basis; (3) Number of hours worked; (4) Total pay period earnings... also required to provide an itemized written statement of this information to each migrant and seasonal... of such records is to retain them for a period of three years. Respondents must also make and keep...

  17. 8 CFR 214.2 - Special requirements for admission, extension, and maintenance of status.

    Science.gov (United States)

    2010-01-01

    ... performed; the salary offered; and verification that the dependent possesses the qualifications for the... pursuant to the North American Free Trade Agreement (NAFTA). A citizen of Canada or Mexico seeking..., with respect to Mexico, are those requirements which were in effect at the time of entry into force of...

  18. 77 FR 67026 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2012-11-08

    ... Department of Labor, to laborers and mechanics on most federally financed or assisted construction projects... employment of laborers or mechanics. The requirements of this information collection consist of: (1) Reports...): $3996. Total Burden Costs (operation/maintenance): $54,732. Dated: October 31, 2012. Mary Ziegler...

  19. 77 FR 74225 - Proposed Extension of the Approval of Information Collection Requirements

    Science.gov (United States)

    2012-12-13

    ... format, reporting burden (time and financial resources) is minimized, collection instruments are clearly... for these records. The regulations set forth reporting requirements that include a Work Study Program... Regulations (WSP) Regulations 29 CFR Section 570.35b. A copy of the proposed information request can be...

  20. Anisotropic extension of Finch and Skea stellar model

    Science.gov (United States)

    Sharma, Ranjan; Das, Shyam; Thirukkanesh, S.

    2017-12-01

    In this paper, the spacetime geometry of Finch and Skea [Class. Quantum Gravity 6:467, 1989] has been utilized to obtain closed-form solutions for a spherically symmetric anisotropic matter distribution. By examining its physical admissibility, we have shown that the class of solutions can be used as viable models for observed pulsars. In particular, a specific class of solutions can be used as an `anisotropic switch' to examine the impact of anisotropy on the gross physical properties of a stellar configuration. Accordingly, the mass-radius relationship has been analyzed.

  1. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  2. Twin support vector machines models, extensions and applications

    CERN Document Server

    Jayadeva; Chandra, Suresh

    2017-01-01

    This book provides a systematic and focused study of the various aspects of twin support vector machines (TWSVM) and related developments for classification and regression. In addition to presenting most of the basic models of TWSVM and twin support vector regression (TWSVR) available in the literature, it also discusses the important and challenging applications of this new machine learning methodology. A chapter on “Additional Topics” has been included to discuss kernel optimization and support tensor machine topics, which are comparatively new but have great potential in applications. It is primarily written for graduate students and researchers in the area of machine learning and related topics in computer science, mathematics, electrical engineering, management science and finance.

  3. Extensible 3D (X3D) Earth Technical Requirements Workshop Summary Report

    Science.gov (United States)

    2007-08-01

    schemes • They already have technologies of choice, economic imperatives and business models • Viva la difference • Some commercial approaches may...display from simulation in a city 8. Community-provided object authoring o Provide an easy art path for users to create content o

  4. Tests of local Lorentz invariance violation of gravity in the standard model extension with pulsars.

    Science.gov (United States)

    Shao, Lijing

    2014-03-21

    The standard model extension is an effective field theory introducing all possible Lorentz-violating (LV) operators to the standard model and general relativity (GR). In the pure-gravity sector of minimal standard model extension, nine coefficients describe dominant observable deviations from GR. We systematically implemented 27 tests from 13 pulsar systems to tightly constrain eight linear combinations of these coefficients with extensive Monte Carlo simulations. It constitutes the first detailed and systematic test of the pure-gravity sector of minimal standard model extension with the state-of-the-art pulsar observations. No deviation from GR was detected. The limits of LV coefficients are expressed in the canonical Sun-centered celestial-equatorial frame for the convenience of further studies. They are all improved by significant factors of tens to hundreds with existing ones. As a consequence, Einstein's equivalence principle is verified substantially further by pulsar experiments in terms of local Lorentz invariance in gravity.

  5. Softened Gravity and the Extension of the Standard Model up to Infinite Energy

    CERN Document Server

    Giudice, Gian F; Salvio, Alberto; Strumia, Alessandro

    2015-01-01

    Attempts to solve naturalness by having the weak scale as the only breaking of classical scale invariance have to deal with two severe difficulties: gravity and the absence of Landau poles. We show that solutions to the first problem require premature modifications of gravity at scales no larger than $10^{11}$ GeV, while the second problem calls for many new particles at the weak scale. To build models that fulfil these properties, we classify 4-dimensional Quantum Field Theories that satisfy Total Asymptotic Freedom (TAF): the theory holds up to infinite energy, where all coupling constants flow to zero. We develop a technique to identify such theories and determine their low-energy predictions. Since the Standard Model turns out to be asymptotically free only under the unphysical conditions $g_1 = 0$, $M_t = 186$ GeV, $M_\\tau = 0$, $M_h = 163$ GeV, we explore some of its weak-scale extensions that satisfy the requirements for TAF.

  6. Modeling requirements for in situ vitrification

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom

  7. A digital repository with an extensible data model for biobanking and genomic analysis management

    Science.gov (United States)

    2014-01-01

    Motivation Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. Results We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Conclusions Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information

  8. Solar Storm GIC Forecasting: Solar Shield Extension Development of the End-User Forecasting System Requirements

    Science.gov (United States)

    Pulkkinen, A.; Mahmood, S.; Ngwira, C.; Balch, C.; Lordan, R.; Fugate, D.; Jacobs, W.; Honkonen, I.

    2015-01-01

    A NASA Goddard Space Flight Center Heliophysics Science Division-led team that includes NOAA Space Weather Prediction Center, the Catholic University of America, Electric Power Research Institute (EPRI), and Electric Research and Management, Inc., recently partnered with the Department of Homeland Security (DHS) Science and Technology Directorate (S&T) to better understand the impact of Geomagnetically Induced Currents (GIC) on the electric power industry. This effort builds on a previous NASA-sponsored Applied Sciences Program for predicting GIC, known as Solar Shield. The focus of the new DHS S&T funded effort is to revise and extend the existing Solar Shield system to enhance its forecasting capability and provide tailored, timely, actionable information for electric utility decision makers. To enhance the forecasting capabilities of the new Solar Shield, a key undertaking is to extend the prediction system coverage across Contiguous United States (CONUS), as the previous version was only applicable to high latitudes. The team also leverages the latest enhancements in space weather modeling capacity residing at Community Coordinated Modeling Center to increase the Technological Readiness Level, or Applications Readiness Level of the system http://www.nasa.gov/sites/default/files/files/ExpandedARLDefinitions4813.pdf.

  9. First Order Electroweak Phase Transition from (Non)Conformal Extensions of the Standard Model

    DEFF Research Database (Denmark)

    Sannino, Francesco; Virkajärvi, Jussi

    2015-01-01

    We analyse and compare the finite-temperature electroweak phase transition properties of classically (non)conformal extensions of the Standard Model. In the classically conformal scenarios the breaking of the electroweak symmetry is generated radiatively. The models feature new scalars coupled co...... the associated models are testable at the upcoming Large Hadron Collider run two experiments....

  10. Spatial Double Generalized Beta Regression Models: Extensions and Application to Study Quality of Education in Colombia

    Science.gov (United States)

    Cepeda-Cuervo, Edilberto; Núñez-Antón, Vicente

    2013-01-01

    In this article, a proposed Bayesian extension of the generalized beta spatial regression models is applied to the analysis of the quality of education in Colombia. We briefly revise the beta distribution and describe the joint modeling approach for the mean and dispersion parameters in the spatial regression models' setting. Finally, we motivate…

  11. TAF-4 is required for the life extension of isp-1, clk-1 and tpk-1 Mit mutants.

    Science.gov (United States)

    Khan, Maruf H; Ligon, Melissa; Hussey, Lauren R; Hufnal, Bryce; Farber, Robert; Munkácsy, Erin; Rodriguez, Amanda; Dillow, Andy; Kahlig, Erynn; Rea, Shane L

    2013-10-01

    While numerous life-extending manipulations have been discovered in the nematode Caenorhabditis elegans, one that remains most enigmatic is disruption of oxidative phosphorylation. In order to unravel how such an ostensibly deleterious manipulation can extend lifespan, we sought to identify the ensemble of nuclear transcription factors that are activated in response to defective mitochondrial electron transport chain (ETC) function. Using a feeding RNAi approach, we targeted over 400 transcription factors and identified 15 that, when reduced in function, reproducibly and differentially altered the development, stress response, and/or fecundity of isp-1(qm150) Mit mutants relative to wild-type animals. Seven of these transcription factors--AHA-1, CEH-18, HIF-1, JUN-1, NHR-27, NHR-49 and the CREB homolog-1 (CRH-1)-interacting protein TAF-4--were also essential for isp-1 life extension. When we tested the involvement of these seven transcription factors in the life extension of two other Mit mutants, namely clk-1(qm30) and tpk-1(qm162), TAF-4 and HIF-1 were consistently required. Our findings suggest that the Mit phenotype is under the control of multiple transcriptional responses, and that TAF-4 and HIF-1 may be part of a general signaling axis that specifies Mit mutant life extension.

  12. A new interpolation method for gridded extensive variables with application in Lagrangian transport and dispersion models

    Science.gov (United States)

    Hittmeir, Sabine; Philipp, Anne; Seibert, Petra

    2017-04-01

    In discretised form, an extensive variable usually represents an integral over a 3-dimensional (x,y,z) grid cell. In the case of vertical fluxes, gridded values represent integrals over a horizontal (x,y) grid face. In meteorological models, fluxes (precipitation, turbulent fluxes, etc.) are usually written out as temporally integrated values, thus effectively forming 3D (x,y,t) integrals. Lagrangian transport models require interpolation of all relevant variables towards the location in 4D space of each of the computational particles. Trivial interpolation algorithms usually implicitly assume the integral value to be a point value valid at the grid centre. If the integral value would be reconstructed from the interpolated point values, it would in general not be correct. If nonlinear interpolation methods are used, non-negativity cannot easily be ensured. This problem became obvious with respect to the interpolation of precipitation for the calculation of wet deposition FLEXPART (http://flexpart.eu) which uses ECMWF model output or other gridded input data. The presently implemented method consists of a special preprocessing in the input preparation software and subsequent linear interpolation in the model. The interpolated values are positive but the criterion of cell-wise conservation of the integral property is violated; it is also not very accurate as it smoothes the field. A new interpolation algorithm was developed which introduces additional supporting grid points in each time interval with linear interpolation to be applied in FLEXPART later between them. It preserves the integral precipitation in each time interval, guarantees the continuity of the time series, and maintains non-negativity. The function values of the remapping algorithm at these subgrid points constitute the degrees of freedom which can be prescribed in various ways. Combining the advantages of different approaches leads to a final algorithm respecting all the required conditions. To

  13. C. elegans lifespan extension by osmotic stress requires FUdR, base excision repair, FOXO, and sirtuins.

    Science.gov (United States)

    Anderson, Edward N; Corkins, Mark E; Li, Jia-Cheng; Singh, Komudi; Parsons, Sadé; Tucey, Tim M; Sorkaç, Altar; Huang, Huiyan; Dimitriadi, Maria; Sinclair, David A; Hart, Anne C

    2016-03-01

    Moderate stress can increase lifespan by hormesis, a beneficial low-level induction of stress response pathways. 5'-fluorodeoxyuridine (FUdR) is commonly used to sterilize Caenorhabditis elegans in aging experiments. However, FUdR alters lifespan in some genotypes and induces resistance to thermal and proteotoxic stress. We report that hypertonic stress in combination with FUdR treatment or inhibition of the FUdR target thymidylate synthase, TYMS-1, extends C. elegans lifespan by up to 30%. By contrast, in the absence of FUdR, hypertonic stress decreases lifespan. Adaptation to hypertonic stress requires diminished Notch signaling and loss of Notch co-ligands leads to lifespan extension only in combination with FUdR. Either FUdR treatment or TYMS-1 loss induced resistance to acute hypertonic stress, anoxia, and thermal stress. FUdR treatment increased expression of DAF-16 FOXO and the osmolyte biosynthesis enzyme GPDH-1. FUdR-induced hypertonic stress resistance was partially dependent on sirtuins and base excision repair (BER) pathways, while FUdR-induced lifespan extension under hypertonic stress conditions requires DAF-16, BER, and sirtuin function. Combined, these results demonstrate that FUdR, through inhibition of TYMS-1, activates stress response pathways in somatic tissues to confer hormetic resistance to acute and chronic stress. C. elegans lifespan studies using FUdR may need re-interpretation in light of this work. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Some simple extensions of the standard model and muon member violation

    International Nuclear Information System (INIS)

    Ng, J.N.

    1994-05-01

    A brief discussion of a systematic study of simple particle extensions to the standard model (SM) is given. The effects of such extensions in lepton number violation processes such as μ → e γ, 3e and μ -e conversion nuclei is given. It is found that μ → e γ and μ -e conversion offer the best opportunities for the discovery of this kind of new physics. (author)

  15. Evaluation of Online Video Usage and Learning Satisfaction: An Extension of the Technology Acceptance Model

    Science.gov (United States)

    Nagy, Judit T.

    2018-01-01

    The aim of the study was to examine the determining factors of students' video usage and their learning satisfaction relating to the supplementary application of educational videos, accessible in a Moodle environment in a Business Mathematics Course. The research model is based on the extension of "Technology Acceptance Model" (TAM), in…

  16. A factorization model for the generalized Friedrichs extension in a Pontryagin space

    NARCIS (Netherlands)

    Derkach, Vladimir; Hassi, Seppo; de Snoo, Henk; Forster, KH; Jonas, P; Langer, H

    2006-01-01

    An operator model for the generalized Friedrichs extension in the Pontryagin space setting is presented. The model is based on a factorization of the associated Weyl function (or Q-function) and it carries the information on the asymptotic behavior of the Weyl function at z = infinity.

  17. Hidden Markov models for sequence analysis: extension and analysis of the basic method

    DEFF Research Database (Denmark)

    Hughey, Richard; Krogh, Anders Stærmose

    1996-01-01

    -maximization training procedure is relatively straight-forward. In this paper,we review the mathematical extensions and heuristics that move the method from the theoreticalto the practical. Then, we experimentally analyze the effectiveness of model regularization,dynamic model modification, and optimization strategies...

  18. Modeling and Application Domain Extension of CityGML in UML

    NARCIS (Netherlands)

    Van den Brink, L.; Stoter, J.E.; Zlatanova, S.

    2012-01-01

    This paper presents key aspects of the development of a Dutch 3D standard IMGeo as a CityGML ADE. The new ADE is modeled using UML class diagrams. However the OGC CityGML specification does not provide clear rules on modeling an ADE in UML. This paper describes how the extension was built, which

  19. An Object-Oriented Model for Extensible Concurrent Systems: the Composition-Filters Approach

    NARCIS (Netherlands)

    Bergmans, Lodewijk; Aksit, Mehmet; Wakita, K.; Wakita, Ken; Yonezawa, Akinori

    1992-01-01

    Applying the object-oriented paradigm for the development of large and complex software systems offers several advantages, of which increased extensibility and reusability are the most prominent ones. The object-oriented model is also quite suitable for modeling concurrent systems. However, it

  20. Conditions for vacuum stability in an S{sub 3} extension of the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Beltran, O Felix [Fac. de Cs. de la Electronica, BUAP, Apdo. Postal 542, Puebla, Pue. 72570 (Mexico); Mondragon, M [Instituto de Fisica, Universidad Nacional Autonoma de Mexico, Apdo. Postal 20-364, Mexico, D.F. 01000 (Mexico); RodrIguez-Jauregui, E, E-mail: ezequiel.rodriguez@correo.fisica.uson.m [Departamento de Fisica, UNISON, Apdo. Postal 1626, Hermosillo, Son. 83000 (Mexico)

    2009-06-01

    In this work we study the Higgs sector in the minimal S{sub 3} extension of the Standard Model. The S{sub 3} extended Standard Model, which has three Higgs doublets fields that belong to the three-dimensional reducible representation of the permutation group S{sub 3}, has naturally new phenomena: there are several Higgs bosons, charged, neutral and pseuodscalar ones, and more than one potential minimum. We analyzed the stability of the minimal S3 invariant extension of the Higgs potential and show that at tree-level, the potential minimum preserving electric charge and CP symmetries, when it exists, is the global one.

  1. Tracking Maneuvering Group Target with Extension Predicted and Best Model Augmentation Method Adapted

    Directory of Open Access Journals (Sweden)

    Linhai Gan

    2017-01-01

    Full Text Available The random matrix (RM method is widely applied for group target tracking. The assumption that the group extension keeps invariant in conventional RM method is not yet valid, as the orientation of the group varies rapidly while it is maneuvering; thus, a new approach with group extension predicted is derived here. To match the group maneuvering, a best model augmentation (BMA method is introduced. The existing BMA method uses a fixed basic model set, which may lead to a poor performance when it could not ensure basic coverage of true motion modes. Here, a maneuvering group target tracking algorithm is proposed, where the group extension prediction and the BMA adaption are exploited. The performance of the proposed algorithm will be illustrated by simulation.

  2. Exercise effects in a virtual type 1 diabetes patient: Using stochastic differential equations for model extension

    DEFF Research Database (Denmark)

    Duun-Henriksen, Anne Katrine; Schmidt, S.; Nørgaard, K.

    2013-01-01

    extension incorporating exercise effects on insulin and glucose dynamics. Our model is constructed as a stochastic state space model consisting of a set of stochastic differential equations (SDEs). In a stochastic state space model, the residual error is split into random measurement error...... physical activity. Exercise constitutes a substantial challenge to closed-loop control of T1D. The effects are many and depend on intensity and duration and may be delayed by several hours. In this study, we use a model for the glucoregulatory system based on the minimal model and a previously published...

  3. Evaluation of runner cone extension to dampen pressure pulsations in a Francis model turbine

    Science.gov (United States)

    Gogstad, Peter Joachim; Dahlhaug, Ole Gunnar

    2016-11-01

    Today's energy market has a high demand of flexibility due to introduction of other intermittent renewables as wind and solar. To ensure a steady power supply, hydro turbines are often forced to operate more at part load conditions. Originally, turbines were built for steady operation around the best efficiency point. The demand of flexibility, combined with old designs has showed an increase in turbines having problems with hydrodynamic instabilities such as pressure pulsations. Different methods have been investigated to mitigate pressure pulsations. Air injection shows a significant reduction of pressure pulsation amplitudes. However, installation of air injection requires extra piping and a compressor. Investigation of other methods such as shaft extension shows promising results for some operational points, but may significantly reduce the efficiency of the turbine at other operational points. The installation of an extension of the runner cone has been investigated at NTNU by Vekve in 2004. This has resulted in a cylindrical extension at Litjfossen Power Plant in Norway, where the bolt suffered mechanical failure. This indicates high amplitude pressure pulsations in the draft tube centre. The high pressure pulsation amplitudes are believed to be related to high tangential velocity in the draft tube. The mentioned runner cone extension has further been developed to a freely rotating extension. The objective is to reduce the tangential velocity in the draft tube and thereby the pressure pulsation amplitudes.

  4. Investigating Students' Perceptions on Laptop Initiative in Higher Education: An Extension of the Technology Acceptance Model

    Science.gov (United States)

    Elwood, Susan; Changchit, Chuleeporn; Cutshall, Robert

    2006-01-01

    Purpose: This study aims to examine students' perceptions and their acceptance towards implementing a laptop program. Design/methodology/approach: Extensive research has been carried out on the technology acceptance model (TAM) to better understand the behavioral intention of individuals to accept and use technology. Therefore, the TAM was adopted…

  5. Recent extensions of the residence time distribution concept: unsteady state conditions and hydrodynamic model developments

    Directory of Open Access Journals (Sweden)

    Claudel S.

    2000-01-01

    Full Text Available Two recent extensions of the residence time distribution concept are developed. The first one concerns the use of this method under transient conditions, a concept theoretically treated but rarely confirm by relevant experiments. In the present work, two experimental set-ups have been used to verify some limits of the concept. The second extension is devoted to the development of hydrodynamic models. Up to now, the hydrodynamics of the process are either determined by simple models (mixing cells in series, plug flow reactor with axial dispersion or by the complex calculation of the velocity profile obtained via the Navier-Stokes equations. An alternative is to develop a hydrodynamic model by use of a complex network of interconnected elementary reactors. Such models should be simple enough to be derived easily and sufficiently complex to give a good representation of the behavior of the process.

  6. Singlet extensions of the standard model at LHC Run 2: benchmarks and comparison with the NMSSM

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Raul [Centro de Física Teórica e Computacional, Faculdade de Ciências,Universidade de Lisboa, Campo Grande, Edifício C8 1749-016 Lisboa (Portugal); Departamento de Física da Universidade de Aveiro,Campus de Santiago, 3810-183 Aveiro (Portugal); Mühlleitner, Margarete [Institute for Theoretical Physics, Karlsruhe Institute of Technology,76128 Karlsruhe (Germany); Sampaio, Marco O.P. [Departamento de Física da Universidade de Aveiro,Campus de Santiago, 3810-183 Aveiro (Portugal); CIDMA - Center for Research Development in Mathematics and Applications,Campus de Santiago, 3810-183 Aveiro (Portugal); Santos, Rui [Centro de Física Teórica e Computacional, Faculdade de Ciências,Universidade de Lisboa, Campo Grande, Edifício C8 1749-016 Lisboa (Portugal); ISEL - Instituto Superior de Engenharia de Lisboa,Instituto Politécnico de Lisboa, 1959-007 Lisboa (Portugal)

    2016-06-07

    The Complex singlet extension of the Standard Model (CxSM) is the simplest extension that provides scenarios for Higgs pair production with different masses. The model has two interesting phases: the dark matter phase, with a Standard Model-like Higgs boson, a new scalar and a dark matter candidate; and the broken phase, with all three neutral scalars mixing. In the latter phase Higgs decays into a pair of two different Higgs bosons are possible. In this study we analyse Higgs-to-Higgs decays in the framework of singlet extensions of the Standard Model (SM), with focus on the CxSM. After demonstrating that scenarios with large rates for such chain decays are possible we perform a comparison between the NMSSM and the CxSM. We find that, based on Higgs-to-Higgs decays, the only possibility to distinguish the two models at the LHC run 2 is through final states with two different scalars. This conclusion builds a strong case for searches for final states with two different scalars at the LHC run 2. Finally, we propose a set of benchmark points for the real and complex singlet extensions to be tested at the LHC run 2. They have been chosen such that the discovery prospects of the involved scalars are maximised and they fulfil the dark matter constraints. Furthermore, for some of the points the theory is stable up to high energy scales. For the computation of the decay widths and branching ratios we developed the Fortran code sHDECAY, which is based on the implementation of the real and complex singlet extensions of the SM in HDECAY.

  7. The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Luati, Alessandra

    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...... to the log-spectrum. We then propose two extensions. The first deals with replacing the logarithmic link with a more general Box-Cox link, which encompasses also the identity and the inverse links: this enables nesting alternative spectral estimation methods (autoregressive, exponential, etc.) under the same...

  8. Towards semantically sensitive text clustering: a feature space modeling technology based on dimension extension.

    Science.gov (United States)

    Liu, Yuanchao; Liu, Ming; Wang, Xin

    2015-01-01

    The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach.

  9. Towards semantically sensitive text clustering: a feature space modeling technology based on dimension extension.

    Directory of Open Access Journals (Sweden)

    Yuanchao Liu

    Full Text Available The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach.

  10. Extension of internationalisation models drivers and processes for the globalisation of product development

    DEFF Research Database (Denmark)

    Søndergaard, Erik Stefan; Oehmen, Josef; Ahmed-Kristensen, Saeema

    2016-01-01

    This paper develops an extension to established production- and supply chain management focused internationalisation models. It applies explorative case studies in Danish and Chinese engineering firms to discover how the globalisation process of product development differs from Danish and Chinese...... of product development and collaborative distributed development beyond sourcing, sales and production elements. The paper then provides propositions for how to further develop the suggested model, and how western companies can learn from the Chinese approaches, and globalise their product development...

  11. Predicting failure response of spot welded joints using recent extensions to the Gurson model

    DEFF Research Database (Denmark)

    Nielsen, Kim Lau

    2010-01-01

    The plug failure modes of resistance spot welded shear-lab and cross-tension test specimens are studied, using recent extensions to the Gurson model. A comparison of the predicted mechanical response is presented when using either: (i) the Gurson-Tvergaard-Needleman model (GTN-model), (ii......) the shear-modified GTN-model by Nahshon and Hutchinson that also describes damage development at low triaxiality (NH-model) or (iii) the Gologanu-Leblond-Devaux model (GLD-model) accounting for non-spherical void growth. The failure responses predicted by the various models are discussed in relation...... to their approximate description of the nucleation, growth and coalescence of microvoids. Using the void shape factor of the GLD-model, a simple approach for approximating void nucleation by either particle fracture or particle-matrix decohesion is applied and a study of the subsequent void shape evolution...

  12. Integrated modelling requires mass collaboration (Invited)

    Science.gov (United States)

    Moore, R. V.

    2009-12-01

    The need for sustainable solutions to the world’s problems is self evident; the challenge is to anticipate where, in the environment, economy or society, the proposed solution will have negative consequences. If we failed to realise that the switch to biofuels would have the seemingly obvious result of reduced food production, how much harder will it be to predict the likely impact of policies whose impacts may be more subtle? It has been clear for a long time that models and data will be important tools for assessing the impact of events and the measures for their mitigation. They are an effective way of encapsulating knowledge of a process and using it for prediction. However, most models represent a single or small group of processes. The sustainability challenges that face us now require not just the prediction of a single process but the prediction of how many interacting processes will respond in given circumstances. These processes will not be confined to a single discipline but will often straddle many. For example, the question, “What will be the impact on river water quality of the medical plans for managing a ‘flu pandemic and could they cause a further health hazard?” spans medical planning, the absorption of drugs by the body, the spread of disease, the hydraulic and chemical processes in sewers and sewage treatment works and river water quality. This question nicely reflects the present state of the art. We have models of the processes and standards, such as the Open Modelling Interface (the OpenMI), allow them to be linked together and to datasets. We can therefore answer the question but with the important proviso that we thought to ask it. The next and greater challenge is to deal with the open question, “What are the implications of the medical plans for managing a ‘flu pandemic?”. This implies a system that can make connections that may well not have occurred to us and then evaluate their probable impact. The final touch will be to

  13. Baryogenesis in the two doublet and inert singlet extension of the Standard Model

    DEFF Research Database (Denmark)

    Alanne, Tommi; Kainulainen, Kimmo; Tuominen, Kimmo

    2016-01-01

    We investigate an extension of the Standard Model containing two Higgs doublets and a singlet scalar field (2HDSM). We show that the model can have a strongly first-order phase transition and give rise to the observed baryon asymmetry of the Universe, consistent with all experimental constraints...... with the critical temperature, Tn Tc, which can significantly alter the usual phase-transition pattern in 2HD models with Tn ≈ Tc. Furthermore, the singlet field can be the dark matter particle. However, in models with a strong first-order transition its abundance is typically but a thousandth of the observed dark...... matter abundance....

  14. Extensions to the time lag models for practical application to rocket engine stability design

    Science.gov (United States)

    Casiano, Matthew J.

    The combustion instability problem in liquid-propellant rocket engines (LREs) has remained a tremendous challenge since their discovery in the 1930s. Improvements are usually made in solving the combustion instability problem primarily using computational fluid dynamics (CFD) and also by testing demonstrator engines. Another approach is to use analytical models. Analytical models can be used such that design, redesign, or improvement of an engine system is feasible in a relatively short period of time. Improvements to the analytical models can greatly aid in design efforts. A thorough literature review is first conducted on liquid-propellant rocket engine (LRE) throttling. Throttling is usually studied in terms of vehicle descent or ballistic missile control however there are many other cases where throttling is important. It was found that combustion instabilities are one of a few major issues that occur during deep throttling (other major issues are heat transfer concerns, performance loss, and pump dynamics). In the past and again recently, gas injected into liquid propellants has shown to be a viable solution to throttle engines and to eliminate some forms of combustion instability. This review uncovered a clever solution that was used to eliminate a chug instability in the Common Extensible Cryogenic Engine (CECE), a modified RL10 engine. A separate review was also conducted on classic time lag combustion instability models. Several new stability models are developed by incorporating important features to the classic and contemporary models, which are commonly used in the aerospace rocket industry. The first two models are extensions of the original Crocco and Cheng concentrated combustion model with feed system contributions. A third new model is an extension to the Wenzel and Szuch double-time lag model also with feed system contributions. The first new model incorporates the appropriate injector acoustic boundary condition which is neglected in contemporary

  15. Mixed-Integer Linear Programming Models for Teaching Assistant Assignment and Extensions

    Directory of Open Access Journals (Sweden)

    Xiaobo Qu

    2017-01-01

    Full Text Available In this paper, we develop mixed-integer linear programming models for assigning the most appropriate teaching assistants to the tutorials in a department. The objective is to maximize the number of tutorials that are taught by the most suitable teaching assistants, accounting for the fact that different teaching assistants have different capabilities and each teaching assistant’s teaching load cannot exceed a maximum value. Moreover, with optimization models, the teaching load allocation, a time-consuming process, does not need to be carried out in a manual manner. We have further presented a number of extensions that capture more practical considerations. Extensive numerical experiments show that the optimization models can be solved by an off-the-shelf solver and used by departments in universities.

  16. Phase transition and gravitational wave phenomenology of scalar conformal extensions of the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Marzola, Luca; Racioppi, Antonio; Vaskonen, Ville [National Institute of Chemical Physics and Biophysics, Tallinn (Estonia)

    2017-07-15

    Thermal corrections in classically conformal models typically induce a strong first-order electroweak phase transition, thereby resulting in a stochastic gravitational background that could be detectable at gravitational wave observatories. After reviewing the basics of classically conformal scenarios, in this paper we investigate the phase transition dynamics in a thermal environment and the related gravitational wave phenomenology within the framework of scalar conformal extensions of the Standard Model. We find that minimal extensions involving only one additional scalar field struggle to reproduce the correct phase transition dynamics once thermal corrections are accounted for. Next-to-minimal models, instead, yield the desired electroweak symmetry breaking and typically result in a very strong gravitational wave signal. (orig.)

  17. Modelling in support of decision-making for South African extensive beef farmers

    Directory of Open Access Journals (Sweden)

    D.H. Meyer

    2003-12-01

    Full Text Available In this study it is shown that it is possible to build a decision support system for the use of South African extensive beef farmers. Initially models for the key variables which affect extensive beef farmers are developed. These key variables include rainfall, beef, veal and weaner prices and the condition of the veld. This last key variable is monitored using the voluntary lick intake of the cattle and is modelled in terms of rainfall and stocking intensity. Particular attention is paid to the interrelationships between the key variables and to the distribution of modelling errors. The next stage of the study concerns the use of these models as a decision-support tool for extensive beef farmers. It is shown that Monte Carlo simulations and dynamic programming analyses can use these models to suggest how gross margins can be increased. At the same time these methods can be used to monitor the effect of management decisions on mean lick intake and, hence, the effect of these decisions on the condition of the veld. In particular the decisions of "what stocking intensity", "what cattle system", "when to sell" and "when to make a change" are addressed.

  18. How sedimentation affects rift segment interaction during oblique extension: a 4D analogue modelling study

    Science.gov (United States)

    Zwaan, Frank; Schreurs, Guido; Adam, Jürgen

    2017-04-01

    During the early stages of rifting, rift segments may form along non-continuous and/or offset pre-existing weaknesses. It is important to understand how these initial rift segments interact and connect to form a continuous rift system. Previous modelling of rift interaction structures has shown the dominant influence of oblique extension, promoting rift segment linkage (e.g. Zwaan et al., 2016) and eventual continent break-up (Brune et al., 2012). However, these studies did not incorporate sedimentation, which can have important implications for rift evolution (e.g. Bialas and Buck, 2009). Here we present a series of analogue model experiments investigating the influence of sedimentation on rift interaction structures under oblique extension conditions. Our set-up involves a base of compressed foam and plexiglass that forces distributed extension in the overlying analogue materials when the model sidewalls move apart. A sand layer simulates the brittle upper crust and a viscous sand/silicone mixture the ductile lower crust. One of the underlying base plates can move laterally allowing oblique extension. Right-stepping offset and disconnected lines of silicone (seeds) on top of the basal viscous serve as inherited structures since the strong sand cover is locally thinner. We apply syn-rift sediments by filling in the developing rift and transfer zone basins with sand at fixed time steps. Models are run either with sedimentation or without to allow comparison. The first results suggest that the gross structures are similar with or without sedimentation. As seen by Zwaan et al. (2016), dextral oblique extension promotes rift linkage because rift propagation aligns itself perpendicular to the extension direction. This causes the rift segments to grow towards each other and to establish a continuous rift structure. However, the structures within the rift segments show quite different behaviour when sedimentation is applied. The extra sediment loading in the rift basin

  19. Higgs Boson Properties in the Standard Model and its Supersymmetric Extensions

    CERN Document Server

    Ellis, Jonathan Richard; Zwirner, F; Ellis, John; Ridolfi, Giovanni; Zwirner, Fabio

    2007-01-01

    We review the realization of the Brout-Englert-Higgs mechanism in the electroweak theory and describe the experimental and theoretical constraints on the mass of the single Higgs boson expected in the minimal Standard Model. We also discuss the couplings of this Higgs boson and its possible decay modes as functions of its unknown mass. We then review the structure of the Higgs sector in the minimal supersymmetric extension of the Standard Model (MSSM), noting the importance of loop corrections to the masses of its five physical Higgs bosons. Finally, we discuss some non-minimal models.

  20. Higgs boson properties in the Standard Model and its supersymmetric extensions

    International Nuclear Information System (INIS)

    Ellis, J.; Ridolfi, G.; Zwirner, F.

    2007-01-01

    We review the realization of the Brout-Englert-Higgs mechanism in the electroweak theory and describe the experimental and theoretical constraints on the mass of the single Higgs boson expected in the minimal Standard Model. We also discuss the couplings of this Higgs boson and its possible decay modes as functions of its unknown mass. We then review the structure of the Higgs sector in the minimal supersymmetric extension of the Standard Model (MSSM), noting the importance of loop corrections to the masses of its 5 physical Higgs bosons. Finally, we discuss some non-minimal models. (authors)

  1. Power Extension Package (PEP) system definition extension, orbital service module systems analysis study. Volume 7: PEP logistics and training plan requirements

    Science.gov (United States)

    1979-01-01

    Recommendations for logistics activities and logistics planning are presented based on the assumption that a system prime contractor will perform logistics functions to support all program hardware and will implement a logistics system to include the planning and provision of products and services to assure cost effective coverage of the following: maintainability; maintenance; spares and supply support; fuels; pressurants and fluids; operations and maintenance documentation training; preservation, packaging and packing; transportation and handling; storage; and logistics management information reporting. The training courses, manpower, materials, and training aids required will be identified and implemented in a training program.

  2. Extension Staffing Models to Serve 4-H Clientele in Changing Times

    Directory of Open Access Journals (Sweden)

    Donna R. Gillespie

    2010-03-01

    Full Text Available In response to budget cuts in 2002, 4-H staffing models were restructured. The response by University of Idaho Extension was intended to continue meeting the needs of Idaho’s citizens with fewer UI Extension faculty. This staffing reorganization led to the formation of the District III 4-H Team who united to bring stronger 4-H programs to south central Idaho and expand programs to underserved audiences. Information from surveys and interviews over the past seven years reflects the effectiveness, challenges and successes of the District III 4-H Team. In Making the Best Better: 4-H Staffing Patterns and Trends in the Largest Professional Network in the Nation (2007, author Kirk A. Astroth notes a nationwide change in 4-H leadership at the county level from 4-H faculty to program assistants or coordinators. The information gathered in our research may help other states determine staffing models to meet the needs of clientele in these changing times.

  3. An Extensible NetLogo Model for Visualizing Message Routing Protocols

    Science.gov (United States)

    2017-08-01

    Extensible NetLogo Model for Visualizing Message Routing Protocols by Robert P Winkler and Somiya Metu Computational and Information Sciences ...hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...ranging from fields as diverse as games to the hard sciences to the social sciences to computer-generated art. NetLogo represents the world as a set of

  4. Development of an ArcGIS extension to model urban climate factors

    OpenAIRE

    Burghardt, René

    2015-01-01

    The possibility to develop automatically running models which can capture some of the most important factors driving the urban climate would be very useful for many planning aspects. With the help of these modulated climate data, the creation of the typically used “Urban Climate Maps” (UCM) will be accelerated and facilitated. This work describes the development of a special ArcGIS software extension, along with two support databases to achieve this functionality. At the present time, lacking...

  5. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  6. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  7. Extracting the properties of dark matter particles in minimal extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Maira Dutra Vasconcelos dos; Santos, Antonio Carlos Oliveira; Silva, Paulo Sergio Rodrigues da; Pires, Carlos Antonio de Sousa; Siqueira, Clarissa [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil); Queiroz, Farinaldo da Silva [University of California (United States)

    2013-07-01

    Full text: Nature has provided a striking evidence for physics beyond the Standard Model, namely dark matter. Observations coming from a variety of sources point to the existence of a non-baryonic matter that accounts for roughly 27% of the total abundance of the universe and is composed of neutral, massive, stable and weakly interacting particles. Once the Standard Model has no candidate that fulfills all these properties we must extend it. There are many interesting proposals in the literature that have a good dark matter candidate. Essentially, all of them invoke an extended scalar or gauge sector. Here we aim to extract information about the underlying beyond Standard Model theory able to address the dark matter and many other theoretical puzzles through minimal extensions of the standard model. The minimality perspective it is a worthwhile approach because we can focus on the dark side of many particle physics models. We will carry on our investigation in a pedagogic way Firstly, we will add a neutral fermion, which is our dark matter candidate, and one neutral scalar, both being singlet under the Standard Model gauge group. In this model we compute the abundance of our dark matter candidate and the scattering cross sections off nuclei in order to face our results with the current direct detection experiments data. Secondly, we add a charged scalar field, which is predicted in many standard model extensions, to the first model and investigate the role of this scalar in our results. Lastly, we add a Z' boson to the latter model, and study how our results are affected, with the purpose of, further on, exploring the complementarity between direct detection and collider physics regarding the search of this boson. Thus, we will be able to extract precise information about the beyond Standard Model theory and the properties of the dark matter particles. (author)

  8. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  9. Complex {PT}-symmetric extensions of the nonlinear ultra-short light pulse model

    Science.gov (United States)

    Yan, Zhenya

    2012-11-01

    The short pulse equation u_{xt}=u+\\frac{1}{2}(u^2u_x)_x is PT symmetric, which arises in nonlinear optics for the ultra-short pulse case. We present a family of new complex PT-symmetric extensions of the short pulse equation, i[(iu_x)^{\\sigma }]_t=au+bu^m+ic[u^n(iu_x)^{\\epsilon }]_x \\,\\, (\\sigma ,\\, \\epsilon ,\\,a,\\,b,\\,c,\\,m,\\,n \\in {R}), based on the complex PT-symmetric extension principle. Some properties of these equations with some chosen parameters are studied including the Hamiltonian structures and exact solutions such as solitary wave solutions, doubly periodic wave solutions and compacton solutions. Our results may be useful to understand complex PT-symmetric nonlinear physical models. This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical devoted to ‘Quantum physics with non-Hermitian operators’.

  10. Comparison and Extension of Existing 3D Propagation Models with Real-World Effects Based on Ray-tracing

    DEFF Research Database (Denmark)

    Kifle, Dereje W.; Gimenez, Lucas Chavarria; Wegmann, Bernhard

    2014-01-01

    The next generation of cellular network deployment is heterogeneous and temporally changing in order to follow the coverage and capacity needs. Active Antenna Systems allows fast deployment changes by cell shaping and tilt adaptation which has to be controlled in self-organized manner. However......, such kind of automated and flexible network operation require a Self Organizing Network algorithm based on network performance parameters being partly derived from the radio measurements. Appropriate radio propagation models are not only needed for network planning tools but also for simulative lab tests...... of the developed Self Organizing Network algorithm controlling the flexible deployment changes enabled by Active Antenna Systems. In this paper, an extension of the existing 3D propagation model is proposed in order to incorporate the the propagation condition variation effects, not considered so far, by changing...

  11. Progress Report 2008: A Scalable and Extensible Earth System Model for Climate Change Science

    Energy Technology Data Exchange (ETDEWEB)

    Drake, John B [ORNL; Worley, Patrick H [ORNL; Hoffman, Forrest M [ORNL; Jones, Phil [Los Alamos National Laboratory (LANL)

    2009-01-01

    This project employs multi-disciplinary teams to accelerate development of the Community Climate System Model (CCSM), based at the National Center for Atmospheric Research (NCAR). A consortium of eight Department of Energy (DOE) National Laboratories collaborate with NCAR and the NASA Global Modeling and Assimilation Office (GMAO). The laboratories are Argonne (ANL), Brookhaven (BNL) Los Alamos (LANL), Lawrence Berkeley (LBNL), Lawrence Livermore (LLNL), Oak Ridge (ORNL), Pacific Northwest (PNNL) and Sandia (SNL). The work plan focuses on scalablity for petascale computation and extensibility to a more comprehensive earth system model. Our stated goal is to support the DOE mission in climate change research by helping ... To determine the range of possible climate changes over the 21st century and beyond through simulations using a more accurate climate system model that includes the full range of human and natural climate feedbacks with increased realism and spatial resolution.

  12. Relational Data Modelling of Textual Corpora: The Skaldic Project and its Extensions

    DEFF Research Database (Denmark)

    Wills, Tarrin Jon

    2015-01-01

    , however, has limitations in representing semantic relationships that do not conform to the tree model. This article presents the relational data model as a way of representing the structure of skaldic texts and their contextual environment. The relational data model raises both problems and possibilities......Skaldic poetry is a highly complex textual phenomenon both in terms of the intricacy of the poetry and its contextual environment. Extensible Markup Language (XML) applications such as that of the Text Encoding Initiative provide a means of semantic representation of some of these complexities. XML...... for this type of project. The main problem addressed here is the representation of the syntagmatic structures of texts in a data model that is not intrinsically ordered. The advantages are also explored, including networked data editing and management, quantitative linguistic analysis, dynamic representation...

  13. Bivariate Extension of the Quadrature Method of Moments for Modeling Simultaneous Coagulation and Sintering of Particle Populations.

    Science.gov (United States)

    Wright, Douglas L.; McGraw, Robert; Rosner, Daniel E.

    2001-04-15

    We extendthe application of moment methods to multivariate suspended particle population problems-those for which size alone is insufficient to specify the state of a particle in the population. Specifically, a bivariate extension of the quadrature method of moments (QMOM) (R. McGraw, Aerosol Sci. Technol. 27, 255 (1997)) is presented for efficiently modeling the dynamics of a population of inorganic nanoparticles undergoing simultaneous coagulation and particle sintering. Continuum regime calculations are presented for the Koch-Friedlander-Tandon-Rosner model, which includes coagulation by Brownian diffusion (evaluated for particle fractal dimensions, D(f), in the range 1.8-3) and simultaneous sintering of the resulting aggregates (P. Tandon and D. E. Rosner, J. Colloid Interface Sci. 213, 273 (1999)). For evaluation purposes, and to demonstrate the computational efficiency of the bivariate QMOM, benchmark calculations are carried out using a high-resolution discrete method to evolve the particle distribution function n(nu, a) for short to intermediate times (where nu and a are particle volume and surface area, respectively). Time evolution of a selected set of 36 low-order mixed moments is obtained by integration of the full bivariate distribution and compared with the corresponding moments obtained directly using two different extensions of the QMOM. With the more extensive treatment, errors of less than 1% are obtained over substantial aerosol evolution, while requiring only a few minutes (rather than days) of CPU time. Longer time QMOM simulations lend support to the earlier finding of a self-preserving limit for the dimensionless joint (nu, a) particle distribution function under simultaneous coagulation and sintering (Tandon and Rosner, 1999; D. E. Rosner and S. Yu, AIChE J., 47 (2001)). We demonstrate that, even in the bivariate case, it is possible to use the QMOM to rapidly model the approach to asymptotic behavior, allowing an immediate assessment of

  14. Model 9975 Life Extension Test Package 3 - Interim Report - January 2017

    Energy Technology Data Exchange (ETDEWEB)

    Daugherty, W. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-01-31

    Life extension package LE3 (9975-03203) has been instrumented and subjected to an elevated temperature environment for approximately 8 years. During this time, the cane fiberboard has been maintained at a maximum temperature of ~160 - 165 °F, which was established by a combination of internal (19 watts) and external heat sources. Several tests and parameters were used to characterize the package components. Results from these tests generally indicate agreement between this full-scale shipping package and small-scale laboratory tests on fiberboard samples, including the degradation models based on the laboratory tests. These areas of agreement include the rate of change of fiberboard weight, dimensions and density, and change in fiberboard thermal conductivity. Corrosion of the lead shield occurred at a high rate during the first several weeks of aging, but dropped significantly after most of the moisture in the fiberboard migrated away from the lead shield. Dimensional measurements of the lead shield indicate that no significant creep deformation has occurred. This is consistent with literature data that predict a very small creep deformation for the time at temperature experienced by this package. The SCV O-rings were verified to remain leak-tight after ~5 years aging at an average temperature of ~170 °F. This package provides an example of the extent to which moisture within a typical fiberboard assembly can redistribute in the presence of a temperature gradient such as might be created by a 19 watt internal heat load. The majority of water within the fiberboard migrated to the bottom layers of fiberboard, with approximately 2 kg of water (2 liters) eventually escaping from the package. Two conditions have developed that are not consistent with package certification requirements. The axial gap at the top of the package increased to a maximum value of 1.549 inches, exceeding the 1 inch criterion. In addition, staining and/or corrosion have formed in a few spots

  15. REQUIREMENTS PARTICLE NETWORKS: AN APPROACH TO FORMAL SOFTWARE FUNCTIONAL REQUIREMENTS MODELLING

    OpenAIRE

    Wiwat Vatanawood; Wanchai Rivepiboon

    2001-01-01

    In this paper, an approach to software functional requirements modelling using requirements particle networks is presented. In our approach, a set of requirements particles is defined as an essential tool to construct a visual model of software functional requirements specification during the software analysis phase and the relevant formal specification is systematically generated without the experience of writing formal specification. A number of algorithms are presented to perform these for...

  16. Building traceable Event-B models from requirements

    OpenAIRE

    Alkhammash, Eman; Butler, Michael; Fathabadi, Asieh Salehi; Cîrstea, Corina

    2015-01-01

    Abstract Bridging the gap between informal requirements and formal specifications is a key challenge in systems engineering. Constructing appropriate abstractions in formal models requires skill and managing the complexity of the relationships between requirements and formal models can be difficult. In this paper we present an approach that aims to address the twin challenges of finding appropriate abstractions and managing traceability between requirements and models. Our approach is based o...

  17. The reading efficiency model: an extension of the componential model of reading.

    Science.gov (United States)

    Høien-Tengesdal, Ingjerd; Høien, Torleiv

    2012-01-01

    The purpose of the present study was twofold: First, the authors investigated if an extended version of the component model of reading (CMR; Model 2), including decoding rate and oral vocabulary comprehension, accounted for more of the variance in reading comprehension than the commonly used measures of the cognitive factors in the CMR. Second, the authors investigated the fitness of a new model, titled the reading efficiency model (REM), which deviates from earlier models regarding how reading is defined. In the study, 780 Norwegian students from Grades 6 and 10 were recruited. Here, hierarchical regression analyses showed that the extended model did not account for more of the variance in reading comprehension than the traditional CMR model (Model 1). In the second part of the study the authors used structural equation modeling (SEM) to explore the REM. The results showed that the REM explained an overall larger amount of variance in reading ability, compared to Model 1 and Model 2. This result is probably the result of the new definition of reading applied in the REM. The authors believe their model will more fully reflects students' differentiated reading skills by including reading fluency in the definition of reading.

  18. UNIRAM modeling for increased nuclear-plant availability and life extension

    International Nuclear Information System (INIS)

    O'Mara, R.L.

    1988-01-01

    At the start of a nuclear-power plant's design life of 40 years, most parts of the plant are effectively brand new, but some subcomponents have already experienced significant wear and aging effects. In short, the spectrum of where each component is in its life cycle at any time is quite broad, and this makes the prediction of the future availability of the plant a complex issue. Predictive models that account for the differential effects of aging, wear, and functional failure on the plant are desirable as a means to represent this complex behavior. This paper addresses the task of using a computer model to account for the relationships between components, systems, and plant availability, in the context of current and future needs, including eventual life extension. The computer model is based on the Electric Power Research Institute's (EPRI) code, UNIRAM, which has a large and growing user base among utilities

  19. Extension of the PMV model to non-air-conditioned building in warm climates

    DEFF Research Database (Denmark)

    Fanger, Povl Ole; Toftum, Jørn

    2002-01-01

    predicts. The main reason is low expectations, but a metabolic rate that is estimated too high can also contribute to explaining the difference. An extension of the PMV model that includes an expectancy factor is introduced for use in non-air-conditioned buildings in warm climates. The extended PMV model......The PMV model agrees well with high-quality field studies in buildings with HVAC systems, situated in cold, temperate and warm climates, studied during both summer and winter. In non-air-conditioned buildings in warm climates, occupants may sense the warmth as being less severe than the PMV...... agrees well with quality field studies in non-air-conditioned buildings of three continents....

  20. Flaxion: a minimal extension to solve puzzles in the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Ema, Yohei [Department of Physics,The University of Tokyo, Tokyo 133-0033 (Japan); Hamaguchi, Koichi; Moroi, Takeo; Nakayama, Kazunori [Department of Physics,The University of Tokyo, Tokyo 133-0033 (Japan); Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU),University of Tokyo, Kashiwa 277-8583 (Japan)

    2017-01-23

    We propose a minimal extension of the standard model which includes only one additional complex scalar field, flavon, with flavor-dependent global U(1) symmetry. It not only explains the hierarchical flavor structure in the quark and lepton sector (including neutrino sector), but also solves the strong CP problem by identifying the CP-odd component of the flavon as the QCD axion, which we call flaxion. Furthermore, the flaxion model solves the cosmological puzzles in the standard model, i.e., origin of dark matter, baryon asymmetry of the universe, and inflation. We show that the radial component of the flavon can play the role of inflaton without isocurvature nor domain wall problems. The dark matter abundance can be explained by the flaxion coherent oscillation, while the baryon asymmetry of the universe is generated through leptogenesis.

  1. Limit sets for natural extensions of Schelling’s segregation model

    Science.gov (United States)

    Singh, Abhinav; Vainchtein, Dmitri; Weiss, Howard

    2011-07-01

    Thomas Schelling developed an influential demographic model that illustrated how, even with relatively mild assumptions on each individual's nearest neighbor preferences, an integrated city would likely unravel to a segregated city, even if all individuals prefer integration. Individuals in Schelling's model cities are divided into two groups of equal number and each individual is "happy" or "unhappy" when the number of similar neighbors cross a simple threshold. In this manuscript we consider natural extensions of Schelling's original model to allow the two groups have different sizes and to allow different notions of happiness of an individual. We observe that differences in aggregation patterns of majority and minority groups are highly sensitive to the happiness threshold; for low threshold, the differences are small, and when the threshold is raised, striking new patterns emerge. We also observe that when individuals strongly prefer to live in integrated neighborhoods, the final states exhibit a new tessellated-like structure.

  2. Constraints on abelian extensions of the Standard Model from two-loop vacuum stability and U(1) B- L

    Science.gov (United States)

    Corianò, Claudio; Rose, Luigi Delle; Marzo, Carlo

    2016-02-01

    We present a renormalization group study of the scalar potential in a minimal U(1) B- L extension of the Standard Model involving one extra heavier Higgs and three heavy right-handed neutrinos with family universal B-L charge assignments. We implement a type-I seesaw for the masses of the light neutrinos of the Standard Model. In particular, compared to a previous study, we perform a two-loop extension of the evolution, showing that two-loop effects are essential for the study of the stability of the scalar potential up to the Planck scale. The analysis includes the contribution of the kinetic mixing between the two abelian gauge groups, which is radiatively generated by the evolution, and the one-loop matching conditions at the electroweak scale. By requiring the stability of the potential up to the Planck mass, significant constraints on the masses of the heavy neutrinos, on the gauge couplings and the mixing in the Higgs sector are identified.

  3. Viscous relaxation as a prerequisite for tectonic resurfacing on Ganymede: Insights from numerical models of lithospheric extension

    Science.gov (United States)

    Bland, Michael T.; McKinnon, William B.

    2018-01-01

    Ganymede’s bright terrain formed during a near-global resurfacing event (or events) that produced both heavily tectonized and relatively smooth terrains. The mechanism(s) by which resurfacing occurred on Ganymede (e.g., cryovolcanic or tectonic), and the relationship between the older, dark and the younger, bright terrain are fundamental to understanding the geological evolution of the satellite. Using a two-dimensional numerical model of lithospheric extension that has previously been used to successfully simulate surface deformation consistent with grooved terrain morphologies, we investigate whether large-amplitude preexisting topography can be resurfaced (erased) by extension (i.e., tectonic resurfacing). Using synthetically produced initial topography, we show that when the total relief of the initial topography is larger than 25–50 m, periodic groove-like structures fail to form. Instead, extension is localized in a few individual, isolated troughs. These results pose a challenge to the tectonic resurfacing hypothesis. We further investigate the effects of preexisting topography by performing suites of simulations initialized with topography derived from digital terrain models of Ganymede’s surface. These include dark terrain, fresh (relatively deep) impact craters, smooth bright terrain, and a viscously relaxed impact crater. The simulations using dark terrain and fresh impact craters are consistent with our simulations using synthetic topography: periodic groove-like deformation fails to form. In contrast, when simulations were initialized with bright smooth terrain topography, groove-like deformation results from a wide variety of heat flow and surface temperature conditions. Similarly, when a viscously relaxed impact crater was used, groove-like structures were able to form during extension. These results suggest that tectonic resurfacing may require that the amplitude of the initial topography be reduced before extension begins. We emphasize that

  4. Genome-wide selection by mixed model ridge regression and extensions based on geostatistical models.

    Science.gov (United States)

    Schulz-Streeck, Torben; Piepho, Hans-Peter

    2010-03-31

    The success of genome-wide selection (GS) approaches will depend crucially on the availability of efficient and easy-to-use computational tools. Therefore, approaches that can be implemented using mixed models hold particular promise and deserve detailed study. A particular class of mixed models suitable for GS is given by geostatistical mixed models, when genetic distance is treated analogously to spatial distance in geostatistics. We consider various spatial mixed models for use in GS. The analyses presented for the QTL-MAS 2009 dataset pay particular attention to the modelling of residual errors as well as of polygenetic effects. It is shown that geostatistical models are viable alternatives to ridge regression, one of the common approaches to GS. Correlations between genome-wide estimated breeding values and true breeding values were between 0.879 and 0.889. In the example considered, we did not find a large effect of the residual error variance modelling, largely because error variances were very small. A variance components model reflecting the pedigree of the crosses did not provide an improved fit. We conclude that geostatistical models deserve further study as a tool to GS that is easily implemented in a mixed model package.

  5. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...... describe an incremental approach to requirements validation and systems modelling. Formal modelling facilitates a high degree of automation: it serves for validation and traceability. The foundation for our approach are requirements that are structured according to the WRSPM reference model. We provide...... a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements...

  6. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  7. 76 FR 77850 - Permit-Required Confined Spaces; Extension of the Office of Management and Budget's (OMB...

    Science.gov (United States)

    2011-12-14

    ... section. Section 1910.146(k)(2)(iii) requires that the employer train affected employees in basic first... rescue team or service who holds a current certification in first aid and CPR is available. Section 1910...) requires that the employer certify that the training required by paragraphs (g)(1) through (g)(3) has been...

  8. An Idealized Modeling Study of the Gulf-Stream and Kuroshio Extension Systems

    Science.gov (United States)

    Primeau, F. W.; Newman, D.

    2004-05-01

    A shallow-water model is used to study the dynamics of the mid-latitude wind-driven ocean circulation. A bifurcation analysis of the steady-state equilibrium solutions is presented. The analysis is in terms of several control parameters: some that control the wind-stress pattern and others that controls the dissipation parameterization. Of the parameters that control the wind-stress, one controls the tilt of the zero-curl line, another controls the relative intensity of the vorticity input in the subtropical and sub-polar gyres and a third controls the overall intensity of the wind-stress. We identify parameter ranges for which multiple equilibria with elongated and contracted western boundary current extensions exist. We also present time-dependent solutions with low frequency variability associated with transitions between the elongated and contracted modes of circulation. The modeled variability is conjectured to correspond to similar elongation-contraction patterns of variability observed in the Kuroshio and Gulf-Stream extension systems from satellite observations.

  9. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  10. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  11. Plant condition assessments as a requirement before major investment in life extension for a CANDU nuclear power plant

    International Nuclear Information System (INIS)

    Aubray, Marc

    2002-01-01

    Full text: Since, to extend the life of a CANDU-6 reactor beyond its original design life requires the replacement of reactor components (380 pressure and calandria tubes), a major investment will have to be done. After a preliminary technical and economical feasibility study, Hydro- Quebec, owner of the Gentilly-2 NPP, has decided to perform a more detailed assessment to: 1. Get assurance that it is technically and economically viable to extend Gentilly-2 for another 20 years beyond the original design life; 2. Identify the detailed work to be done during the refurbishment period planned in 2008-2009; 3. Define the overall cost and the general schedule of the refurbishment phase; 4. Ensure an adequate licensing strategy to restart after refurbishment; 5. Complete all the Environmental Impact Studies required to obtain the government authorizations. The business case to support the refurbishment of Gentilly-2 has to take in consideration the reactor core components, which will be the major work to be completed during refurbishment. In summary the following main component will have to be changed or refreshed: The pressure and calandria tubes and the feeders (partial replacement only) (ageing mechanisms); The control computers (obsolescence); The condenser tubes (tubes plugging); The turbine control and electric-governor (obsolescence). An extensive campaign is under way to assess the 'health' of the station systems, structures and components (SSC). Two processes have been used for this assessment: Plant Life Management Studies (PLIM) for approximately 10 critical SSC or families of SSC (PLIM Studies); Condition Assessment Studies for other SSC with a lower impact on the Plant production or safety). The PLIM Studies are done on SSC's, which were judged critical because they are not replaceable (Reactor Building, Calandria), or that their failure could have a significant impact on safety or production (electrical motors, majors pumps, heat exchangers and pressure

  12. Modeling quorum sensing trade-offs between bacterial cell density and system extension from open boundaries

    Science.gov (United States)

    Marenda, Mattia; Zanardo, Marina; Trovato, Antonio; Seno, Flavio; Squartini, Andrea

    2016-12-01

    Bacterial communities undergo collective behavioural switches upon producing and sensing diffusible signal molecules; a mechanism referred to as Quorum Sensing (QS). Exemplarily, biofilm organic matrices are built concertedly by bacteria in several environments. QS scope in bacterial ecology has been debated for over 20 years. Different perspectives counterpose the role of density reporter for populations to that of local environment diffusivity probe for individual cells. Here we devise a model system where tubes of different heights contain matrix-embedded producers and sensors. These tubes allow non-limiting signal diffusion from one open end, thereby showing that population spatial extension away from an open boundary can be a main critical factor in QS. Experimental data, successfully recapitulated by a comprehensive mathematical model, demonstrate how tube height can overtake the role of producer density in triggering sensor activation. The biotic degradation of the signal is found to play a major role and to be species-specific and entirely feedback-independent.

  13. Search for neutral Higgs bosons in the minimal supersymmetric extension of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Akers, R.; Alexander, G.; Allison, J.; Anderson, K.J.; Arcelli, S.; Asai, S.; Astbury, A.; Axen, D.; Azuelos, G.; Ball, A.H.; Barberio, E.; Barlow, R.J.; Bartoldus, R.; Batley, J.R.; Beaudoin, G.; Beck, A.; Beck, G.A.; Becker, J.; Beeston, C.; Behnke, T.; Bell, K.W.; Bella, G.; Bentkowski, P.; Bentvelsen, S.; Berlich, P.; Bethke, S.; Biebel, O.; Bloodworth, I.J.; Bock, P.; Bosch, H.M.; Boutemeur, M.; Braibant, S.; Bright-Thomas, P.; Brown, R.M.; Buijs, A.; Burckhart, H.J.; Burgard, C.; Capiluppi, P.; Carnegie, R.K.; Carter, A.A.; Carter, J.R.; Chang, C.Y.; Charlesworth, C.; Charlton, D.G.; Chu, S.L.; Clarke, P.E.L.; Clayton, J.C.; Clowes, S.G.; Cohen, I.; Conboy, J.E.; Coupland, M.; Cuffiani, M.; Dado, S.; Dallapiccola, C.; Darling, C.; Dallavalle, G.M.; Jong, S. de; Deng, H.; Dittmar, M.; Dixit, M.S.; Couto e Silva, E. do; Duchovni, E.; Duboscq, J.E.; Duckeck, G.; Duerdoth, I.P.; Dunwoody, U.C.; Elcombe, P.A.; Estabrooks, P.G.; Etzion, E.; Evans, H.G.; Fabbri, F.; Fabbro, B.; Fanti, M; OPAL Collaboration

    1994-09-01

    A search for the neutral Higgs bosons h[sup O] and A[sup O], predicted by the Minimal Supersymmetric Extension of the Standard Model (MSSM), has been performed by the OPAL Collaboration at LEP. The analysis was based on approximately 75 pb[sup -1] of data taken at centre-of-mass energies in the vicinity of the Z[sup O] resonance. No Higgs boson signals have been detected. Using, in addition, an upper limit on the contribution of non-Standard Model processes to the Z[sup O] boson width, almost the entire MSSM parameter space that can be reached at present LEP energies has been excluded. In particular, at the 95% confidence level, our results imply that m[sub h[sup o

  14. Simple thermodynamic model of the extension of solid solution of Cu-Mo alloys processed by mechanical alloying

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar, C., E-mail: claudio.aguilar@usm.cl [Departamento de Ingenieria Metalurgica y de Materiales, Universidad Tecnica Federico Santa Maria, Avenida Espana 1680, Valparaiso (Chile); Guzman, D. [Departamento de Metalurgia, Facultad de Ingenieria, Universidad de Atacama, Av. Copayapu 485, Copiapo (Chile); Rojas, P.A. [Escuela de Ingenieria Mecanica, Facultad de Ingenieria, Pontificia Universidad Catolica de Valparaiso, Av. Los Carrera 01567, Quilpue (Chile); Ordonez, Stella [Departamento de Ingenieria Metalurgica, Facultad de Ingenieria, Universidad de Santiago de Chile, Av. L. Bernardo O' Higgins 3363, Santiago (Chile); Rios, R. [Instituto de Materiales y Procesos Termomecanicos, Facultad de Ciencias de la Ingenieria, Universidad Austral de Chile, General Lagos 2086, Valdivia (Chile)

    2011-08-15

    Highlights: {yields} Extension of solid solution in Cu-Mo systems achieved by mechanical alloying. {yields} Simple thermodynamic model to explain extension of solid solution of Mo in Cu. {yields} Model gives results that are consistent with the solubility limit extension reported in other works. - Abstract: The objective of this work is proposing a simple thermodynamic model to explain the increase in the solubility limit of the powders of the Cu-Mo systems or other binary systems processed by mechanical alloying. In the regular solution model, the effects of crystalline defects, such as; dislocations and grain boundary produced during milling were introduced. The model gives results that are consistent with the solubility limit extension reported in other works for the Cu-Cr, Cu-Nb and Cu-Fe systems processed by mechanical alloying.

  15. Use, misuse and extensions of "ideal gas" models of animal encounter.

    Science.gov (United States)

    Hutchinson, John M C; Waser, Peter M

    2007-08-01

    Biologists have repeatedly rediscovered classical models from physics predicting collision rates in an ideal gas. These models, and their two-dimensional analogues, have been used to predict rates and durations of encounters among animals or social groups that move randomly and independently, given population density, velocity, and distance at which an encounter occurs. They have helped to separate cases of mixed-species association based on behavioural attraction from those that simply reflect high population densities, and to detect cases of attraction or avoidance among conspecifics. They have been used to estimate the impact of population density, speeds of movement and size on rates of encounter between members of the opposite sex, between gametes, between predators and prey, and between observers and the individuals that they are counting. One limitation of published models has been that they predict rates of encounter, but give no means of determining whether observations differ significantly from predictions. Another uncertainty is the robustness of the predictions when animal movements deviate from the model's assumptions in specific, biologically relevant ways. Here, we review applications of the ideal gas model, derive extensions of the model to cover some more realistic movement patterns, correct several errors that have arisen in the literature, and show how to generate confidence limits for expected rates of encounter among independently moving individuals. We illustrate these results using data from mangabey monkeys originally used along with the ideal gas model to argue that groups avoid each other. Although agent-based simulations provide a more flexible alternative approach, the ideal gas model remains both a valuable null model and a useful, less onerous, approximation to biological reality.

  16. An Extension of a Fuzzy Reputation Agent Trust Model (AFRAS) in the ART Testbed

    OpenAIRE

    Carbó, Javier; Molina, José M.

    2010-01-01

    With the introduction of web services, users require an automated way of determining their reliability and even their matching to personal and subjective preferences. Therefore, trust modelling of web services, managed in an autonomous way by intelligent agents, is a challenging and relevant issue. Due to the dynamic and distributed nature of web services, recommendations of web services from third parties may also play an important role to build and update automated trust models. In this con...

  17. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  18. Extensive cardinal parameter model to predict growth of pseudomonads in salt-reduced lightly preserved seafood

    DEFF Research Database (Denmark)

    Martinez Rios, Veronica; Dalgaard, Paw

    and including terms for temperature, pH, aw/NaCl, lactic- and sorbic acids (Martinez-Rios et al., Int. J. Food Microbiol. 216. 110-120, 2016). MIC-values for acetic-, benzoic- and citric acids were determined in broth and terms modelling their antimicrobial effect were added to the model. The new and expanded...... reformulation as shown here for brined shrimps at 8°C, pH of 5.8 and water phase organic acid concentrations of 3000 ppm (citric), 1200 ppm (benzoic) and 500 ppm (sorbic). When the water phase salt concentration in this product is reduced from 3% to 1% growth of psychrotolerant pseudomonads change from none...... was to develop an extensive predictive model that allows growth of psychrotolerant pseudomonads to be predicted in brined and marinated seafood with a range of different organic acids The new model was developed by expanding an existing cardinal parameter-type model for growth of pseudomonads in dairy products...

  19. Extension of the time-average model to Candu refueling schemes involving reshuffling

    International Nuclear Information System (INIS)

    Rouben, Benjamin; Nichita, Eleodor

    2008-01-01

    Candu reactors consist of a horizontal non-pressurized heavy-water-filled vessel penetrated axially by fuel channels, each containing twelve 50-cm-long fuel bundles cooled by pressurized heavy water. Candu reactors are refueled on-line and, as a consequence, the core flux and power distributions change continuously. For design purposes, a 'time-average' model was developed in the 1970's to calculate the average over time of the flux and power distribution and to study the effects of different refueling schemes. The original time-average model only allows treatment of simple push-through refueling schemes whereby fresh fuel is inserted at one end of the channel and irradiated fuel is removed from the other end. With the advent of advanced fuel cycles and new Candu designs, novel refueling schemes may be considered, such as reshuffling discharged fuel from some channels into other channels, to achieve better overall discharge burnup. Such reshuffling schemes cannot be handled by the original time-average model. This paper presents an extension of the time-average model to allow for the treatment of refueling schemes with reshuffling. Equations for the extended model are presented, together with sample results for a simple demonstration case. (authors)

  20. Modeling when people quit: Bayesian censored geometric models with hierarchical and latent-mixture extensions.

    Science.gov (United States)

    Okada, Kensuke; Vandekerckhove, Joachim; Lee, Michael D

    2018-02-01

    People often interact with environments that can provide only a finite number of items as resources. Eventually a book contains no more chapters, there are no more albums available from a band, and every Pokémon has been caught. When interacting with these sorts of environments, people either actively choose to quit collecting new items, or they are forced to quit when the items are exhausted. Modeling the distribution of how many items people collect before they quit involves untangling these two possibilities, We propose that censored geometric models are a useful basic technique for modeling the quitting distribution, and, show how, by implementing these models in a hierarchical and latent-mixture framework through Bayesian methods, they can be extended to capture the additional features of specific situations. We demonstrate this approach by developing and testing a series of models in two case studies involving real-world data. One case study deals with people choosing jokes from a recommender system, and the other deals with people completing items in a personality survey.

  1. The steric gate amino acid tyrosine 112 is required for efficient mismatched-primer extension by human DNA polymerase kappa.

    Science.gov (United States)

    Niimi, Naoko; Sassa, Akira; Katafuchi, Atsushi; Grúz, Petr; Fujimoto, Hirofumi; Bonala, Radha-Rani; Johnson, Francis; Ohta, Toshihiro; Nohmi, Takehiko

    2009-05-26

    Human DNA is continuously damaged by exogenous and endogenous genotoxic insults. To counteract DNA damage and ensure the completion of DNA replication, cells possess specialized DNA polymerases (Pols) that bypass a variety of DNA lesions. Human DNA polymerase kappa (hPolkappa) is a member of the Y-family of DNA Pols and a direct counterpart of DinB in Escherichia coli. hPolkappa is characterized by its ability to bypass several DNA adducts [e.g., benzo[a]pyrene diolepoxide-N(2)-deoxyguanine (BPDE-N(2)-dG) and thymine glycol] and efficiently extend primers with mismatches at the termini. hPolkappa is structurally distinct from E. coli DinB in that it possesses an approximately 100-amino acid extension at the N-terminus. Here, we report that tyrosine 112 (Y112), the steric gate amino acid of hPolkappa, which distinguishes dNTPs from rNTPs by sensing the 2'-hydroxy group of incoming nucleotides, plays a crucial role in extension reactions with mismatched primer termini. When Y112 was replaced with alanine, the amino acid change severely reduced the catalytic constant, i.e., k(cat), of the extending mismatched primers and lowered the efficiency, i.e., k(cat)/K(m), of this process by approximately 400-fold compared with that of the wild-type enzyme. In contrast, the amino acid replacement did not reduce the insertion efficiency of dCMP opposite BPDE-N(2)-dG in template DNA, nor did it affect the ability of hPolkappa to bind strongly to template-primer DNA with BPDE-N(2)-dG/dCMP. We conclude that the steric gate of hPolkappa is a major fidelity factor that regulates extension reactions from mismatched primer termini.

  2. A Simple Mathematical Model for Standard Model of Elementary Particles and Extension Thereof

    Science.gov (United States)

    Sinha, Ashok

    2016-03-01

    An algebraically (and geometrically) simple model representing the masses of the elementary particles in terms of the interaction (strong, weak, electromagnetic) constants is developed, including the Higgs bosons. The predicted Higgs boson mass is identical to that discovered by LHC experimental programs; while possibility of additional Higgs bosons (and their masses) is indicated. The model can be analyzed to explain and resolve many puzzles of particle physics and cosmology including the neutrino masses and mixing; origin of the proton mass and the mass-difference between the proton and the neutron; the big bang and cosmological Inflation; the Hubble expansion; etc. A novel interpretation of the model in terms of quaternion and rotation in the six-dimensional space of the elementary particle interaction-space - or, equivalently, in six-dimensional spacetime - is presented. Interrelations among particle masses are derived theoretically. A new approach for defining the interaction parameters leading to an elegant and symmetrical diagram is delineated. Generalization of the model to include supersymmetry is illustrated without recourse to complex mathematical formulation and free from any ambiguity. This Abstract represents some results of the Author's Independent Theoretical Research in Particle Physics, with possible connection to the Superstring Theory. However, only very elementary mathematics and physics is used in my presentation.

  3. Meaningful Use in Chronic Care: Improved Diabetes Outcomes Using a Primary Care Extension Center Model.

    Science.gov (United States)

    Cykert, Samuel; Lefebvre, Ann; Bacon, Thomas; Newton, Warren

    The effect of practice facilitation that provides onsite quality improvement (QI) and electronic health record (EHR) coaching on chronic care outcomes is unclear. This study evaluates the effectiveness of such a program-similar to an agricultural extension center model-that provides these services. Through the Health Information Technology for Economic and Clinical Health (HITECH) portion of the American Recovery and Reinvestment Act, the North Carolina Area Health Education Centers program became the Regional Extension Center for Health Information Technology (REC) for North Carolina. The REC program provides onsite technical assistance to help small primary care practices achieve meaningful use of certified EHRs. While pursuing meaningful use functionality, practices were also offered complementary onsite advice regarding QI issues. We followed the first 50 primary care practices that utilized both EHR and QI advice targeting diabetes care. The achievement of meaningful use of certified EHRs and performance of QI with onsite practice facilitation showed an absolute improvement of 19% in the proportion of patients who achieved excellent diabetes control (hemoglobin A1c 9%) fell steeply in these practices. No control group was available for comparison. Practice facilitation that provided EHR and QI coaching support showed important improvements in diabetes outcomes in practices that achieved meaningful use of their EHR systems. This approach holds promise as a way to help small primary care practices achieve excellent patient outcomes. ©2016 by the North Carolina Institute of Medicine and The Duke Endowment. All rights reserved.

  4. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  5. Modeling Domain Variability in Requirements Engineering with Contexts

    Science.gov (United States)

    Lapouchnian, Alexei; Mylopoulos, John

    Various characteristics of the problem domain define the context in which the system is to operate and thus impact heavily on its requirements. However, most requirements specifications do not consider contextual properties and few modeling notations explicitly specify how domain variability affects the requirements. In this paper, we propose an approach for using contexts to model domain variability in goal models. We discuss the modeling of contexts, the specification of their effects on system goals, and the analysis of goal models with contextual variability. The approach is illustrated with a case study.

  6. On generalized Yang-Mills theories and extensions of the standard model in Clifford (tensorial) spaces

    International Nuclear Information System (INIS)

    Castro, Carlos

    2006-01-01

    We construct the Clifford-space tensorial-gauge fields generalizations of Yang-Mills theories and the Standard Model that allows to predict the existence of new particles (bosons, fermions) and tensor-gauge fields of higher-spins in the 10 Tev regime. We proceed with a detailed discussion of the unique D 4 - D 5 - E 6 - E 7 - E 8 model of Smith based on the underlying Clifford algebraic structures in D = 8, and which furnishes all the properties of the Standard Model and Gravity in four-dimensions, at low energies. A generalization and extension of Smith's model to the full Clifford-space is presented when we write explicitly all the terms of the extended Clifford-space Lagrangian. We conclude by explaining the relevance of multiple-foldings of D = 8 dimensions related to the modulo 8 periodicity of the real Cliford algebras and display the interplay among Clifford, Division, Jordan, and Exceptional algebras, within the context of D = 26, 27, 28 dimensions, corresponding to bosonic string, M and F theory, respectively, advanced earlier by Smith. To finalize we describe explicitly how the E 8 x E 8 Yang-Mills theory can be obtained from a Gauge Theory based on the Clifford (16) group

  7. New viable region of an inert Higgs doublet dark matter model with scotogenic extension

    Science.gov (United States)

    Borah, Debasish; Gupta, Aritra

    2017-12-01

    We explore the intermediate dark matter mass regime of the inert Higgs doublet model, approximately between 400 and 550 GeV, which is allowed by latest constraints from direct and indirect detection experiments, but the thermal relic abundance remains suppressed. We extend the model by three copies of right-handed neutrinos, odd under the built-in Z2 symmetry of the model. This discrete Z2 symmetry of the model allows these right-handed neutrinos to couple to the usual lepton doublets through the inert Higgs doublet allowing the possibility of radiative neutrino mass in the scotogenic fashion. Apart from generating nonzero neutrino mass, such an extension can also revive the intermediate dark matter mass regime. The late decay of the lightest right-handed neutrino to dark matter makes it possible for the usual thermally underabundant dark matter in this intermediate mass regime to satisfy the correct relic abundance limit. The revival of this wide intermediate mass range can have relevance not only for direct and indirect search experiments but also for neutrino experiments as the long lifetime of the lightest right-handed neutrino also results in almost vanishing lightest neutrino mass.

  8. A cell kinetic model of granulopoiesis under radiation exposure: Extension from rodents to canines and humans

    International Nuclear Information System (INIS)

    Hu, S.; Cucinotta, F. A.

    2011-01-01

    As significant ionising radiation exposure will occur during prolonged space travel in future, it is essential to understand their adverse effects on the radiosensitive organ systems that are important for immediate survival of humans, e.g. the haematopoietic system. In this paper, a bio-mathematical model of granulopoiesis is used to analyse the granulocyte changes seen in the blood of mammalians under acute and continuous radiation exposure. This is one of a set of haematopoietic models that have been successfully utilised to simulate and interpret the experimental data of acute and chronic radiation on rodents. Extension to canine and human systems indicates that the results of the model are consistent with the cumulative experimental and empirical data from various sources, implying the potential to integrate them into one united model system to monitor the haematopoietic response of various species under irradiation. The suppression of granulocytes' level of a space traveller under chronic stress of low-dose irradiation as well as the granulopoietic response when encountering a historically large solar particle event is also discussed. (authors)

  9. Extensive and systematic rewiring of histone post-translational modifications in cancer model systems.

    Science.gov (United States)

    Noberini, Roberta; Osti, Daniela; Miccolo, Claudia; Richichi, Cristina; Lupia, Michela; Corleone, Giacomo; Hong, Sung-Pil; Colombo, Piergiuseppe; Pollo, Bianca; Fornasari, Lorenzo; Pruneri, Giancarlo; Magnani, Luca; Cavallaro, Ugo; Chiocca, Susanna; Minucci, Saverio; Pelicci, Giuliana; Bonaldi, Tiziana

    2018-03-29

    Histone post-translational modifications (PTMs) generate a complex combinatorial code that regulates gene expression and nuclear functions, and whose deregulation has been documented in different types of cancers. Therefore, the availability of relevant culture models that can be manipulated and that retain the epigenetic features of the tissue of origin is absolutely crucial for studying the epigenetic mechanisms underlying cancer and testing epigenetic drugs. In this study, we took advantage of quantitative mass spectrometry to comprehensively profile histone PTMs in patient tumor tissues, primary cultures and cell lines from three representative tumor models, breast cancer, glioblastoma and ovarian cancer, revealing an extensive and systematic rewiring of histone marks in cell culture conditions, which includes a decrease of H3K27me2/me3, H3K79me1/me2 and H3K9ac/K14ac, and an increase of H3K36me1/me2. While some changes occur in short-term primary cultures, most of them are instead time-dependent and appear only in long-term cultures. Remarkably, such changes mostly revert in cell line- and primary cell-derived in vivo xenograft models. Taken together, these results support the use of xenografts as the most representative models of in vivo epigenetic processes, suggesting caution when using cultured cells, in particular cell lines and long-term primary cultures, for epigenetic investigations.

  10. Extension of the master sintering curve for constant heating rate modeling

    Science.gov (United States)

    McCoy, Tammy Michelle

    The purpose of this work is to extend the functionality of the Master Sintering Curve (MSC) such that it can be used as a practical tool for predicting sintering schemes that combine both a constant heating rate and an isothermal hold. Rather than just being able to predict a final density for the object of interest, the extension to the MSC will actually be able to model a sintering run from start to finish. Because the Johnson model does not incorporate this capability, the work presented is an extension of what has already been shown in literature to be a valuable resource in many sintering situations. A predicted sintering curve that incorporates a combination of constant heating rate and an isothermal hold is more indicative of what is found in real-life sintering operations. This research offers the possibility of predicting the sintering schedule for a material, thereby having advanced information about the extent of sintering, the time schedule for sintering, and the sintering temperature with a high degree of accuracy and repeatability. The research conducted in this thesis focuses on the development of a working model for predicting the sintering schedules of several stabilized zirconia powders having the compositions YSZ (HSY8), 10Sc1CeSZ, 10Sc1YSZ, and 11ScSZ1A. The compositions of the four powders are first verified using x-ray diffraction (XRD) and the particle size and surface area are verified using a particle size analyzer and BET analysis, respectively. The sintering studies were conducted on powder compacts using a double pushrod dilatometer. Density measurements are obtained both geometrically and using the Archimedes method. Each of the four powders is pressed into ¼" diameter pellets using a manual press with no additives, such as a binder or lubricant. Using a double push-rod dilatometer, shrinkage data for the pellets is obtained over several different heating rates. The shrinkage data is then converted to reflect the change in relative

  11. Carbon-14 dynamics in rice: an extension of the ORYZA2000 model

    Energy Technology Data Exchange (ETDEWEB)

    Galeriu, D.; Melintescu, A. [' ' Horia Hulubei' ' National Institute for Physics and Nuclear Engineering, Life and Environmental Physics Department, 30 Reactorului St., POB MG-6, Bucharest-Magurele (Romania)

    2014-03-15

    Carbon-14 ({sup 14}C) is a radionuclide of major interest in nuclear power production. The Fukushima accident changed the public attitude on the use of nuclear energy all over the world. In terms of nuclear safety, the need of quality-assured radiological models was emphasized by many international organizations, and for models used by decision-makers (i.e. regulatory environmental models and radiological models), a moderate conservatism, transparency, relative simplicity and user friendliness are required. Because the interaction between crops and the environment is complex and regulated by many feedback mechanisms, however, these requirements are difficult to accomplish. The present study makes a step forward regarding the development of a robust model dealing with food contamination after a short-term accidental emission and considers a single crop species, rice (Oryza sativa), one of the most widely used rice species. Old and more recent experimental data regarding the carbon dynamics in rice plants are reviewed, and a well-established crop growth model, ORYZA2000, is used and adapted in order to assess the dynamics of {sup 14}C in rice after a short-term exposure to {sup 14}CO{sub 2}. Here, the model is used to investigate the role of the genotype, management and weather on the concentration of radiocarbon at harvest. (orig.)

  12. Extension of Small-Scale Postharvest Horticulture Technologies—A Model Training and Services Center

    Directory of Open Access Journals (Sweden)

    Lisa Kitinoja

    2015-07-01

    Full Text Available A pilot Postharvest Training and Services Center (PTSC was launched in October 2012 in Arusha, Tanzania as part of a United States Agency for International Development (USAID funded project. The five key components of the PTSC are (1 training of postharvest trainers, (2 postharvest training and demonstrations for local small-scale clientele, (3 adaptive research, (4 postharvest services, and (5 retail sales of postharvest tools and supplies. During the years of 2011–2012, a one year e-learning program was provided to 36 young horticultural professionals from seven Sub-Saharan African countries. These postharvest specialists went on to train more than 13,000 local farmers, extension workers, food processors, and marketers in their home countries in the year following completion of their course. Evaluators found that these specialists had trained an additional 9300 people by November 2014. When asked about adoption by their local trainees, 79% reported examples of their trainees using improved postharvest practices. From 2012–2013, the project supported 30 multi-day training programs, and the evaluation found that many of the improved practices being promoted were adopted by the trainees and led to increased earnings. Three PTSC components still require attention. Research activities initiated during the project are incomplete, and successful sales of postharvest goods and services will require commitment and improved partnering.

  13. Extension and Higher Education Service-Learning: Toward a Community Development Service-Learning Model

    Science.gov (United States)

    Stoecker, Randy

    2014-01-01

    This article explores how on-the-ground Extension educators interface with higher education service-learning. Most service-learning in Extension has focused on precollege youth and 4-H. When we look at higher education service-learning and Extension in Wisconsin, we see that there is not as much connection as might be expected. County-based…

  14. Abelian embedding formulation of the Stueckelberg model and its power-counting renormalizable extension

    International Nuclear Information System (INIS)

    Quadri, Andrea

    2006-01-01

    We elucidate the geometry of the polynomial formulation of the non-Abelian Stueckelberg mechanism. We show that a natural off-shell nilpotent Becchi-Rouet-Stora-Tyutin (BRST) differential exists allowing to implement the constraint on the σ field by means of BRST techniques. This is achieved by extending the ghost sector by an additional U(1) factor (Abelian embedding). An important consequence is that a further BRST-invariant but not gauge-invariant mass term can be written for the non-Abelian gauge fields. As all versions of the Stueckelberg theory, also the Abelian embedding formulation yields a nonpower-counting renormalizable theory in D=4. We then derive its natural power-counting renormalizable extension and show that the physical spectrum contains a physical massive scalar particle. Physical unitarity is also established. This model implements the spontaneous symmetry breaking in the Abelian embedding formalism

  15. Lepton flavour violation in a minimal S{sub 3}-invariant extension of the Standard Model

    Energy Technology Data Exchange (ETDEWEB)

    Mondragon, A.; Mondragon, M.; Peinado, E. [IFUNAM, A.P. 20-364, 01000 Mexico, D.F. (Mexico)

    2007-12-15

    After a brief review of some relevant results on lepton masses and mixings, that had previously been derived in the framework of a Minimal S{sub 3}-Invariant Extension of the Standard Model, we derive explicit analytical expressions for the matrices of the Yukawa couplings and compute the branching ratios of some selected flavour changing neutral current processes as functions of the masses of the charged leptons and the neutral Higgs bosons. We find that the S{sub 3} X Z{sub 2} flavour symmetry and the strong mass hierarchy of the charged leptons strongly suppress the FCNC processes in the leptonic sector well below the present experimental upper bounds by many orders of magnitude. (Author)

  16. LHC benchmark scenarios for the real Higgs singlet extension of the standard model

    International Nuclear Information System (INIS)

    Robens, Tania; Stefaniak, Tim

    2016-01-01

    We present benchmark scenarios for searches for an additional Higgs state in the real Higgs singlet extension of the Standard Model in Run 2 of the LHC. The scenarios are selected such that they fulfill all relevant current theoretical and experimental constraints, but can potentially be discovered at the current LHC run. We take into account the results presented in earlier work and update the experimental constraints from relevant LHC Higgs searches and signal rate measurements. The benchmark scenarios are given separately for the low-mass and high-mass region, i.e. the mass range where the additional Higgs state is lighter or heavier than the discovered Higgs state at around 125 GeV. They have also been presented in the framework of the LHC Higgs Cross Section Working Group. (orig.)

  17. Personalized predictive modeling for patients with Alzheimer's disease using an extension of Sullivan's life table model.

    Science.gov (United States)

    Stallard, Eric; Kinosian, Bruce; Stern, Yaakov

    2017-09-20

    Alzheimer's disease (AD) progression varies substantially among patients, hindering calculation of residual total life expectancy (TLE) and its decomposition into disability-free life expectancy (DFLE) and disabled life expectancy (DLE) for individual patients with AD. The objective of the present study was to assess the accuracy of a new synthesis of Sullivan's life table (SLT) and longitudinal Grade of Membership (L-GoM) models that estimates individualized TLEs, DFLEs, and DLEs for patients with AD. If sufficiently accurate, such information could enhance the quality of important decisions in AD treatment and patient care. We estimated a new SLT/L-GoM model of the natural history of AD over 10 years in the Predictors 2 Study cohort: N = 229 with 6 fixed and 73 time-varying covariates over 21 examinations covering 11 measurement domains including cognitive, functional, behavioral, psychiatric, and other symptoms/signs. Total remaining life expectancy was censored at 10 years. Disability was defined as need for full-time care (FTC), the outcome most strongly associated with AD progression. All parameters were estimated via weighted maximum likelihood using data-dependent weights designed to ensure that the estimates of the prognostic subtypes were of high quality. Goodness of fit was tested/confirmed for survival and FTC disability for five relatively homogeneous subgroups defined to cover the range of patient outcomes over the 21 examinations. The substantial heterogeneity in initial patient presentation and AD progression was captured using three clinically meaningful prognostic subtypes and one terminal subtype exhibiting highly differentiated symptom severity on 7 of the 11 measurement domains. Comparisons of the observed and estimated survival and FTC disability probabilities demonstrated that the estimates were accurate for all five subgroups, supporting their use in AD life expectancy calculations. Mean 10-year TLE differed widely across subgroups

  18. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  19. Capabilities and requirements for modelling radionuclide transport in the geosphere

    International Nuclear Information System (INIS)

    Paige, R.W.; Piper, D.

    1989-02-01

    This report gives an overview of geosphere flow and transport models suitable for use by the Department of the Environment in the performance assessment of radioactive waste disposal sites. An outline methodology for geosphere modelling is proposed, consisting of a number of different types of model. A brief description of each of the component models is given, indicating the purpose of the model, the processes being modelled and the methodologies adopted. Areas requiring development are noted. (author)

  20. Heritage House Maintenance Using 3d City Model Application Domain Extension Approach

    Science.gov (United States)

    Mohd, Z. H.; Ujang, U.; Liat Choon, T.

    2017-11-01

    Heritage house is part of the architectural heritage of Malaysia that highly valued. Many efforts by the Department of Heritage to preserve this heritage house such as monitoring the damage problems of heritage house. The damage problems of heritage house might be caused by wooden decay, roof leakage and exfoliation of wall. One of the initiatives for maintaining and documenting this heritage house is through Three-dimensional (3D) of technology. 3D city models are widely used now and much used by researchers for management and analysis. CityGML is a standard tool that usually used by researchers to exchange, storing and managing virtual 3D city models either geometric and semantic information. Moreover, it also represent multi-scale of 3D model in five level of details (LoDs) whereby each of level give a distinctive functions. The extension of CityGML was recently introduced and can be used for problems monitoring and the number of habitants of a house.

  1. HERITAGE HOUSE MAINTENANCE USING 3D CITY MODEL APPLICATION DOMAIN EXTENSION APPROACH

    Directory of Open Access Journals (Sweden)

    Z. H. Mohd

    2017-11-01

    Full Text Available Heritage house is part of the architectural heritage of Malaysia that highly valued. Many efforts by the Department of Heritage to preserve this heritage house such as monitoring the damage problems of heritage house. The damage problems of heritage house might be caused by wooden decay, roof leakage and exfoliation of wall. One of the initiatives for maintaining and documenting this heritage house is through Three-dimensional (3D of technology. 3D city models are widely used now and much used by researchers for management and analysis. CityGML is a standard tool that usually used by researchers to exchange, storing and managing virtual 3D city models either geometric and semantic information. Moreover, it also represent multi-scale of 3D model in five level of details (LoDs whereby each of level give a distinctive functions. The extension of CityGML was recently introduced and can be used for problems monitoring and the number of habitants of a house.

  2. Extensive Numerical Study and Circuitry Implementation of the Watt Governor Model

    Science.gov (United States)

    Marcondes, D. W. C.; Comassetto, G. F.; Pedro, B. G.; Vieira, J. C. C.; Hoff, A.; Prebianca, F.; Manchein, C.; Albuquerque, H. A.

    In this work we carry out extensive numerical study of a Watt centrifugal governor system model, and we also implement an electronic circuit by analog computation to experimentally solve the model. Our numerical results show the existence of self-organized stable periodic structures (SPSs) on parameter-space of the largest Lyapunov exponent and isospikes of time series of the Watt governor system model. A peculiar hierarchical organization and period-adding bifurcation cascade of the SPSs are observed, and this self-organized cascade accumulates on a periodic boundary. It is also shown that the periods of these structures organize themselves obeying the solutions of Diophantine equations. In addition, an experimental setup is implemented by a circuitry analogy of mechanical systems using analog computing technique to characterize the robustness of our numerical results. After applying an active control of chaos in the experiment, the effect of intrinsic experimental noise was minimized such that, the experimental results are astonishingly well in agreement with our numerical findings. We can also mention as another remarkable result, the application of analog computing technique to perform an experimental circuitry analysis in real mechanical problems.

  3. Top quark electric dipole moment in a minimal supersymmetric standard model extension with vectorlike multiplets

    International Nuclear Information System (INIS)

    Ibrahim, Tarek; Nath, Pran

    2010-01-01

    The electric dipole moment (EDM) of the top quark is calculated in a model with a vector like multiplet which mixes with the third generation in an extension of the minimal supersymmetric standard model. Such mixings allow for new CP violating phases. Including these new CP phases, the EDM of the top in this class of models is computed. The top EDM arises from loops involving the exchange of the W, the Z as well as from the exchange involving the charginos, the neutralinos, the gluino, and the vector like multiplet and their superpartners. The analysis of the EDM of the top is more complicated than for the light quarks because the mass of the external fermion, in this case the top quark mass cannot be ignored relative to the masses inside the loops. A numerical analysis is presented and it is shown that the top EDM could be close to 10 -19 ecm consistent with the current limits on the EDM of the electron, the neutron and on atomic EDMs. A top EDM of size 10 -19 ecm could be accessible in collider experiments such as the International Linear Collider.

  4. Constraints effects in swollen particulate composites with hyperelastic polymer matrix of finite extensibility modeled by FEM

    Science.gov (United States)

    Šomvársky, Ján; Dušek, Karel; Dušková-Smrčková, Miroslava

    2014-03-01

    The class of particulate composites with cross-linked hyperelastic polymer matrix and non-deformable filler particles represents many important biopolymer and engineering materials. At application conditions, the matrix is either in the swollen state, or the swollen state is utilized for matrix characterization. In this contribution, a numerical model for simulation of equilibrium stress-strain and swelling behavior of this composite material was developed based on finite element method using COMSOL Multiphysics® software. In the constitutive equations (Gibbs energy), the elastic contribution is based on statistical-mechanical model of a network composed of freely jointed chains of finite extensibility and polymer-solvent mixing term is derived from the Flory-Huggins lattice model. A perfect adhesion of matrix-to-particle is assumed. The adhesion of matrix to stiff surface generates stress and degree-of-swelling fields in the composite. The existence of these fields determines the mechanical and swelling properties of the composite. Spatial distribution of filler particles in the composite plays an important role.

  5. Novel extension of the trap model for electrons in liquid hydrocarbons

    International Nuclear Information System (INIS)

    Jamal, M.A.; Watt, D.E.

    1981-01-01

    A novel extension for the trap model of electron mobilities in liquid hydrocarbons is described. The new model assumes: (a) two main types of electron trap exist in liquid hydrocarbons, one is deep and the second is shallow; (b) these traps are the same in all liquid alkanes. The difference in electron mobilities in different alkanes is accounted for by the difference in the frequency of electron trapping in each state. The probability of trapping in each state has been evaluated from the known structures of the normal alkanes. Electron mobilities in normal alkanes (C 3 -C 10 ) show a very good correlation with the probability of trapping in deep traps, suggesting that the C-C bonds are the main energy sinks of the electron. A mathematical formula which expresses the electron mobility in terms of the probability of trapping in deep traps has been found from the Arrhenius relationship between electron mobilities and probability of trapping. The model has been extended for branched alkanes and the relatively high electron mobilities in globular alkanes has been explained by the fact that each branch provides some degree of screening to the skeleton structure of the molecule resulting in reduction of the probability of electron interaction with the molecular skeleton. (author)

  6. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  7. Identifying the determinants of South Africa’s extensive and intensive trade margins: A gravity model approach

    Directory of Open Access Journals (Sweden)

    Marianne Matthee

    2017-03-01

    Full Text Available Background: The significance of the paper is twofold. Firstly, it adds to the small but growing body of literature focusing on the decomposition of South Africa’s export growth. Secondly, it identifies the determinants of the intensive and extensive margins of South Africa’s exports – a topic that (as far as the authors are concerned has not been explored before. Aim: This paper aims to investigate a wide range of market access determinants that affect South Africa’s export growth along the intensive and extensive margins. Setting: Export diversification has been identified as one of the critical pillars of South Africa’s much-hoped-for economic revival. Although recent years have seen the country’s export product mix evolving, there is still insufficient diversification into new markets with high value-added products. This is putting a damper on export performance as a whole and, in turn, hindering South Africa’s economic growth. Methods: A Heckman selection gravity model is applied using highly disaggregated data. The first stage of the process revealed the factors affecting the probability of South Africa exporting to a particular destination (extensive margin. The second stage, which modelled trade flows, revealed the variables that affect export volumes (intensive margin. Results: The results showed that South Africa’s export product mix is relatively varied, but the number of export markets is limited. In terms of the extensive margin (or the probability of exporting, economic variables such as the importing country’s GDP and population have a positive impact on firms’ decision to export. Other factors affecting the extensive margin are distance to the market (negative impact, cultural or language fit (positive impact, presence of a South African embassy abroad (positive impact, existing free trade agreement with Southern African Development Community (positive impact and trade regulations and costs (negative

  8. The effectiveness of agrobusiness technical training and education model for the field agricultural extension officers

    Directory of Open Access Journals (Sweden)

    Kristiyo Sumarwono

    2017-07-01

    Full Text Available The study was to: (1 find the most effective agrobusiness technical training and education model for the Field Agricultural Extension Officers to be implemented; and (2 to identify the knowledge level, the highest agrobusiness skills and the strongest self-confidence that might be achieved by the participants through the implemented training and education patterns. The study was conducted by means of experiment method with the regular pattern of training and education program as the control and the mentoring pattern of training and education program as the treatment. The three patterns of training and education programs served as the independent variables while the knowledge, the skills and the self-confidence served as the dependent variables. The study was conducted in three locations namely: the Institution of Agricultural Human Resources Development in the Province of Yogyakarta Special Region (Balai Pengembangan Sumber Daya Manusia Pertanian Daerah Istimewa Yogyakarta – BPSMP DIY; the Institution of Agricultural Human Resources Empowerment (Balai Pemberdayaan Sumber Daya Manusia Pertanian – BPSDMTAN Soropadan Temanggung Provinsi Jawa Tengah in Soropadan, Temanggung, the Province of Central Java; and the Institution of Training and Education in Semarang, the Province of Central Java (Badan Pendidikan dan Pelatihan Semarang Provinsi Jawa Tengah. The study was conducted to all of the participants who attended the agrobusiness technical training and education program and, therefore, all of the participants became the subjects of the study. The study was conducted from October 2013 until March 2014. The results of the study showed that: (1 there had not been any significant difference on the knowledge and the skills of the participants who attended the regular pattern in training and education programs and those who attended the mentoring pattern in training and education programs; (2 the regular pattern in training and education programs

  9. Multi-period natural gas market modeling Applications, stochastic extensions and solution approaches

    Science.gov (United States)

    Egging, Rudolf Gerardus

    shorter solution times relative to solving the extensive-forms. Larger problems, up to 117,481 variables, were solved in extensive-form, but not when applying BD due to numerical issues. It is discussed how BD could significantly reduce the solution time of large-scale stochastic models, but various challenges remain and more research is needed to assess the potential of Benders decomposition for solving large-scale stochastic MCP. 1 www.gecforum.org

  10. Multi-Period Natural Gas Market Modeling. Applications, Stochastic Extensions and Solution Approaches

    International Nuclear Information System (INIS)

    Egging, R.G.

    2010-11-01

    shorter solution times relative to solving the extensive-forms. Larger problems, up to 117,481 variables, were solved in extensive-form, but not when applying BD due to numerical issues. It is discussed how BD could significantly reduce the solution time of large-scale stochastic models, but various challenges remain and more research is needed to assess the potential of Benders decomposition for solving large-scale stochastic MCP.

  11. A Scalable and Extensible Earth System Model for Climate Change Science

    Energy Technology Data Exchange (ETDEWEB)

    Gent, Peter; Lamarque, Jean-Francois; Conley, Andrew; Vertenstein, Mariana; Craig, Anthony

    2013-02-13

    The objective of this award was to build a scalable and extensible Earth System Model that can be used to study climate change science. That objective has been achieved with the public release of the Community Earth System Model, version 1 (CESM1). In particular, the development of the CESM1 atmospheric chemistry component was substantially funded by this award, as was the development of the significantly improved coupler component. The CESM1 allows new climate change science in areas such as future air quality in very large cities, the effects of recovery of the southern hemisphere ozone hole, and effects of runoff from ice melt in the Greenland and Antarctic ice sheets. Results from a whole series of future climate projections using the CESM1 are also freely available via the web from the CMIP5 archive at the Lawrence Livermore National Laboratory. Many research papers using these results have now been published, and will form part of the 5th Assessment Report of the United Nations Intergovernmental Panel on Climate Change, which is to be published late in 2013.

  12. Dimensional reduction of the CPT-even electromagnetic sector of the standard model extension

    Science.gov (United States)

    Casana, Rodolfo; Carvalho, Eduardo S.; Ferreira, Manoel M., Jr.

    2011-08-01

    The CPT-even Abelian gauge sector of the standard model extension is represented by the Maxwell term supplemented by (KF)μνρσFμνFρσ, where the Lorentz-violating background tensor, (KF)μνρσ, possesses the symmetries of the Riemann tensor. In the present work, we examine the planar version of this theory, obtained by means of a typical dimensional reduction procedure to (1+2) dimensions. The resulting planar electrodynamics is composed of a gauge sector containing six Lorentz-violating coefficients, a scalar field endowed with a noncanonical kinetic term, and a coupling term that links the scalar and gauge sectors. The dispersion relation is exactly determined, revealing that the six parameters related to the pure electromagnetic sector do not yield birefringence at any order. In this model, the birefringence may appear only as a second order effect associated with the coupling tensor linking the gauge and scalar sectors. The equations of motion are written and solved in the stationary regime. The Lorentz-violating parameters do not alter the asymptotic behavior of the fields but induce an angular dependence not observed in the Maxwell planar theory.

  13. Requirements Traceability and Transformation Conformance in Model-Driven Development

    NARCIS (Netherlands)

    Andrade Almeida, João; van Eck, Pascal; Iacob, Maria Eugenia

    2006-01-01

    The variety of design artefacts (models) produced in a model-driven design process results in an intricate rela-tionship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship. This frame-work is a basis for tracing

  14. Extension of a Kolmogorov Atmospheric Turbulence Model for Time-Based Simulation Implementation

    Science.gov (United States)

    McMinn, John D.

    1997-01-01

    The development of any super/hypersonic aircraft requires the interaction of a wide variety of technical disciplines to maximize vehicle performance. For flight and engine control system design and development on this class of vehicle, realistic mathematical simulation models of atmospheric turbulence, including winds and the varying thermodynamic properties of the atmosphere, are needed. A model which has been tentatively selected by a government/industry group of flight and engine/inlet controls representatives working on the High Speed Civil Transport is one based on the Kolmogorov spectrum function. This report compares the Dryden and Kolmogorov turbulence forms, and describes enhancements that add functionality to the selected Kolmogorov model. These added features are: an altitude variation of the eddy dissipation rate based on Dryden data, the mapping of the eddy dissipation rate database onto a regular latitude and longitude grid, a method to account for flight at large vehicle attitude angles, and a procedure for transitioning smoothly across turbulence segments.

  15. GAIA - a generalizable, extensible structure for integrating games, models and social networking to support decision makers

    Science.gov (United States)

    Paxton, L. J.; Schaefer, R. K.; Nix, M.; Fountain, G. H.; Weiss, M.; Swartz, W. H.; Parker, C. L.; MacDonald, L.; Ihde, A. G.; Simpkins, S.; GAIA Team

    2011-12-01

    In this paper we describe the application of a proven methodology for modeling the complex social and economic interactions embodied in real-world decision making to water scarcity and water resources. We have developed a generalizable, extensible facility we call "GAIA" - Global Assimilation of Information for Action - and applied it to different problem sets. We describe the use of the "Green Country Model" and other gaming/simulation tools to address the impacts of climate and climate disruption issues at the intersection of science, economics, policy, and society. There is a long history in the Defense community of using what are known as strategic simulations or "wargames" to model the complex interactions between the environment, people, resources, infrastructure and the economy in a competitive environment. We describe in this paper, work that we have done on understanding how this heritage can be repurposed to help us explore how the complex interplay between climate disruption and our socio/political and economic structures will affect our future. Our focus here is on a fundamental and growing issue - water and water availability. We consider water and the role of "virtual water" in the system. Various "actors" are included in the simulations. While these simulations cannot definitively predict what will happen, they do illuminate non-linear feedbacks between, for example, treaty agreement, the environment, the economy, and the government. These simulations can be focused on the global, regional, or local environment. We note that these simulations are not "zero sum" games - there need not be a winner and a loser. They are, however, competitive influence games: they represent the tools that a nation, state, faction or group has at its disposal to influence policy (diplomacy), finances, industry (economy), infrastructure, information, etc to achieve their particular goals. As in the real world the problem is competitive - not everyone shares the same

  16. A new conceptual model of coral biomineralisation: hypoxia as the physiological driver of skeletal extension

    Directory of Open Access Journals (Sweden)

    S. Wooldridge

    2013-05-01

    Full Text Available That corals skeletons are built of aragonite crystals with taxonomy-linked ultrastructure has been well understood since the 19th century. Yet, the way by which corals control this crystallization process remains an unsolved question. Here, I outline a new conceptual model of coral biomineralisation that endeavours to relate known skeletal features with homeostatic functions beyond traditional growth (structural determinants. In particular, I propose that the dominant physiological driver of skeletal extension is night-time hypoxia, which is exacerbated by the respiratory oxygen demands of the coral's algal symbionts (= zooxanthellae. The model thus provides a new narrative to explain the high growth rate of symbiotic corals, by equating skeletal deposition with the "work-rate" of the coral host needed to maintain a stable and beneficial symbiosis. In this way, coral skeletons are interpreted as a continuous (long-run recording unit of the stability and functioning of the coral–algae endosymbiosis. After providing supportive evidence for the model across multiple scales of observation, I use coral core data from the Great Barrier Reef (Australia to highlight the disturbed nature of the symbiosis in recent decades, but suggest that its onset is consistent with a trajectory that has been followed since at least the start of the 1900s. In concluding, I outline how the proposed capacity of cnidarians (which includes modern reef corals to overcome the metabolic limitation of hypoxia via skeletogenesis also provides a new hypothesis to explain the sudden appearance in the fossil record of calcified skeletons at the Precambrian–Cambrian transition – and the ensuing rapid appearance of most major animal phyla.

  17. KiDS-450: testing extensions to the standard cosmological model

    Science.gov (United States)

    Joudaki, Shahab; Mead, Alexander; Blake, Chris; Choi, Ami; de Jong, Jelte; Erben, Thomas; Fenech Conti, Ian; Herbonnet, Ricardo; Heymans, Catherine; Hildebrandt, Hendrik; Hoekstra, Henk; Joachimi, Benjamin; Klaes, Dominik; Köhlinger, Fabian; Kuijken, Konrad; McFarland, John; Miller, Lance; Schneider, Peter; Viola, Massimo

    2017-10-01

    We test extensions to the standard cosmological model with weak gravitational lensing tomography using 450 deg2 of imaging data from the Kilo Degree Survey (KiDS). In these extended cosmologies, which include massive neutrinos, non-zero curvature, evolving dark energy, modified gravity and running of the scalar spectral index, we also examine the discordance between KiDS and cosmic microwave background (CMB) measurements from Planck. The discordance between the two data sets is largely unaffected by a more conservative treatment of the lensing systematics and the removal of angular scales most sensitive to non-linear physics. The only extended cosmology that simultaneously alleviates the discordance with Planck and is at least moderately favoured by the data includes evolving dark energy with a time-dependent equation of state (in the form of the w0 - wa parametrization). In this model, the respective S_8=σ _8√{Ω m/0.3} constraints agree at the 1σ level, and there is 'substantial concordance' between the KiDS and Planck data sets when accounting for the full parameter space. Moreover, the Planck constraint on the Hubble constant is wider than in Λ cold dark matter (ΛCDM) and in agreement with the Riess et al. (2016) direct measurement of H0. The dark energy model is moderately favoured as compared to ΛCDM when combining the KiDS and Planck measurements, and marginalized constraints in the w0-wa plane are discrepant with a cosmological constant at the 3σ level. KiDS further constrains the sum of neutrino masses to 4.0 eV (95% CL), finds no preference for time or scale-dependent modifications to the metric potentials, and is consistent with flatness and no running of the spectral index.

  18. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    In this paper we argue that previous models of cognitive abilities (e.g. memory, analogy) have been constructed to satisfy functional requirements of implicit commonsense psychological theories held by researchers and nonresearchers alike...

  19. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  20. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  1. Hybriding CMMI and requirement engineering maturity and capability models

    OpenAIRE

    Buglione, Luigi; Hauck, Jean Carlo R.; Gresse von Wangenheim, Christiane; Mc Caffery, Fergal

    2012-01-01

    peer-reviewed Estimation represents one of the most critical processes for any project and it is highly dependent on the quality of requirements elicitation and management. Therefore, the management of requirements should be prioritised in any process improvement program, because the less precise the requirements gathering, analysis and sizing, the greater the error in terms of time and cost estimation. Maturity and Capability Models (MCM) represent a good tool for assessing the status of ...

  2. Generic skills requirements (KSA model) towards future mechanical ...

    African Journals Online (AJOL)

    Generic Skills is a basic requirement that engineers need to master in all areas of Engineering. This study was conducted throughout the peninsular Malaysia involving small, medium and heavy industries using the KSA Model. The objectives of this study are studying the level of requirement of Generic Skills that need to be ...

  3. Extension of the quantum-kinetic model to lunar and Mars return physics

    International Nuclear Information System (INIS)

    Liechty, D. S.; Lewis, M. J.

    2014-01-01

    The ability to compute rarefied, ionized hypersonic flows is becoming more important as missions such as Earth reentry, landing high-mass payloads on Mars, and the exploration of the outer planets and their satellites are being considered. A recently introduced molecular-level chemistry model, the quantum-kinetic, or Q-K, model that predicts reaction rates for gases in thermal equilibrium and non-equilibrium using only kinetic theory and fundamental molecular properties, is extended in the current work to include electronic energy level transitions and reactions involving charged particles. Like the Q-K procedures for neutral species chemical reactions, these new models are phenomenological procedures that aim to reproduce the reaction/transition rates but do not necessarily capture the exact physics. These engineering models are necessarily efficient due to the requirement to compute billions of simulated collisions in direct simulation Monte Carlo (DSMC) simulations. The new models are shown to generally agree within the spread of reported transition and reaction rates from the literature for near equilibrium conditions

  4. SEA extension of a F. E. model to predict total engine noise

    Science.gov (United States)

    Stimpson, G.; Lalor, N.

    Automotive engine noise has been the subject of much research and development in recent years, mainly due to the pressures of legislation. Most of this research has been concentrated on the design of the cylinder block, since this is where the vibration originates. However, on many engines the ligth covers (i.e. timing gear cover, rocker cover and sump) are the predominant sources of structurally radiated noise and usually 2 to 3 decibel (dBA) reduction can be achieved by quietening them. Because of its inherent stiffness, the block casting vibrates with quite simple (low order) mode shapes even at the top end of the acoustically important 300 Hz to 3000 Hz frequency band. Thus, relatively coarse mesh Finite Element (FE) models are adequate for noise prediction. In contrast to this, many light covers have a high modal density in their predominant noise radiating region, making finite element techniques difficult to apply. The block, cylinder head and bearing caps assembly can also be considered as a subsystem of a Statistical Energy Analysis (SEA) model. Thus the vibration energy calculated by the FE model can be fed into the SEA model of the complete engine - which can include ancillary equipment (starter motor, alternator, exhaust system etc.), if required. This paper describes how such a SEA model is constructed and how it can be used to evaluate noise reduction strategies.

  5. Higher derivative extensions of 3d Chern-Simons models: conservation laws and stability

    Energy Technology Data Exchange (ETDEWEB)

    Kaparulin, D.S.; Karataeva, I.Yu.; Lyakhovich, S.L. [Tomsk State University, Physics Faculty, Tomsk (Russian Federation)

    2015-11-15

    We consider the class of higher derivative 3d vector field models with the field equation operator being a polynomial of the Chern-Simons operator. For the nth-order theory of this type, we provide a general recipe for constructing n-parameter family of conserved second rank tensors. The family includes the canonical energy-momentum tensor, which is unbounded, while there are bounded conserved tensors that provide classical stability of the system for certain combinations of the parameters in the Lagrangian. We also demonstrate the examples of consistent interactions which are compatible with the requirement of stability. (orig.)

  6. Climate-driven range extension of Amphistegina (protista, foraminiferida): models of current and predicted future ranges.

    Science.gov (United States)

    Langer, Martin R; Weinmann, Anna E; Lötters, Stefan; Bernhard, Joan M; Rödder, Dennis

    2013-01-01

    Species-range expansions are a predicted and realized consequence of global climate change. Climate warming and the poleward widening of the tropical belt have induced range shifts in a variety of marine and terrestrial species. Range expansions may have broad implications on native biota and ecosystem functioning as shifting species may perturb recipient communities. Larger symbiont-bearing foraminifera constitute ubiquitous and prominent components of shallow water ecosystems, and range shifts of these important protists are likely to trigger changes in ecosystem functioning. We have used historical and newly acquired occurrence records to compute current range shifts of Amphistegina spp., a larger symbiont-bearing foraminifera, along the eastern coastline of Africa and compare them to analogous range shifts currently observed in the Mediterranean Sea. The study provides new evidence that amphisteginid foraminifera are rapidly progressing southwestward, closely approaching Port Edward (South Africa) at 31°S. To project future species distributions, we applied a species distribution model (SDM) based on ecological niche constraints of current distribution ranges. Our model indicates that further warming is likely to cause a continued range extension, and predicts dispersal along nearly the entire southeastern coast of Africa. The average rates of amphisteginid range shift were computed between 8 and 2.7 km year(-1), and are projected to lead to a total southward range expansion of 267 km, or 2.4° latitude, in the year 2100. Our results corroborate findings from the fossil record that some larger symbiont-bearing foraminifera cope well with rising water temperatures and are beneficiaries of global climate change.

  7. Climate-driven range extension of Amphistegina (protista, foraminiferida: models of current and predicted future ranges.

    Directory of Open Access Journals (Sweden)

    Martin R Langer

    Full Text Available Species-range expansions are a predicted and realized consequence of global climate change. Climate warming and the poleward widening of the tropical belt have induced range shifts in a variety of marine and terrestrial species. Range expansions may have broad implications on native biota and ecosystem functioning as shifting species may perturb recipient communities. Larger symbiont-bearing foraminifera constitute ubiquitous and prominent components of shallow water ecosystems, and range shifts of these important protists are likely to trigger changes in ecosystem functioning. We have used historical and newly acquired occurrence records to compute current range shifts of Amphistegina spp., a larger symbiont-bearing foraminifera, along the eastern coastline of Africa and compare them to analogous range shifts currently observed in the Mediterranean Sea. The study provides new evidence that amphisteginid foraminifera are rapidly progressing southwestward, closely approaching Port Edward (South Africa at 31°S. To project future species distributions, we applied a species distribution model (SDM based on ecological niche constraints of current distribution ranges. Our model indicates that further warming is likely to cause a continued range extension, and predicts dispersal along nearly the entire southeastern coast of Africa. The average rates of amphisteginid range shift were computed between 8 and 2.7 km year(-1, and are projected to lead to a total southward range expansion of 267 km, or 2.4° latitude, in the year 2100. Our results corroborate findings from the fossil record that some larger symbiont-bearing foraminifera cope well with rising water temperatures and are beneficiaries of global climate change.

  8. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  9. Correction to the crack extension direction in numerical modelling of mixed mode crack paths

    DEFF Research Database (Denmark)

    Lucht, Tore; Aliabadi, M.H.

    2007-01-01

    In order to avoid introduction of an error when a local crack-growth criterion is used in an incremental crack growth formulation, each straight crack extension would have to be infinitesimal or have its direction corrected. In this paper a new procedure to correct the crack extension direction...... is proposed in connection with crack growth analyzed by the Dual Boundary Element Method (DBEM). The proposed correction procedure and a reference correction procedure already described in the literature are evaluated by solving two different computational crack growth examples. In the two examples...... it is found that analyses of the crack paths performed with the proposed crack. correction procedure using big increments, of crack extension Are in excellent agreement with analyses of the crack paths performed by using very small increments of crack extension. Furthermore, it is shown that the reference...

  10. Modelling dynamic liquid-gas systems: Extensions to the volume-of-fluid solver

    CSIR Research Space (South Africa)

    Heyns, Johan A

    2013-06-01

    Full Text Available onboard air- and spacecraft or liquid natural gas on tankers. As part of the development three extensions are considered: Firstly, a revised surface capturing formulation is proposed; secondly, a new weakly compressible volume-of-fluid formulation...

  11. Extensions to modeling aerobic carbon degradation using combined respirometric-titrimetric measurements in view of activated sludge model calibration.

    Science.gov (United States)

    Sin, Gürkan; Vanrolleghem, Peter A

    2007-08-01

    Recently a model was introduced to interpret the respirometric (OUR) -titrimetric (Hp) data obtained from aerobic oxidation of different carbon sources in view of calibration of Activated Sludge Model No.1 (ASM1). The model requires, among others, the carbon dioxide transfer rate (CTR) to be relatively constant during aerobic experiments. As CTR is an inherently nonlinear process, this assumption may not hold for certain experimental conditions. Hence, we extended the model to describe the nonlinear CTR behavior. A simple calibration procedure of the CO2 model was developed only using titrimetric data. The identifiable parameter subset of this model when using titrimetric data only contained the first equilibrium constant of the CO2 dissociation, pK1, the initial aqueous CO2 concentration, C(Tinit) and the nitrogen content of biomass, i(NBM). The extended model was then successfully applied to interpret typical data obtained from respirometric-titrimetric measurements with a nonlinear CO2 stripping process. The parameter estimation results using titrimetric data were consistent with the results estimated using respirometric data (OUR) alone or combined OUR and Hp data, thereby supporting the validity of the dynamic CO2 model and its calibration approach. The increased range of applicability and accurate utilization of the titrimetric data are expected to contribute particularly to the improvement of calibration of ASM models using batch experiments.

  12. Genomic selection using regularized linear regression models: ridge regression, lasso, elastic net and their extensions.

    Science.gov (United States)

    Ogutu, Joseph O; Schulz-Streeck, Torben; Piepho, Hans-Peter

    2012-05-21

    Genomic selection (GS) is emerging as an efficient and cost-effective method for estimating breeding values using molecular markers distributed over the entire genome. In essence, it involves estimating the simultaneous effects of all genes or chromosomal segments and combining the estimates to predict the total genomic breeding value (GEBV). Accurate prediction of GEBVs is a central and recurring challenge in plant and animal breeding. The existence of a bewildering array of approaches for predicting breeding values using markers underscores the importance of identifying approaches able to efficiently and accurately predict breeding values. Here, we comparatively evaluate the predictive performance of six regularized linear regression methods-- ridge regression, ridge regression BLUP, lasso, adaptive lasso, elastic net and adaptive elastic net-- for predicting GEBV using dense SNP markers. We predicted GEBVs for a quantitative trait using a dataset on 3000 progenies of 20 sires and 200 dams and an accompanying genome consisting of five chromosomes with 9990 biallelic SNP-marker loci simulated for the QTL-MAS 2011 workshop. We applied all the six methods that use penalty-based (regularization) shrinkage to handle datasets with far more predictors than observations. The lasso, elastic net and their adaptive extensions further possess the desirable property that they simultaneously select relevant predictive markers and optimally estimate their effects. The regression models were trained with a subset of 2000 phenotyped and genotyped individuals and used to predict GEBVs for the remaining 1000 progenies without phenotypes. Predictive accuracy was assessed using the root mean squared error, the Pearson correlation between predicted GEBVs and (1) the true genomic value (TGV), (2) the true breeding value (TBV) and (3) the simulated phenotypic values based on fivefold cross-validation (CV). The elastic net, lasso, adaptive lasso and the adaptive elastic net all had

  13. Methods for Automating Analysis of Glacier Morphology for Regional Modelling: Centerlines, Extensions, and Elevation Bands

    Science.gov (United States)

    Viger, R. J.; Van Beusekom, A. E.

    2016-12-01

    The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.

  14. Extension of the survival dimensionality reduction algorithm to detect epistasis in competing risks models (SDR-CR).

    Science.gov (United States)

    Beretta, Lorenzo; Santaniello, Alessandro

    2013-02-01

    The discovery and the description of the genetic background of common human diseases is hampered by their complexity and dynamic behavior. Appropriate bioinformatic tools are needed to account all the facets of complex diseases and to this end we recently described the survival dimensionality reduction (SDR) algorithm in the effort to model gene-gene interactions in the context of survival analysis. When one event precludes the occurrence of another event under investigation in the 'competing risk model', survival algorithms require particular adjustment to avoid the risk of reporting wrong or biased conclusions. The SDR algorithm was modified to incorporate the cumulative incidence function as well as an adapted version of the Brier score for mutually exclusive outcomes, to better search for epistatic models in the competing risk setting. The applicability of the new SDR algorithm (SDR-CR) was evaluated using synthetic lifetime epistatic datasets with competing risks and on a dataset of scleroderma patients. The SDR-CR algorithms retains a satisfactory power to detect the causative variants in simulated datasets under different scenarios of sample size and degrees of type I or type II censoring. In the real-world dataset, SDR-CR was capable of detecting a significant interaction between the IL-1α C-889T and the IL-1β C-511T single-nucleotide polymorphisms to predict the occurrence of restrictive lung disease vs. isolated pulmonary hypertension. We provide an useful extension of the SDR algorithm to analyze epistatic interactions in the competing risk settings that may be of use to unveil the genetic background of complex human diseases. http://sourceforge.net/projects/sdrproject/files/. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Business Process Simulation: Requirements for Business and Resource Models

    OpenAIRE

    Audrius Rima; Olegas Vasilecas

    2015-01-01

    The purpose of Business Process Model and Notation (BPMN) is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  16. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  17. Extension, sedimentation and diapirism: understanding evolution of diapiric structures in the Central High Atlas using analogue modelling

    Science.gov (United States)

    Moragas, Mar; Vergés, Jaume; Nalpas, Thierry; Saura, Eduard; Diego Martín-Martín, Juan; Messager, Grégoire; Hunt, David William

    2017-04-01

    Analogue modelling has proven to be an essential tool for the study and analysis of the mechanisms involved in tectonic processes. Applied to salt tectonics, analogue modelling has been used to understand the mechanisms that trigger the onset of diapirs and the evolution of diapiric structures and minibasins. Analogue modelling has also been applied to analyse the impact of the progradation of sedimentary systems above a ductile layer, representing the source of diapirs. However, these models did not consider ongoing tectonic processes during progradation. To analyse how extension and sedimentary progradation influence on the formation of diapiric structures and their geometries, we present models composed of a mildly extension followed by post-extension period. Each model includes a particular sedimentary pattern: homogeneous sedimentation during extension and post-extension, homogeneous sedimentation during extension followed by prograding sedimentation during post-extension and prograding sedimentation during both extension and post-extension. Proximal high sedimentation rates enhance the mobilization of ductile material towards growing diapirs, resulting well-developed passive diapirs. Diapirs from distal domain of the model with post-extension progradation show silicone extrusions, that are caused by the decreased sedimentation rate associated to the progradation. By contrast, reduced sedimentation in the distal part of the model with syn- and post-extension progradation (3.5 times smaller than in the proximal domain) causes a limited migration of the silicone and hampers the transition from reactive diapirs to active and passive diapirs. These models show that the ratio between diapir growth and sedimentation rate, the time of the onset of the progradation and the relative thickness of the sedimentary cover beneath the prograding system have a clear impact on the final diapiric geometries. Additionally, we present two models with increasing amounts of

  18. Vacuum stability in U(1-prime extensions of the Standard Model with TeV scale right handed neutrinos

    Directory of Open Access Journals (Sweden)

    Claudio Corianò

    2014-11-01

    Full Text Available We investigate a minimal U(1′ extension of the Standard Model with one extra complex scalar and generic gauge charge assignments. We use a type-I seesaw mechanism with three heavy right handed neutrinos to illustrate the constraints on the charges, on their mass and on the mixing angle of the two scalars, derived by requiring the vacuum stability of the scalar potential. We focus our study on a scenario which could be accessible at the LHC, by selecting a vacuum expectation value of the extra Higgs in the TeV range and determining the constraints that emerge in the parameter space. To illustrate the generality of the approach, specific gauge choices corresponding to U(1B−L, U(1R and U(1χ are separately analyzed. Our results are based on a modified expression of one of the β functions of the quartic couplings of the scalar potential compared to the previous literature. This is due to a change in the coefficient of the Yukawa term of the right handed neutrinos. Differently from previous analysis, we show that this coupling may destabilize the vacuum.

  19. Regional admixture mapping and structured association testing: conceptual unification and an extensible general linear model.

    Directory of Open Access Journals (Sweden)

    David T Redden

    2006-08-01

    Full Text Available Individual genetic admixture estimates, determined both across the genome and at specific genomic regions, have been proposed for use in identifying specific genomic regions harboring loci influencing phenotypes in regional admixture mapping (RAM. Estimates of individual ancestry can be used in structured association tests (SAT to reduce confounding induced by various forms of population substructure. Although presented as two distinct approaches, we provide a conceptual framework in which both RAM and SAT are special cases of a more general linear model. We clarify which variables are sufficient to condition upon in order to prevent spurious associations and also provide a simple closed form "semiparametric" method of evaluating the reliability of individual admixture estimates. An estimate of the reliability of individual admixture estimates is required to make an inherent errors-in-variables problem tractable. Casting RAM and SAT methods as a general linear model offers enormous flexibility enabling application to a rich set of phenotypes, populations, covariates, and situations, including interaction terms and multilocus models. This approach should allow far wider use of RAM and SAT, often using standard software, in addressing admixture as either a confounder of association studies or a tool for finding loci influencing complex phenotypes in species as diverse as plants, humans, and nonhuman animals.

  20. A Java simulator of Rescorla and Wagner's prediction error model and configural cue extensions.

    Science.gov (United States)

    Alonso, Eduardo; Mondragón, Esther; Fernández, Alberto

    2012-10-01

    In this paper we present the "R&W Simulator" (version 3.0), a Java simulator of Rescorla and Wagner's prediction error model of learning. It is able to run whole experimental designs, and compute and display the associative values of elemental and compound stimuli simultaneously, as well as use extra configural cues in generating compound values; it also permits change of the US parameters across phases. The simulator produces both numerical and graphical outputs, and includes a functionality to export the results to a data processor spreadsheet. It is user-friendly, and built with a graphical interface designed to allow neuroscience researchers to input the data in their own "language". It is a cross-platform simulator, so it does not require any special equipment, operative system or support program, and does not need installation. The "R&W Simulator" (version 3.0) is available free. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  1. Building a Narrative Based Requirements Engineering Mediation Model

    Science.gov (United States)

    Ma, Nan; Hall, Tracy; Barker, Trevor

    This paper presents a narrative-based Requirements Engineering (RE) mediation model to help RE practitioners to effectively identify, define, and resolve conflicts of interest, goals, and requirements. Within the SPI community, there is a common belief that social, human, and organizational issues significantly impact on the effectiveness of software process improvement in general and the requirements engineering process in particularl. Conflicts among different stakeholders are an important human and social issue that need more research attention in the SPI and RE community. By drawing on the conflict resolution literature and IS literature, we argue that conflict resolution in RE is a mediated process, in which a requirements engineer can act as a mediator among different stakeholders. To address socio-psychological aspects of conflict in RE and SPI, Winslade and Monk (2000)'s narrative mediation model is introduced, justified, and translated into the context of RE.

  2. Modelling Security Requirements Through Extending Scrum Agile Development Framework

    OpenAIRE

    Alotaibi, Minahi

    2016-01-01

    Security is today considered as a basic foundation in software development and therefore, the modelling and implementation of security requirements is an essential part of the production of secure software systems. Information technology organisations are moving towards agile development methods in order to satisfy customers' changing requirements in light of accelerated evolution and time restrictions with their competitors in software production. Security engineering is considered difficult...

  3. Perturbative extension of the standard model with a 125 GeV Higgs and Magnetic Dark Matter

    DEFF Research Database (Denmark)

    Dissauer, Karin; Frandsen, Mads Toudal; Hapola, Tuomas

    2012-01-01

    among several direct dark matter search experiments. We further constrain the parameters of the underlying theory using results from the Large Hadron Collider. The extension can accommodate the recently observed properties of the Higgs-like state and leads to interesting predictions. Finally we show...... that the model's collider phenomenology and constraints nicely complement the ones coming from dark matter searches....

  4. The CLAIR model: Extension of Brodmann areas based on brain oscillations and connectivity.

    Science.gov (United States)

    Başar, Erol; Düzgün, Aysel

    2016-05-01

    Since the beginning of the last century, the localization of brain function has been represented by Brodmann areas, maps of the anatomic organization of the brain. They are used to broadly represent cortical structures with their given sensory-cognitive functions. In recent decades, the analysis of brain oscillations has become important in the correlation of brain functions. Moreover, spectral connectivity can provide further information on the dynamic connectivity between various structures. In addition, brain responses are dynamic in nature and structural localization is almost impossible, according to Luria (1966). Therefore, brain functions are very difficult to localize; hence, a combined analysis of oscillation and event-related coherences is required. In this study, a model termed as "CLAIR" is described to enrich and possibly replace the concept of the Brodmann areas. A CLAIR model with optimum function may take several years to develop, but this study sets out to lay its foundation. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Kinetic modeling and determination of reaction constants of Alzheimer's beta-amyloid fibril extension and dissociation using surface plasmon resonance.

    Science.gov (United States)

    Hasegawa, Kazuhiro; Ono, Kenjiro; Yamada, Masahito; Naiki, Hironobu

    2002-11-19

    To establish the kinetic model of the extension and dissociation of beta-amyloid fibrils (f(A)beta) in vitro, we analyzed these reactions using a surface plasmon resonance (SPR) biosensor. Sonicated f(A)beta were immobilized on the surface of the SPR sensor chip as seeds. The SPR signal increased linearly as a function of time after amyloid beta-peptides (Abeta) were injected into the f(A)beta-immobilized chips. The extension of f(A)beta was confirmed by atomic force microscopy. When flow cells were washed with running buffer, the SPR signal decreased with time after the extension reaction. The curve fitting resolved the dissociation reaction into the fast exponential and slow linear decay phases. Kinetic analysis of the effect of Abeta/f(A)beta concentrations on the reaction rate indicated that both the extension reaction and the slow linear phase of the dissociation were consistent with a first-order kinetic model; i.e., the extension/dissociation reactions proceed via consecutive association/dissociation of Abeta onto/from the end of existing fibrils. On the basis of this model, the critical monomer concentration ([M](e)) and the equilibrium association constant (K) were calculated, for the first time, to be 20 nM and 5 x 10(7) M(-1), respectively. Alternatively, [M](e) was directly measured as 200 nM, which may represent the equilibrium between the extension reaction and the fast phase of the dissociation. The SPR biosensor is a useful quantitative tool for the kinetic and thermodynamic study of the molecular mechanisms of f9A)beta formation in vitro.

  6. Extension of the Representativeness of the Traumatic Brain Injury Model Systems National Database: 2001 to 2010

    Science.gov (United States)

    Cuthbert, Jeffrey P; Corrigan, John D.; Whiteneck, Gale G.; Harrison-Felix, Cynthia; Graham, James E.; Bell, Jeneita M.; Coronado, Victor G.

    2017-01-01

    Objective To extend the representativeness of the Traumatic Brain Injury Model Systems National Database (TBIMS-NDB) for individuals aged 16 years and older admitted for acute, inpatient rehabilitation in the United States with a primary diagnosis of traumatic brain injury (TBI) analyses completed by Corrigan and colleagues,3 by comparing this dataset to national data for patients admitted to inpatient rehabilitation with identical inclusion criteria that included 3 additional years of data and 2 new demographic variables. Design Secondary analysis of existing datasets; extension of previously published analyses. Setting Acute inpatient rehabilitation facilities. Participants Patients 16 years of age and older with a primary rehabilitation diagnosis of TBI; US TBI Rehabilitation population n = 156,447; TBIMS-NDB population n = 7373. Interventions None. Main Outcome Measure demographics, functional status and hospital length of stay. Results The TBIMS-NDB was largely representative of patients 16 years and older admitted for rehabilitation in the U.S. with a primary diagnosis of TBI on or after October 1, 2001 and discharged as of December 31, 2010. The results of the extended analyses were similar to those reported by Corrigan and colleagues. Age accounted for the largest difference between the samples, with the TBIMS-NDB including a smaller proportion of patients aged 65 and older as compared to all those admitted for rehabilitation with a primary diagnosis of TBI in the United States. After partitioning each dataset at age 65, most distributional differences found between samples were markedly reduced; however, differences on the Pre-injury vocational status of employed and rehabilitation lengths of stay between 1 and 9 days remained robust. The subsamples of patients aged 64 and younger was found to differ only slightly on all remaining variables, while those aged 65 and older were found to have meaningful differences on insurance type and age distribution

  7. Forecasting inter-urban transport demand for a logistics company: A combined grey–periodic extension model with remnant correction

    Directory of Open Access Journals (Sweden)

    Donghui Wang

    2015-12-01

    Full Text Available Accurately predicting short-term transport demand for an individual logistics company involved in a competitive market is critical to make short-term operation decisions. This article proposes a combined grey–periodic extension model with remnant correction to forecast the short-term inter-urban transport demand of a logistics company involved in a nationwide competitive market, showing changes in trend and seasonal fluctuations with irregular periods different to the macroeconomic cycle. A basic grey–periodic extension model of an additive pattern, namely, the main combination model, is first constructed to fit the changing trends and the featured seasonal fluctuation periods. In order to improve prediction accuracy and model adaptability, the grey model is repeatedly modelled to fit the remnant tail time series of the main combination model until prediction accuracy is satisfied. The modelling approach is applied to a logistics company engaged in a nationwide less-than-truckload road transportation business in China. The results demonstrate that the proposed modelling approach produces good forecasting results and goodness of fit, also showing good model adaptability to the analysed object in a changing macro environment. This fact makes this modelling approach an option to analyse the short-term transportation demand of an individual logistics company.

  8. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  9. Memoirs of a Browser: A Cross-browser Detection Model for Privacy-breaching Extensions

    NARCIS (Netherlands)

    Giuffrida, C.; Ortolani, S.; Crispo, B.

    2012-01-01

    Web browsers are undoubtedly one of the most popular user applications. This is even more evident in recent times, with Google introducing a platform where the browser is the only application provided to the user. With their modular and extensible architecture, modern browsers are also an appealing

  10. Endogenous model state and parameter estimation from an extensive batch experiment

    NARCIS (Netherlands)

    Keesman, K.J.; Spanjers, H.

    2000-01-01

    In this paper an extensive batch experiment of endogenous process behavior in an aerobic biodegradation process is presented. From these experimental data, comprising measurements of MLVSS (mixed liquor volatile suspended solids) and respiration rate, in a first step the states and unknown

  11. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  12. Critical Business Requirements Model and Metrics for Intranet ROI

    OpenAIRE

    Luqi; Jacoby, Grant A.

    2005-01-01

    Journal of Electronic Commerce Research, Vol. 6, No. 1, pp. 1-30. This research provides the first theoretical model, the Intranet Efficiency and Effectiveness Model (IEEM), to measure intranet overall value contributions based on a corporation’s critical business requirements by applying a balanced baseline of metrics and conversion ratios linked to key business processes of knowledge workers, IT managers and business decision makers -- in effect, closing the gap of understanding...

  13. Coupling R and PHREEQC: an interactive and extensible environment for efficient programming of geochemical models

    Science.gov (United States)

    De Lucia, Marco; Kühn, Michael

    2013-04-01

    manipulations and visualization in a powerful high level language, and benefiting from an enormous amount of third-party open source R extensions. The possibility to rapidly prototype complex algorithms involving geochemical modelling is in our opinion a huge advantage. A demonstration is given by the successful evaluation of a strategy to reduce the CPU-time needed to perform reactive transport simulations in a sequential coupling scheme. The idea is the "reduction" of the number of actual chemical simulations to perform at every time step, by searching for "duplicates" of each chemical simulations in the grid: such comparison involves typically a huge number of elements (one chemical simulation for grid element for time step) and a quite large number of variables (concentrations and mineral abundances). However, through the straightforward implementation of the prototype algorithm through the R/PHREEQC interface, we found out that the scan is extremely cost-effective in terms of CPU-time and typically allows a relevant speedup for simulations starting from a homogeneous or zone-homogeneous state. This speedup can even greatily exceed that of parallelization in some favorable but not unfrequent case. This feature should therefore be implemented in reactive transport simulators. References [1] Parkhurst D, Appelo C (1999) Users guide to PHREEQC (version 2). Tech. rep, U.S. Geological Survey. [2] Beyer C, Li D, De Lucia M, Kühn M, Bauer S (2012): Modelling CO2-induced fluid-rock interactions in the Altensalzwedel gas reservoir. Part II: coupled reactive transport simulation. Environ. Earth Sci., 67, 2, 573-588. [3] R Core Team (2012) R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0, URL http://www.R-project.org/. [4] Kühn M, Münch U (2012) CLEAN: CO2 Large-Scale Enhanced Gas Recovery. GEOTECHNOLOGIEN Science Report No. 19. Series: Advanced. Technologies in Earth Sciences, 199 p, ISBN 978-3-642-31676-0.

  14. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  15. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  16. WTC deafness Kyoto (dfk): a rat model for extensive investigations of Kcnq1 functions.

    Science.gov (United States)

    Gohma, Hiroshi; Kuramoto, Takashi; Kuwamura, Mitsuru; Okajima, Ryoko; Tanimoto, Noriaki; Yamasaki, Ken-ichi; Nakanishi, Satoshi; Kitada, Kazuhiro; Makiyama, Takeru; Akao, Masaharu; Kita, Toru; Sasa, Masashi; Serikawa, Tadao

    2006-02-14

    KCNQ1 forms K+ channels by assembly with regulatory subunit KCNE proteins and plays a key role in the K+ homeostasis in a variety of tissues. In the heart, KCNQ1 is coassembled with KCNE1 to produce a cardiac delayed rectifier K+ current. In the inner ear, the KCNQ1/KCNE1 complex maintains the high concentration of K+ in the endolymph. In the stomach, KCNQ1 is coassembled with KCNE2 to form the K+ exflux channel that is essential for gastric acid secretion. In the colon and small intestine, KCNQ1 is coassembled with KCNE3 to play an important role in transepithelial cAMP-stimulated Cl- secretion. For further understanding of Kcnq1 function in vivo, an animal model has been required. Here we reported the identification of a coisogenic Kcnq1 mutant rat, named deafness Kyoto (dfk), and the characterization of its phenotypes. WTC-dfk rats carried intragenic deletion at the Kcnq1 gene and showed impaired gain of weight, deafness, and imbalance resulting from the marked reduction of endolymph, prolonged QT interval in the electrocardiogram (ECG), and gastric achlorhydria associated with hypertrophic gastric mucosa. Surprisingly, WTC-dfk rats showed hypertension, which suggested that Kcnq1 might be involved in the regulation of blood pressure. These findings suggest that WTC-dfk rats could represent a powerful tool for studying the physiological functions of KCNQ1 and for the establishment of new therapeutic procedures for Kcnq1-related diseases.

  17. New prediction model for probe specificity in an allele-specific extension reaction for haplotype-specific extraction (HSE of Y chromosome mixtures.

    Directory of Open Access Journals (Sweden)

    Jessica Rothe

    Full Text Available Allele-specific extension reactions (ASERs use 3' terminus-specific primers for the selective extension of completely annealed matches by polymerase. The ability of the polymerase to extend non-specific 3' terminal mismatches leads to a failure of the reaction, a process that is only partly understood and predictable, and often requires time-consuming assay design. In our studies we investigated haplotype-specific extraction (HSE for the separation of male DNA mixtures. HSE is an ASER and provides the ability to distinguish between diploid chromosomes from one or more individuals. Here, we show that the success of HSE and allele-specific extension depend strongly on the concentration difference between complete match and 3' terminal mismatch. Using the oligonucleotide-modeling platform Visual Omp, we demonstrated the dependency of the discrimination power of the polymerase on match- and mismatch-target hybridization between different probe lengths. Therefore, the probe specificity in HSE could be predicted by performing a relative comparison of different probe designs with their simulated differences between the duplex concentration of target-probe match and mismatches. We tested this new model for probe design in more than 300 HSE reactions with 137 different probes and obtained an accordance of 88%.

  18. Atmospheric disturbance modelling requirements for flying qualities applications

    Science.gov (United States)

    Moorhouse, D. J.

    1978-01-01

    Flying qualities are defined as those airplane characteristics which govern the ease or precision with which the pilot can accomplish the mission. Some atmospheric disturbance modelling requirements for aircraft flying qualities applications are reviewed. It is concluded that some simplifications are justified in identifying the primary influence on aircraft response and pilot control. It is recommended that a universal environmental model be developed, which could form the reference for different applications. This model should include the latest information on winds, turbulence, gusts, visibility, icing and precipitation. A chosen model would be kept by a national agency and updated regularly by feedback from users. A user manual is believed to be an essential part of such a model.

  19. An extension of the talbot-ogden hydrology model to an affine multi-dimensional moisture content domain

    KAUST Repository

    Yu, Han

    2013-09-01

    The Talbot-Ogden hydrology model provides a fast mass conservative method to compute infiltration in unsaturated soils. As a replacement for a model based on Richards equation, it separates the groundwater movement into infiltration and redistribution for every time step. The typical feature making this method fast is the discretization of the moisture content domain rather than the spatial one. The Talbot-Ogden model rapidly determines how well ground water and aquifers are recharged only. Hence, it differs from models based on advanced reservoir modeling that are uniformly far more expensive computationally since they determine where the water moves in space instead, a completely different and more complex problem.According to the pore-size distribution curve for many soils, this paper extends the one dimensional moisture content domain into a two dimensional one by keeping the vertical spatial axis. The proposed extension can describe any pore-size or porosity distribution as an important soil feature. Based on this extension, infiltration and redistribution are restudied. The unconditional conservation of mass in the Talbot-Ogden model is inherited in this extended model. A numerical example is given for the extended model.

  20. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  1. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  2. An Extension of a Parallel-Distributed Processing Framework of Reading Aloud in Japanese: Human Nonword Reading Accuracy Does Not Require a Sequential Mechanism

    Science.gov (United States)

    Ikeda, Kenji; Ueno, Taiji; Ito, Yuichi; Kitagami, Shinji; Kawaguchi, Jun

    2017-01-01

    Humans can pronounce a nonword (e.g., rint). Some researchers have interpreted this behavior as requiring a sequential mechanism by which a grapheme-phoneme correspondence rule is applied to each grapheme in turn. However, several parallel-distributed processing (PDP) models in English have simulated human nonword reading accuracy without a…

  3. Extension of landscape-based population viability models to ecoregional scales for conservation planning

    Science.gov (United States)

    Thomas W. Bonnot; Frank R. III Thompson; Joshua Millspaugh

    2011-01-01

    Landscape-based population models are potentially valuable tools in facilitating conservation planning and actions at large scales. However, such models have rarely been applied at ecoregional scales. We extended landscape-based population models to ecoregional scales for three species of concern in the Central Hardwoods Bird Conservation Region and compared model...

  4. The Effects of Different Aspects of Tourism Services on Travelers' Quality of Life: Model Validation, Refinement, and Extension

    OpenAIRE

    Neal, Janet Davis

    2000-01-01

    The Effects of Different Aspects of Tourism Services on Travelers' Quality of Life: Model Validation, Refinement, and Extension by Janet Davis Neal ABSTRACT Numerous satisfaction studies have been conducted in both tourism and marketing which have examined various aspects of travelers and/or consumers. Quality of life satisfaction studies look beyond the types of satisfaction experiences that endure for only a short time to those that "spill over" into individuals' life domains ...

  5. A Generative Probabilistic Model and Discriminative Extensions for Brain Lesion Segmentation - With Application to Tumor and Stroke

    DEFF Research Database (Denmark)

    Menze, Bjoern H.; Van Leemput, Koen; Lashkari, Danial

    2016-01-01

    We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM...... jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model...... patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the extended discriminative-discriminative model to be one of the top ranking methods in the BRATS...

  6. Effect Displays in R for Multinomial and Proportional-Odds Logit Models: Extensions to the effects Package

    Directory of Open Access Journals (Sweden)

    John Fox

    2009-10-01

    Full Text Available Based on recent work by Fox and Andersen (2006, this paper describes substantial extensions to the effects package for R to construct effect displays for multinomial and proportional-odds logit models. The package previously was limited to linear and generalized linear models. Effect displays are tabular and graphical representations of terms — typically high-order terms — in a statistical model. For polytomous logit models, effect displays depict fitted category probabilities under the model, and can include point-wise confidence envelopes for the effects. The construction of effect displays by functions in the effects package is essentially automatic. The package provides several kinds of displays for polytomous logit models.

  7. Palo Podrido: Model for Extensive Delignification of Wood by Ganoderma applanatum.

    Science.gov (United States)

    Dill, I; Kraepelin, G

    1986-12-01

    Chemical and micromorphological analysis revealed that South Chilean "palo podrido" results from a white-rot fungus that causes highly selective and extensive delignification. Palo podrido samples from 10 different hardwood trunks (Eucryphia cordifolia, Drimys winteri, and Nothofagus dombeyi) decayed by Ganoderma applanatum were analyzed. Of 14 samples, 11 had extremely low Klason lignin values, ranging from 6.1 to 0.4% (dry weight). The most remarkable and unusual feature was that delignification and defibration were not restricted to small pockets but extended throughout large areas in the interior of trunks subjected to undisturbed rotting over long periods of time. Comparative analysis of water content, swelling capacity, and lignin content led to the conclusion that besides lignin degradation, suppression of the cellulolytic activity of the rotting organisms plays a decisive role. Among various nutrients added to a palo podrido sample (3% residual Klason lignin), the nitrogen source was the only one leading to almost complete cellulose degradation. We suggest that the extremely low nitrogen content (0.037 to 0.073% [dry weight]) of the investigated wood species was the primary cause for the extensive delignification as well as the concomitant suppression of cellulose breakdown. The low temperatures, high humidity, and microaerobic conditions maintained within the decaying trunks are discussed as additional ecological factors favoring delignification in South Chilean rain forests.

  8. Palo Podrido: Model for Extensive Delignification of Wood by Ganoderma applanatum

    Science.gov (United States)

    Dill, Ingrid; Kraepelin, Gunda

    1986-01-01

    Chemical and micromorphological analysis revealed that South Chilean “palo podrido” results from a white-rot fungus that causes highly selective and extensive delignification. Palo podrido samples from 10 different hardwood trunks (Eucryphia cordifolia, Drimys winteri, and Nothofagus dombeyi) decayed by Ganoderma applanatum were analyzed. Of 14 samples, 11 had extremely low Klason lignin values, ranging from 6.1 to 0.4% (dry weight). The most remarkable and unusual feature was that delignification and defibration were not restricted to small pockets but extended throughout large areas in the interior of trunks subjected to undisturbed rotting over long periods of time. Comparative analysis of water content, swelling capacity, and lignin content led to the conclusion that besides lignin degradation, suppression of the cellulolytic activity of the rotting organisms plays a decisive role. Among various nutrients added to a palo podrido sample (3% residual Klason lignin), the nitrogen source was the only one leading to almost complete cellulose degradation. We suggest that the extremely low nitrogen content (0.037 to 0.073% [dry weight]) of the investigated wood species was the primary cause for the extensive delignification as well as the concomitant suppression of cellulose breakdown. The low temperatures, high humidity, and microaerobic conditions maintained within the decaying trunks are discussed as additional ecological factors favoring delignification in South Chilean rain forests. Images PMID:16347235

  9. Towards an extensible core model for Digital Rights Management in VDM

    DEFF Research Database (Denmark)

    Lauritsen, Rasmus Winther; Lorenzen, Lasse

    2012-01-01

    In this article two views on DRM are presented and modelled in VDM. The contribution from this modelling process is two-fold. A set of properties that are of interest while designing DRM systems are presented. Then, the two models are compared to elaborate our understanding of DRM elements and th...

  10. Modelling human resource requirements for the nuclear industry in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG) (Netherlands); Flore, Massimo; Estorff, Ulrik von [Joint Research Center (JRC) (Netherlands)

    2017-11-15

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  11. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    . This is important because it represents the identi cation of what is being designed (the reactive system), and what is given and being made assumptions about (the environment). The representation of the environment is further partitioned to distinguish human actors from non-human actors. This allows the modeler...... to addressing the problem of validating formal requirements models through interactive graphical animations is presented. Executable Use Cases (EUCs) provide a framework for integrating three tiers of descriptions of specifications and environment assumptions: the lower tier is an informal description...... to distinguish the modeling artifacts describing the environment from those describing the specifications for a reactive system. The formalization allows for clear identi cation of interfaces between interacting domains, where the interaction takes place through an abstraction of possibly parameterized states...

  12. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  13. Extension of the application of conway-maxwell-poisson models: analyzing traffic crash data exhibiting underdispersion.

    Science.gov (United States)

    Lord, Dominique; Geedipally, Srinivas Reddy; Guikema, Seth D

    2010-08-01

    The objective of this article is to evaluate the performance of the COM-Poisson GLM for analyzing crash data exhibiting underdispersion (when conditional on the mean). The COM-Poisson distribution, originally developed in 1962, has recently been reintroduced by statisticians for analyzing count data subjected to either over- or underdispersion. Over the last year, the COM-Poisson GLM has been evaluated in the context of crash data analysis and it has been shown that the model performs as well as the Poisson-gamma model for crash data exhibiting overdispersion. To accomplish the objective of this study, several COM-Poisson models were estimated using crash data collected at 162 railway-highway crossings in South Korea between 1998 and 2002. This data set has been shown to exhibit underdispersion when models linking crash data to various explanatory variables are estimated. The modeling results were compared to those produced from the Poisson and gamma probability models documented in a previous published study. The results of this research show that the COM-Poisson GLM can handle crash data when the modeling output shows signs of underdispersion. Finally, they also show that the model proposed in this study provides better statistical performance than the gamma probability and the traditional Poisson models, at least for this data set.

  14. Advanced Models and Controls for Prediction and Extension of Battery Lifetime (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Smith, K.; Wood, E.; Santhanagopalan, S.; Kim, G.; Pesaran, A.

    2014-02-01

    Predictive models of capacity and power fade must consider a multiplicity of degradation modes experienced by Li-ion batteries in the automotive environment. Lacking accurate models and tests, lifetime uncertainty must presently be absorbed by overdesign and excess warranty costs. To reduce these costs and extend life, degradation models are under development that predict lifetime more accurately and with less test data. The lifetime models provide engineering feedback for cell, pack and system designs and are being incorporated into real-time control strategies.

  15. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  16. The dual pathway model of overeating. Replication and extension with actual food consumption

    NARCIS (Netherlands)

    Ouwens, M.A.; Strien, T. van; Leeuwe, J.F.J. van; Staak, C.P.F. van der

    2009-01-01

    van Strien et al. [van Strien, T., Engels, R. C. M. E., van Leeuwe, J., Snoek, H. M. (2005). The Stice model of overeating: tests in clinical and non-clinical samples. Appetite, 45, 205–213] extended the negative affect pathway of Stice's dual pathway model of overeating Stice [Stice, E. (1994).

  17. The dual pathway model of overeating. Replication and extension with actual food consumption

    NARCIS (Netherlands)

    Ouwens, Machteld A; van Strien, T; Leeuwe, J.F.J.; van der Staak, C P F

    van Strien et al. [van Strien, T., Engels, R. C. M. E., van Leeuwe, J., Snoek, H. M. (2005). The Stice model of overeating: tests in clinical and non-clinical samples. Appetite, 45, 205-213] extended the negative affect pathway of Stice's dual pathway model of overeating Stice [Stice, E. (1994).

  18. The Job Demands-Resources Model in China: Validation and Extension

    NARCIS (Netherlands)

    Hu, Q.

    2014-01-01

    The Job Demands-Resources (JD-R) Model assumes that employee health and well-being result from the interplay between job demands and job resources. Based on its openheuristic nature, the JD-R model can be applied to various occupational settings, irrespective of the particular demands and resources

  19. Motivation Monitoring and Assessment Extension for Input-Process-Outcome Game Model

    Science.gov (United States)

    Ghergulescu, Ioana; Muntean, Cristina Hava

    2014-01-01

    This article proposes a Motivation Assessment-oriented Input-Process-Outcome Game Model (MotIPO), which extends the Input-Process-Outcome game model with game-centred and player-centred motivation assessments performed right from the beginning of the game-play. A feasibility case-study involving 67 participants playing an educational game and…

  20. Multidimensional Extension of Multiple Indicators Multiple Causes Models to Detect DIF

    Science.gov (United States)

    Lee, Soo; Bulut, Okan; Suh, Youngsuk

    2017-01-01

    A number of studies have found multiple indicators multiple causes (MIMIC) models to be an effective tool in detecting uniform differential item functioning (DIF) for individual items and item bundles. A recently developed MIMIC-interaction model is capable of detecting both uniform and nonuniform DIF in the unidimensional item response theory…

  1. Extension of a biochemical model for the generalized stoichiometry of electron transport limited C3 photosynthesis

    NARCIS (Netherlands)

    Yin, X.; Oijen, van M.; Schapendonk, A.H.C.M.

    2004-01-01

    The widely used steady-state model of Farquhar et al. (Planta 149: 78-90, 1980) for C-3 photosynthesis was developed on the basis of linear whole-chain (non-cyclic) electron transport. In this model, calculation of the RuBP-regeneration limited CO2-assimilation rate depends on whether it is

  2. Modeling goals and functions of control and safety systems - theoretical foundations and extensions of MFM

    International Nuclear Information System (INIS)

    Lind, M.

    2005-10-01

    Multilevel Flow Modeling (MFM) has proven to be an effective modeling tool for reasoning about plant failure and control strategies and is currently exploited for operator support in diagnosis and on-line alarm analysis. Previous MFM research was focussed on representing goals and functions of process plants which generate, transform and distribute mass and energy. However, only a limited consideration has been given to the problems of modeling the control systems. Control functions are indispensable for operating any industrial plant. But modeling of control system functions has proven to be a more challenging problem than modeling functions of energy and mass processes. The problems were discussed by Lind and tentative solutions has been proposed but have not been investigated in depth until recently, partly due to the lack of an appropriate theoretical foundation. The purposes of the present report are to show that such a theoretical foundation for modeling goals and functions of control systems can be built from concepts and theories of action developed by Von Wright and to show how the theoretical foundation can be used to extend MFM with concepts for modeling control systems. The theoretical foundations has been presented in detail elsewhere by the present author without the particular focus on modeling control actions and MFM adopted here. (au)

  3. Extension of Petri Nets by Aspects to Apply the Model Driven Architecture Approach

    NARCIS (Netherlands)

    Roubtsova, E.E.; Aksit, Mehmet

    2005-01-01

    Within MDA models are usually created in the UML. However, one may prefer to use different notations such as Petri-nets, for example, for modelling concurrency and synchronization properties of systems. This paper claims that techniques that are adopted within the context of MDA can also be

  4. PSpice Modeling Platform for SiC Power MOSFET Modules with Extensive Experimental Validation

    DEFF Research Database (Denmark)

    Ceccarelli, Lorenzo; Iannuzzo, Francesco; Nawaz, Muhammad

    2016-01-01

    The aim of this work is to present a PSpice implementation for a well-established and compact physics-based SiC MOSFET model, including a fast, experimental-based parameter extraction procedure in a MATLAB GUI environment. The model, originally meant for single-die devices, has been used...

  5. Modeling goals and functions of control and safety systems -theoretical foundations and extensions of MFM

    Energy Technology Data Exchange (ETDEWEB)

    Lind, M. [Oersted - DTU, Kgs. Lyngby (Denmark)

    2005-10-01

    Multilevel Flow Modeling (MFM) has proven to be an effective modeling tool for reasoning about plant failure and control strategies and is currently exploited for operator support in diagnosis and on-line alarm analysis. Previous MFM research was focussed on representing goals and functions of process plants which generate, transform and distribute mass and energy. However, only a limited consideration has been given to the problems of modeling the control systems. Control functions are indispensable for operating any industrial plant. But modeling of control system functions has proven to be a more challenging problem than modeling functions of energy and mass processes. The problems were discussed by Lind and tentative solutions has been proposed but have not been investigated in depth until recently, partly due to the lack of an appropriate theoretical foundation. The purposes of the present report are to show that such a theoretical foundation for modeling goals and functions of control systems can be built from concepts and theories of action developed by Von Wright and to show how the theoretical foundation can be used to extend MFM with concepts for modeling control systems. The theoretical foundations has been presented in detail elsewhere by the present author without the particular focus on modeling control actions and MFM adopted here. (au)

  6. Non-residential water demand model validated with extensive measurements and surveys

    NARCIS (Netherlands)

    Pieterse-Quirijns, I.; Blokker, E.J.M.; van der Blom, E.C.; Vreeburg, J.H.G.

    2013-01-01

    Existing Dutch guidelines for the design of the drinking water and hot water system of nonresidential buildings are based on outdated assumptions on peak water demand or on unfounded assumptions on hot water demand. They generally overestimate peak demand values required for the design of an

  7. mfpa: Extension of mfp using the ACD covariate transformation for enhanced parametric multivariable modeling.

    Science.gov (United States)

    Royston, Patrick; Sauerbrei, Willi

    2016-01-01

    In a recent article, Royston (2015, Stata Journal 15: 275-291) introduced the approximate cumulative distribution (acd) transformation of a continuous covariate x as a route toward modeling a sigmoid relationship between x and an outcome variable. In this article, we extend the approach to multivariable modeling by modifying the standard Stata program mfp. The result is a new program, mfpa, that has all the features of mfp plus the ability to fit a new model for user-selected covariates that we call fp1( p 1 , p 2 ). The fp1( p 1 , p 2 ) model comprises the best-fitting combination of a dimension-one fractional polynomial (fp1) function of x and an fp1 function of acd ( x ). We describe a new model-selection algorithm called function-selection procedure with acd transformation, which uses significance testing to attempt to simplify an fp1( p 1 , p 2 ) model to a submodel, an fp1 or linear model in x or in acd ( x ). The function-selection procedure with acd transformation is related in concept to the fsp (fp function-selection procedure), which is an integral part of mfp and which is used to simplify a dimension-two (fp2) function. We describe the mfpa command and give univariable and multivariable examples with real data to demonstrate its use.

  8. Review and Extension of Suitability Assessment Indicators of Weather Model Output for Analyzing Decentralized Energy Systems

    Directory of Open Access Journals (Sweden)

    Hans Schermeyer

    2015-12-01

    Full Text Available Electricity from renewable energy sources (RES-E is gaining more and more influence in traditional energy and electricity markets in Europe and around the world. When modeling RES-E feed-in on a high temporal and spatial resolution, energy systems analysts frequently use data generated by numerical weather models as input since there is no spatial inclusive and comprehensive measurement data available. However, the suitability of such model data depends on the research questions at hand and should be inspected individually. This paper focuses on new methodologies to carry out a performance evaluation of solar irradiation data provided by a numerical weather model when investigating photovoltaic feed-in and effects on the electricity grid. Suitable approaches of time series analysis are researched from literature and applied to both model and measurement data. The findings and limits of these approaches are illustrated and a new set of validation indicators is presented. These novel indicators complement the assessment by measuring relevant key figures in energy systems analysis: e.g., gradients in energy supply, maximum values and volatility. Thus, the results of this paper contribute to the scientific community of energy systems analysts and researchers who aim at modeling RES-E feed-in on a high temporal and spatial resolution using weather model data.

  9. Optimized bit extraction using distortion modeling in the scalable extension of H.264/AVC.

    Science.gov (United States)

    Maani, Ehsan; Katsaggelos, Aggelos K

    2009-09-01

    The newly adopted scalable extension of H.264/AVC video coding standard (SVC) demonstrates significant improvements in coding efficiency in addition to an increased degree of supported scalability relative to the scalable profiles of prior video coding standards. Due to the complicated hierarchical prediction structure of the SVC and the concept of key pictures, content-aware rate adaptation of SVC bit streams to intermediate bit rates is a nontrivial task. The concept of quality layers has been introduced in the design of the SVC to allow for fast content-aware prioritized rate adaptation. However, existing quality layer assignment methods are suboptimal and do not consider all network abstraction layer (NAL) units from different layers for the optimization. In this paper, we first propose a technique to accurately and efficiently estimate the quality degradation resulting from discarding an arbitrary number of NAL units from multiple layers of a bitstream by properly taking drift into account. Then, we utilize this distortion estimation technique to assign quality layers to NAL units for a more efficient extraction. Experimental results show that a significant gain can be achieved by the proposed scheme.

  10. Efficient occupancy model-fitting for extensive citizen-science data

    Science.gov (United States)

    Morgan, Byron J. T.; Freeman, Stephen N.; Ridout, Martin S.; Brereton, Tom M.; Fox, Richard; Powney, Gary D.; Roy, David B.

    2017-01-01

    Appropriate large-scale citizen-science data present important new opportunities for biodiversity modelling, due in part to the wide spatial coverage of information. Recently proposed occupancy modelling approaches naturally incorporate random effects in order to account for annual variation in the composition of sites surveyed. In turn this leads to Bayesian analysis and model fitting, which are typically extremely time consuming. Motivated by presence-only records of occurrence from the UK Butterflies for the New Millennium data base, we present an alternative approach, in which site variation is described in a standard way through logistic regression on relevant environmental covariates. This allows efficient occupancy model-fitting using classical inference, which is easily achieved using standard computers. This is especially important when models need to be fitted each year, typically for many different species, as with British butterflies for example. Using both real and simulated data we demonstrate that the two approaches, with and without random effects, can result in similar conclusions regarding trends. There are many advantages to classical model-fitting, including the ability to compare a range of alternative models, identify appropriate covariates and assess model fit, using standard tools of maximum likelihood. In addition, modelling in terms of covariates provides opportunities for understanding the ecological processes that are in operation. We show that there is even greater potential; the classical approach allows us to construct regional indices simply, which indicate how changes in occupancy typically vary over a species’ range. In addition we are also able to construct dynamic occupancy maps, which provide a novel, modern tool for examining temporal changes in species distribution. These new developments may be applied to a wide range of taxa, and are valuable at a time of climate change. They also have the potential to motivate citizen

  11. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  12. The polarized structure function of the nucleons with a non-extensive statistical quark model

    Energy Technology Data Exchange (ETDEWEB)

    Trevisan, Luis A. [Departamento de Matematica e Estatistica, Universidade Estadual de Ponta Grossa, 84010-790, Ponta Grossa, PR (Brazil); Mirez, Carlos [Instituto de Ciencia, Engenharia e Tecnologia - ICET, Universidade Federal dos Vales do Jequitinhonha e Mucuri - UFVJM, Campus do Mucuri, Rua do Cruzeiro 01, Jardim Sao Paulo, 39803-371, Teofilo Otoni, Minas Gerais (Brazil)

    2013-05-06

    We studied an application of nonextensive thermodynamics to describe the polarized structure function of nucleon, in a model where the usual Fermi-Dirac and Bose-Einstein energy distribution, often used in the statistical models, were replaced by the equivalent functions of the q-statistical. The parameters of the model are given by an effective temperature T, the q parameter (from Tsallis statistics), and the chemical potentials given by the corresponding up (u) and down (d) quark normalization in the nucleon and by {Delta}u and {Delta}d of the polarized functions.

  13. Towards an effective health interventions design: an extension of the health belief model.

    Science.gov (United States)

    Orji, Rita; Vassileva, Julita; Mandryk, Regan

    2012-01-01

    The recent years have witnessed a continuous increase in lifestyle related health challenges around the world. As a result, researchers and health practitioners have focused on promoting healthy behavior using various behavior change interventions. The designs of most of these interventions are informed by health behavior models and theories adapted from various disciplines. Several health behavior theories have been used to inform health intervention designs, such as the Theory of Planned Behavior, the Transtheoretical Model, and the Health Belief Model (HBM). However, the Health Belief Model (HBM), developed in the 1950s to investigate why people fail to undertake preventive health measures, remains one of the most widely employed theories of health behavior. However, the effectiveness of this model is limited. The first limitation is the low predictive capacity (R(2) Health Belief Model by introducing four new variables: Self-identity, Perceived Importance, Consideration of Future Consequences, and Concern for Appearance as possible determinants of healthy behavior. (2) We exhaustively explored the relationships/interactions between the HBM variables and their effect size. (3) We tested the validity of both our proposed extended model and the original HBM on healthy eating behavior. Finally, we compared the predictive capacity of the original HBM model and our extended model. To achieve the objective of this paper, we conducted a quantitative study of 576 participants' eating behavior. Data for this study were collected over a period of one year (from August 2011 to August 2012). The questionnaire consisted of validated scales assessing the HBM determinants - perceived benefit, barrier, susceptibility, severity, cue to action, and self-efficacy - using 7-point Likert scale. We also assessed other health determinants such as consideration of future consequences, self-identity, concern for appearance and perceived importance. To analyses our data, we employed

  14. An extension of the technology acceptance model for business intelligence systems: project management maturity perspective

    Directory of Open Access Journals (Sweden)

    Mirjana Pejić Bach

    2017-01-01

    Full Text Available Business intelligence systems (BISs refer to wide range of technologies and applications useful for retrieving and analyzing the large amount of information with the goal to generate knowledge useful for making effective business decision. In order to investigate adoption of BISs in companies, we propose a model based on the technology acceptance model (TAM that is expanded by variables representing the concept of a project management maturity (PMM. The survey on the sample of USA companies has been conducted with the chief information officer (CIO as the main informant. Structural equations model has been developed in order to test the research model. Results indicate that TAM expanded with the notion of PMM is useful in increasing understanding of BISs adoption in companies.

  15. Dynamic energy conservation model REDUCE. Extension with experience curves, energy efficiency indicators and user's guide

    International Nuclear Information System (INIS)

    Uyterlinde, M.A.; Rijkers, F.A.M.

    1999-12-01

    The main objective of the energy conservation model REDUCE (Reduction of Energy Demand by Utilization of Conservation of Energy) is the evaluation of the effectiveness of economical, financial, institutional, and regulatory measures for improving the rational use of energy in end-use sectors. This report presents the results of additional model development activities, partly based on the first experiences in a previous project. Energy efficiency indicators have been added as an extra tool for output analysis in REDUCE. The methodology is described and some examples are given. The model has been extended with a method for modelling the effects of technical development on production costs, by means of an experience curve. Finally, the report provides a 'users guide', by describing in more detail the input data specification as well as all menus and buttons. 19 refs

  16. Extension of a generalized state-vector model of radiation carcinogenesis to consideration of dose rate

    International Nuclear Information System (INIS)

    Crawford-Brown, D.J.; Hofmann, W.

    1993-01-01

    Mathematical models for radiation carcinogenesis typically employ transition rates that either are a function of the dose to specific cells or are purely empirical constructs unrelated to biophysical theory. These functions either ignore or do not explicitly model interactions between the fates of cells in a community. This paper extends a model of mitosis, cell transformation, promotion, and progression to cases in which interacting cellular communities are irradiated at specified dose rates. The model predicts that lower dose rates are less effective at producing cancer when irradiation is by X- or gamma rays but are generally more effective in instances of irradiation by alpha particles up to a dose rate in excess of 0.01 Gy/day. The resulting predictions are compared with existing experimental data. 39 refs., 9 figs., 1 tab

  17. Extensions of D-optimal Minimal Designs for Symmetric Mixture Models

    OpenAIRE

    Li, Yanyan; Raghavarao, Damaraju; Chervoneva, Inna

    2016-01-01

    The purpose of mixture experiments is to explore the optimum blends of mixture components, which will provide desirable response characteristics in finished products. D-optimal minimal designs have been considered for a variety of mixture models, including Scheffé's linear, quadratic, and cubic models. Usually, these D-optimal designs are minimally supported since they have just as many design points as the number of parameters. Thus, they lack the degrees of freedom to perform the Lack of Fi...

  18. Extension of the SAEM algorithm for nonlinear mixed models with two levels of random effects

    OpenAIRE

    Panhard, Xavière; Samson, Adeline

    2008-01-01

    This article focuses on parameter estimation of multi-levels nonlinear mixed effects models (MNLMEMs). These models are used to analyze data presenting multiple hierarchical levels of grouping (cluster data, clinical trials with several observation periods,...). The variability of the individual parameters of the regression function is thus decomposed as a between-sub ject variability and higher levels of variability (for example within-sub ject variability). We propose maximum likelihood est...

  19. Cpl6: The New Extensible, High-Performance Parallel Coupler forthe Community Climate System Model

    Energy Technology Data Exchange (ETDEWEB)

    Craig, Anthony P.; Jacob, Robert L.; Kauffman, Brain; Bettge,Tom; Larson, Jay; Ong, Everest; Ding, Chris; He, Yun

    2005-03-24

    Coupled climate models are large, multiphysics applications designed to simulate the Earth's climate and predict the response of the climate to any changes in the forcing or boundary conditions. The Community Climate System Model (CCSM) is a widely used state-of-art climate model that has released several versions to the climate community over the past ten years. Like many climate models, CCSM employs a coupler, a functional unit that coordinates the exchange of data between parts of climate system such as the atmosphere and ocean. This paper describes the new coupler, cpl6, contained in the latest version of CCSM,CCSM3. Cpl6 introduces distributed-memory parallelism to the coupler, a class library for important coupler functions, and a standardized interface for component models. Cpl6 is implemented entirely in Fortran90 and uses Model Coupling Toolkit as the base for most of its classes. Cpl6 gives improved performance over previous versions and scales well on multiple platforms.

  20. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  1. Requirements traceability in model-driven development: Applying model and transformation conformance

    NARCIS (Netherlands)

    Andrade Almeida, João; Iacob, Maria Eugenia; van Eck, Pascal

    The variety of design artifacts (models) produced in a model-driven design process results in an intricate relationship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship, which helps in assessing the quality of

  2. An iterative and targeted sampling design informed by habitat suitability models for detecting focal plant species over extensive areas.

    Science.gov (United States)

    Wang, Ophelia; Zachmann, Luke J; Sesnie, Steven E; Olsson, Aaryn D; Dickson, Brett G

    2014-01-01

    Prioritizing areas for management of non-native invasive plants is critical, as invasive plants can negatively impact plant community structure. Extensive and multi-jurisdictional inventories are essential to prioritize actions aimed at mitigating the impact of invasions and changes in disturbance regimes. However, previous work devoted little effort to devising sampling methods sufficient to assess the scope of multi-jurisdictional invasion over extensive areas. Here we describe a large-scale sampling design that used species occurrence data, habitat suitability models, and iterative and targeted sampling efforts to sample five species and satisfy two key management objectives: 1) detecting non-native invasive plants across previously unsampled gradients, and 2) characterizing the distribution of non-native invasive plants at landscape to regional scales. Habitat suitability models of five species were based on occurrence records and predictor variables derived from topography, precipitation, and remotely sensed data. We stratified and established field sampling locations according to predicted habitat suitability and phenological, substrate, and logistical constraints. Across previously unvisited areas, we detected at least one of our focal species on 77% of plots. In turn, we used detections from 2011 to improve habitat suitability models and sampling efforts in 2012, as well as additional spatial constraints to increase detections. These modifications resulted in a 96% detection rate at plots. The range of habitat suitability values that identified highly and less suitable habitats and their environmental conditions corresponded to field detections with mixed levels of agreement. Our study demonstrated that an iterative and targeted sampling framework can address sampling bias, reduce time costs, and increase detections. Other studies can extend the sampling framework to develop methods in other ecosystems to provide detection data. The sampling methods

  3. Predicting Corporate Financial Distress in Sri Lanka: An Extension to Z-Score Model

    Directory of Open Access Journals (Sweden)

    K.G.M. Nanayakkara

    2015-03-01

    Full Text Available The main purpose of this study is to develop a better financial distress prediction model for the Sri Lankan companies using the Z-score model. Fourteen variables have been selected consisting of accounting, cash flow and market based variables. Multivariate Discriminate Analysis (MDA was used as the analytical technique and stepwise method was used to select the variables with the best discriminating power to a dataset of sixty-seven matched pairs of failed and non-failed quoted public companies over the period 2002 to 2011. The final models are validated using the cross validation method. The results indicate that a model with four predictors of earnings before interest and taxes, cash flow from operations to total debts, retained earnings to total assets, and firm size have achieved the classification accuracy of 85.8% in one year prior to the distress with a very low type I error. Moreover, the model has correctly classified the cases by 79.9% and 69.4% in two year and three year prior to distress respectively. The study has further revealed that the companies with negative cutoff value fall into distress zone while the companies with positive cutoff values fall into safety area. Hence, the study concluded that the companies with cutoff values approximately zero should be considered on mitigating actions for financial distress not only on the accounting information but also on the cash flow and market data.

  4. Safety Culture: A Requirement for New Business Models — Lessons Learned from Other High Risk Industries

    International Nuclear Information System (INIS)

    Kecklund, L.

    2016-01-01

    Technical development and changes on global markets affects all high risk industries creating opportunities as well as risks related to the achievement of safety and business goals. Changes in legal and regulatory frameworks as well as in market demands create a need for major changes. Several high risk industries are facing a situation where they have to develop new business models. Within the transportation domain, e.g., aviation and railways, there is a growing concern related to how the new business models may affects safety issues. New business models in aviation and railways include extensive use of outsourcing and subcontractors to reduce costs resulting in, e.g., negative changes in working conditions, work hours, employment conditions and high turnover rates. The energy sector also faces pressures to create new business models for transition to renewable energy production to comply with new legal and regulatory requirements and to make best use of new reactor designs. In addition, large scale phase out and decommissioning of nuclear facilities have to be managed by the nuclear industry. Some negative effects of new business models have already arisen within the transportation domain, e.g., the negative effects of extensive outsourcing and subcontractor use. In the railway domain the infrastructure manager is required by European and national regulations to assure that all subcontractors are working according to the requirements in the infrastructure managers SMS (Safety Management System). More than ten levels of subcontracts can be working in a major infrastructure project making the system highly complex and thus difficult to control. In the aviation domain, tightly coupled interacting computer networks supplying airport services, as well as air traffic control, are managed and maintained by several different companies creating numerous interfaces which must be managed by the SMS. There are examples where a business model with several low

  5. A generative probabilistic model and discriminative extensions for brain lesion segmentation – with application to tumor and stroke

    Science.gov (United States)

    Menze, Bjoern H.; Van Leemput, Koen; Lashkari, Danial; Riklin-Raviv, Tammy; Geremia, Ezequiel; Alberts, Esther; Gruber, Philipp; Wegener, Susanne; Weber, Marc-André; Székely, Gabor; Ayache, Nicholas; Golland, Polina

    2016-01-01

    We introduce a generative probabilistic model for segmentation of brain lesions in multi-dimensional images that generalizes the EM segmenter, a common approach for modelling brain images using Gaussian mixtures and a probabilistic tissue atlas that employs expectation-maximization (EM) to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model to arbitrary labels with semantic and biological meaning, such as “tumor core” or “fluid-filled structure”, but without a one-to-one correspondence to the hypo-or hyper-intense lesion areas identified by the generative model. We test the approach in two image sets: the publicly available BRATS set of glioma patient scans, and multimodal brain images of patients with acute and subacute ischemic stroke. We find the generative model that has been designed for tumor lesions to generalize well to stroke images, and the generative-discriminative model to be one of the top ranking methods in the BRATS evaluation. PMID:26599702

  6. Extensions of D-optimal Minimal Designs for Symmetric Mixture Models.

    Science.gov (United States)

    Li, Yanyan; Raghavarao, Damaraju; Chervoneva, Inna

    2017-01-01

    The purpose of mixture experiments is to explore the optimum blends of mixture components, which will provide desirable response characteristics in finished products. D-optimal minimal designs have been considered for a variety of mixture models, including Scheffé's linear, quadratic, and cubic models. Usually, these D-optimal designs are minimally supported since they have just as many design points as the number of parameters. Thus, they lack the degrees of freedom to perform the Lack of Fit tests. Also, the majority of the design points in D-optimal minimal designs are on the boundary: vertices, edges, or faces of the design simplex. Also a new strategy for adding multiple interior points for symmetric mixture models is proposed. We compare the proposed designs with Cornell (1986) two ten-point designs for the Lack of Fit test by simulations.

  7. NASA Common Research Model Test Envelope Extension With Active Sting Damping at NTF

    Science.gov (United States)

    Rivers, Melissa B.; Balakrishna, S.

    2014-01-01

    The NASA Common Research Model (CRM) high Reynolds number transonic wind tunnel testing program was established to generate an experimental database for applied Computational Fluid Dynamics (CFD) validation studies. During transonic wind tunnel tests, the CRM encounters large sting vibrations when the angle of attack approaches the second pitching moment break, which can sometimes become divergent. CRM transonic test data analysis suggests that sting divergent oscillations are related to negative net sting damping episodes associated with flow separation instability. The National Transonic Facility (NTF) has been addressing remedies to extend polar testing up to and beyond the second pitching moment break point of the test articles using an active piezoceramic damper system for both ambient and cryogenic temperatures. This paper reviews CRM test results to gain understanding of sting dynamics with a simple model describing the mechanics of a sting-model system and presents the performance of the damper under cryogenic conditions.

  8. Extension of an anisotropic creep model to general high temperature deformation of a single crystal superalloy

    International Nuclear Information System (INIS)

    Pan, L.M.; Ghosh, R.N.; McLean, M.

    1993-01-01

    A physics based model has been developed that accounts for the principal features of anisotropic creep deformation of single crystal superalloys. The present paper extends this model to simulate other types of high temperature deformation under strain controlled test conditions, such as stress relaxation and tension tests at constant strain rate in single crystals subject to axial loading along an arbitrary crystal direction. The approach is applied to the SRR99 single crystal superalloy where a model parameter database is available, determined via analysis of a database of constant stress creep curves. A software package has been generated to simulate the deformation behaviour under complex stress-strain conditions taking into account anisotropic elasticity. (orig.)

  9. Extension of the M-D model for treating stress drops in salt

    International Nuclear Information System (INIS)

    Munson, D.E.; DeVries, K.L.; Fossum, A.F.; Callahan, G.D.

    1993-01-01

    Development of the multimechanism deformation (M-D) constitutive model for steady state creep, which incorporates irreversible workhardening and recovery transient strains, was motivated by the need to predict very long term closures in underground rooms for radioactive waste repositories in salt. The multimechanism deformation model for the creep deformation of salt is extended to treat the response of salt to imposed stress drops. Stress drop tests produce a very distinctive behavior where both reversible elastic strain and reversible time dependent strain occur. These transient strains are negative compared to the positive transient strains produced by the normal creep workhardening and recovery processes. A simple micromechanical evolutionary process is defined to account for the accumulation of these reversible strains, and their subsequent release with decreases in stress. A number of experimental stress drop tests for various stress drop magnitudes and temperatures are adequately simulated with the model

  10. Exercise effects in a virtual type 1 diabetes patient: Using stochastic differential equations for model extension

    DEFF Research Database (Denmark)

    Duun-Henriksen, Anne Katrine; Schmidt, S.; Nørgaard, K.

    2013-01-01

    The use of virtual patients for in silico testing of control algorithms for an artificial pancreas is growing. It is an easy, fast and low-cost alternative to pre-clinical testing. To simulate the everyday life of a type 1 diabetes (T1D) patient a simulator must be able to take into account...... physical activity. Exercise constitutes a substantial challenge to closed-loop control of T1D. The effects are many and depend on intensity and duration and may be delayed by several hours. In this study, we use a model for the glucoregulatory system based on the minimal model and a previously published...... on clinical data from a study including exercise bouts of 20 minutes performed on 12 T1D patients treated with continuous subcutaneous insulin infusion. The predictive abilities of the model are investigated. In conclusion, this study illustrates the advantages of using SDEs in the development of an extended...

  11. Extension of Murray's law using a non-Newtonian model of blood flow

    Directory of Open Access Journals (Sweden)

    Bonjour Jocelyn

    2009-05-01

    Full Text Available Abstract Background So far, none of the existing methods on Murray's law deal with the non-Newtonian behavior of blood flow although the non-Newtonian approach for blood flow modelling looks more accurate. Modeling In the present paper, Murray's law which is applicable to an arterial bifurcation, is generalized to a non-Newtonian blood flow model (power-law model. When the vessel size reaches the capillary limitation, blood can be modeled using a non-Newtonian constitutive equation. It is assumed two different constraints in addition to the pumping power: the volume constraint or the surface constraint (related to the internal surface of the vessel. For a seek of generality, the relationships are given for an arbitrary number of daughter vessels. It is shown that for a cost function including the volume constraint, classical Murray's law remains valid (i.e. ΣRc = cste with c = 3 is verified and is independent of n, the dimensionless index in the viscosity equation; R being the radius of the vessel. On the contrary, for a cost function including the surface constraint, different values of c may be calculated depending on the value of n. Results We find that c varies for blood from 2.42 to 3 depending on the constraint and the fluid properties. For the Newtonian model, the surface constraint leads to c = 2.5. The cost function (based on the surface constraint can be related to entropy generation, by dividing it by the temperature. Conclusion It is demonstrated that the entropy generated in all the daughter vessels is greater than the entropy generated in the parent vessel. Furthermore, it is shown that the difference of entropy generation between the parent and daughter vessels is smaller for a non-Newtonian fluid than for a Newtonian fluid.

  12. Supersymmetric U(1)Y‧⊗ U(1)B-L extension of the Standard Model

    Science.gov (United States)

    Montero, J. C.; Pleitez, V.; Sánchez-Vega, B. L.; Rodriguez, M. C.

    2017-06-01

    We build a supersymmetric version with SU(3)C ⊗ SU(2)L ⊗ U(1)Y‧⊗ U(1)B-L gauge symmetry, where Y‧ is a new charge and B and L are the usual baryonic and leptonic numbers. The model has three right-handed neutrinos with identical B - L charges, and can accommodate all fermion masses at the tree level. In particular, the type I seesaw mechanism is implemented for the generation of the active neutrino masses. We obtain the mass spectra of all sectors and for the scalar one we also give the flat directions allowed by the model.

  13. Effective description of general extensions of the Standard Model: the complete tree-level dictionary

    Science.gov (United States)

    de Blas, J.; Criado, J. C.; Pérez-Victoria, M.; Santiago, J.

    2018-03-01

    We compute all the tree-level contributions to the Wilson coefficients of the dimension-six Standard-Model effective theory in ultraviolet completions with general scalar, spinor and vector field content and arbitrary interactions. No assumption about the renormalizability of the high-energy theory is made. This provides a complete ultraviolet/infrared dictionary at the classical level, which can be used to study the low-energy implications of any model of interest, and also to look for explicit completions consistent with low-energy data.

  14. Stability of the electroweak ground state in the Standard Model and its extensions

    International Nuclear Information System (INIS)

    Di Luzio, Luca; Isidori, Gino; Ridolfi, Giovanni

    2016-01-01

    We review the formalism by which the tunnelling probability of an unstable ground state can be computed in quantum field theory, with special reference to the Standard Model of electroweak interactions. We describe in some detail the approximations implicitly adopted in such calculation. Particular attention is devoted to the role of scale invariance, and to the different implications of scale-invariance violations due to quantum effects and possible new degrees of freedom. We show that new interactions characterized by a new energy scale, close to the Planck mass, do not invalidate the main conclusions about the stability of the Standard Model ground state derived in absence of such terms.

  15. Stability of the electroweak ground state in the Standard Model and its extensions

    Energy Technology Data Exchange (ETDEWEB)

    Di Luzio, Luca, E-mail: diluzio@ge.infn.it [Dipartimento di Fisica, Università di Genova and INFN, Sezione di Genova, Via Dodecaneso 33, I-16146 Genova (Italy); Isidori, Gino [Department of Physics, University of Zürich, Winterthurerstrasse 190, CH-8057 Zürich (Switzerland); Ridolfi, Giovanni [Dipartimento di Fisica, Università di Genova and INFN, Sezione di Genova, Via Dodecaneso 33, I-16146 Genova (Italy)

    2016-02-10

    We review the formalism by which the tunnelling probability of an unstable ground state can be computed in quantum field theory, with special reference to the Standard Model of electroweak interactions. We describe in some detail the approximations implicitly adopted in such calculation. Particular attention is devoted to the role of scale invariance, and to the different implications of scale-invariance violations due to quantum effects and possible new degrees of freedom. We show that new interactions characterized by a new energy scale, close to the Planck mass, do not invalidate the main conclusions about the stability of the Standard Model ground state derived in absence of such terms.

  16. Stability of the electroweak ground state in the Standard Model and its extensions

    Directory of Open Access Journals (Sweden)

    Luca Di Luzio

    2016-02-01

    Full Text Available We review the formalism by which the tunnelling probability of an unstable ground state can be computed in quantum field theory, with special reference to the Standard Model of electroweak interactions. We describe in some detail the approximations implicitly adopted in such calculation. Particular attention is devoted to the role of scale invariance, and to the different implications of scale-invariance violations due to quantum effects and possible new degrees of freedom. We show that new interactions characterized by a new energy scale, close to the Planck mass, do not invalidate the main conclusions about the stability of the Standard Model ground state derived in absence of such terms.

  17. A theoretical and empirical evaluation and extension of the Todaro migration model.

    Science.gov (United States)

    Salvatore, D

    1981-11-01

    "This paper postulates that it is theoretically and empirically preferable to base internal labor migration on the relative difference in rural-urban real income streams and rates of unemployment, taken as separate and independent variables, rather than on the difference in the expected real income streams as postulated by the very influential and often quoted Todaro model. The paper goes on to specify several important ways of extending the resulting migration model and improving its empirical performance." The analysis is based on Italian data. excerpt

  18. Extension of the engineering treatment model (ETM) to bending configurations under plane stress

    International Nuclear Information System (INIS)

    Schwalbe, K.H.

    1987-01-01

    The application of the previously introduced Engineering Treatment Model to bending configurations involves the prediction of the crack tip opening displacement, the load line displacement, and the J-integral. Using these, failure situations can be predicted. All predictions except one are in good agreement with experimental results (Al2024-FC, Al2024-T351, steel 35Ni(rMo16, pressure vessel steel 20MnMoNi55) or finite element calculations. The reason for the exception is clear and will be utilised for an improvement of the model. With 26 figs., 2 tabs [de

  19. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  20. Automated main-chain model building by template matching and iterative fragment extension

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.

    2003-01-01

    A method for automated macromolecular main-chain model building is described. An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and β-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and β-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more C α positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 Å. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition

  1. An Extension of IRT-Based Equating to the Dichotomous Testlet Response Theory Model

    Science.gov (United States)

    Tao, Wei; Cao, Yi

    2016-01-01

    Current procedures for equating number-correct scores using traditional item response theory (IRT) methods assume local independence. However, when tests are constructed using testlets, one concern is the violation of the local item independence assumption. The testlet response theory (TRT) model is one way to accommodate local item dependence.…

  2. BPS sectors of the Skyrme model and their non-BPS extensions

    Science.gov (United States)

    Adam, C.; Foster, D.; Krusch, S.; Wereszczynski, A.

    2018-02-01

    Two recently found coupled Bogomol'nyi-Prasad-Sommerfield (BPS) submodels of the Skyrme model are further analyzed. First, we provide a geometrical formulation of the submodels in terms of the eigenvalues of the strain tensor. Second, we study their thermodynamical properties and show that the mean-field equations of state coincide at high pressure and read p =ρ ¯/3 . We also provide evidence that matter described by the first BPS submodel has some similarity with a Bose-Einstein condensate. Moreover, we show that extending the second submodel to a non-BPS model by including certain additional terms of the full Skyrme model does not spoil the respective ansatz, leading to an ordinary differential equation for the profile of the Skymion, for any value of the topological charge. This allows for an almost analytical description of the properties of Skyrmions in this model. In particular, we analytically study the breaking and restoration of the BPS property. Finally, we provide an explanation of the success of the rational map ansatz.

  3. Effectiveness of the famer-to-farmer extension model in increasing ...

    African Journals Online (AJOL)

    The effectiveness of the model was found to depend on facilitators in terms of: a) their socio-economic closeness to the beneficiaries; b) their multiple community roles which boosted communication networks; c) their role in enhanced information flow among individuals of similar social status; d) better interaction and ...

  4. User Acceptance of YouTube for Procedural Learning: An Extension of the Technology Acceptance Model

    Science.gov (United States)

    Lee, Doo Young; Lehto, Mark R.

    2013-01-01

    The present study was framed using the Technology Acceptance Model (TAM) to identify determinants affecting behavioral intention to use YouTube. Most importantly, this research emphasizes the motives for using YouTube, which is notable given its extrinsic task goal of being used for procedural learning tasks. Our conceptual framework included two…

  5. HSP v2: Haptic Signal Processing with Extensions for Physical Modeling

    DEFF Research Database (Denmark)

    Overholt, Daniel; Kontogeorgakopoulos, Alexandros; Berdahl, Edgar

    2010-01-01

    The Haptic Signal Processing (HSP) platform aims to enable musicians to easily design and perform with digital haptic musical instruments [1]. In this paper, we present some new objects introduced in version v2 for modeling of musical dynamical systems such as resonators and vibrating strings...

  6. Knowledge management: from ignorance to truth or the dikw model extension

    Directory of Open Access Journals (Sweden)

    Sturza Alexei

    2013-01-01

    Full Text Available Knowledge management may be treated both narrowly and broadly. The understanding of the nature of knowledge, including its classification, is important for improving knowledge management. There are distinguished 6 forms of knowledge: ignorance, data, information, knowledge, wisdom and truth (an extended and original version of the DIKW model.

  7. KNOWLEDGE MANAGEMENT: FROM IGNORANCE TO TRUTH OR THE DIKW MODEL EXTENSION

    Directory of Open Access Journals (Sweden)

    Sturza Alexei

    2013-01-01

    Full Text Available Knowledge management may be treated both narrowly and broadly. The understanding of the nature of knowledge, including its classification, is important for improving knowledge management. There are distinguished 6 forms of knowledge: ignorance, data, information, knowledge, wisdom and truth (an extended and original version of the DIKW model.

  8. Modeling Views for Semantic Web Using eXtensible Semantic (XSemantic) Nets

    NARCIS (Netherlands)

    Rajugan, R.; Chang, E.; Feng, L.; Dillon, T.; meersman, R; Tari, Z; herrero, p; Méndez, G.; Cavedon, L.; Martin, D.; Hinze, A.; Buchanan, G.

    2005-01-01

    The emergence of Semantic Web (SW) and the related technologies promise to make the web a meaningful experience. Yet, high level modeling, design and querying techniques proves to be a challenging task for organizations that are hoping utilize the SW paradigm for their industrial applications, which

  9. Extension of the Hapke bidirectional reflectance model to retrieve soil water content

    Directory of Open Access Journals (Sweden)

    G.-J. Yang

    2011-07-01

    Full Text Available Soil moisture links the hydrologic cycle and the energy budget of land surfaces by regulating latent heat fluxes. An accurate assessment of the spatial and temporal variation of soil moisture is important to the study of surface biogeophysical processes. Although remote sensing has proven to be one of the most powerful tools for obtaining land surface parameters, no effective methodology yet exists for in situ soil moisture measurement based on a Bidirectional Reflectance Distribution Function (BRDF model, such as the Hapke model. To retrieve and analyze soil moisture, this study applied the soil water parametric (SWAP-Hapke model, which introduced the equivalent water thickness of soil, to ground multi-angular and hyperspectral observations coupled with, Powell-Ant Colony Algorithm methods. The inverted soil moisture data resulting from our method coincided with in situ measurements (R2 = 0.867, RMSE = 0.813 based on three selected bands (672 nm, 866 nm, 2209 nm. It proved that the extended Hapke model can be used to estimate soil moisture with high accuracy based on the field multi-angle and multispectral remote sensing data.

  10. Consumer Decision-Making Styles Extension to Trust-Based Product Comparison Site Usage Model

    Directory of Open Access Journals (Sweden)

    Radoslaw Macik

    2016-09-01

    Full Text Available The paper describes an implementation of extended consumer decision-making styles concept in explaining consumer choices made in product comparison site environment in the context of trust-based information technology acceptance model. Previous research proved that trust-based acceptance model is useful in explaining purchase intention and anticipated satisfaction in product comparison site environment, as an example of online decision shopping aids. Trust to such aids is important in explaining their usage by consumers. The connections between consumer decision-making styles, product and sellers opinions usage, cognitive and affective trust toward online product comparison site, as well as choice outcomes (purchase intention and brand choice are explored trough structural equation models using PLS-SEM approach, using a sample of 461 young consumers. Research confirmed the validity of research model in explaining product comparison usage, and some consumer decision-making styles influenced consumers’ choices and purchase intention. Product and sellers reviews usage were partially mediating mentioned relationships.

  11. Phenomenology of U(1)F extension of inert-doublet model with exotic scalars and leptons

    Science.gov (United States)

    Dhargyal, Lobsang

    2018-02-01

    In this work we will extend the inert-doublet model (IDM) by adding a new U(1)F gauge symmetry to it, under which, a Z2 even scalar (φ 2) and Z2 odd right handed component of two exotic charged leptons (F_{eR}, F_{μ R}), are charged. We also add one Z2 even real scalar (φ 1) and one complex scalar (φ ), three neutral Majorana right handed fermions (N1, N2, N3), two left handed components of the exotic charged leptons (F_{eL}, F_{μ L}) as well as F_{τ } are all odd under the Z2, all of which are not charged under the U(1)F. With these new particles added to the IDM, we have a model which can give two scalar DM candidates, together they can explain the present DM relic density as well as the muon (g-2) anomaly simultaneously. Also in this model the neutrino masses are generated at one loop level. One of the most peculiar feature of this model is that non-trivial solution to the axial gauge anomaly free conditions lead to the prediction of a stable very heavy partner to the electron (Fe), whose present collider limit (13 TeV LHC) on its mass should be around m_{Fe} ≥ few TeV.

  12. Inferring Causalities in Landscape Genetics: An Extension of Wright's Causal Modeling to Distance Matrices.

    Science.gov (United States)

    Fourtune, Lisa; Prunier, Jérôme G; Paz-Vinas, Ivan; Loot, Géraldine; Veyssière, Charlotte; Blanchet, Simon

    2018-04-01

    Identifying landscape features that affect functional connectivity among populations is a major challenge in fundamental and applied sciences. Landscape genetics combines landscape and genetic data to address this issue, with the main objective of disentangling direct and indirect relationships among an intricate set of variables. Causal modeling has strong potential to address the complex nature of landscape genetic data sets. However, this statistical approach was not initially developed to address the pairwise distance matrices commonly used in landscape genetics. Here, we aimed to extend the applicability of two causal modeling methods-that is, maximum-likelihood path analysis and the directional separation test-by developing statistical approaches aimed at handling distance matrices and improving functional connectivity inference. Using simulations, we showed that these approaches greatly improved the robustness of the absolute (using a frequentist approach) and relative (using an information-theoretic approach) fits of the tested models. We used an empirical data set combining genetic information on a freshwater fish species (Gobio occitaniae) and detailed landscape descriptors to demonstrate the usefulness of causal modeling to identify functional connectivity in wild populations. Specifically, we demonstrated how direct and indirect relationships involving altitude, temperature, and oxygen concentration influenced within- and between-population genetic diversity of G. occitaniae.

  13. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    Science.gov (United States)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  14. Differences Awareness Model to Fundament Long Term Strategy Extension in a New Emerging Market

    Directory of Open Access Journals (Sweden)

    Elena Mădălina ŞERBAN

    2011-12-01

    Full Text Available The globalization opened the opportunity for multinational companies to increase their businesses in emerging markets and also to local giants from developing countries to step in the western markets. The objective of this paper is to propose a model for analyzing the differences between operating businesses in emerging markets and developed markets. The model purpose is to be applied in the strategic process by mainly the multinationals or local giants intending to develop long term businesses outside their traditional geographies. The studies on emerging markets identified two main axes to be considered when appreciating the attractiveness of the markets: the business dynamics both inside and outside the company and the institutional environment, as facilitator of efficient encounter between sellers and buyers.

  15. Searching for extensions to the standard model in rare kaon decays

    International Nuclear Information System (INIS)

    Sanders, G.H.

    1989-01-01

    Small effects that are beyond the current standard models of physics are often signatures for new physics, revealing fields and mass scales far removed from contemporary experimental capabilities. This perspective motivates sensitive searches for rare decays of the kaon. The current status of these searches is reviewed, new results are presented, and progress in the near future is discussed. Opportunities for exciting physics research at a hadron facility are noted. 5 refs., 8 figs., 1 tab

  16. On the extension of multi-phase models to sub-residual saturations

    International Nuclear Information System (INIS)

    Lingineni, S.; Chen, Y.T.; Boehm, R.F.

    1995-01-01

    This paper focuses on the limitations of applying multi-phase flow and transport models to simulate the hydrothermal processes occurring when the liquid saturation falls below residual levels. A typical scenario of a heat-generating high-level waste package emplaced in a backfilled drift of a waste repository is presented. The hydrothermal conditions in the vicinity of the waste package as well as in the far-field are determined using multi-phase, non-isothermal codes such as TOUGH2 and FEHM. As the waste package temperature increases, heat-pipe effects are created and water is driven away from the package into colder regions where it condenses. The variations in the liquid saturations close to the waste package are determined using these models with extended capillary pressure-saturations relationships to sub-residual regime. The predictions indicate even at elevated temperatures, waste package surroundings are not completely dry. However, if transport based modeling is used to represent liquid saturation variations in the sub-residual regime, then complete dry conditions are predicted within the backfill for extended periods of time. The relative humidity conditions near the waste package are also found to be sensitive to the representation of capillary pressure-saturation relationship used for sub-residual regime. An experimental investigation is carried out to study the variations in liquid saturations and relative humidity conditions in sub-residual regimes. Experimental results indicated that extended multi-phase models without interphase transport can not predict dry-out conditions and the simulations underpredict the humidity conditions near the waste package

  17. The broken-pair model for nuclei and its extension with quadrupole vibrations

    International Nuclear Information System (INIS)

    Hofstra, P.

    1979-01-01

    The author presents calculations for low energy properties of nuclei with an odd number of particles. These are described in the Broken-Pair approximation, where it is assumed that all but three particles occur as ordered Cooper pairs; the unpaired (one or three) particles are called quasiparticles. A model is developed with which it is hoped to describe odd nuclei with two open shells in terms of both single-particle and collective degrees of freedom. (Auth.)

  18. An Extension of the Miller Equilibrium Model into the X-Point Region

    Science.gov (United States)

    Hill, M. D.; King, R. W.; Stacey, W. M.

    2017-10-01

    The Miller equilibrium model has been extended to better model the flux surfaces in the outer region of the plasma and scrape-off layer, including the poloidally non-uniform flux surface expansion that occurs in the X-point region(s) of diverted tokamaks. Equations for elongation and triangularity are modified to include a poloidally varying component and grad-r, which is used in the calculation of the poloidal magnetic field, is rederived. Initial results suggest that strong quantitative agreement with experimental flux surface reconstructions and strong qualitative agreement with poloidal magnetic fields can be obtained using this model. Applications are discussed. A major new application is the automatic generation of the computation mesh in the plasma edge, scrape-off layer, plenum and divertor regions for use in the GTNEUT neutral particle transport code, enabling this powerful analysis code to be routinely run in experimental analyses. Work supported by US DOE under DE-FC02-04ER54698.

  19. Application and project portfolio valuation using enterprise architecture and business requirements modelling

    Science.gov (United States)

    Quartel, Dick; Steen, Maarten W. A.; Lankhorst, Marc M.

    2012-05-01

    This article describes an architecture-based approach to IT valuation. This approach offers organisations an instrument to valuate their application and project portfolios and to make well-balanced decisions about IT investments. The value of a software application is assessed in terms of its contribution to a selection of business goals. Based on such assessments, the value of different applications can be compared, and requirements for innovation, development, maintenance and phasing out can be identified. IT projects are proposed to realise the requirements. The value of each project is assessed in terms of the value it adds to one or more applications. This value can be obtained by relating the 'as-is' application portfolio to the 'to-be' portfolio that is being proposed by the project portfolio. In this way, projects can be ranked according to their added value, given a certain selection of business goals. The approach uses ArchiMate to model the relationship between software applications, business processes, services and products. In addition, two language extensions are used to model the relationship of these elements to business goals and requirements and to projects and project portfolios. The approach is illustrated using the portfolio method of Bedell and has been implemented in BiZZdesign Architect.

  20. Modeling the minimum enzymatic requirements for optimal cellulose conversion

    International Nuclear Information System (INIS)

    Den Haan, R; Van Zyl, W H; Van Zyl, J M; Harms, T M

    2013-01-01

    Hydrolysis of cellulose is achieved by the synergistic action of endoglucanases, exoglucanases and β-glucosidases. Most cellulolytic microorganisms produce a varied array of these enzymes and the relative roles of the components are not easily defined or quantified. In this study we have used partially purified cellulases produced heterologously in the yeast Saccharomyces cerevisiae to increase our understanding of the roles of some of these components. CBH1 (Cel7), CBH2 (Cel6) and EG2 (Cel5) were separately produced in recombinant yeast strains, allowing their isolation free of any contaminating cellulolytic activity. Binary and ternary mixtures of the enzymes at loadings ranging between 3 and 100 mg g −1 Avicel allowed us to illustrate the relative roles of the enzymes and their levels of synergy. A mathematical model was created to simulate the interactions of these enzymes on crystalline cellulose, under both isolated and synergistic conditions. Laboratory results from the various mixtures at a range of loadings of recombinant enzymes allowed refinement of the mathematical model. The model can further be used to predict the optimal synergistic mixes of the enzymes. This information can subsequently be applied to help to determine the minimum protein requirement for complete hydrolysis of cellulose. Such knowledge will be greatly informative for the design of better enzymatic cocktails or processing organisms for the conversion of cellulosic biomass to commodity products. (letter)

  1. Dual Processing Model for Medical Decision-Making: An Extension to Diagnostic Testing.

    Directory of Open Access Journals (Sweden)

    Athanasios Tsalatsanis

    Full Text Available Dual Processing Theories (DPT assume that human cognition is governed by two distinct types of processes typically referred to as type 1 (intuitive and type 2 (deliberative. Based on DPT we have derived a Dual Processing Model (DPM to describe and explain therapeutic medical decision-making. The DPM model indicates that doctors decide to treat when treatment benefits outweigh its harms, which occurs when the probability of the disease is greater than the so called "threshold probability" at which treatment benefits are equal to treatment harms. Here we extend our work to include a wider class of decision problems that involve diagnostic testing. We illustrate applicability of the proposed model in a typical clinical scenario considering the management of a patient with prostate cancer. To that end, we calculate and compare two types of decision-thresholds: one that adheres to expected utility theory (EUT and the second according to DPM. Our results showed that the decisions to administer a diagnostic test could be better explained using the DPM threshold. This is because such decisions depend on objective evidence of test/treatment benefits and harms as well as type 1 cognition of benefits and harms, which are not considered under EUT. Given that type 1 processes are unique to each decision-maker, this means that the DPM threshold will vary among different individuals. We also showed that when type 1 processes exclusively dominate decisions, ordering a diagnostic test does not affect a decision; the decision is based on the assessment of benefits and harms of treatment. These findings could explain variations in the treatment and diagnostic patterns documented in today's clinical practice.

  2. Dual Processing Model for Medical Decision-Making: An Extension to Diagnostic Testing.

    Science.gov (United States)

    Tsalatsanis, Athanasios; Hozo, Iztok; Kumar, Ambuj; Djulbegovic, Benjamin

    2015-01-01

    Dual Processing Theories (DPT) assume that human cognition is governed by two distinct types of processes typically referred to as type 1 (intuitive) and type 2 (deliberative). Based on DPT we have derived a Dual Processing Model (DPM) to describe and explain therapeutic medical decision-making. The DPM model indicates that doctors decide to treat when treatment benefits outweigh its harms, which occurs when the probability of the disease is greater than the so called "threshold probability" at which treatment benefits are equal to treatment harms. Here we extend our work to include a wider class of decision problems that involve diagnostic testing. We illustrate applicability of the proposed model in a typical clinical scenario considering the management of a patient with prostate cancer. To that end, we calculate and compare two types of decision-thresholds: one that adheres to expected utility theory (EUT) and the second according to DPM. Our results showed that the decisions to administer a diagnostic test could be better explained using the DPM threshold. This is because such decisions depend on objective evidence of test/treatment benefits and harms as well as type 1 cognition of benefits and harms, which are not considered under EUT. Given that type 1 processes are unique to each decision-maker, this means that the DPM threshold will vary among different individuals. We also showed that when type 1 processes exclusively dominate decisions, ordering a diagnostic test does not affect a decision; the decision is based on the assessment of benefits and harms of treatment. These findings could explain variations in the treatment and diagnostic patterns documented in today's clinical practice.

  3. Strong dynamics in a classically scale invariant extension of the standard model with a flat potential

    Science.gov (United States)

    Haba, Naoyuki; Yamada, Toshifumi

    2017-06-01

    We investigate the scenario where the standard model is extended with classical scale invariance, which is broken by chiral symmetry breaking and confinement in a new strongly coupled gauge theory that resembles QCD. The standard model Higgs field emerges as a result of the mixing of a scalar meson in the new strong dynamics and a massless elementary scalar field. The mass and scalar decay constant of that scalar meson, which are generated dynamically in the new gauge theory, give rise to the Higgs field mass term, automatically possessing the correct negative sign by the bosonic seesaw mechanism. Using analogy with QCD, we evaluate the dynamical scale of the new gauge theory and further make quantitative predictions for light pseudo-Nambu-Goldstone bosons associated with the spontaneous breaking of axial symmetry along chiral symmetry breaking in the new gauge theory. A prominent consequence of the scenario is that there should be a standard model gauge singlet pseudo-Nambu-Goldstone boson with mass below 220 GeV, which couples to two electroweak gauge bosons through the Wess-Zumino-Witten term, whose strength is thus determined by the dynamical scale of the new gauge theory. Other pseudo-Nambu-Goldstone bosons, charged under the electroweak gauge groups, also appear. Concerning the theoretical aspects, it is shown that the scalar quartic coupling can vanish at the Planck scale with the top quark pole mass as large as 172.5 GeV, realizing the flatland scenario without being in tension with the current experimental data.

  4. Extension of semi-analytical Erbium-doped fiber amplifier model to self-saturation regime

    DEFF Research Database (Denmark)

    Nissov, Morten

    1997-01-01

    demonstrate very good agreement between measurements and simulations based on measured fiber data for both high-gain EDFAs (small-signal gain of 35 dB) and isolator EDFAs (small-signal gain of 43 dB) with deviations Academic Press.......We show in this paper that the analytical erbium-doped fiber amplifier (EDFA) model presented by Jopson and Saleh in 1991 can be extended to the self-saturation regime, making it capable of simulating all practical EDFAs. We show that an Intel Pentium 66 based computer can calculate gain and noise...

  5. Extension of the A-UNIFAC model to mixtures of cross- and self-associating compounds

    OpenAIRE

    Ferreira, Olga; Macedo, Maria E.; Bottini, Susana B.

    2005-01-01

    http://apps.isiknowledge.com/full_record.do?product=UA&search_mode=GeneralSearch&qid=4&SID=V21Di6PajaHLPoM3@AJ&page=1&doc=1&colname=WOS In the present work an extended UNIFAC group contribution model is used to calculate activity coefficients in solutions containing alcohols, water, carboxylic acids, esters, alkanes and aromatic hydrocarbons. The limiting expressions for the association contribution to the activity coefficients at infinite dilution are presented and discussed. A new ...

  6. Extensive EIS characterization of commercially available lithium polymer battery cell for performance modelling

    DEFF Research Database (Denmark)

    Stanciu, Tiberiu; Stroe, Daniel Loan; Teodorescu, Remus

    2015-01-01

    Electrochemical Impedance Spectroscopy (EIS) has become a popular analytical technique for research and development of battery cells' chemistries, due to the established, high precision computer controlled equipment, that are capable of direct, on-line monitoring of performance parameters...... on the performance of a commercially available 53 Ah Lithium polymer battery cell, manufactured by Kokam Co. Ltd., is investigated in laboratory experiments, at its beginning of life, by means of EIS. A data fitting algorithm was used to obtain the parameter values for the proposed equivalent electrical circuit......, which was further selected for the development of an accurate EIS based performance model for the chosen Li-ion battery cell....

  7. Two-Dimensional Numerical Modeling of Intracontinental Extension: A Case Study Of the Baikal Rift Formation

    DEFF Research Database (Denmark)

    Yang, H.; Chemia, Zurab; Artemieva, Irina

    The Baikal Rift zone (BRZ) is a narrow ( 10 km) active intra-continental basin, located at the boundary between the Amurian and Eurasian Plates. Although the BRZ is one of the major tectonically active rift zones in the world andit has been a subject of numerous geological...... on topography,basin depth, the structure of the crust, lithosphere thickness, and the location of major tectonic faults. Our goal is to determine the physical models that reproduce reasonably well the ob-served deformation patterns of the BRZ.We perform a systematic analysis of the pa-rameter space in order...

  8. An Improved Analytical Model of the Local Interstellar Magnetic Field: The Extension to Compressibility

    Energy Technology Data Exchange (ETDEWEB)

    Kleimann, Jens; Fichtner, Horst [Ruhr-Universität Bochum, Fakultät für Physik und Astronomie, Institut für Theoretische Physik IV, Bochum (Germany); Röken, Christian, E-mail: jk@tp4.rub.de, E-mail: hf@tp4.rub.de, E-mail: christian.roeken@mathematik.uni-regensburg.de [Universität Regensburg, Fakultät für Mathematik, Regensburg (Germany)

    2017-03-20

    A previously published analytical magnetohydrodynamic model for the local interstellar magnetic field in the vicinity of the heliopause (Röken et al. 2015) is extended from incompressible to compressible, yet predominantly subsonic flow, considering both isothermal and adiabatic equations of state. Exact expressions and suitable approximations for the density and the flow velocity are derived and discussed. In addition to the stationary induction equation, these expressions also satisfy the momentum balance equation along stream lines. The practical usefulness of the corresponding, still exact, analytical magnetic field solution is assessed by comparing it quantitatively to results from a fully self-consistent magnetohydrodynamic simulation of the interstellar magnetic field draping around the heliopause.

  9. Some general remarks on hyperplasticity modelling and its extension to partially saturated soils

    Science.gov (United States)

    Lei, Xiaoqin; Wong, Henry; Fabbri, Antonin; Bui, Tuan Anh; Limam, Ali

    2016-06-01

    The essential ideas and equations of classic plasticity and hyperplasticity are successively recalled and compared, in order to highlight their differences and complementarities. The former is based on the mathematical framework proposed by Hill (The mathematical theory of plasticity. Oxford University Press, Oxford, 1950), whereas the latter is founded on the orthogonality hypothesis of Ziegler (An introduction to thermomechanics. Elsevier, North-Holland, 1983). The main drawback of classic plasticity is the possibility of violating the second principle of thermodynamics, while the relative ease to conjecture the yield function in order to approach experimental results is its main advantage. By opposition, the a priori satisfaction of thermodynamic principles constitutes the chief advantage of hyperplasticity theory. Noteworthy is also the fact that this latter approach allows a finer energy partition; in particular, the existence of frozen energy emerges as a natural consequence from its theoretical formulation. On the other hand, the relative difficulty to conjecture an efficient dissipation function to produce accurate predictions is its main drawback. The two theories are thus better viewed as two complementary approaches. Following this comparative study, a methodology to extend the hyperplasticity approach initially developed for dry or saturated materials to the case of partially saturated materials, accounting for interface energies and suction effects, is developed. A particular example based on the yield function of modified Cam-Clay model is then presented. It is shown that the approach developed leads to a model consistent with other existing works.

  10. A model for a correlated random walk based on the ordered extension of pseudopodia.

    Directory of Open Access Journals (Sweden)

    Peter J M Van Haastert

    Full Text Available Cell migration in the absence of external cues is well described by a correlated random walk. Most single cells move by extending protrusions called pseudopodia. To deduce how cells walk, we have analyzed the formation of pseudopodia by Dictyostelium cells. We have observed that the formation of pseudopodia is highly ordered with two types of pseudopodia: First, de novo formation of pseudopodia at random positions on the cell body, and therefore in random directions. Second, pseudopod splitting near the tip of the current pseudopod in alternating right/left directions, leading to a persistent zig-zag trajectory. Here we analyzed the probability frequency distributions of the angles between pseudopodia and used this information to design a stochastic model for cell movement. Monte Carlo simulations show that the critical elements are the ratio of persistent splitting pseudopodia relative to random de novo pseudopodia, the Left/Right alternation, the angle between pseudopodia and the variance of this angle. Experiments confirm predictions of the model, showing reduced persistence in mutants that are defective in pseudopod splitting and in mutants with an irregular cell surface.

  11. Hippocampal adaptive response following extensive neuronal loss in an inducible transgenic mouse model.

    Directory of Open Access Journals (Sweden)

    Kristoffer Myczek

    Full Text Available Neuronal loss is a common component of a variety of neurodegenerative disorders (including Alzheimer's, Parkinson's, and Huntington's disease and brain traumas (stroke, epilepsy, and traumatic brain injury. One brain region that commonly exhibits neuronal loss in several neurodegenerative disorders is the hippocampus, an area of the brain critical for the formation and retrieval of memories. Long-lasting and sometimes unrecoverable deficits caused by neuronal loss present a unique challenge for clinicians and for researchers who attempt to model these traumas in animals. Can these deficits be recovered, and if so, is the brain capable of regeneration following neuronal loss? To address this significant question, we utilized the innovative CaM/Tet-DT(A mouse model that selectively induces neuronal ablation. We found that we are able to inflict a consistent and significant lesion to the hippocampus, resulting in hippocampally-dependent behavioral deficits and a long-lasting upregulation in neurogenesis, suggesting that this process might be a critical part of hippocampal recovery. In addition, we provide novel evidence of angiogenic and vasculature changes following hippocampal neuronal loss in CaM/Tet-DTA mice. We posit that angiogenesis may be an important factor that promotes neurogenic upregulation following hippocampal neuronal loss, and both factors, angiogenesis and neurogenesis, can contribute to the adaptive response of the brain for behavioral recovery.

  12. A Mesoscopic Analytical Model to Predict the Onset of Wrinkling in Plain Woven Preforms under Bias Extension Shear Deformation

    Directory of Open Access Journals (Sweden)

    Abbas Hosseini

    2017-10-01

    Full Text Available A mesoscopic analytical model of wrinkling of Plain-Woven Composite Preforms (PWCPs under the bias extension test is presented, based on a new instability analysis. The analysis is aimed to facilitate a better understanding of the nature of wrinkle formation in woven fabrics caused by large in-plane shear, while it accounts for the effect of fabric and process parameters on the onset of wrinkling. To this end, the mechanism of wrinkle formation in PWCPs in mesoscale is simplified and an equivalent structure composed of bars and different types of springs is proposed, mimicking the behavior of a representative PWCP element at the post-locking state. The parameters of this equivalent structure are derived based on geometric and mechanical characteristics of the PWCP. The principle of minimum total potential energy is employed to formluate the model, and experimental validation is carried out to reveal the effectiveness of the derived wrinkling prediction equation.

  13. The Assessment of Comprehensive Vulnerability of Chemical Industrial Park Based on Entropy Method and Matter-element Extension Model

    Directory of Open Access Journals (Sweden)

    Yan Jingyi

    2016-01-01

    Full Text Available The paper focuses on studying connotative meaning, evaluation methods and models for chemical industry park based on in-depth analysis of relevant research results in China and abroad, it summarizes and states the feature of menacing vulnerability and structural vulnerability and submits detailed influence factors such as personnel vulnerability, infrastructural vulnerability, environmental vulnerability and the vulnerability of safety managerial defeat. Using vulnerability scoping diagram establishes 21 evaluation indexes and an index system for the vulnerability evaluation of chemical industrial park. The comprehensive weights are calculated with entropy method, combining matter-element extension model to make the quantitative evaluation, then apply to evaluate some chemical industrial park successfully. This method provides a new ideas and ways for enhancing overall safety of the chemical industrial park.

  14. Extension of non-linear beam models with deformable cross sections

    Science.gov (United States)

    Sokolov, I.; Krylov, S.; Harari, I.

    2015-12-01

    Geometrically exact beam theory is extended to allow distortion of the cross section. We present an appropriate set of cross-section basis functions and provide physical insight to the cross-sectional distortion from linear elastostatics. The beam formulation in terms of material (back-rotated) beam internal force resultants and work-conjugate kinematic quantities emerges naturally from the material description of virtual work of constrained finite elasticity. The inclusion of cross-sectional deformation allows straightforward application of three-dimensional constitutive laws in the beam formulation. Beam counterparts of applied loads are expressed in terms of the original three-dimensional data. Special attention is paid to the treatment of the applied stress, keeping in mind applications such as hydrogel actuators under environmental stimuli or devices made of electroactive polymers. Numerical comparisons show the ability of the beam model to reproduce finite elasticity results with good efficiency.

  15. Landau-Zener extension of the Tavis-Cummings model: structure of the solution

    Science.gov (United States)

    Sun, Chen; Sinitsyn, Nikolai

    We explore the recently discovered solution of the driven Tavis-Cummings model (DTCM). It describes interaction of arbitrary number of two-level systems with a bosonic mode that has linearly time-dependent frequency. We derive compact and tractable expressions for transition probabilities in terms of the well known special functions. In the new form, our formulas are suitable for fast numerical calculations and analytical approximations. As an application, we obtain the semiclassical limit of the exact solution and compare it to prior approximations. We also reveal connection between DTCM and q-deformed binomial statistics. Under the auspices of the National Nuclear Security Administration of the U.S. Department of Energy at Los Alamos National Laboratory under Contract No. DE-AC52-06NA25396. Authors also thank the support from the LDRD program at LANL.

  16. Language model: Extension to solve inconsistency, incompleteness, and short query in cultural heritage collection

    Science.gov (United States)

    Tan, Kian Lam; Lim, Chen Kim

    2017-10-01

    With the explosive growth of online information such as email messages, news articles, and scientific literature, many institutions and museums are converting their cultural collections from physical data to digital format. However, this conversion resulted in the issues of inconsistency and incompleteness. Besides, the usage of inaccurate keywords also resulted in short query problem. Most of the time, the inconsistency and incompleteness are caused by the aggregation fault in annotating a document itself while the short query problem is caused by naive user who has prior knowledge and experience in cultural heritage domain. In this paper, we presented an approach to solve the problem of inconsistency, incompleteness and short query by incorporating the Term Similarity Matrix into the Language Model. Our approach is tested on the Cultural Heritage in CLEF (CHiC) collection which consists of short queries and documents. The results show that the proposed approach is effective and has improved the accuracy in retrieval time.

  17. Fuzzy sets as extension of probabilistic models for evaluating human reliability

    International Nuclear Information System (INIS)

    Przybylski, F.

    1996-11-01

    On the base of a survey of established quantification methodologies for evaluating human reliability, a new computerized methodology was developed in which a differential consideration of user uncertainties is made. In this quantification method FURTHER (FUzzy Sets Related To Human Error Rate Prediction), user uncertainties are quantified separately from model and data uncertainties. As tools fuzzy sets are applied which, however, stay hidden to the method's user. The user in the quantification process only chooses an action pattern, performance shaping factors and natural language expressions. The acknowledged method HEART (Human Error Assessment and Reduction Technique) serves as foundation of the fuzzy set approach FURTHER. By means of this method, the selection of a basic task in connection with its basic error probability, the decision how correct the basic task's selection is, the selection of a peformance shaping factor, and the decision how correct the selection and how important the performance shaping factor is, were identified as aspects of fuzzification. This fuzzification is made on the base of data collection and information from literature as well as of the estimation by competent persons. To verify the ammount of additional information to be received by the usage of fuzzy sets, a benchmark session was accomplished. In this benchmark twelve actions were assessed by five test-persons. In case of the same degree of detail in the action modelling process, the bandwidths of the interpersonal evaluations decrease in FURTHER in comparison with HEART. The uncertainties of the single results could not be reduced up to now. The benchmark sessions conducted so far showed plausible results. A further testing of the fuzzy set approach by using better confirmed fuzzy sets can only be achieved in future practical application. Adequate procedures, however, are provided. (orig.) [de

  18. Electroweak and Flavor Physics in Extensions of the Standard Model with Large Extra Dimensions

    CERN Document Server

    Delgado, A.; Quiros, M.

    2000-01-01

    We study the implications of extra dimensions of size $R\\sim 1/TeV$ on electroweak and flavor physics due to the presence of Kaluza-Klein excitations of the SM gauge-bosons. We consider several scenarios with the SM fermions either living in the bulk or being localized at different points of an extra dimension. Global fits to electroweak observables provide lower bounds on 1/R, which are generically in the 2-5 TeV range. We find, however, certain models where the fit to electroweak observables is better than in the SM, because of an improvement in the prediction to the weak charge Q_W. We also consider the case of softly-broken supersymmetric theories and we find new non-decoupling effects that put new constraints on 1/R. If quarks of different families live in different points of the extra dimension, we find that the Kaluza-Klein modes of the SM gluons generate (at tree level) dangerous flavor and CP-violating interactions. The lower bounds on 1/R can increase in this case up to 5000 TeV, disfavoring these s...

  19. The Song Remains the Same: A Replication and Extension of the MUSIC Model

    Science.gov (United States)

    Rentfrow, Peter J.; Goldberg, Lewis R.; Stillwell, David J.; Kosinski, Michal; Gosling, Samuel D.; Levitin, Daniel J.

    2012-01-01

    There is overwhelming anecdotal and empirical evidence for individual differences in musical preferences. However, little is known about what drives those preferences. Are people drawn to particular musical genres (e.g., rap, jazz) or to certain musical properties (e.g., lively, loud)? Recent findings suggest that musical preferences can be conceptualized in terms of five orthogonal dimensions: Mellow, Unpretentious, Sophisticated, Intense, and Contemporary (conveniently, MUSIC). The aim of the present research is to replicate and extend that work by empirically examining the hypothesis that musical preferences are based on preferences for particular musical properties and psychological attributes as opposed to musical genres. Findings from Study 1 replicated the five-factor MUSIC structure using musical excerpts from a variety of genres and subgenres and revealed musical attributes that differentiate each factor. Results from Studies 2 and 3 show that the MUSIC structure is recoverable using musical pieces from only the jazz and rock genres, respectively. Taken together, the current work provides strong evidence that preferences for music are determined by specific musical attributes and that the MUSIC model is a robust framework for conceptualizing and measuring such preferences. PMID:24825945

  20. The Song Remains the Same: A Replication and Extension of the MUSIC Model.

    Science.gov (United States)

    Rentfrow, Peter J; Goldberg, Lewis R; Stillwell, David J; Kosinski, Michal; Gosling, Samuel D; Levitin, Daniel J

    2012-12-01

    There is overwhelming anecdotal and empirical evidence for individual differences in musical preferences. However, little is known about what drives those preferences. Are people drawn to particular musical genres (e.g., rap, jazz) or to certain musical properties (e.g., lively, loud)? Recent findings suggest that musical preferences can be conceptualized in terms of five orthogonal dimensions: Mellow, Unpretentious, Sophisticated, Intense, and Contemporary (conveniently, MUSIC). The aim of the present research is to replicate and extend that work by empirically examining the hypothesis that musical preferences are based on preferences for particular musical properties and psychological attributes as opposed to musical genres. Findings from Study 1 replicated the five-factor MUSIC structure using musical excerpts from a variety of genres and subgenres and revealed musical attributes that differentiate each factor. Results from Studies 2 and 3 show that the MUSIC structure is recoverable using musical pieces from only the jazz and rock genres, respectively. Taken together, the current work provides strong evidence that preferences for music are determined by specific musical attributes and that the MUSIC model is a robust framework for conceptualizing and measuring such preferences.

  1. pypk - A Python extension module to handle chemical kinetics in plasma physics modeling

    Directory of Open Access Journals (Sweden)

    2008-06-01

    Full Text Available PLASMAKIN is a package to handle physical and chemical data used in plasma physics modeling and to compute gas-phase and gas-surface kinetics data: particle production and loss rates, photon emission spectra and energy exchange rates. A large number of species properties and reaction types are supported, namely: gas or electron temperature dependent collision rate coefficients, vibrational and cascade levels, evaluation of branching ratios, superelastic and other reverse processes, three-body collisions, radiation imprisonment and photoelectric emission. Support of non-standard rate coefficient functions can be handled by a user-supplied shared library.

    The main block of the PLASMAKIN package is a Fortran module that can be included in an user's program or compiled as a shared library, libpk. pypk is a new addition to the package and provides access to libpk from Python programs. It is build on top of the ctypes foreign function library module and is prepared to work with several Fortran compilers. However pypk is more than a wrapper and provides its own classes and functions taking advantage of Python language characteristics. Integration with Python tools allows substantial productivity gains on program development and insight on plasma physics problems.

  2. The other Higgses, at resonance, in the Lee-Wick extension of the Standard Model

    CERN Document Server

    Figy, Terrance

    2011-01-01

    Within the framework of the Lee Wick Standard Model (LWSM) we investigate Higgs pair production $gg \\to h_0 h_0$, $gg \\to h_0 \\tilde p_0$ and top pair production $gg \\to \\bar tt$ at the Large Hadron Collider (LHC), where the neutral particles from the Higgs sector ($h_0$, $\\tilde h_0$ and $\\tilde p_0$) appear as possible resonant intermediate states. We investigate the signal $gg \\to h_0 h_0 \\to \\bar b b \\gamma \\gamma$ and we find that the LW Higgs, depending on its mass-range, can be seen not long after the LHC upgrade in 2012. More precisely this happens when the new LW Higgs states are below the top pair threshold. In $gg \\to \\bar tt$ the LW states, due to the wrong-sign propagator and negative width, lead to a dip-peak structure instead of the usual peak-dip structure which gives a characteristic signal especially for low-lying LW Higgs states. We comment on the LWSM and the forward-backward asymmetry in view of the measurement at the TeVatron. Furthermore, we present a technique which reduces the hyperbo...

  3. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    Directory of Open Access Journals (Sweden)

    Tatsuhiko Sato

    Full Text Available By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni, muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS simulation performed by Particle and Heavy Ion Transport code System (PHITS. The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS. Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  4. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    Science.gov (United States)

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  5. Modeling Turkey National 2D Geo-Data Model as a CityGML Application Domain Extension in UML

    Directory of Open Access Journals (Sweden)

    Serpil Ates Aydar

    2016-12-01

    Full Text Available This paper presents the generation of the 3D national geo-data model of Turkey, which is compatible with the international OGC CityGML Encoding Standard. We prepare an ADE named CityGML-TRKBIS that is produced by extending existing thematic modules of CityGML according to TRKBIS needs. All thematic data groups in TRKBIS geo-data model have been remodeled in order to generate the national large scale 3D geodata model for Turkey. Within the concept of reference model-driven framework for modelling CityGML ADEs in UML, the first step is conceptual mapping between CityGML and national information model. To test all stages of the framework for Urban GIS Turkey, Building data theme is mapped with CityGML-Building thematic model. All classes, attributes, code lists and code list values of the TRKBIS is tried to be mapped with related CityGML concept. New classes, attributes and code list values that will be added to CityGML Building model are determined based on this conceptual mapping. Finally the new model for CityGML-TRKBIS.BI is established in UML with subclassing of the related CityGML classes. This study provides new insights into 3D applications in Turkey. The generated 3D geo-data model for building thematic class will be used as a common exchange format that meets 2D, 2.5D and 3D implementation needs at national level.

  6. Data Requirements and Modeling for Gas Hydrate-Related Mixtures and a Comparison of Two Association Models

    DEFF Research Database (Denmark)

    Liang, Xiaodong; Aloupis, Georgios; Kontogeorgis, Georgios M.

    2017-01-01

    used association models in the chemical and petroleum industries. The CPA model is extensively used in flow assurance, in which the gas hydrate formation is one of the central topics. Experimental data play a vital role in validating models and obtaining model parameters. In this work, we will compare...

  7. Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration

    Science.gov (United States)

    Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang

    Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.

  8. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  9. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  10. Effects of Lorentz violation through the γe → Wνe process in the Standard Model extension

    International Nuclear Information System (INIS)

    Aranda, J I; Ramírez-Zavaleta, F; Tututi, E S; Rosete, D A; Tlachino, F J; Toscano, J J

    2014-01-01

    Physics beyond the Fermi scale could show up through deviations of the gauge couplings predicted by the electroweak Yang–Mills sector. This possibility is explored in the context of the International Linear Collider through the helicity amplitudes for the γe → Wν e reaction to which the trilinear WWγ coupling contributes. The new physics effects on this vertex are parametrized in a model-independent fashion through an effective electroweak Yang–Mills sector, which is constructed by considering two essentially different sources of new physics. In one scenario, Lorentz violation will be considered exclusively as the source of new physics effects. This type of new physics is considered in an extension of the Standard Model (SM) that is known as the SM extension (SME), which is an effective field theory that contemplates CPT and Lorentz violation in a model-independent fashion. Any source of new physics that respects the Lorentz symmetry will be considered within the general context of the well-known conventional effective SM (CESM) extension. Both the SME and CESM descriptions include gauge invariant operators of dimension higher than 4, which, in general, transform as Lorentz tensors of rank higher than zero. In the former theory, observer Lorentz invariants are constructed by contracting these operators with constant Lorentz tensors, whereas in the latter the corresponding Lorentz invariant interactions are obtained contracting such operators with products of the metric tensor. In this work, we focus on a dimension 6 Lorentz 2-tensor, O αβ , which arises from an effective SU(2) L Yang–Mills sector. Contributions to the WWγ coupling arising from dimension 4 operators are ignored since they are strongly constrained. When these operators are contracted with a constant antisymmetric background tensor, b αβ , the corresponding observer invariant belongs to the SME, whereas if they are contracted with the metric tensor, g αβ , an effective interaction in

  11. Model Application Multiattribute (M.A.U in decision-making processes for departmental charges in Education Technical University of Babahoyo, Quevedo Extension

    Directory of Open Access Journals (Sweden)

    Gonzalo Arturo Peñafiel Nivela

    2017-05-01

    Full Text Available This document is intended to determine the model application Multiattribute (M.A.U. in processes of decision making educational department heads of the Technical University of Babahoyo Extension Quevedo, describe the main features and structure of alternative solutions that reflect the needs of the institution as such. Within higher education institutions the evaluative processes require the department head specific solutions for presentation of evidence, forcing the leader or boss to take immediate decisions in context, this by requiring higher hierarchical approval for decision making spends time and allows state loss for the institution. It is essential to consolidate this quantitative tool gives the possibility to analyze their real needs (OUTPUT the weight required to address the problem vs. the realities of the candidates or tenders for them (INPUT, the same that will be contrasted to its logical selection and application. The methodology describes the presentation of hypothetical methods deductive with deductive logic and cuantitavos analysis of data generated in recruitment for academic programs each semester process should justify hiring his staff at the university and socialized to the university community Council.

  12. Modeling Nitrous Oxide Production during Biological Nitrogen Removal via Nitrification and Denitrification: Extensions to the General ASM Models

    DEFF Research Database (Denmark)

    Ni, Bing-Jie; Ruscalleda, Maël; Pellicer i Nàcher, Carles

    2011-01-01

    Nitrous oxide (N2O) can be formed during biological nitrogen (N) removal processes. In this work, a mathematical model is developed that describes N2O production and consumption during activated sludge nitrification and denitrification. The well-known ASM process models are extended to capture N2O...

  13. Evidence for Sub-Chandrasekhar Mass Type Ia Supernovae from an Extensive Survey of Radiative Transfer Models

    Science.gov (United States)

    Goldstein, Daniel A.; Kasen, Daniel

    2018-01-01

    There are two classes of viable progenitors for normal Type Ia supernovae (SNe Ia): systems in which a white dwarf explodes at the Chandrasekhar mass ({M}{ch}), and systems in which a white dwarf explodes below the Chandrasekhar mass (sub-{M}{ch}). It is not clear which of these channels is dominant; observations and light-curve modeling have provided evidence for both. Here we use an extensive grid of 4500 time-dependent, multiwavelength radiation transport simulations to show that the sub-{M}{ch} model can reproduce the entirety of the width–luminosity relation, while the {M}{ch} model can only produce the brighter events (0.8models that vary the mass, kinetic energy, and compositional structure of the ejecta, thereby realizing a broad range of possible outcomes of white dwarf explosions. We provide fitting functions based on our large grid of detailed simulations that map observable properties of SNe Ia, such as peak brightness and light-curve width, to physical parameters such as {}56{Ni} and total ejected mass. These can be used to estimate the physical properties of observed SNe Ia.

  14. A New Approach to the Modeling and Analysis of Fracture through Extension of Continuum Mechanics to the Nanoscale

    KAUST Repository

    Sendova, T.

    2010-02-15

    In this paper we focus on the analysis of the partial differential equations arising from a new approach to modeling brittle fracture based on an extension of continuum mechanics to the nanoscale. It is shown that ascribing constant surface tension to the fracture surfaces and using the appropriate crack surface boundary condition given by the jump momentum balance leads to a sharp crack opening profile at the crack tip but predicts logarithmically singular crack tip stress. However, a modified model, where the surface excess property is responsive to the curvature of the fracture surfaces, yields bounded stresses and a cusp-like opening profile at the crack tip. Further, two possible fracture criteria in the context of the new theory are discussed. The first is an energy-based crack growth condition, while the second employs the finite crack tip stress the model predicts. The classical notion of energy release rate is based upon the singular solution, whereas for the modeling approach adopted here, a notion analogous to the energy release rate arises through a different mechanism associated with the rate of working of the surface excess properties at the crack tip. © The Author(s), 2010.

  15. Modeling nitrous oxide production during biological nitrogen removal via nitrification and denitrification: extensions to the general ASM models.

    Science.gov (United States)

    Ni, Bing-Jie; Ruscalleda, Maël; Pellicer-Nàcher, Carles; Smets, Barth F

    2011-09-15

    Nitrous oxide (N(2)O) can be formed during biological nitrogen (N) removal processes. In this work, a mathematical model is developed that describes N(2)O production and consumption during activated sludge nitrification and denitrification. The well-known ASM process models are extended to capture N(2)O dynamics during both nitrification and denitrification in biological N removal. Six additional processes and three additional reactants, all involved in known biochemical reactions, have been added. The validity and applicability of the model is demonstrated by comparing simulations with experimental data on N(2)O production from four different mixed culture nitrification and denitrification reactor study reports. Modeling results confirm that hydroxylamine oxidation by ammonium oxidizers (AOB) occurs 10 times slower when NO(2)(-) participates as final electron acceptor compared to the oxic pathway. Among the four denitrification steps, the last one (N(2)O reduction to N(2)) seems to be inhibited first when O(2) is present. Overall, N(2)O production can account for 0.1-25% of the consumed N in different nitrification and denitrification systems, which can be well simulated by the proposed model. In conclusion, we provide a modeling structure, which adequately captures N(2)O dynamics in autotrophic nitrification and heterotrophic denitrification driven biological N removal processes and which can form the basis for ongoing refinements.

  16. Probing Minimal 5D Extensions of the Standard Model From LEP to an $e^{+} e^{-}$ Linear Collider

    CERN Document Server

    Mück, A; Rückl, R; Mück, Alexander; Pilaftsis, Apostolos; R\\"uckl, Reinhold

    2004-01-01

    We derive new improved constraints on the compactification scale of minimal 5-dimensional (5D) extensions of the Standard Model (SM) from electroweak and LEP2 data and estimate the reach of an e^+e^- linear collider such as TESLA. Our analysis is performed within the framework of non-universal 5D models, where some of the gauge and Higgs fields propagate in the extra dimension, while all fermions are localized on a S^1/Z_2 orbifold fixed point. Carrying out simultaneous multi-parameter fits of the compactification scale and the SM parameters to the data, we obtain lower bounds on this scale in the range between 4 and 6 TeV. These fits also yield the correlation of the compactification scale with the SM Higgs mass. Investigating the prospects at TESLA, we show that the so-called GigaZ option has the potential to improve these bounds by about a factor 2 in almost all 5D models. Furthermore, at the center of mass energy of 800 GeV and with an integrated luminosity of 10^3 fb^-1, linear collider experiments can p...

  17. A multi-compartment model for slow bronchial clearance of insoluble particles - Extension of the ICRP human respiratory tract models

    International Nuclear Information System (INIS)

    Sturm, R.; Hofmann, W.

    2006-01-01

    To incorporate the various mechanisms that are presently assumed to be responsible for the experimentally observed slow bronchial clearance into the HRTM, a multi-compartment model was developed to simulate the clearance of insoluble particles in the tracheobronchial tree of the human lung. The new model considers specific mass transfer paths that may play an important role for slow bronchial clearance. These include the accumulation of particulate mass in the peri-ciliary sol layer, phagocytosis of stored particles by airway macrophages and uptake of deposited mass by epithelial cells. Besides the gel layer representing fast mucociliary clearance, all cellular and non-cellular units involved in the slow clearance process are described by respective compartments that are connected by specific transfer rates. The gastrointestinal tract and lymph nodes are included into the model as final accumulation compartments, to which mass is transferred via the airway route and the transepithelial path. Predicted retention curves correspond well with previously published data. (authors)

  18. Requirement for Serratia marcescens Cytolysin in a Murine Model of Hemorrhagic Pneumonia

    Science.gov (United States)

    González-Juarbe, Norberto; Mares, Chris A.; Hinojosa, Cecilia A.; Medina, Jorge L.; Cantwell, Angelene; Dube, Peter H.; Bergman, Molly A.

    2014-01-01

    Serratia marcescens, a member of the carbapenem-resistant Enterobacteriaceae, is an important emerging pathogen that causes a wide variety of nosocomial infections, spreads rapidly within hospitals, and has a systemic mortality rate of ≤41%. Despite multiple clinical descriptions of S. marcescens nosocomial pneumonia, little is known regarding the mechanisms of bacterial pathogenesis and the host immune response. To address this gap, we developed an oropharyngeal aspiration model of lethal and sublethal S. marcescens pneumonia in BALB/c mice and extensively characterized the latter. Lethal challenge (>4.0 × 106 CFU) was characterized by fulminate hemorrhagic pneumonia with rapid loss of lung function and death. Mice challenged with a sublethal dose (<2.0 × 106 CFU) rapidly lost weight, had diminished lung compliance, experienced lung hemorrhage, and responded to the infection with extensive neutrophil infiltration and histopathological changes in tissue architecture. Neutrophil extracellular trap formation and the expression of inflammatory cytokines occurred early after infection. Mice depleted of neutrophils were exquisitely susceptible to an otherwise nonlethal inoculum, thereby demonstrating the requirement for neutrophils in host protection. Mutation of the genes encoding the cytolysin ShlA and its transporter ShlB resulted in attenuated S. marcescens strains that failed to cause profound weight loss, extended illness, hemorrhage, and prolonged lung pathology in mice. This study describes a model of S. marcescens pneumonia that mimics known clinical features of human illness, identifies neutrophils and the toxin ShlA as a key factors important for defense and infection, respectively, and provides a solid foundation for future studies of novel therapeutics for this important opportunistic pathogen. PMID:25422267

  19. Myrigalone A Inhibits Lepidium sativum Seed Germination by Interference with Gibberellin Metabolism and Apoplastic Superoxide Production Required for Embryo Extension Growth and Endosperm Rupture

    Czech Academy of Sciences Publication Activity Database

    Oracz, K.; Voegele, A.; Tarkowská, Danuše; Jacquemoud, D.; Turečková, Veronika; Urbanová, Terezie; Strnad, Miroslav; Sliwinska, E.; Leubner-Metzger, G.

    2012-01-01

    Roč. 53, č. 1 (2012), s. 81-95 ISSN 0032-0781 R&D Projects: GA AV ČR KAN200380801; GA MŠk ED0007/01/01; GA ČR GD522/08/H003 Keywords : Embryo cell extension growth * Endoreduplication * Endosperm rupture * Gibberellin metabolism * Lepidium sativum * Myrica gale * Phytotoxicity * Reactive oxygen species Subject RIV: EF - Botanics Impact factor: 4.134, year: 2012

  20. Dark revelations of the [SU(3)]3 and [SU(3)]4 gauge extensions of the standard model

    Science.gov (United States)

    Kownacki, Corey; Ma, Ernest; Pollard, Nicholas; Popov, Oleg; Zakeri, Mohammadreza

    2018-02-01

    Two theoretically well-motivated gauge extensions of the standard model are SU(3)C × SU(3)L × SU(3)R and SU(3)q × SU(3)L × SU(3)l × SU(3)R, where SU(3)q is the same as SU(3)C and SU(3)l is its color leptonic counterpart. Each has three variations, according to how SU(3)R is broken. It is shown here for the first time that a built-in dark U(1)D gauge symmetry exists in all six versions. However, the corresponding symmetry breaking pattern does not reduce properly to that of the standard model, unless an additional Z2‧ symmetry is defined, so that U(1)D ×Z2‧ is broken to Z2 dark parity. The available dark matter candidates in each case include fermions, scalars, as well as vector gauge bosons. This work points to the possible unity of matter with dark matter, the origin of which may not be ad hoc.

  1. Dark revelations of the [SU(3]3 and [SU(3]4 gauge extensions of the standard model

    Directory of Open Access Journals (Sweden)

    Corey Kownacki

    2018-02-01

    Full Text Available Two theoretically well-motivated gauge extensions of the standard model are SU(3C×SU(3L×SU(3R and SU(3q×SU(3L×SU(3l×SU(3R, where SU(3q is the same as SU(3C and SU(3l is its color leptonic counterpart. Each has three variations, according to how SU(3R is broken. It is shown here for the first time that a built-in dark U(1D gauge symmetry exists in all six versions. However, the corresponding symmetry breaking pattern does not reduce properly to that of the standard model, unless an additional Z2′ symmetry is defined, so that U(1D×Z2′ is broken to Z2 dark parity. The available dark matter candidates in each case include fermions, scalars, as well as vector gauge bosons. This work points to the possible unity of matter with dark matter, the origin of which may not be ad hoc.

  2. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required...

  3. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  4. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    Science.gov (United States)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried

  5. Is there a relationship between health care models and their performance assessment? The results of an extensive review

    Directory of Open Access Journals (Sweden)

    Ferruccio Pelone

    2008-06-01

    Full Text Available

    Background: Health system performance is a multi-dimensional concept related to the achievement of several objectives such as effectiveness, efficiency and equity. The aim of this study is to investigate the relationship between models of health care systems (Beveridge, Bismarck and voluntary health insurance and performance frameworks available in the scientific literature.

    Methods: An extensive literature search in several electronic databases was carried out. According to a preliminary classification of performance domains and dimensions, we analysed, among the selected articles, the relationship between domains/dimensions and the three main models of health care systems.Results: 12.6% of the children were obese and 26.3% overweight, with the percentage of obesity nearly double in those who do not practice organized sports activities at least once a week, in those who don’t have breakfast in the morning and in those who don’t spend their free time in movement games. From a multiple logistic regression it results that the risk of being obese is twice and three times higher for the children living respectively in medium and small towns than for the ones living in large towns.

    Results: From 540 references found, 17 papers were considered relevant for the purposes of this research. A total of 39 frameworks were identified: 41% referred to the “Beveridge model, 10% to the “Bismarck model”, and 23% to the “Voluntary health insurance” model and 26% to “Umbrella organizations” (e.g. OECD. Domains of effectiveness and responsiveness were covered by all of the frameworks while fewer covered equity and efficiency. The most frequent dimensions in all the models were effectiveness and technical efficiency, but relevant differences exist among the healthcare system models about dimensions of performance considered.

    Conclusions: Although the need of

  6. Bioenergetics model for estimating food requirements of female Pacific walruses (Odobenus rosmarus divergens)

    Science.gov (United States)

    Noren, S.R.; Udevitz, M.S.; Jay, C.V.

    2012-01-01

    Pacific walruses Odobenus rosmarus divergens use sea ice as a platform for resting, nursing, and accessing extensive benthic foraging grounds. The extent of summer sea ice in the Chukchi Sea has decreased substantially in recent decades, causing walruses to alter habitat use and activity patterns which could affect their energy requirements. We developed a bioenergetics model to estimate caloric demand of female walruses, accounting for maintenance, growth, activity (active in-water and hauled-out resting), molt, and reproductive costs. Estimates for non-reproductive females 0–12 yr old (65−810 kg) ranged from 16359 to 68960 kcal d−1 (74−257 kcal d−1 kg−1) for years with readily available sea ice for which we assumed animals spent 83% of their time in water. This translated into the energy content of 3200–5960 clams per day, equivalent to 7–8% and 14–9% of body mass per day for 5–12 and 2–4 yr olds, respectively. Estimated consumption rates of 12 yr old females were minimally affected by pregnancy, but lactation had a large impact, increasing consumption rates to 15% of body mass per day. Increasing the proportion of time in water to 93%, as might happen if walruses were required to spend more time foraging during ice-free periods, increased daily caloric demand by 6–7% for non-lactating females. We provide the first bioenergetics-based estimates of energy requirements for walruses and a first step towards establishing bioenergetic linkages between demography and prey requirements that can ultimately be used in predicting this population’s response to environmental change.

  7. Extensive Investigations on Bio-Inspired Trust and Reputation Model over Hops Coefficient Factor in Distributed Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Vinod Kumar Verma

    2014-08-01

    Full Text Available Resource utilization requires a substantial consideration for a trust and reputation model to be deployed within a wireless sensor network (WSN. In the evaluation, our attention is focused on the effect of hops coefficient factor estimation on WSN with bio-inspired trust and reputation model (BTRM. We present the state-of-the-art system level evaluation of accuracy and path length of sensor node operations for their current and average scenarios. Additionally, we emphasized over the energy consumption evaluation for static, dynamic and oscillatory modes of BTRM-WSN model. The performance of the hops coefficient factor for our proposed framework is evaluated via analytic bounds and numerical simulations.

  8. CLUSTER MODEL FOR EXTENSIVE GIANT TIGER SHRIMP (Penaeus monodon Fab. TO PREVENT TRANSMISSION OF WHITE SPOT SYNDROME VIRUS

    Directory of Open Access Journals (Sweden)

    Arief Taslihan

    2015-06-01

    Full Text Available White spot syndrome virus (WSSV has become epidemic in Indonesia and affecting shrimp aquaculture interm of its production. White spot syndrome virus is transmitted from one to other ponds, through crustacean, included planktonic copepode as carrier for WSSV and through water from affected shrimp pond. A cluster model, consist of shrimp grow out ponds surrounded by non-shrimp pond as a role of biosecurity has been developed. The model aimed to prevent white spot virus transmission in extensive giant tiger shrimp pond. The study was conducted in two sites at Demak District, Central Java Province. As the treatment, a cluster consist of three shrimp ponds in site I, and two shrimp ponds in site II, each was surrounded by buffer ponds rearing only finfish. As the control, five extensive shrimp grow out ponds in site I and three shrimp grow out ponds in site II, with shrimp pond has neither applied biosecurity nor surrounded by non-shrimp pond as biosecurity as well considered as control ponds. The results found that treatment of cluster shrimp ponds surrounded by non-shrimp ponds could hold shrimp at duration of culture in the grow out pond (DOC 105.6±4.5 days significantly much longer than that of control that harvested at 60.9±16.0 days due to WSSV outbreak. Survival rate in trial ponds was 77.6±3.6%, significantly higher than that of control at 22.6±15.8%. Shrimp production in treatment ponds has total production of 425.1±146.6 kg/ha significantly higher than that of control that could only produced 54.5±47.6 kg/ha. Implementation of Better Management Practices (BMP by arranging shrimp ponds in cluster and surrounding by non-shrimp ponds proven effectively prevent WSSV transmission from traditional shrimp ponds in surrounding area.

  9. Beef Species Symposium: an assessment of the 1996 Beef NRC: metabolizable protein supply and demand and effectiveness of model performance prediction of beef females within extensive grazing systems.

    Science.gov (United States)

    Waterman, R C; Caton, J S; Löest, C A; Petersen, M K; Roberts, A J

    2014-07-01

    Interannual variation of forage quantity and quality driven by precipitation events influence beef livestock production systems within the Southern and Northern Plains and Pacific West, which combined represent 60% (approximately 17.5 million) of the total beef cows in the United States. The beef cattle requirements published by the NRC are an important tool and excellent resource for both professionals and producers to use when implementing feeding practices and nutritional programs within the various production systems. The objectives of this paper include evaluation of the 1996 Beef NRC model in terms of effectiveness in predicting extensive range beef cow performance within arid and semiarid environments using available data sets, identifying model inefficiencies that could be refined to improve the precision of predicting protein supply and demand for range beef cows, and last, providing recommendations for future areas of research. An important addition to the current Beef NRC model would be to allow users to provide region-specific forage characteristics and the ability to describe supplement composition, amount, and delivery frequency. Beef NRC models would then need to be modified to account for the N recycling that occurs throughout a supplementation interval and the impact that this would have on microbial efficiency and microbial protein supply. The Beef NRC should also consider the role of ruminal and postruminal supply and demand of specific limiting AA. Additional considerations should include the partitioning effects of nitrogenous compounds under different physiological production stages (e.g., lactation, pregnancy, and periods of BW loss). The intent of information provided is to aid revision of the Beef NRC by providing supporting material for changes and identifying gaps in existing scientific literature where future research is needed to enhance the predictive precision and application of the Beef NRC models.

  10. Conformal complex singlet extension of the Standard Model: scenario for dark matter and a second Higgs boson

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhi-Wei; Steele, T.G. [Department of Physics and Engineering Physics, University of Saskatchewan,116 Science Place, Saskatoon, SK, S7N 5E2 (Canada); Hanif, T. [Department of Theoretical Physics, University of Dhaka,Dhaka-1000 (Bangladesh); Mann, R.B. [Department of Physics, University of Waterloo,Waterloo, ON, N2L 3G1 (Canada)

    2016-08-09

    We consider a conformal complex singlet extension of the Standard Model with a Higgs portal interaction. The global U(1) symmetry of the complex singlet can be either broken or unbroken and we study each scenario. In the unbroken case, the global U(1) symmetry protects the complex singlet from decaying, leading to an ideal cold dark matter candidate with approximately 100 GeV mass along with a significant proportion of thermal relic dark matter abundance. In the broken case, we have developed a renormalization-scale optimization technique to significantly narrow the parameter space and in some situations, provide unique predictions for all the model’s couplings and masses. We have found there exists a second Higgs boson with a mass of approximately 550 GeV that mixes with the known 125 GeV Higgs with a large mixing angle sin θ≈0.47 consistent with current experimental limits. The imaginary part of the complex singlet in the broken case could provide axion dark matter for a wide range of models. Upon including interactions of the complex scalar with an additional vector-like fermion, we explore the possibility of a diphoton excess in both the unbroken and the broken cases. In the unbroken case, the model can provide a natural explanation for diphoton excess if extra terms are introduced providing extra contributions to the singlet mass. In the broken case, we find a set of coupling solutions that yield a second Higgs boson of mass 720 GeV and an 830 GeV extra vector-like fermion F, which is able to address the 750 GeV LHC diphoton excess. We also provide criteria to determine the symmetry breaking pattern in both the Higgs and hidden sectors.

  11. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  12. User Requirements Model for University Timetable Management System

    OpenAIRE

    Ahmad Althunibat; Mohammad I. Muhairat

    2016-01-01

    Automated timetables are used to schedule courses, lectures and rooms in universities by considering some constraints. Inconvenient and ineffective timetables often waste time and money. Therefore, it is important to investigate the requirements and potential needs of users. Thus, eliciting user requirements of University Timetable Management System (TMS) and their implication becomes an important process for the implementation of TMS. On this note, this paper seeks to propose a m...

  13. The Modeling of Factors That Influence Coast Guard Manpower Requirements

    Science.gov (United States)

    2014-12-01

    Requirements Determination process. Finally, sincere thanks, hugs, and kisses to my family. I appreciate your enduring patience and encouragement. I...allowances. To help clarify the process, Phase II has guiding principles and core assumptions that direct the Phase. Three of the four guiding principles are...analyst is determining for the first time what manpower is required. The second notable guiding principle is “MRD analysts shall identify and

  14. Validation and Extension of the Prolonged Mechanical Ventilation Prognostic Model (ProVent) Score for Predicting 1-Year Mortality after Prolonged Mechanical Ventilation.

    Science.gov (United States)

    Udeh, Chiedozie I; Hadder, Brent; Udeh, Belinda L

    2015-12-01

    Prognostic models can inform management decisions for patients requiring prolonged mechanical ventilation. The Prolonged Mechanical Ventilation Prognostic model (ProVent) score was developed to predict 1-year mortality in these patients. External evaluation of such models is needed before they are adopted for routine use. The goal was to perform an independent external validation of the modified ProVent score and assess for spectrum extension at 14 days of mechanical ventilation. This was a retrospective cohort analysis of patients who received prolonged mechanical ventilation at the University of Iowa Hospitals. Patients who received 14 or more days of mechanical ventilation were identified from a database. Manual review of their medical records was performed to abstract relevant data including the four model variables at Days 14 and 21 of mechanical ventilation. Vital status at 1 year was checked in the medical records or the social security death index. Logistic regressions examined the associations between the different variables and mortality. Model performance at 14 to 20 days and 21+ days was assessed for discrimination by calculating the area under the receiver operating characteristic curve, and calibration was assessed using the Hosmer-Lemeshow goodness-of-fit test. A total of 180 patients (21+ d) and 218 patients (14-20 d) were included. Overall, 75% were surgical patients. One-year mortality was 51% for 21+ days and 32% for 14 to 20 days of mechanical ventilation. Age greater than 65 years was the strongest predictor of mortality at 1 year in all cohorts. There was no significant difference between predicted and observed mortality rates for patients stratified by ProVent score. There was near-perfect specificity for mortality in the groups with higher ProVent scores. Areas under the curve were 0.69 and 0.75 for the 21+ days and the 14 to 20 days cohorts respectively. P values for the Hosmer-Lemeshow statistics were 0.24 for 21+ days and 0.22 for 14 to

  15. The New FARM Program: A Model for Supporting Diverse Emerging Farmers and Early-Career Extension Professionals

    Science.gov (United States)

    Sirrine, J. R.; Eschbach, Cheryl L.; Lizotte, Erin; Rothwell, N. L.

    2016-01-01

    As early-career Extension educators challenged by societal, structural, agricultural, and fiscal trends, we designed a multiyear educational program to support the diverse needs of emerging specialty crop producers in northwest Michigan. This article presents outcomes of that program. We explore how Extension professionals can develop impactful…

  16. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  17. Rossby and drift wave turbulence and zonal flows: The Charney-Hasegawa-Mima model and its extensions

    Science.gov (United States)

    Connaughton, Colm; Nazarenko, Sergey; Quinn, Brenda

    2015-12-01

    A detailed study of the Charney-Hasegawa-Mima model and its extensions is presented. These simple nonlinear partial differential equations suggested for both Rossby waves in the atmosphere and drift waves in a magnetically-confined plasma, exhibit some remarkable and nontrivial properties, which in their qualitative form, survive in more realistic and complicated models. As such, they form a conceptual basis for understanding the turbulence and zonal flow dynamics in real plasma and geophysical systems. Two idealised scenarios of generation of zonal flows by small-scale turbulence are explored: a modulational instability and turbulent cascades. A detailed study of the generation of zonal flows by the modulational instability reveals that the dynamics of this zonal flow generation mechanism differ widely depending on the initial degree of nonlinearity. The jets in the strongly nonlinear case further roll up into vortex streets and saturate, while for the weaker nonlinearities, the growth of the unstable mode reverses and the system oscillates between a dominant jet, which is slightly inclined to the zonal direction, and a dominant primary wave. A numerical proof is provided for the extra invariant in Rossby and drift wave turbulence-zonostrophy. While the theoretical derivations of this invariant stem from the wave kinetic equation which assumes weak wave amplitudes, it is shown to be relatively well-conserved for higher nonlinearities also. Together with the energy and enstrophy, these three invariants cascade into anisotropic sectors in the k-space as predicted by the Fjørtoft argument. The cascades are characterised by the zonostrophy pushing the energy to the zonal scales. A small scale instability forcing applied to the model has demonstrated the well-known drift wave-zonal flow feedback loop. The drift wave turbulence is generated from this primary instability. The zonal flows are then excited by either one of the generation mechanisms, extracting energy from

  18. Is procrastination a vulnerability factor for hypertension and cardiovascular disease? Testing an extension of the procrastination-health model.

    Science.gov (United States)

    Sirois, Fuschia M

    2015-06-01

    Personality is an important epidemiological factor for understanding health outcomes. This study investigated the associations of trait procrastination with hypertension and cardiovascular disease (HT/CVD) and maladaptive coping by testing an extension of the procrastination-health model among individuals with and without HT/CVD. Individuals with self-reported HT/CVD (N = 182) and healthy controls (N = 564), from a community sample, completed an online survey including measures of personality, coping, and health outcomes. Logistic regression analysis controlling for demographic and higher order personality factors found that older age, lower education level and higher procrastination scores were associated with HT/CVD. Moderated mediation analyses with bootstrapping revealed that procrastination was more strongly associated with maladaptive coping behaviours in participants with HT/CVD than the healthy controls, and the indirect effects on stress through maladaptive coping were larger for the HT/CVD sample. Results suggest procrastination is a vulnerability factor for poor adjustment to and management of HT/CVD.

  19. Neutrino masses, mixings, and FCNC’s in an S3 flavor symmetric extension of the standard model

    International Nuclear Information System (INIS)

    Mondragón, A.; Mondragón, M.; Peinado, E.

    2011-01-01

    By introducing threeHiggs fields that are SU(2) doublets and a flavor permutational symmetry, S 3 , in the theory, we extend the concepts of flavor and generations to the Higgs sector and formulate a Minimal S 3 -Invariant Extension of the Standard Model. The mass matrices of the neutrinos and charged leptons are re-parameterized in terms of their eigenvalues, then the neutrino mixing matrix, V PMNS , is computed and exact, explicit analytical expressions for the neutrino mixing angles as functions of the masses of neutrinos and charged leptons are obtained in excellent agreement with the latest experimental data. We also compute the branching ratios of some selected flavor-changing neutral current (FCNC) processes, as well as the contribution of the exchange of neutral flavor-changing scalars to the anomaly of the magnetic moment of the muon, as functions of the masses of charged leptons and the neutral Higgs bosons. We find that the S 3 × Z 2 flavor symmetry and the strong mass hierarchy of the charged leptons strongly suppress the FCNC processes in the leptonic sector, well below the present experimental bounds by many orders of magnitude. The contribution of FCNC’s to the anomaly of the muon’s magnetic moment is small, but not negligible.

  20. The effect of sodium hypochlorite concentration and irrigation needle extension on biofilm removal from a simulated root canal model.

    Science.gov (United States)

    Mohmmed, Saifalarab A; Vianna, Morgana E; Penny, Matthew R; Hilton, Stephen T; Knowles, Jonathan C

    2017-12-01

    To investigate the effect of sodium hypochlorite concentration and needle extension on removal of Enterococcus faecalis biofilm, sixty root canal models were 3D printed. Biofilms were grown on the apical 3 mm of the canal for 10 days. Irrigation for 60s with 9 mL of either 5.25% or 2.5% NaOCl or water was performed using a needle inserted either 3 or 2 mm from the canal terminus and imaged using fluorescence microscopy and residual biofilm percentages were calculated using imaging software. The data were analysed using analysis of covariance and two-sample t-tests. A significance level of 0.05 was used throughout. Residual biofilm was less using 5.25% than with 2.5% NaOCl. Statistically significant biofilm removal was evident with the needle placed closer to the canal terminus. A greater reduction of available chlorine and pH was noted as the concentration increased. One-minute irrigation was not sufficient for complete biofilm removal. © 2017 Australian Society of Endodontology Inc.

  1. Viable dark matter via radiative symmetry breaking in a scalar singlet Higgs portal extension of the standard model.

    Science.gov (United States)

    Steele, T G; Wang, Zhi-Wei; Contreras, D; Mann, R B

    2014-05-02

    We consider the generation of dark matter mass via radiative electroweak symmetry breaking in an extension of the conformal standard model containing a singlet scalar field with a Higgs portal interaction. Generating the mass from a sequential process of radiative electroweak symmetry breaking followed by a conventional Higgs mechanism can account for less than 35% of the cosmological dark matter abundance for dark matter mass M(s)>80 GeV. However, in a dynamical approach where both Higgs and scalar singlet masses are generated via radiative electroweak symmetry breaking, we obtain much higher levels of dark matter abundance. At one-loop level we find abundances of 10%-100% with 106 GeVdark matter mass. The dynamical approach also predicts a small scalar-singlet self-coupling, providing a natural explanation for the astrophysical observations that place upper bounds on dark matter self-interaction. The predictions in all three approaches are within the M(s)>80 GeV detection region of the next generation XENON experiment.

  2. A finite element evaluation of mechanical function for 3 distal extension partial dental prosthesis designs with a 3-dimensional nonlinear method for modeling soft tissue.

    Science.gov (United States)

    Nakamura, Yoshinori; Kanbara, Ryo; Ochiai, Kent T; Tanaka, Yoshinobu

    2014-10-01

    The mechanical evaluation of the function of partial removable dental prostheses with 3-dimensional finite element modeling requires the accurate assessment and incorporation of soft tissue behavior. The differential behaviors of the residual ridge mucosa and periodontal ligament tissues have been shown to exhibit nonlinear displacement. The mathematic incorporation of known values simulating nonlinear soft tissue behavior has not been investigated previously via 3-dimensional finite element modeling evaluation to demonstrate the effect of prosthesis design on the supporting tissues. The purpose of this comparative study was to evaluate the functional differences of 3 different partial removable dental prosthesis designs with 3-dimensional finite element analysis modeling and a simulated patient model incorporating known viscoelastic, nonlinear soft tissue properties. Three different designs of distal extension removable partial dental prostheses were analyzed. The stress distributions to the supporting abutments and soft tissue displacements of the designs tested were calculated and mechanically compared. Among the 3 dental designs evaluated, the RPI prosthesis demonstrated the lowest stress concentrations on the tissue supporting the tooth abutment and also provided wide mucosa-borne areas of support, thereby demonstrating a mechanical advantage and efficacy over the other designs evaluated. The data and results obtained from this study confirmed that the functional behavior of partial dental prostheses with supporting abutments and soft tissues are consistent with the conventional theories of design and clinical experience. The validity and usefulness of this testing method for future applications and testing protocols are shown. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  3. Oceanotron server for marine in-situ observations : a thematic data model implementation as a basis for the extensibility

    Science.gov (United States)

    Loubrieu, T.; Donnart, J. C.; Bregent, S.; Blower, J.; Griffith, G.

    2012-04-01

    Oceanotron (https://forge.ifremer.fr/plugins/mediawiki/wiki/oceanotron/index.php/Accueil) is an open-source data server dedicated to marine in-situ observation dissemination. For its extensibility it relies of an ocean business data model. IFREMER hosts the CORIOLIS marine in-situ data centre (http://www.coriolis.eu.org) and, as French NODC (National Oceanographic Data Centre, http://www.ifremer.fr/sismer/index_UK.htm), some other in-situ observation databases. As such IFREMER participates to numerous ocean data management projects. IFREMER wished to capitalize its thematic data management expertise in a dedicated data dissemination server called Oceanotron. The development of the server coordinated by IFREMER has started in 2010. Knowing the diversity of data repository formats (RDBMS, netCDF, ODV, MEDATLAS, ...) and the temperamental nature of the standard interoperability interface profiles (OGC/WMS, OGC/WFS, OGC/SOS, OpenDAP, …), the architecture of the software relies on an ocean business data model dedicated to marine in-situ observation features. The ocean business data model relies on the CSML conceptual modelling (http://csml.badc.rl.ac.uk/) and UNIDATA Common Data Model (http://www.unidata.ucar.edu/software/netcdf-java/CDM/) works and focuses on the most common marine observation features which are : vertical profiles, point series, trajectories and point. The ocean business data model has been implemented in java and can be used as an API. The oceanotron server orchestrates different types of modules handling the ocean business data model objects : - StorageUnits : which read specific data repository formats (netCDF/OceanSites, netCDF/ARGO, ...). - TransformationUnits : which apply useful ocean business related transformation to the features (for example conversion of vertical coordinates from pressure in dB to meters under sea surface). - FrontDesks : which get external requests and send results for interoperable protocols (OpenDAP, WMS, ...). These

  4. Modelling Imperfect Product Line Requirements with Fuzzy Feature Diagrams

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Weston, Nathan; Rashid, Awais

    In this article, we identify that partial, vague and conflicting information can severely limit the effectiveness of approaches that derive feature trees from textual requirement specifications. We examine the impact such imperfect information has on feature tree extraction and we propose the use of

  5. From requirement document to formal modelling and decomposition of control systems

    OpenAIRE

    Yeganefard, Sanaz

    2014-01-01

    Formal modelling of control systems can help with identifying missing requirements and design flaws before implementing them. However, modelling using formal languages can be challenging and time consuming. Therefore intermediate steps may be required to simplify the transition from informal requirements to a formal model.In this work we firstly provide a four-stage approach for structuring and formalising requirements of a control system. This approach is based on monitored, controlled, mode...

  6. Towards the development of a 3D digital city model as a real extension of public urban spaces

    DEFF Research Database (Denmark)

    Tournay, Bruno

    that begin in the analogue world can be transferred to the digital world, and how events in the digital world require similar events in the analogue world to succeed. The need to define the development of ICT in the urban regeneration process as a goal. Business as usual can be deadly in efforts to develop...... new approaches to communication and participation. Who controls the Electronic Neighbourhood? Just as in the analogue world, control of central places in the digital world is power.   Finally, based on the experience gained in relation to the project, the paper will outline some guidelines for better......; it only serves as a tool in the analogue world. The model is a passive picture for contemplation.   Another way of looking at a digital 3D model is to see it not as a virtual model of reality but as a real model that must fulfil real functions and to design it as a space of transition between the local...

  7. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio...... ambiguity in the process of elicitation and analysis through the use of empirically informed quality goals attached to functional goals. The authors demonstrate the benefit of articulating a quality goal without turning it into a functional goal. Their study shows that quality goals kept at a high level...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  8. Requirements and Problems in Parallel Model Development at DWD

    Directory of Open Access Journals (Sweden)

    Ulrich Schäattler

    2000-01-01

    Full Text Available Nearly 30 years after introducing the first computer model for weather forecasting, the Deutscher Wetterdienst (DWD is developing the 4th generation of its numerical weather prediction (NWP system. It consists of a global grid point model (GME based on a triangular grid and a non-hydrostatic Lokal Modell (LM. The operational demand for running this new system is immense and can only be met by parallel computers. From the experience gained in developing earlier NWP models, several new problems had to be taken into account during the design phase of the system. Most important were portability (including efficieny of the programs on several computer architectures and ease of code maintainability. Also the organization and administration of the work done by developers from different teams and institutions is more complex than it used to be. This paper describes the models and gives some performance results. The modular approach used for the design of the LM is explained and the effects on the development are discussed.

  9. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  10. Android Access Control Extension

    Directory of Open Access Journals (Sweden)

    Anton Baláž

    2015-12-01

    Full Text Available The main objective of this work is to analyze and extend security model of mobile devices running on Android OS. Provided security extension is a Linux kernel security module that allows the system administrator to restrict program's capabilities with per-program profiles. Profiles can allow capabilities like network access, raw socket access, and the permission to read, write, or execute files on matching paths. Module supplements the traditional Android capability access control model by providing mandatory access control (MAC based on path. This extension increases security of access to system objects in a device and allows creating security sandboxes per application.

  11. Modelling production of field crops and its requirements

    NARCIS (Netherlands)

    Wit, de C.T.; Keulen, van H.

    1987-01-01

    Simulation models are being developed that enable quantitative estimates of the growth and production of the main agricultural crops under a wide range of weather and soil conditions. For this purpose, several hierarchically ordered production situations are distinguished in such a way that the

  12. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    Science.gov (United States)

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  13. Modelling of capital requirements in the energy sector: capital market access. Final memorandum

    Energy Technology Data Exchange (ETDEWEB)

    1978-04-01

    Formal modelling techniques for analyzing the capital requirements of energy industries have been performed at DOE. A survey has been undertaken of a number of models which forecast energy-sector capital requirements or which detail the interactions of the energy sector and the economy. Models are identified which can be useful as prototypes for some portion of DOE's modelling needs. The models are examined to determine any useful data bases which could serve as inputs to an original DOE model. A selected group of models are examined which can comply with the stated capabilities. The data sources being used by these models are covered and a catalog of the relevant data bases is provided. The models covered are: capital markets and capital availability models (Fossil 1, Bankers Trust Co., DRI Macro Model); models of physical capital requirements (Bechtel Supply Planning Model, ICF Oil and Gas Model and Coal Model, Stanford Research Institute National Energy Model); macroeconomic forecasting models with input-output analysis capabilities (Wharton Annual Long-Term Forecasting Model, Brookhaven/University of Illinois Model, Hudson-Jorgenson/Brookhaven Model); utility models (MIT Regional Electricity Model-Baughman Joskow, Teknekron Electric Utility Simulation Model); and others (DRI Energy Model, DRI/Zimmerman Coal Model, and Oak Ridge Residential Energy Use Model).

  14. StreamFlow 1.0: an extension to the spatially distributed snow model Alpine3D for hydrological modelling and deterministic stream temperature prediction

    Science.gov (United States)

    Gallice, Aurélien; Bavay, Mathias; Brauchli, Tristan; Comola, Francesco; Lehning, Michael; Huwald, Hendrik

    2016-12-01

    Climate change is expected to strongly impact the hydrological and thermal regimes of Alpine rivers within the coming decades. In this context, the development of hydrological models accounting for the specific dynamics of Alpine catchments appears as one of the promising approaches to reduce our uncertainty of future mountain hydrology. This paper describes the improvements brought to StreamFlow, an existing model for hydrological and stream temperature prediction built as an external extension to the physically based snow model Alpine3D. StreamFlow's source code has been entirely written anew, taking advantage of object-oriented programming to significantly improve its structure and ease the implementation of future developments. The source code is now publicly available online, along with a complete documentation. A special emphasis has been put on modularity during the re-implementation of StreamFlow, so that many model aspects can be represented using different alternatives. For example, several options are now available to model the advection of water within the stream. This allows for an easy and fast comparison between different approaches and helps in defining more reliable uncertainty estimates of the model forecasts. In particular, a case study in a Swiss Alpine catchment reveals that the stream temperature predictions are particularly sensitive to the approach used to model the temperature of subsurface flow, a fact which has been poorly reported in the literature to date. Based on the case study, StreamFlow is shown to reproduce hourly mean discharge with a Nash-Sutcliffe efficiency (NSE) of 0.82 and hourly mean temperature with a NSE of 0.78.

  15. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...... of the Danish VAT law in Web Ontology Language (OWL) and in Con¿git Product Modeling Language (CPML)....

  16. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    triangles (.raw) to the native triangular facet file (.facet). The software vendors recommend the use of McNeil and Associates’ Rhinoceros 3D for all...surface modeling and export. Rhinoceros has the capability and precision to create highly detailed 3D surface geometry suitable for radar cross section... white before ending up at blue as the temperature increases [27]. IR radiation was discovered in 1800 but its application is still limited in

  17. Arabidopsis ECERIFERUM2 Is a Component of the Fatty Acid Elongation Machinery Required for Fatty Acid Extension to Exceptional Lengths1[W][OA

    Science.gov (United States)

    Haslam, Tegan M.; Mañas-Fernández, Aurora; Zhao, Lifang; Kunst, Ljerka

    2012-01-01

    Primary aerial surfaces of land plants are coated by a lipidic cuticle, which forms a barrier against transpirational water loss and protects the plant from diverse stresses. Four enzymes of a fatty acid elongase complex are required for the synthesis of very-long-chain fatty acid (VLCFA) precursors of cuticular waxes. Fatty acid elongase substrate specificity is determined by a condensing enzyme that catalyzes the first reaction carried out by the complex. In Arabidopsis (Arabidopsis thaliana), characterized condensing enzymes involved in wax synthesis can only elongate VLCFAs up to 28 carbons (C28) in length, despite the predominance of C29 to C31 monomers in Arabidopsis stem wax. This suggests additional proteins are required for elongation beyond C28. The wax-deficient mutant eceriferum2 (cer2) lacks waxes longer than C28, implying that CER2, a putative BAHD acyltransferase, is required for C28 elongation. Here, we characterize the cer2 mutant and demonstrate that green fluorescent protein-tagged CER2 localizes to the endoplasmic reticulum, the site of VLCFA biosynthesis. We use site-directed mutagenesis to show that the classification of CER2 as a BAHD acyltransferase based on sequence homology does not fit with CER2 catalytic activity. Finally, we provide evidence for the function of CER2 in C28 elongation by an assay in yeast (Saccharomyces cerevisiae). PMID:22930748

  18. Aminoglycoside Concentrations Required for Synergy with Carbapenems against Pseudomonas aeruginosa Determined via Mechanistic Studies and Modeling.

    Science.gov (United States)

    Yadav, Rajbharan; Bulitta, Jürgen B; Schneider, Elena K; Shin, Beom Soo; Velkov, Tony; Nation, Roger L; Landersdorfer, Cornelia B

    2017-12-01

    This study aimed to systematically identify the aminoglycoside concentrations required for synergy with a carbapenem and characterize the permeabilizing effect of aminoglycosides on the outer membrane of Pseudomonas aeruginosa Monotherapies and combinations of four aminoglycosides and three carbapenems were studied for activity against P. aeruginosa strain AH298-GFP in 48-h static-concentration time-kill studies (SCTK) (inoculum: 10 7.6 CFU/ml). The outer membrane-permeabilizing effect of tobramycin alone and in combination with imipenem was characterized via electron microscopy, confocal imaging, and the nitrocefin assay. A mechanism-based model (MBM) was developed to simultaneously describe the time course of bacterial killing and prevention of regrowth by imipenem combined with each of the four aminoglycosides. Notably, 0.25 mg/liter of tobramycin, which was inactive in monotherapy, achieved synergy (i.e., ≥2-log 10 more killing than the most active monotherapy at 24 h) combined with imipenem. Electron micrographs, confocal image analyses, and the nitrocefin uptake data showed distinct outer membrane damage by tobramycin, which was more extensive for the combination with imipenem. The MBM indicated that aminoglycosides enhanced the imipenem target site concentration up to 4.27-fold. Tobramycin was the most potent aminoglycoside to permeabilize the outer membrane; tobramycin (0.216 mg/liter), gentamicin (0.739 mg/liter), amikacin (1.70 mg/liter), or streptomycin (5.19 mg/liter) was required for half-maximal permeabilization. In summary, our SCTK, mechanistic studies and MBM indicated that tobramycin was highly synergistic and displayed the maximum outer membrane disruption potential among the tested aminoglycosides. These findings support the optimization of highly promising antibiotic combination dosage regimens for critically ill patients. Copyright © 2017 American Society for Microbiology.

  19. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies....

  20. A New Extension of the Binomial Error Model for Responses to Items of Varying Difficulty in Educational Testing and Attitude Surveys.

    Directory of Open Access Journals (Sweden)

    James A Wiley

    Full Text Available We put forward a new item response model which is an extension of the binomial error model first introduced by Keats and Lord. Like the binomial error model, the basic latent variable can be interpreted as a probability of responding in a certain way to an arbitrarily specified item. For a set of dichotomous items, this model gives predictions that are similar to other single parameter IRT models (such as the Rasch model but has certain advantages in more complex cases. The first is that in specifying a flexible two-parameter Beta distribution for the latent variable, it is easy to formulate models for randomized experiments in which there is no reason to believe that either the latent variable or its distribution vary over randomly composed experimental groups. Second, the elementary response function is such that extensions to more complex cases (e.g., polychotomous responses, unfolding scales are straightforward. Third, the probability metric of the latent trait allows tractable extensions to cover a wide variety of stochastic response processes.

  1. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  2. Formal Requirements Modeling with Executable Use Cases and Coloured Petri Nets

    OpenAIRE

    Jørgensen, Jens Bæk; Tjell, Simon; Fernandes, Joao Miguel

    2009-01-01

    This paper presents executable use cases (EUCs), which constitute a model-based approach to requirements engineering. EUCs may be used as a supplement to model-driven development (MDD) and can describe and link user-level requirements and more technical software specifications. In MDD, user-level requirements are not always explicitly described, since usually it is sufficient that one provides a specification, or platform-independent model, of the software that is to be developed. Th...

  3. Continuous forearc extension following the 2010 Maule megathrust earthquake: InSAR and seismic observations and modelling

    Science.gov (United States)

    Bie, L.; Rietbrock, A.; Agurto-Detzel, H.

    2017-12-01

    The forearc region in subduction zones deforms in response to relative movement on the plate interface throughout the earthquake cycle. Megathrust earthquakes may alter the stress field in the forearc areas from compression to extension, resulting in normal faulting earthquakes. Recent cases include the 2011 Iwaki sequence following the Tohoku-Oki earthquake in Japan, and 2010 Pichilemu sequence after the Maule earthquake in central Chile. Given the closeness of these normal fault events to residential areas, and their shallow depth, they may pose equivalent, if not higher, seismic risk in comparison to earthquakes on the megathrust. Here, we focus on the 2010 Pichilemu sequence following the Mw 8.8 Maule earthquake in central Chile, where the Nazca Plate subducts beneath the South American Plate. Previous studies have clearly delineated the Pichilemu normal fault structure. However, it is not clear whether the Pichilemu events fully released the extensional stress exerted by the Maule mainshock, or the forearc area is still controlled by extensional stress. A 3 months displacement time-series, constructed by radar satellite images, clearly shows continuous aseismic deformation along the Pichilemu fault. Kinematic inversion reveals peak afterslip of 25 cm at shallow depth, equivalent to a Mw 5.4 earthquake. We identified a Mw 5.3 earthquake 2 months after the Pichilemu sequence from both geodetic and seismic observations. Nonlinear inversion from geodetic data suggests that this event ruptured a normal fault conjugate to the Pichilemu fault, at a depth of 4.5 km, consistent with the result obtained from independent moment tensor inversion. We relocated aftershocks in the Pichilemu area using relative arrivals time and a 3D velocity model. The spatial correlation between geodetic deformation and aftershocks reveals three additional areas which may have experienced aseismic slip at depth. Both geodetic displacement and aftershock distribution show a conjugated L

  4. Requirements for modeling airborne microbial contamination in space stations

    Science.gov (United States)

    Van Houdt, Rob; Kokkonen, Eero; Lehtimäki, Matti; Pasanen, Pertti; Leys, Natalie; Kulmala, Ilpo

    2018-03-01

    Exposure to bioaerosols is one of the facets that affect indoor air quality, especially for people living in densely populated or confined habitats, and is associated to a wide range of health effects. Good indoor air quality is thus vital and a prerequisite for fully confined environments such as space habitats. Bioaerosols and microbial contamination in these confined space stations can have significant health impacts, considering the unique prevailing conditions and constraints of such habitats. Therefore, biocontamination in space stations is strictly monitored and controlled to ensure crew and mission safety. However, efficient bioaerosol control measures rely on solid understanding and knowledge on how these bioaerosols are created and dispersed, and which factors affect the survivability of the associated microorganisms. Here we review the current knowledge gained from relevant studies in this wide and multidisciplinary area of bioaerosol dispersion modeling and biological indoor air quality control, specifically taking into account the specific space conditions.

  5. SWAT (Student Weekend Arborist Team): A Model for Land Grant Institutions and Cooperative Extension Systems to Conduct Street Tree Inventories

    Science.gov (United States)

    Cowett, F.D.; Bassuk, N.L.

    2012-01-01

    SWAT (Student Weekend Arborist Team) is a program affiliated with Cornell University and Extension founded to conduct street tree inventories in New York State communities with 10,000 residents or fewer, a group of communities underserved in community forestry planning. Between 2002 and 2010, SWAT conducted 40 inventories, and data from these…

  6. The Extension of Quality Function Deployment Based on 2-Tuple Linguistic Representation Model for Product Design under Multigranularity Linguistic Environment

    Directory of Open Access Journals (Sweden)

    Ming Li

    2012-01-01

    Full Text Available Quality function deployment (QFD is a customer-driven approach for product design and development. A QFD analysis process includes a series of subprocesses, such as determination of the importance of customer requirements (CRs, the correlation among engineering characteristics (ECs, and the relationship between CRs and ECs. Usually more than group of one decision makers are involved in the subprocesses to make the decision. In most decision making problems, they often provide their evaluation information in the linguistic form. Moreover, because of different knowledge, background, and discrimination ability, decision makers may express their linguistic preferences in multigranularity linguistic information. Therefore, an effective approach to deal with the multi-granularity linguistic information in QFD analysis process is highly needed. In this study, the QFD methodology is extended with 2-tuple linguistic representation model under multi-granularity linguistic environment. The extended QFD methodology can cope with multi-granularity linguistic evaluation information and avoid the loss of information. The applicability of the proposed approach is demonstrated with a numerical example.

  7. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    Control systems are purposeful systems involving goal-oriented information processing (cyber) and technical (physical) structures. Requirements modeling formalizes fundamental concepts and relations of a system architecture at a high-level design stage and can be used to identify potential design...... issues early. For requirements formulation of control structures, cyber and physical aspects need to be jointly represented to express interdependencies, check for consistency and discover potentially conflicting requirements. Early identification of potential conflicts may prevent larger problems...... modeling for early requirements checking using a suitable modeling language, and illustrates how this approach enables the identification of several classes of controller conflict....

  8. A semi-automated approach for generating natural language requirements documents based on business process models

    NARCIS (Netherlands)

    Aysolmaz, Banu; Leopold, Henrik; Reijers, Hajo A.; Demirörs, Onur

    2018-01-01

    Context: The analysis of requirements for business-related software systems is often supported by using business process models. However, the final requirements are typically still specified in natural language. This means that the knowledge captured in process models must be consistently

  9. Requirements-level semantics and model checking of object-oriented statecharts

    NARCIS (Netherlands)

    Eshuis, H.; Jansen, D.N.; Wieringa, Roelf J.

    2002-01-01

    In this paper we define a requirements-level execution semantics for object-oriented statecharts and show how properties of a system specified by these statecharts can be model checked using tool support for model checkers. Our execution semantics is requirements-level because it uses the perfect

  10. Employing 2D Forward Modeling of Gravity and Magnetic Data to Further Constrain the Magnitude of Extension Recorded by the Caetano Caldera, Nevada

    Science.gov (United States)

    Ritzinger, B. T.; Glen, J. M. G.; Athens, N. D.; Denton, K. M.; Bouligand, C.

    2015-12-01

    Regionally continuous Cenozoic rocks in the Basin and Range that predate the onset of major mid-Miocene extension provide valuable insight into the sequence of faulting and magnitude of extension. An exceptional example of this is Caetano caldera, located in north-central Nevada, that formed during the eruption of the Caetano Tuff at the Eocene-Oligocene transition. The caldera and associated deposits, as well as conformable caldera-filling sedimentary and volcanic units allow for the reconstruction of post Oligocene extensional faulting. Extensive mapping and geochronologic, geochemical and paleomagnetic analyses have been conducted over the last decade to help further constrain the eruptive and extensional history of the Caetano caldera and associated deposits. Gravity and magnetic data, that highlight contrasts in density and magnetic properties (susceptibility and remanence), respectively, are useful for mapping and modeling structural and lithic discontinuities. By combining existing gravity and aeromagnetic data with newly collected high-resolution gravity data, we are performing detailed potential field modeling to better characterize the subsurface within and surrounding the caldera. Modeling is constrained by published geologic map and cross sections and by new rock properties for these units determined from oriented drill core and hand samples collected from outcrops that span all of the major rock units in the study area. These models will enable us to better map the margins of the caldera and more accurately determine subsurface lithic boundaries and complex fault geometries, as well as aid in refining estimates of the magnitude of extension across the caldera. This work highlights the value in combining geologic and geophysical data to build an integrated structural model to help characterize the subsurface and better constrain the extensional tectonic history if this part of the Great Basin.

  11. Teaching as an Intervention: Evaluating the AIAI-FTFD Teaching Model and 9 Skills of Communication In an Extension Learning Environment

    Directory of Open Access Journals (Sweden)

    Victor W. Harris

    2016-02-01

    Full Text Available Extension educators are continually seeking ways to make instruction more effective and engaging. This study evaluated the Attention, Interact, Apply, and Invite – Fact, Think, Feel, Do (AIAI-FTFD Start-to-Finish Teaching Model for human service educators in an ongoing Extension educational program to determine the effectiveness of this model in implementing the concept of “teaching as an intervention” in Extension educational programming. Specifically, the study assessed the cognitive, emotional, and intent to change behavioral learning outcomes generated by using the AIAI-FTFD teaching model while completing the 9 Important Communication Skills for Every Relationship (9 Skills program. A self-reported quantitative evaluation design was utilized to assess key objectives in the sample (n = 152. Noticeable and clearly-evident effect sizes were found in perceived knowledge gain and perceived confidence gain in the ability to implement the skills covered in the training. Subsequent discussion focuses on how the AIAI-FTFD Start-to-Finish Teaching Model can facilitate change and learning in educational settings.

  12. Selection Ideal Coal Suppliers of Thermal Power Plants Using the Matter-Element Extension Model with Integrated Empowerment Method for Sustainability

    Directory of Open Access Journals (Sweden)

    Zhongfu Tan

    2014-01-01

    Full Text Available In order to reduce thermal power generation cost and improve its market competitiveness, considering fuel quality, cost, creditworthiness, and sustainable development capacity factors, this paper established the evaluation system for coal supplier selection of thermal power and put forward the coal supplier selection strategies for thermal power based on integrated empowering and ideal matter-element extension models. On the one hand, the integrated empowering model can overcome the limitations of subjective and objective methods to determine weights, better balance subjective, and objective information. On the other hand, since the evaluation results of the traditional element extension model may fall into the same class and only get part of the order results, in order to overcome this shortcoming, the idealistic matter-element extension model is constructed. It selects the ideal positive and negative matter-elements classical field and uses the closeness degree to replace traditional maximum degree of membership criterion and calculates the positive or negative distance between the matter-element to be evaluated and the ideal matter-element; then it can get the full order results of the evaluation schemes. Simulated and compared with the TOPSIS method, Romania selection method, and PROMETHEE method, numerical example results show that the method put forward by this paper is effective and reliable.

  13. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  14. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  15. Feasibility and extension of universal quantitative models for moisture content determination in beta-lactam powder injections by near-infrared spectroscopy.

    Science.gov (United States)

    Zhang, Xue-Bo; Feng, Yan-Chun; Hu, Chang-Qin

    2008-12-23

    In present work, we investigated the feasibility of universal calibration models for moisture content determination of a much complicated products system of powder injections to simulate the process of building universal models for drug preparations with same INN (International Nonproprietary Name) from diverse formulations and sources. We also extended the applicability of universal model by model updating and calibration transfer. Firstly, a moisture content quantitative model for ceftriaxone sodium for injection was developed, the results show that calibration model established for products of some manufacturers is also available for the products of others. Then, we further constructed a multiplex calibration model for seven cephalosporins for injection ranging from 0.40 to 9.90%, yielding RMSECV and RMSEP of 0.283 and 0.261, respectively. However, this multiplex model could not predict samples of another cephalosporin (ceftezole sodium) and one penicillins (penicillin G procaine) for injection accurately. With regard to such limits and the extension of universal models, two solutions are proposed: model updating (MU) and calibration transfer. Overall, model updating is a robust method for the analytical problem under consideration. When timely model updating is impractical, piecewise direct standardization (PDS) algorithm is more desirable and applied to transfer calibration model between different powder injections. Both two solutions have proven to be effective to extend the applicability of original universal models for the new products emerging.

  16. Extension of the segment-based Wilson and NRTL models for correlation of excess molar enthalpies of polymer solutions

    International Nuclear Information System (INIS)

    Sadeghi, Rahmat

    2005-01-01

    The polymer Wilson model and the polymer NRTL model have been extended for the representation of the excess enthalpy of multicomponent polymer solutions. Applicability of obtained equations in the correlation of the excess enthalpies of polymer solutions has been examined. It is found that the both models are suitable models in representing the published excess enthalpy data for the tested polymer solutions

  17. The effectiveness and limitations of fuel modeling using the fire and fuels extension to the Forest Vegetation Simulator

    Science.gov (United States)

    Erin K. Noonan-Wright; Nicole M. Vaillant; Alicia L. Reiner

    2014-01-01

    Fuel treatment effectiveness is often evaluated with fire behavior modeling systems that use fuel models to generate fire behavior outputs. How surface fuels are assigned, either using one of the 53 stylized fuel models or developing custom fuel models, can affect predicted fire behavior. We collected surface and canopy fuels data before and 1, 2, 5, and 8 years after...

  18. Linking nitrogen deposition to nitrate concentrations in groundwater below nature areas : modelling approach and data requirements

    NARCIS (Netherlands)

    Bonten, L.T.C.; Mol-Dijkstra, J.P.; Wieggers, H.J.J.; Vries, de W.; Pul, van W.A.J.; Hoek, van den K.W.

    2009-01-01

    This study determines the most suitable model and required model improvements to link atmospheric deposition of nitrogen and other elements in the Netherlands to measurements of nitrogen and other elements in the upper groundwater. The deterministic model SMARTml was found to be the most suitable

  19. On the Potential of Functional Modeling Extensions to the CIM for Means-Ends Representation and Reasoning

    DEFF Research Database (Denmark)

    Heussen, Kai; Kullmann, Daniel

    2010-01-01

    Engineering is the art of making complicated things work. There are few things an engineer can’t do. Explaining his work to a computer may be one of them. This paper introduces Functional Modeling with Multilevel Flow Models as an information modeling approach that explicitly relates the functions...

  20. Segment-Specific Adhesion as a Driver of Convergent Extension

    Science.gov (United States)

    Vroomans, Renske M. A.; Hogeweg, Paulien; ten Tusscher, Kirsten H. W. J.

    2015-01-01

    Convergent extension, the simultaneous extension and narrowing of tissues, is a crucial event in the formation of the main body axis during embryonic development. It involves processes on multiple scales: the sub-cellular, cellular and tissue level, which interact via explicit or intrinsic feedback mechanisms. Computational modelling studies play an important role in unravelling the multiscale feedbacks underlying convergent extension. Convergent extension usually operates in tissue which has been patterned or is currently being patterned into distinct domains of gene expression. How such tissue patterns are maintained during the large scale tissue movements of convergent extension has thus far not been investigated. Intriguingly, experimental data indicate that in certain cases these tissue patterns may drive convergent extension rather than requiring safeguarding against convergent extension. Here we use a 2D Cellular Potts Model (CPM) of a tissue prepatterned into segments, to show that convergent extension tends to disrupt this pre-existing segmental pattern. However, when cells preferentially adhere to cells of the same segment type, segment integrity is maintained without any reduction in tissue extension. Strikingly, we demonstrate that this segment-specific adhesion is by itself sufficient to drive convergent extension. Convergent extension is enhanced when we endow our in silico cells with persistence of motion, which in vivo would naturally follow from cytoskeletal dynamics. Finally, we extend our model to confirm the generality of our results. We demonstrate a similar effect of differential adhesion on convergent extension in tissues that can only extend in a single direction (as often occurs due to the inertia of the head region of the embryo), and in tissues prepatterned into a sequence of domains resulting in two opposing adhesive gradients, rather than alternating segments. PMID:25706823

  1. Determining the Most Efficient Supplier Considering Imprecise Data in Data Envelopment Analysis (DEA, a extension for Toloo and Nalchigar's model

    Directory of Open Access Journals (Sweden)

    Morteza Rahmani

    2017-03-01

    Full Text Available Supplier selection in supply chain as a multi-criteria decision making problem (containing both qualitative and quantitative criteria is one of the main factors in a successful supply chain. To this purpose, Toloo and Nalchigar (2011 proposed an integrated data envelopment analysis (DEA model to find the most efficient (best supplier by considering imprecise data. In this paper, it will be shown that their model randomly selects an efficient supplier as the most efficient and therefore their model cannot find the most efficient supplier correctly. We also explain some other problems in this model and propose a modified model to resolve the drawbacks. The proposed model in this paper finds the most efficient supplier considering imprecise data by solving only one mixed integer linear programming. In addition, a new algorithm is proposed for determining and ranking other efficient suppliers. Afficiency of the proposed approach is explained by considering imprecise data for 18 suppliers.

  2. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    Science.gov (United States)

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity.

  3. Developing a Predictive Model for Unscheduled Maintenance Requirements on United States Air Force Installations

    National Research Council Canada - National Science Library

    Kovich, Matthew D; Norton, J. D

    2008-01-01

    .... This paper develops one such method by using linear regression and time series analysis to develop a predictive model to forecast future year man-hour and funding requirements for unscheduled maintenance...

  4. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  5. Towards a framework for improving goal-oriented requirement models quality

    OpenAIRE

    Cares, Carlos; Franch Gutiérrez, Javier

    2009-01-01

    Goal-orientation is a widespread and useful approach to Requirements Engineering. However, quality assessment frameworks focused on goal-oriented processes are either limited or remain on the theoretical side. Requirements quality initiatives range from simple metrics applicable to requirements documents, to general-purpose quality frameworks that include syntactic, semantic and pragmatic concerns. In some recent works, we have proposed a metrics framework for goal-oriented models, b...

  6. Development and validation of extensive growth and growth boundary models for psychrotolerant pseudomonads in seafood, meat and vegetable products

    DEFF Research Database (Denmark)

    Martinez Rios, Veronica; Dalgaard, Paw

    ., Int. J. Food Microbiol.216. 110-120, 2016). MIC-values for acetic-, benzoic- and citric acids were determined in broth and terms modelling their antimicrobial effect were added to the model. Cardinal parameter values for CO2 and aw were obtained from literature.The new model included 9 environmental...... literature data. Performance of the new expanded model was equally good for seafood and meat products, and importance of including the effect of acetic, benzoic, citric acids and CO2 in order to accurately predict growth of psychrotolerant pseudomonads was clearly demonstrated e.g. for brined shrimps...

  7. Development of a model to compute the extension of life supporting zones for Earth-like exoplanets.

    Science.gov (United States)

    Neubauer, David; Vrtala, Aron; Leitner, Johannes J; Firneis, Maria G; Hitzenberger, Regina

    2011-12-01

    A radiative convective model to calculate the width and the location of the life supporting zone (LSZ) for different, alternative solvents (i.e. other than water) is presented. This model can be applied to the atmospheres of the terrestrial planets in the solar system as well as (hypothetical, Earth-like) terrestrial exoplanets. Cloud droplet formation and growth are investigated using a cloud parcel model. Clouds can be incorporated into the radiative transfer calculations. Test runs for Earth, Mars and Titan show a good agreement of model results with observations.

  8. A manpower training requirements model for new weapons systems, with applications to the infantry fighting vehicle

    OpenAIRE

    Kenehan, Douglas J.

    1981-01-01

    Approved for public release; distribution is unlimited This thesis documents the methodology and parameters used in designing a manpower training requirements model for new weapons systems. This model provides manpower planners with the capability of testing alternative fielding policies and adjusting model parameters to improve the use of limited personnel resources. Use of the model is illustrated in a detailed analysis of the planned introduction of the Infantry Fighting Vehicle into t...

  9. Fuel requirements for experimental devices in MTR reactors. A perturbation model for reactor core analysis

    International Nuclear Information System (INIS)

    Beeckmans de West-Meerbeeck, A.

    1991-01-01

    Irradiation in neutron absorbing devices, requiring high fast neutron fluxes in the core or high thermal fluxes in the reflector and flux traps, lead to higher density fuel and larger core dimensions. A perturbation model of the reactor core helps to estimate the fuel requirements. (orig.)

  10. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms of a derivat...

  11. The Nuremberg Code subverts human health and safety by requiring animal modeling

    OpenAIRE

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-01-01

    Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive...

  12. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  13. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  14. Continuous multivariate exponential extension

    International Nuclear Information System (INIS)

    Block, H.W.

    1975-01-01

    The Freund-Weinman multivariate exponential extension is generalized to the case of nonidentically distributed marginal distributions. A fatal shock model is given for the resulting distribution. Results in the bivariate case and the concept of constant multivariate hazard rate lead to a continuous distribution related to the multivariate exponential distribution (MVE) of Marshall and Olkin. This distribution is shown to be a special case of the extended Freund-Weinman distribution. A generalization of the bivariate model of Proschan and Sullo leads to a distribution which contains both the extended Freund-Weinman distribution and the MVE

  15. An extension of Box-Jenkins transfer/noise models for spatial interpolation of groundwater head series

    NARCIS (Netherlands)

    Geer, F.C. van; Zuur, A.F.

    1997-01-01

    This paper advocates an approach to extend single-output Box-Jenkins transfer/noise models for several groundwater head series to a multiple-output transfer/noise model. The approach links several groundwater head series and enables a spatial interpolation in terms of time series analysis. Our

  16. Predicting User Acceptance of Collaborative Technologies: An Extension of the Technology Acceptance Model for E-Learning

    Science.gov (United States)

    Cheung, Ronnie; Vogel, Doug

    2013-01-01

    Collaborative technologies support group work in project-based environments. In this study, we enhance the technology acceptance model to explain the factors that influence the acceptance of Google Applications for collaborative learning. The enhanced model was empirically evaluated using survey data collected from 136 students enrolled in a…

  17. Kinetics of autothermal thermophilic aerobic digestion - application and extension of Activated Sludge Model No 1 at thermophilic temperatures.

    Science.gov (United States)

    Kovács, R; Miháltz, P; Csikor, Zs

    2007-01-01

    The application of an ASM1-based mathematical model for the modeling of autothermal thermophilic aerobic digestion is demonstrated. Based on former experimental results the original ASM1 was extended by the activation of facultative thermophiles from the feed sludge and a new component, the thermophilic biomass was introduced. The resulting model was calibrated in the temperature range of 20-60 degrees C. The temperature dependence of the growth and decay rates in the model is given in terms of the slightly modified Arrhenius and Topiwala-Sinclair equations. The capabilities of the calibrated model in realistic ATAD scenarios are demonstrated with a focus on autothermal properties of ATAD systems at different conditions.

  18. SU-F-J-138: An Extension of PCA-Based Respiratory Deformation Modeling Via Multi-Linear Decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, North Carolina (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, North Carolina (United States); Yin, FF; Ren, L

    2016-06-15

    Purpose: To address and lift the limited degree of freedom (DoF) of globally bilinear motion components such as those based on principal components analysis (PCA), for encoding and modeling volumetric deformation motion. Methods: We provide a systematic approach to obtaining a multi-linear decomposition (MLD) and associated motion model from deformation vector field (DVF) data. We had previously introduced MLD for capturing multi-way relationships between DVF variables, without being restricted by the bilinear component format of PCA-based models. PCA-based modeling is commonly used for encoding patient-specific deformation as per planning 4D-CT images, and aiding on-board motion estimation during radiotherapy. However, the bilinear space-time decomposition inherently limits the DoF of such models by the small number of respiratory phases. While this limit is not reached in model studies using analytical or digital phantoms with low-rank motion, it compromises modeling power in the presence of relative motion, asymmetries and hysteresis, etc, which are often observed in patient data. Specifically, a low-DoF model will spuriously couple incoherent motion components, compromising its adaptability to on-board deformation changes. By the multi-linear format of extracted motion components, MLD-based models can encode higher-DoF deformation structure. Results: We conduct mathematical and experimental comparisons between PCA- and MLD-based models. A set of temporally-sampled analytical trajectories provides a synthetic, high-rank DVF; trajectories correspond to respiratory and cardiac motion factors, including different relative frequencies and spatial variations. Additionally, a digital XCAT phantom is used to simulate a lung lesion deforming incoherently with respect to the body, which adheres to a simple respiratory trend. In both cases, coupling of incoherent motion components due to a low model DoF is clearly demonstrated. Conclusion: Multi-linear decomposition can

  19. Specialty Care Access Network-Extension of Community Healthcare Outcomes Model Program for Liver Disease Improves Specialty Care Access.

    Science.gov (United States)

    Glass, Lisa M; Waljee, Akbar K; McCurdy, Heather; Su, Grace L; Sales, Anne

    2017-12-01

    To improve subspecialty access, VA Ann Arbor Healthcare System (VAAAHS) implemented the first Specialty Care Access Network (SCAN)-Extension of Community Healthcare Outcomes (ECHO) in chronic liver disease. SCAN-ECHO Liver links primary care providers (PCPs) to hepatologists via secure video-teleconferencing. We aim to describe characteristics of participants (PCPs) and patients (clinical question and diagnosis) in SCAN-ECHO Liver. This is a prospective study of the VAAAHS SCAN-ECHO Liver (June 10, 2011-March 31, 2015). This evaluation was carried out as a non-research activity under the guidance furnished by VHA Handbook 1058.05. It was approved through the Medicine Service at VAAAHS as noted in the attestation document which serves as documentation of approved non-research, quality improvement activities in VHA. In total, 106 PCPs from 23 sites participated. A total of 155 SCAN-ECHO sessions discussed 519 new and 49 return patients. 29.4% of Liver Clinic requests were completed in SCAN-ECHO Liver. SCAN-ECHO Liver consults were completed an average of 10 days sooner than in conventional clinic. Potential travel saving was 250 miles round-trip (median 255 (IQR 142-316) per patient. SCAN-ECHO Liver provided specialty care with increased efficiency and convenience for chronic liver disease patients. One of three of Liver Clinic consults was diverted to SCAN-ECHO Liver, reducing consult completion time by 20%.

  20. Extensions of the Rosner-Colditz breast cancer prediction model to include older women and type-specific predicted risk.

    Science.gov (United States)

    Glynn, Robert J; Colditz, Graham A; Tamimi, Rulla M; Chen, Wendy Y; Hankinson, Susan E; Willett, Walter W; Rosner, Bernard

    2017-08-01

    A breast cancer risk prediction rule previously developed by Rosner and Colditz has reasonable predictive ability. We developed a re-fitted version of this model, based on more than twice as many cases now including women up to age 85, and further extended it to a model that distinguished risk factor prediction of tumors with different estrogen/progesterone receptor status. We compared the calibration and discriminatory ability of the original, the re-fitted, and the type-specific models. Evaluation used data from the Nurses' Health Study during the period 1980-2008, when 4384 incident invasive breast cancers occurred over 1.5 million person-years. Model development used two-thirds of study subjects and validation used one-third. Predicted risks in the validation sample from the original and re-fitted models were highly correlated (ρ = 0.93), but several parameters, notably those related to use of menopausal hormone therapy and age, had different estimates. The re-fitted model was well-calibrated and had an overall C-statistic of 0.65. The extended, type-specific model identified several risk factors with varying associations with occurrence of tumors of different receptor status. However, this extended model relative to the prediction of any breast cancer did not meaningfully reclassify women who developed breast cancer to higher risk categories, nor women remaining cancer free to lower risk categories. The re-fitted Rosner-Colditz model has applicability to risk prediction in women up to age 85, and its discrimination is not improved by consideration of varying associations across tumor subtypes.

  1. Disentangling early language development: modeling lexical and grammatical acquisition using an extension of case-study methodology.

    Science.gov (United States)

    Robinson, B F; Mervis, C B

    1998-03-01

    The early lexical and grammatical development of 1 male child is examined with growth curves and dynamic-systems modeling procedures. Lexical-development described a pattern of logistic growth (R2 = .98). Lexical and plural development shared the following characteristics: Plural growth began only after a threshold was reached in vocabulary size; lexical growth slowed as plural growth increased. As plural use reached full mastery, lexical growth began again to increase. It was hypothesized that a precursor model (P. van Geert, 1991) would fit these data. Subsequent testing indicated that the precursor model, modified to incorporate brief yet intensive plural growth, provided a suitable fit. The value of the modified precursor model for the explication of processes implicated in language development is discussed.

  2. Extension of the Eddy Dissipation Concept for Improved Low-Cost Turbulence-Chemistry Interaction Modeling, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The one CFD modeling area that has remained the most challenging, yet most critical to the success of integrated propulsion system simulations, is turbulence...

  3. Experimental and computational models of neurite extension at a choice point in response to controlled diffusive gradients

    Science.gov (United States)

    Catig, G. C.; Figueroa, S.; Moore, M. J.

    2015-08-01

    Ojective. Axons are guided toward desired targets through a series of choice points that they navigate by sensing cues in the cellular environment. A better understanding of how microenvironmental factors influence neurite growth during development can inform strategies to address nerve injury. Therefore, there is a need for biomimetic models to systematically investigate the influence of guidance cues at such choice points. Approach. We ran an adapted in silico biased turning axon growth model under the influence of nerve growth factor (NGF) and compared the results to corresponding in vitro experiments. We examined if growth simulations were predictive of neurite population behavior at a choice point. We used a biphasic micropatterned hydrogel system consisting of an outer cell restrictive mold that enclosed a bifurcated cell permissive region and placed a well near a bifurcating end to allow proteins to diffuse and form a gradient. Experimental diffusion profiles in these constructs were used to validate a diffusion computational model that utilized experimentally measured diffusion coefficients in hydrogels. The computational diffusion model was then used to establish defined soluble gradients within the permissive region of the hydrogels and maintain the profiles in physiological ranges for an extended period of time. Computational diffusion profiles informed the neurite growth model, which was compared with neurite growth experiments in the bifurcating hydrogel constructs. Main results. Results indicated that when applied to the constrained choice point geometry, the biased turning model predicted experimental behavior closely. Results for both simulated and in vitro neurite growth studies showed a significant chemoattractive response toward the bifurcated end containing an NGF gradient compared to the control, though some neurites were found in the end with no NGF gradient. Significance. The integrated model of neurite growth we describe will allow

  4. Modeling of free and confined turbulent natural gas flames using an extension of CFX-F3D

    Energy Technology Data Exchange (ETDEWEB)

    Roekaerts, D. [Shell Research and Technology Centre, Amsterdam (Netherlands); Hsu, A.

    1997-12-31

    A general form of the fast chemistry / assumed shape probability density function model for turbulent gaseous diffusion flames has been implemented in a new combination of computer programs consisting of the commercial code CFX-F3D (formerly CFDS-FLOW3D) and the program FLAME, developed at Delft University of Technology. Also a mixedness-reactedness model with two independent variables (mixture fraction and reaction progress variable) has been implemented. The main strength of the new program is that it combines the advantages of a general purpose commercial CFD code (applicable to arbitrarily shaped domains, wide range of solvers) with the advantages of special purpose combustion subroutines (more detail in modeling of chemistry and of turbulence-chemistry interaction, flexibility). The new combination of programs has been validated by the application to the prediction of the properties of a labscale turbulent natural gas diffusion flame for which detailed measurements are available. The mixedness-reactedness model has been applied to the case of a confined natural gas diffusion flame at globally rich conditions. In contrast with fast chemistry models, the mixedness-reactedness model can be used to predict the amount of methane at the end of the reactor vessel (`methane slip`) as a function of operating conditions. (author)

  5. Estimating required information size by quantifying diversity in random-effects model meta-analyses

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Thorlund, Kristian; Brok, Jesper

    2009-01-01

    BACKGROUND: There is increasing awareness that meta-analyses require a sufficiently large information size to detect or reject an anticipated intervention effect. The required information size in a meta-analysis may be calculated from an anticipated a priori intervention effect or from...... an intervention effect suggested by trials with low-risk of bias. METHODS: Information size calculations need to consider the total model variance in a meta-analysis to control type I and type II errors. Here, we derive an adjusting factor for the required information size under any random-effects model meta......-trial variability and a sampling error estimate considering the required information size. D2 is different from the intuitively obvious adjusting factor based on the common quantification of heterogeneity, the inconsistency (I2), which may underestimate the required information size. Thus, D2 and I2 are compared...

  6. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  7. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  8. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  9. Knowledge Based Characterization of Cross-Models Constraints to Check Design and Modeling Requirements

    Science.gov (United States)

    Simonn Zayas, David; Monceaux, Anne; Ait-Ameur, Yamine

    2011-08-01

    Nowadays, complexity of systems frequently implies different engineering teams handling various descriptive models. Each team having a variety of expertise backgrounds, domain knowledge and modeling practices, the heterogeneity of the models themselves is a logical consequence. Therefore, even individually models are well managed; their diversity becomes a problem when engineers need to share their models to perform some overall validations. One way of reducing this heterogeneity is to take into consideration the implicit knowledge which is not contained in the models but it is cardinal to understand them. In a first stage of our research, we have defined and implemented an approach recommending the formalization of implicit knowledge to enrich models in order to ease cross- model checks. Nevertheless, to fill the gap between the specification of the system and the validation of a cross- model constraint, in this paper we suggest giving values to some relevant characteristics to reinforce the approach.

  10. Dirac Triplet Extension of the MSSM

    CERN Document Server

    Alvarado, C.; Martin, A.; Ostdiek, B.

    2015-08-13

    In this paper we explore extensions of the Minimal Supersymmetric Standard Model involving two $SU(2)_L$ triplet chiral superfields that share a superpotential Dirac mass yet only one of which couples to the Higgs fields. This choice is motivated by recent work using two singlet superfields with the same superpotential requirements. We find that, as in the singlet case, the Higgs mass in the triplet extension can easily be raised to $125\\,\\text{GeV}$ without introducing large fine-tuning. For triplets that carry hypercharge, the regions of least fine tuning are characterized by small contributions to the $\\mathcal T$ parameter, and light stop squarks, $m_{\\tilde t_1} \\sim 300-450\\,\\text{GeV}$; the latter is a result of the $\\tan\\beta$ dependence of the triplet contribution to the Higgs mass. Despite such light stop masses, these models are viable provided the stop-electroweakino spectrum is sufficiently compressed.

  11. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high...... accuracy. The work aims to identify the minimum amount of input data required for parameterizing an accurate model of the PV plant. The analysis was carried out for both amorphous silicon (a-Si) and cadmium telluride (CdTe), using crystalline silicon (c-Si) as a base for comparison. In the studied cases...

  12. Recent extensions and use of the statistical model code EMPIRE-II - version: 2.17 Millesimo

    International Nuclear Information System (INIS)

    Herman, M.

    2003-01-01

    This lecture notes describe new features of the modular code EMPIRE-2.17 designed to perform comprehensive calculations of nuclear reactions using variety of nuclear reaction models. Compared to the version 2.13, the current release has been extended by including Coupled-Channel mechanism, exciton model, Monte Carlo approach to preequilibrium emission, use of microscopic level densities, widths fluctuation correction, detailed calculation of the recoil spectra, and powerful plotting capabilities provided by the ZVView package. The second part of this lecture concentrates on the use of the code in practical calculations, with emphasis on the aspects relevant to nuclear data evaluation. In particular, adjusting model parameters is discussed in details. (author)

  13. Sociologists in Extension

    Science.gov (United States)

    Christenson, James A.; And Others

    1977-01-01

    The article describes the work activities of the extension sociologist, the relative advantage and disadvantage of extension roles in relation to teaching/research roles, and the relevance of sociological training and research for extension work. (NQ)

  14. Use of continuous lactose fermentation for ethanol production by Kluveromyces marxianus for verification and extension of a biochemically structured model

    DEFF Research Database (Denmark)

    Sansonetti, S.; Hobley, Timothy John; Curcio, S.

    2013-01-01

    A biochemically structured model has been developed to describe the continuous fermentation of lactose to ethanol by Kluveromyces marxianus and allowed metabolic coefficients to be determined. Anaerobic lactose-limited chemostat fermentations at different dilution rates (0.02 – 0.35 h-1) were...... performed. Species specific rates of consumption/formation, as well as yield coefficients were determined. Ethanol yield (0.655 C-mol ethanol*C-mol lactose-1) was as high as 98 % of theoretical. The modeling procedure allowed calculation of maintenance coefficients for lactose consumption and ethanol...

  15. arXiv Effective description of general extensions of the Standard Model: the complete tree-level dictionary

    CERN Document Server

    de Blas, J.; Perez-Victoria, M.; Santiago, J.

    2018-03-19

    We compute all the tree-level contributions to the Wilson coefficients of the dimension-six Standard-Model effective theory in ultraviolet completions with general scalar, spinor and vector field content and arbitrary interactions. No assumption about the renormalizability of the high-energy theory is made. This provides a complete ultraviolet/infrared dictionary at the classical level, which can be used to study the low-energy implications of any model of interest, and also to look for explicit completions consistent with low-energy data.

  16. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  17. The Nuremberg Code subverts human health and safety by requiring animal modeling.

    Science.gov (United States)

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-07-08

    The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  18. Workshop on Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials

    Energy Technology Data Exchange (ETDEWEB)

    Giles, GE

    2005-02-03

    The purpose of this Workshop on ''Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials'' was to solicit functional requirements for tools that help Incident Managers plan for and deal with the consequences of industrial or terrorist releases of materials into the nation's waterways and public water utilities. Twenty representatives attended and several made presentations. Several hours of discussions elicited a set of requirements. These requirements were summarized in a form for the attendees to vote on their highest priority requirements. These votes were used to determine the prioritized requirements that are reported in this paper and can be used to direct future developments.

  19. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  20. Non-monotonic modelling from initial requirements: a proposal and comparison with monotonic modelling methods

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wupper, H.; Wieringa, Roelf J.

    2008-01-01

    Researchers make a significant effort to develop new modelling languages and tools. However, they spend less effort developing methods for constructing models using these languages and tools. We are developing a method for building an embedded system model for formal verification. Our method