WorldWideScience

Sample records for modeling approaches involving

  1. Service creation: a model-based approach

    NARCIS (Netherlands)

    Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a model-based approach to support service creation. In this approach, services are assumed to be created from (available) software components. The creation process may involve multiple design steps in which the requested service is repeatedly decomposed into more detailed

  2. Neuropsychiatric Systemic Lupus Erythematosus Involvement: Towards a Tailored Approach to Our Patients?

    Science.gov (United States)

    Faria, Raquel; Gonçalves, João; Dias, Rita

    2017-01-30

    Neuropsychiatric involvement in systemic lupus erythematosus (NPSLE) is a complex condition that remains poorly understood, and includes heterogeneous manifestations involving both the central and peripheral nervous system, with disabling effects. There are several models to improve NPSLE diagnosis when a neurological syndrome is present. In the last couple of years, the growing knowledge of the role of cytokines and antibodies in NPSLE, as well as the development of new functional imaging techniques, has brought some insights into the physiopathology of the disease, but their validation for clinical use remains undetermined. Furthermore, besides the classic clinical approach, a new tool for screening the 19 NPSLE syndromes has also been developed. Regarding NPSLE therapeutics, there is still no evidence-based treatment approach, but some data support the safety of biological medication when classic treatment fails. Despite the tendency to reclassify SLE patients in clinical and immunological subsets, we hope that these data will inspire medical professionals to approach NPSLE in a manner more tailored to the individual patient.

  3. Evaluation of a Blog Based Parent Involvement Approach by Parents

    Science.gov (United States)

    Ozcinar, Zehra; Ekizoglu, Nihat

    2013-01-01

    Despite the well-known benefits of parent involvement in children's education, research clearly shows that it is difficult to effectively involve parents. This study aims to capture parents' views of a Blog Based Parent Involvement Approach (BPIA) designed to secure parent involvement in education by strengthening school-parent communication. Data…

  4. A hybrid agent-based approach for modeling microbiological systems.

    Science.gov (United States)

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  5. Liver involvement in Gaucher disease - Review and clinical approach.

    Science.gov (United States)

    Adar, Tomer; Ilan, Yaron; Elstein, Deborah; Zimran, Ari

    2018-02-01

    Gaucher disease (GD), one of the most prevalent lysosomal storage diseases, is associated with glucocerebroside accumulation in cells of the monocyte-macrophage system in various organs, including the liver. Evaluating and managing liver disease in patients with Gaucher disease may be challenging. While hepatic involvement is common in Gaucher disease, its severity, and clinical significance span a wide spectrum, ranging from sub-clinical involvement to liver cirrhosis with its associated complications including portal hypertension. Apart from liver involvement in Gaucher disease, patients with may also suffer from other comorbidities involving the liver. That Gaucher disease itself can mimic hepatic lesions, affect laboratory tests used to characterize liver disease, and may be associated with non-cirrhotic portal hypertension, complicates the diagnostic approach even more. Better understanding of liver involvement in Gaucher disease can spare patients unnecessary invasive testing, and assist physicians in decision making when evaluating patients with Gaucher disease suspected for significant liver disease. This review describes the various clinical manifestations, laboratory and imaging abnormalities that may be encountered when following patients with Gaucher disease for liver involvement. The mechanism for liver disease are discussed, as well as the possible hepato-protective effect of glucocerebroside, and the a diagnostic and treatment approaches. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  7. Visualization of a City Sustainability Index (CSI: Towards Transdisciplinary Approaches Involving Multiple Stakeholders

    Directory of Open Access Journals (Sweden)

    Koichiro Mori

    2015-09-01

    Full Text Available We have developed a visualized 3-D model of a City Sustainability Index (CSI based on our original concept of city sustainability in which a sustainable city is defined as one that maximizes socio-economic benefits while meeting constraint conditions of the environment and socio-economic equity on a permanent basis. The CSI is based on constraint and maximization indicators. Constraint indicators assess whether a city meets the necessary minimum conditions for city sustainability. Maximization indicators measure the benefits that a city generates in socio-economic aspects. When used in the policy-making process, the choice of constraint indicators should be implemented using a top-down approach. In contrast, a bottom-up approach is more suitable for defining maximization indicators because this technique involves multiple stakeholders (in a transdisciplinary approach. Using different materials of various colors, shapes, sizes, we designed and constructed the visualized physical model of the CSI to help people evaluate and compare the performance of different cities in terms of sustainability. The visualized model of the CSI can convey complicated information in a simple and straightforward manner to diverse stakeholders so that the sustainability analysis can be understood intuitively by ordinary citizens as well as experts. Thus, the CSI model helps stakeholders to develop critical thinking about city sustainability and enables policymakers to make informed decisions for sustainability through a transdisciplinary approach.

  8. Towards a Semantic E-Learning Theory by Using a Modelling Approach

    Science.gov (United States)

    Yli-Luoma, Pertti V. J.; Naeve, Ambjorn

    2006-01-01

    In the present study, a semantic perspective on e-learning theory is advanced and a modelling approach is used. This modelling approach towards the new learning theory is based on the four SECI phases of knowledge conversion: Socialisation, Externalisation, Combination and Internalisation, introduced by Nonaka in 1994, and involving two levels of…

  9. Customer involvement in greening the supply chain: an interpretive structural modeling methodology

    Science.gov (United States)

    Kumar, Sanjay; Luthra, Sunil; Haleem, Abid

    2013-04-01

    The role of customers in green supply chain management needs to be identified and recognized as an important research area. This paper is an attempt to explore the involvement aspect of customers towards greening of the supply chain (SC). An empirical research approach has been used to collect primary data to rank different variables for effective customer involvement in green concept implementation in SC. An interpretive structural-based model has been presented, and variables have been classified using matrice d' impacts croises- multiplication appliqué a un classement analysis. Contextual relationships among variables have been established using experts' opinions. The research may help practicing managers to understand the interaction among variables affecting customer involvement. Further, this understanding may be helpful in framing the policies and strategies to green SC. Analyzing interaction among variables for effective customer involvement in greening SC to develop the structural model in the Indian perspective is an effort towards promoting environment consciousness.

  10. A Bayesian approach for quantification of model uncertainty

    International Nuclear Information System (INIS)

    Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.

    2010-01-01

    In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.

  11. Integrated Transport Planning Framework Involving Combined Utility Regret Approach

    DEFF Research Database (Denmark)

    Wang, Yang; Monzon, Andres; Di Ciommo, Floridea

    2014-01-01

    Sustainable transport planning requires an integrated approach involving strategic planning, impact analysis, and multicriteria evaluation. This study aimed at relaxing the utility-based decision-making assumption by newly embedding anticipated-regret and combined utility regret decision mechanisms...... in a framework for integrated transport planning. The framework consisted of a two-round Delphi survey, integrated land use and transport model for Madrid, and multicriteria analysis. Results show that (a) the regret-based ranking has a similar mean but larger variance than the utility-based ranking does, (b......) the least-regret scenario forms a compromise between the desired and the expected scenarios, (c) the least-regret scenario can lead to higher user benefits in the short term and lower user benefits in the long term, (d) the utility-based, the regret-based, and the combined utility- and regret...

  12. Banking Crisis Early Warning Model based on a Bayesian Model Averaging Approach

    Directory of Open Access Journals (Sweden)

    Taha Zaghdoudi

    2016-08-01

    Full Text Available The succession of banking crises in which most have resulted in huge economic and financial losses, prompted several authors to study their determinants. These authors constructed early warning models to prevent their occurring. It is in this same vein as our study takes its inspiration. In particular, we have developed a warning model of banking crises based on a Bayesian approach. The results of this approach have allowed us to identify the involvement of the decline in bank profitability, deterioration of the competitiveness of the traditional intermediation, banking concentration and higher real interest rates in triggering bank crisis.

  13. A survey on computational intelligence approaches for predictive modeling in prostate cancer

    OpenAIRE

    Cosma, G; Brown, D; Archer, M; Khan, M; Pockley, AG

    2017-01-01

    Predictive modeling in medicine involves the development of computational models which are capable of analysing large amounts of data in order to predict healthcare outcomes for individual patients. Computational intelligence approaches are suitable when the data to be modelled are too complex forconventional statistical techniques to process quickly and eciently. These advanced approaches are based on mathematical models that have been especially developed for dealing with the uncertainty an...

  14. A dual model approach to ground water recovery trench design

    International Nuclear Information System (INIS)

    Clodfelter, C.L.; Crouch, M.S.

    1992-01-01

    The design of trenches for contaminated ground water recovery must consider several variables. This paper presents a dual-model approach for effectively recovering contaminated ground water migrating toward a trench by advection. The approach involves an analytical model to determine the vertical influence of the trench and a numerical flow model to determine the capture zone within the trench and the surrounding aquifer. The analytical model is utilized by varying trench dimensions and head values to design a trench which meets the remediation criteria. The numerical flow model is utilized to select the type of backfill and location of sumps within the trench. The dual-model approach can be used to design a recovery trench which effectively captures advective migration of contaminants in the vertical and horizontal planes

  15. An Integrated Approach for the Numerical Modelling of the Spray Forming Process

    DEFF Research Database (Denmark)

    Hattel, Jesper; Thorborg, Jesper; Pryds, Nini

    2003-01-01

    In this paper, an integrated approach for modelling the entire spray forming process is presented. The basis for the analysis is a recently developed model which extents previous studies and includes the interaction between an array of droplets and the enveloping gas. The formulation of the depos......In this paper, an integrated approach for modelling the entire spray forming process is presented. The basis for the analysis is a recently developed model which extents previous studies and includes the interaction between an array of droplets and the enveloping gas. The formulation...... is in fact the summation of "local" droplet size distributions along the r-axis. Furthermore, the deposition model proposed in the paper involves both the sticking efficiency of the droplets to the substrate as well as a geometrical model involving the effects of shadowing for the production of billet...

  16. A Blended Learning Approach to Teaching Project Management: A Model for Active Participation and Involvement: Insights from Norway

    Directory of Open Access Journals (Sweden)

    Bassam A. Hussein

    2015-04-01

    Full Text Available The paper demonstrates and evaluates the effectiveness of a blended learning approach to create a meaningful learning environment. We use the term blended learning approach in this paper to refer to the use of multiple or hybrid instructional methods that emphasize the role of learners as contributors to the learning process rather than recipients of learning. Contribution to learning is attained by using in class gaming as pathways that ensure active involvement of learners. Using a blended learning approach is important in order to be able to address different learning styles of the target group. The approach was also important in order to be able to demonstrate different types of challenges, issues and competences needed in project management. Student evaluations of the course confirmed that the use of multiple learning methods and, in particular, in class gaming was beneficial and contributed to a meaningful learning experience.

  17. Model correction factor method for reliability problems involving integrals of non-Gaussian random fields

    DEFF Research Database (Denmark)

    Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der

    2002-01-01

    The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals ...

  18. Multirule Based Diagnostic Approach for the Fog Predictions Using WRF Modelling Tool

    Directory of Open Access Journals (Sweden)

    Swagata Payra

    2014-01-01

    Full Text Available The prediction of fog onset remains difficult despite the progress in numerical weather prediction. It is a complex process and requires adequate representation of the local perturbations in weather prediction models. It mainly depends upon microphysical and mesoscale processes that act within the boundary layer. This study utilizes a multirule based diagnostic (MRD approach using postprocessing of the model simulations for fog predictions. The empiricism involved in this approach is mainly to bridge the gap between mesoscale and microscale variables, which are related to mechanism of the fog formation. Fog occurrence is a common phenomenon during winter season over Delhi, India, with the passage of the western disturbances across northwestern part of the country accompanied with significant amount of moisture. This study implements the above cited approach for the prediction of occurrences of fog and its onset time over Delhi. For this purpose, a high resolution weather research and forecasting (WRF model is used for fog simulations. The study involves depiction of model validation and postprocessing of the model simulations for MRD approach and its subsequent application to fog predictions. Through this approach model identified foggy and nonfoggy days successfully 94% of the time. Further, the onset of fog events is well captured within an accuracy of 30–90 minutes. This study demonstrates that the multirule based postprocessing approach is a useful and highly promising tool in improving the fog predictions.

  19. A modelling approach for improved implementation of information technology in manufacturing systems

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Langer, Gilad; Kirkby, Lars Phillip

    2000-01-01

    concept into practice. The paper demonstrates the use of the approach in a practical case, which involves modelling of the shop floor activities and control system at the aluminium parts production at a Danish manufacturer of state-of-the-art audio-video equipment and telephones.......The paper presents a modelling approach, which is based on the multiple view perspective of Soft Systems Methodology and an encapsulation of these perspectives into an object orientated model. The approach provides a structured procedure for putting theoretical abstractions of a new production...

  20. Application of declarative modeling approaches for external events

    International Nuclear Information System (INIS)

    Anoba, R.C.

    2005-01-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at Nuclear Power Plants. Since the issuance of Generic Letter 88-20 and subsequent IPE/IPEEE assessments, the NRC has issued several Regulatory Guides such as RG 1.174 to describe the use of PSA in risk-informed regulation activities. Most PSA have the capability to address internal events including internal floods. As the more demands are being placed for using the PSA to support risk-informed applications, there has been a growing need to integrate other eternal events (Seismic, Fire, etc.) into the logic models. Most external events involve spatial dependencies and usually impact the logic models at the component level. Therefore, manual insertion of external events impacts into a complex integrated fault tree model may be too cumbersome for routine uses of the PSA. Within the past year, a declarative modeling approach has been developed to automate the injection of external events into the PSA. The intent of this paper is to introduce the concept of declarative modeling in the context of external event applications. A declarative modeling approach involves the definition of rules for injection of external event impacts into the fault tree logic. A software tool such as the EPRI's XInit program can be used to interpret the pre-defined rules and automatically inject external event elements into the PSA. The injection process can easily be repeated, as required, to address plant changes, sensitivity issues, changes in boundary conditions, etc. External event elements may include fire initiating events, seismic initiating events, seismic fragilities, fire-induced hot short events, special human failure events, etc. This approach has been applied at a number of US nuclear power plants including a nuclear power plant in Romania. (authors)

  1. Eutrophication Modeling Using Variable Chlorophyll Approach

    International Nuclear Information System (INIS)

    Abdolabadi, H.; Sarang, A.; Ardestani, M.; Mahjoobi, E.

    2016-01-01

    In this study, eutrophication was investigated in Lake Ontario to identify the interactions among effective drivers. The complexity of such phenomenon was modeled using a system dynamics approach based on a consideration of constant and variable stoichiometric ratios. The system dynamics approach is a powerful tool for developing object-oriented models to simulate complex phenomena that involve feedback effects. Utilizing stoichiometric ratios is a method for converting the concentrations of state variables. During the physical segmentation of the model, Lake Ontario was divided into two layers, i.e., the epilimnion and hypolimnion, and differential equations were developed for each layer. The model structure included 16 state variables related to phytoplankton, herbivorous zooplankton, carnivorous zooplankton, ammonium, nitrate, dissolved phosphorus, and particulate and dissolved carbon in the epilimnion and hypolimnion during a time horizon of one year. The results of several tests to verify the model, close to 1 Nash-Sutcliff coefficient (0.98), the data correlation coefficient (0.98), and lower standard errors (0.96), have indicated well-suited model’s efficiency. The results revealed that there were significant differences in the concentrations of the state variables in constant and variable stoichiometry simulations. Consequently, the consideration of variable stoichiometric ratios in algae and nutrient concentration simulations may be applied in future modeling studies to enhance the accuracy of the results and reduce the likelihood of inefficient control policies.

  2. Setting conservation management thresholds using a novel participatory modeling approach.

    Science.gov (United States)

    Addison, P F E; de Bie, K; Rumpff, L

    2015-10-01

    We devised a participatory modeling approach for setting management thresholds that show when management intervention is required to address undesirable ecosystem changes. This approach was designed to be used when management thresholds: must be set for environmental indicators in the face of multiple competing objectives; need to incorporate scientific understanding and value judgments; and will be set by participants with limited modeling experience. We applied our approach to a case study where management thresholds were set for a mat-forming brown alga, Hormosira banksii, in a protected area management context. Participants, including management staff and scientists, were involved in a workshop to test the approach, and set management thresholds to address the threat of trampling by visitors to an intertidal rocky reef. The approach involved trading off the environmental objective, to maintain the condition of intertidal reef communities, with social and economic objectives to ensure management intervention was cost-effective. Ecological scenarios, developed using scenario planning, were a key feature that provided the foundation for where to set management thresholds. The scenarios developed represented declines in percent cover of H. banksii that may occur under increased threatening processes. Participants defined 4 discrete management alternatives to address the threat of trampling and estimated the effect of these alternatives on the objectives under each ecological scenario. A weighted additive model was used to aggregate participants' consequence estimates. Model outputs (decision scores) clearly expressed uncertainty, which can be considered by decision makers and used to inform where to set management thresholds. This approach encourages a proactive form of conservation, where management thresholds and associated actions are defined a priori for ecological indicators, rather than reacting to unexpected ecosystem changes in the future. © 2015 The

  3. Approaches to child protection case management for cases involving people with disabilities.

    Science.gov (United States)

    Lightfoot, Elizabeth B; LaLiberte, Traci L

    2006-04-01

    This exploratory study examines the delivery of child protection services by county child protection agencies involving cases with a family member with a disability. Telephone surveys were conducted with the directors or their designees of 89% of the child protection agencies in a Midwestern state. Respondents were asked about the policies and/or procedures for approaching cases involving a person with a disability and the barriers and strengths agencies have in serving people with disabilities. Only 6.7% of respondents reported their agency had a written policy related to serving persons with a disability. There were 18 different approaches to serving clients with a disability within child protection, with the most common being informally teaming for information, dual case assignment, and teaming with an outside consultant. Five counties had specialty workers who were experts in both child protection and disability. Barriers reported varied between rural and non-rural counties, with the most important barriers being lack of resources, lack of knowledge regarding disabilities, systems conflicts, and rural issues, such as lack of providers and lack of transportation. Strengths included accessing and coordinating services, individualizing services, good collaboration and creativity. While few county agencies had any written policies, both formal and informal collaboration is happening at the individual level. The lack of standardization in providing services indicates a need for more attention to issues regarding disability within child protection, including more training for workers, the development of models of collaborative case management and the removal of systemic barriers.

  4. Modelling the fathering role: Experience in the family of origin and father involvement

    Directory of Open Access Journals (Sweden)

    Mihić Ivana

    2012-01-01

    Full Text Available The study presented in this paper deals with the effects of experiences with father in the family of origin on the fathering role in the family of procreation. The results of the studies so far point to great importance of such experiences in parental role modelling, while recent approaches have suggested the concept of introjected notion or an internal working model of the fathering role as the way to operationalise the transgenerational transfer. The study included 247 two-parent couple families whose oldest child attended preschool education. Fathers provided information on self-assessed involvement via the Inventory of father involvement, while both fathers and mothers gave information on introjected experiences from the family of origin via the inventory Presence of the father in the family of origin. It was shown that father’s experiences from the family of origin had significant direct effects on his involvement in child-care. Very important experiences were those of negative emotional exchange, physical closeness and availability of the father, as well as beliefs about the importance of the father as a parent. Although maternal experiences from the family of origin did not contribute significantly to father involvement, shared beliefs about father’s importance as a parent in the parenting alliance had an effect on greater involvement in child-care. The data provide confirmation of the hypotheses on modelling of the fathering role, but also open the issue of the factor of intergenerational maintenance of traditional forms of father involvement in families in Serbia.

  5. Modeling healthcare authorization and claim submissions using the openEHR dual-model approach

    Science.gov (United States)

    2011-01-01

    Background The TISS standard is a set of mandatory forms and electronic messages for healthcare authorization and claim submissions among healthcare plans and providers in Brazil. It is not based on formal models as the new generation of health informatics standards suggests. The objective of this paper is to model the TISS in terms of the openEHR archetype-based approach and integrate it into a patient-centered EHR architecture. Methods Three approaches were adopted to model TISS. In the first approach, a set of archetypes was designed using ENTRY subclasses. In the second one, a set of archetypes was designed using exclusively ADMIN_ENTRY and CLUSTERs as their root classes. In the third approach, the openEHR ADMIN_ENTRY is extended with classes designed for authorization and claim submissions, and an ISM_TRANSITION attribute is added to the COMPOSITION class. Another set of archetypes was designed based on this model. For all three approaches, templates were designed to represent the TISS forms. Results The archetypes based on the openEHR RM (Reference Model) can represent all TISS data structures. The extended model adds subclasses and an attribute to the COMPOSITION class to represent information on authorization and claim submissions. The archetypes based on all three approaches have similar structures, although rooted in different classes. The extended openEHR RM model is more semantically aligned with the concepts involved in a claim submission, but may disrupt interoperability with other systems and the current tools must be adapted to deal with it. Conclusions Modeling the TISS standard by means of the openEHR approach makes it aligned with ISO recommendations and provides a solid foundation on which the TISS can evolve. Although there are few administrative archetypes available, the openEHR RM is expressive enough to represent the TISS standard. This paper focuses on the TISS but its results may be extended to other billing processes. A complete

  6. Assessing the economic impact of paternal involvement: a comparison of the generalized linear model versus decision analysis trees.

    Science.gov (United States)

    Salihu, Hamisu M; Salemi, Jason L; Nash, Michelle C; Chandler, Kristen; Mbah, Alfred K; Alio, Amina P

    2014-08-01

    Lack of paternal involvement has been shown to be associated with adverse pregnancy outcomes, including infant morbidity and mortality, but the impact on health care costs is unknown. Various methodological approaches have been used in cost minimization and cost effectiveness analyses and it remains unclear how cost estimates vary according to the analytic strategy adopted. We illustrate a methodological comparison of decision analysis modeling and generalized linear modeling (GLM) techniques using a case study that assesses the cost-effectiveness of potential father involvement interventions. We conducted a 12-year retrospective cohort study using a statewide enhanced maternal-infant database that contains both clinical and nonclinical information. A missing name for the father on the infant's birth certificate was used as a proxy for lack of paternal involvement, the main exposure of this study. Using decision analysis modeling and GLM, we compared all infant inpatient hospitalization costs over the first year of life. Costs were calculated from hospital charges using department-level cost-to-charge ratios and were adjusted for inflation. In our cohort of 2,243,891 infants, 9.2% had a father uninvolved during pregnancy. Lack of paternal involvement was associated with higher rates of preterm birth, small-for-gestational age, and infant morbidity and mortality. Both analytic approaches estimate significantly higher per-infant costs for father uninvolved pregnancies (decision analysis model: $1,827, GLM: $1,139). This paper provides sufficient evidence that healthcare costs could be significantly reduced through enhanced father involvement during pregnancy, and buttresses the call for a national program to involve fathers in antenatal care.

  7. Modeling flow in fractured medium. Uncertainty analysis with stochastic continuum approach

    International Nuclear Information System (INIS)

    Niemi, A.

    1994-01-01

    For modeling groundwater flow in formation-scale fractured media, no general method exists for scaling the highly heterogeneous hydraulic conductivity data to model parameters. The deterministic approach is limited in representing the heterogeneity of a medium and the application of fracture network models has both conceptual and practical limitations as far as site-scale studies are concerned. The study investigates the applicability of stochastic continuum modeling at the scale of data support. No scaling of the field data is involved, and the original variability is preserved throughout the modeling. Contributions of various aspects to the total uncertainty in the modeling prediction can also be determined with this approach. Data from five crystalline rock sites in Finland are analyzed. (107 refs., 63 figs., 7 tabs.)

  8. Impact of resilience and job involvement on turnover intention of new graduate nurses using structural equation modeling.

    Science.gov (United States)

    Yu, Mi; Lee, Haeyoung

    2018-03-06

    Nurses' turnover intention is not just a result of their maladjustment to the field; it is an organizational issue. This study aimed to construct a structural model to verify the effects of new graduate nurses' work environment satisfaction, emotional labor, and burnout on their turnover intention, with consideration of resilience and job involvement, and to test the adequacy of the developed model. A cross-sectional study and a structural equation modelling approach were used. A nationwide survey was conducted of 371 new nurses who were working in hospitals for ≤18 months between July and October, 2014. The final model accounted for 40% of the variance in turnover intention. Emotional labor and burnout had a significant positive direct effect and an indirect effect on nurses' turnover intention. Resilience had a positive direct effect on job involvement. Job involvement had a negative direct effect on turnover intention. Resilience and job involvement mediated the effect of work environment satisfaction, emotional labor, and burnout on turnover intention. It is important to strengthen new graduate nurses' resilience in order to increase their job involvement and to reduce their turnover intention. © 2018 Japan Academy of Nursing Science.

  9. Modeling healthcare authorization and claim submissions using the openEHR dual-model approach

    Directory of Open Access Journals (Sweden)

    Freire Sergio M

    2011-10-01

    Full Text Available Abstract Background The TISS standard is a set of mandatory forms and electronic messages for healthcare authorization and claim submissions among healthcare plans and providers in Brazil. It is not based on formal models as the new generation of health informatics standards suggests. The objective of this paper is to model the TISS in terms of the openEHR archetype-based approach and integrate it into a patient-centered EHR architecture. Methods Three approaches were adopted to model TISS. In the first approach, a set of archetypes was designed using ENTRY subclasses. In the second one, a set of archetypes was designed using exclusively ADMIN_ENTRY and CLUSTERs as their root classes. In the third approach, the openEHR ADMIN_ENTRY is extended with classes designed for authorization and claim submissions, and an ISM_TRANSITION attribute is added to the COMPOSITION class. Another set of archetypes was designed based on this model. For all three approaches, templates were designed to represent the TISS forms. Results The archetypes based on the openEHR RM (Reference Model can represent all TISS data structures. The extended model adds subclasses and an attribute to the COMPOSITION class to represent information on authorization and claim submissions. The archetypes based on all three approaches have similar structures, although rooted in different classes. The extended openEHR RM model is more semantically aligned with the concepts involved in a claim submission, but may disrupt interoperability with other systems and the current tools must be adapted to deal with it. Conclusions Modeling the TISS standard by means of the openEHR approach makes it aligned with ISO recommendations and provides a solid foundation on which the TISS can evolve. Although there are few administrative archetypes available, the openEHR RM is expressive enough to represent the TISS standard. This paper focuses on the TISS but its results may be extended to other billing

  10. Bullying Prevention and the Parent Involvement Model

    Science.gov (United States)

    Kolbert, Jered B.; Schultz, Danielle; Crothers, Laura M.

    2014-01-01

    A recent meta-analysis of bullying prevention programs provides support for social-ecological theory, in which parent involvement addressing child bullying behaviors is seen as important in preventing school-based bullying. The purpose of this manuscript is to suggest how Epstein and colleagues' parent involvement model can be used as a…

  11. An Effect of the Environmental Pollution via Mathematical Model Involving the Mittag-Leffler Function

    Directory of Open Access Journals (Sweden)

    Anjali Goswami

    2017-08-01

    Full Text Available In the existing condition estimation of pollution effect on environment is big change for all of us. In this study we develop a new approach to estimate the effect of pollution on environment via mathematical model which involves the generalized Mittag-Leffler function of one variable $E_{\\alpha_{2},\\delta_{1};\\alpha_{3},\\delta_{2}}^{\\gamma_{1},\\alpha_{1}} (z$ which we introduced here.

  12. Modeling interdisciplinary activities involving Mathematics

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    2006-01-01

    In this paper a didactical model is presented. The goal of the model is to work as a didactical tool, or conceptual frame, for developing, carrying through and evaluating interdisciplinary activities involving the subject of mathematics and philosophy in the high schools. Through the terms...... of Horizontal Intertwining, Vertical Structuring and Horizontal Propagation the model consists of three phases, each considering different aspects of the nature of interdisciplinary activities. The theoretical modelling is inspired by work which focuses on the students abilities to concept formation in expanded...... domains (Michelsen, 2001, 2005a, 2005b). Furthermore the theoretical description rest on a series of qualitative interviews with teachers from the Danish high school (grades 9-11) conducted recently. The special case of concrete interdisciplinary activities between mathematics and philosophy is also...

  13. Nuclear physics for applications. A model approach

    International Nuclear Information System (INIS)

    Prussin, S.G.

    2007-01-01

    Written by a researcher and teacher with experience at top institutes in the US and Europe, this textbook provides advanced undergraduates minoring in physics with working knowledge of the principles of nuclear physics. Simplifying models and approaches reveal the essence of the principles involved, with the mathematical and quantum mechanical background integrated in the text where it is needed and not relegated to the appendices. The practicality of the book is enhanced by numerous end-of-chapter problems and solutions available on the Wiley homepage. (orig.)

  14. Learning models of activities involving interacting objects

    DEFF Research Database (Denmark)

    Manfredotti, Cristina; Pedersen, Kim Steenstrup; Hamilton, Howard J.

    2013-01-01

    We propose the LEMAIO multi-layer framework, which makes use of hierarchical abstraction to learn models for activities involving multiple interacting objects from time sequences of data concerning the individual objects. Experiments in the sea navigation domain yielded learned models that were t...

  15. A Hybrid Artificial Reputation Model Involving Interaction Trust, Witness Information and the Trust Model to Calculate the Trust Value of Service Providers

    Directory of Open Access Journals (Sweden)

    Gurdeep Singh Ransi

    2014-02-01

    Full Text Available Agent interaction in a community, such as the online buyer-seller scenario, is often uncertain, as when an agent comes in contact with other agents they initially know nothing about each other. Currently, many reputation models are developed that help service consumers select better service providers. Reputation models also help agents to make a decision on who they should trust and transact with in the future. These reputation models are either built on interaction trust that involves direct experience as a source of information or they are built upon witness information also known as word-of-mouth that involves the reports provided by others. Neither the interaction trust nor the witness information models alone succeed in such uncertain interactions. In this paper we propose a hybrid reputation model involving both interaction trust and witness information to address the shortcomings of existing reputation models when taken separately. A sample simulation is built to setup buyer-seller services and uncertain interactions. Experiments reveal that the hybrid approach leads to better selection of trustworthy agents where consumers select more reputable service providers, eventually helping consumers obtain more gains. Furthermore, the trust model developed is used in calculating trust values of service providers.

  16. Agent-based modeling: a new approach for theory building in social psychology.

    Science.gov (United States)

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.

  17. Compilation of information on uncertainties involved in deposition modeling

    International Nuclear Information System (INIS)

    Lewellen, W.S.; Varma, A.K.; Sheng, Y.P.

    1985-04-01

    The current generation of dispersion models contains very simple parameterizations of deposition processes. The analysis here looks at the physical mechanisms governing these processes in an attempt to see if more valid parameterizations are available and what level of uncertainty is involved in either these simple parameterizations or any more advanced parameterization. The report is composed of three parts. The first, on dry deposition model sensitivity, provides an estimate of the uncertainty existing in current estimates of the deposition velocity due to uncertainties in independent variables such as meteorological stability, particle size, surface chemical reactivity and canopy structure. The range of uncertainty estimated for an appropriate dry deposition velocity for a plume generated by a nuclear power plant accident is three orders of magnitude. The second part discusses the uncertainties involved in precipitation scavenging rates for effluents resulting from a nuclear reactor accident. The conclusion is that major uncertainties are involved both as a result of the natural variability of the atmospheric precipitation process and due to our incomplete understanding of the underlying process. The third part involves a review of the important problems associated with modeling the interaction between the atmosphere and a forest. It gives an indication of the magnitude of the problem involved in modeling dry deposition in such environments. Separate analytics have been done for each section and are contained in the EDB

  18. Laboratory approaches of nuclear reactions involved in primordial and stellar nucleosynthesis

    International Nuclear Information System (INIS)

    Rolfs, C.; California Inst. of Tech., Pasadena

    1986-01-01

    Laboratory-based studies of primordial and stellar nucleosynthesis are reviewed, with emphasis on the nuclear reactions induced by charged particles. The analytical approach used to investigate nuclear reactions associated with stellar reactions is described, as well as the experimental details and procedures used to investigate nuclear reactions induced by charged particles. The present knowledge of some of the key reactions involved in primordial nucleosynthesis is discussed, along with the progress and problems of nuclear reactions involved in the hydrogen and helium burning phases of a star. Finally, a description is given of new experimental techniques which might be useful for future experiments in the field of nuclear astrophysics. (U.K.)

  19. A new approach for developing adjoint models

    Science.gov (United States)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and

  20. Five challenges for stochastic epidemic models involving global transmission

    Directory of Open Access Journals (Sweden)

    Tom Britton

    2015-03-01

    Full Text Available The most basic stochastic epidemic models are those involving global transmission, meaning that infection rates depend only on the type and state of the individuals involved, and not on their location in the population. Simple as they are, there are still several open problems for such models. For example, when will such an epidemic go extinct and with what probability (questions depending on the population being fixed, changing or growing? How can a model be defined explaining the sometimes observed scenario of frequent mid-sized epidemic outbreaks? How can evolution of the infectious agent transmission rates be modelled and fitted to data in a robust way?

  1. Place Branding and Citizen Involvement: Participatory Approach to Building and Managing City Brands

    Directory of Open Access Journals (Sweden)

    Hereźniak Marta

    2017-06-01

    Full Text Available This article examines the role of citizens in the process of building and managing city brands. A multidisciplinary approach is applied to explain the multifaceted nature of territorial brands and citizen involvement. To this end, theoretical concepts from marketing and corporate branding, public management, and human geography are applied. By conceptualising place branding as a public policy and a governance process, and drawing from the concept of participatory place branding, the author discusses a variety of methods and instruments used to involve citizens. Special attention is given to the importance of modern technologies for effective citizen involvement.

  2. Gang Involvement among Immigrant and Refugee Youth: A Developmental Ecological Systems Approach

    Science.gov (United States)

    Goodrum, Nada M.; Chan, Wing Yi; Latzman, Robert D.

    2015-01-01

    Immigrant and refugee youth are at elevated risk for joining gangs, which, in turn, is associated with a host of maladaptive outcomes. Previous literature on risk and protective factors for immigrant and refugee youth gang involvement has been inconclusive. Applying a developmental ecological systems approach, this study investigated contextual…

  3. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    Science.gov (United States)

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  4. Modeling human learning involved in car driving

    NARCIS (Netherlands)

    Wewerinke, P.H.

    1994-01-01

    In this paper, car driving is considered at the level of human tracking and maneuvering in the context of other traffic. A model analysis revealed the most salient features determining driving performance and safety. Learning car driving is modelled based on a system theoretical approach and based

  5. Developing a conceptual model for the application of patient and public involvement in the healthcare system in Iran.

    Science.gov (United States)

    Azmal, Mohammad; Sari, Ali Akbari; Foroushani, Abbas Rahimi; Ahmadi, Batoul

    2016-06-01

    Patient and public involvement is engaging patients, providers, community representatives, and the public in healthcare planning and decision-making. The purpose of this study was to develop a model for the application of patient and public involvement in decision making in the Iranian healthcare system. A mixed qualitative-quantitative approach was used to develop a conceptual model. Thirty three key informants were purposely recruited in the qualitative stage, and 420 people (patients and their companions) were included in a protocol study that was implemented in five steps: 1) Identifying antecedents, consequences, and variables associated with the patient and the publics' involvement in healthcare decision making through a comprehensive literature review; 2) Determining the main variables in the context of Iran's health system using conceptual framework analysis; 3) Prioritizing and weighting variables by Shannon entropy; 4) designing and validating a tool for patient and public involvement in healthcare decision making; and 5) Providing a conceptual model of patient and the public involvement in planning and developing healthcare using structural equation modeling. We used various software programs, including SPSS (17), Max QDA (10), EXCEL, and LISREL. Content analysis, Shannon entropy, and descriptive and analytic statistics were used to analyze the data. In this study, seven antecedents variable, five dimensions of involvement, and six consequences were identified. These variables were used to design a valid tool. A logical model was derived that explained the logical relationships between antecedent and consequent variables and the dimensions of patient and public involvement as well. Given the specific context of the political, social, and innovative environments in Iran, it was necessary to design a model that would be compatible with these features. It can improve the quality of care and promote the patient and the public satisfaction with healthcare and

  6. TRIF - an intermediate approach to environmental tritium modelling

    International Nuclear Information System (INIS)

    Higgins, N.A.

    1997-01-01

    The movement of tritium through the environment, from an initial atmospheric release to selected end points in the food chain, involves a series of closely coupled and complex processes which are, consequently, difficult to model. TRIF (tritium transfer into food) provides a semi-empirical approach to this transport problem, which can be adjusted to bridge the gap between simple steady state approximations and a fully coupled model of tritium dispersion and migration (Higgins et al., 1996). TRIF provides a time-dependent description of the behaviour of tritium in the form of tritium gas (HT) and tritiated water (HTO) as it enters and moves through the food chain into pasture, crops and animals. This includes a representation of the production and movement of organically bound tritium (OBT). (Author)

  7. The Missing Stakeholder Group: Why Patients Should be Involved in Health Economic Modelling.

    Science.gov (United States)

    van Voorn, George A K; Vemer, Pepijn; Hamerlijnck, Dominique; Ramos, Isaac Corro; Teunissen, Geertruida J; Al, Maiwenn; Feenstra, Talitha L

    2016-04-01

    Evaluations of healthcare interventions, e.g. new drugs or other new treatment strategies, commonly include a cost-effectiveness analysis (CEA) that is based on the application of health economic (HE) models. As end users, patients are important stakeholders regarding the outcomes of CEAs, yet their knowledge of HE model development and application, or their involvement therein, is absent. This paper considers possible benefits and risks of patient involvement in HE model development and application for modellers and patients. An exploratory review of the literature has been performed on stakeholder-involved modelling in various disciplines. In addition, Dutch patient experts have been interviewed about their experience in, and opinion about, the application of HE models. Patients have little to no knowledge of HE models and are seldom involved in HE model development and application. Benefits of becoming involved would include a greater understanding and possible acceptance by patients of HE model application, improved model validation, and a more direct infusion of patient expertise. Risks would include patient bias and increased costs of modelling. Patient involvement in HE modelling seems to carry several benefits as well as risks. We claim that the benefits may outweigh the risks and that patients should become involved.

  8. Modeling of scale-dependent bacterial growth by chemical kinetics approach.

    Science.gov (United States)

    Martínez, Haydee; Sánchez, Joaquín; Cruz, José-Manuel; Ayala, Guadalupe; Rivera, Marco; Buhse, Thomas

    2014-01-01

    We applied the so-called chemical kinetics approach to complex bacterial growth patterns that were dependent on the liquid-surface-area-to-volume ratio (SA/V) of the bacterial cultures. The kinetic modeling was based on current experimental knowledge in terms of autocatalytic bacterial growth, its inhibition by the metabolite CO2, and the relief of inhibition through the physical escape of the inhibitor. The model quantitatively reproduces kinetic data of SA/V-dependent bacterial growth and can discriminate between differences in the growth dynamics of enteropathogenic E. coli, E. coli JM83, and Salmonella typhimurium on one hand and Vibrio cholerae on the other hand. Furthermore, the data fitting procedures allowed predictions about the velocities of the involved key processes and the potential behavior in an open-flow bacterial chemostat, revealing an oscillatory approach to the stationary states.

  9. Modeling of Scale-Dependent Bacterial Growth by Chemical Kinetics Approach

    Directory of Open Access Journals (Sweden)

    Haydee Martínez

    2014-01-01

    Full Text Available We applied the so-called chemical kinetics approach to complex bacterial growth patterns that were dependent on the liquid-surface-area-to-volume ratio (SA/V of the bacterial cultures. The kinetic modeling was based on current experimental knowledge in terms of autocatalytic bacterial growth, its inhibition by the metabolite CO2, and the relief of inhibition through the physical escape of the inhibitor. The model quantitatively reproduces kinetic data of SA/V-dependent bacterial growth and can discriminate between differences in the growth dynamics of enteropathogenic E. coli, E. coli  JM83, and Salmonella typhimurium on one hand and Vibrio cholerae on the other hand. Furthermore, the data fitting procedures allowed predictions about the velocities of the involved key processes and the potential behavior in an open-flow bacterial chemostat, revealing an oscillatory approach to the stationary states.

  10. Optogenetic approaches to evaluate striatal function in animal models of Parkinson disease.

    Science.gov (United States)

    Parker, Krystal L; Kim, Youngcho; Alberico, Stephanie L; Emmons, Eric B; Narayanan, Nandakumar S

    2016-03-01

    Optogenetics refers to the ability to control cells that have been genetically modified to express light-sensitive ion channels. The introduction of optogenetic approaches has facilitated the dissection of neural circuits. Optogenetics allows for the precise stimulation and inhibition of specific sets of neurons and their projections with fine temporal specificity. These techniques are ideally suited to investigating neural circuitry underlying motor and cognitive dysfunction in animal models of human disease. Here, we focus on how optogenetics has been used over the last decade to probe striatal circuits that are involved in Parkinson disease, a neurodegenerative condition involving motor and cognitive abnormalities resulting from degeneration of midbrain dopaminergic neurons. The precise mechanisms underlying the striatal contribution to both cognitive and motor dysfunction in Parkinson disease are unknown. Although optogenetic approaches are somewhat removed from clinical use, insight from these studies can help identify novel therapeutic targets and may inspire new treatments for Parkinson disease. Elucidating how neuronal and behavioral functions are influenced and potentially rescued by optogenetic manipulation in animal models could prove to be translatable to humans. These insights can be used to guide future brain-stimulation approaches for motor and cognitive abnormalities in Parkinson disease and other neuropsychiatric diseases.

  11. The Multi-Scale Model Approach to Thermohydrology at Yucca Mountain

    International Nuclear Information System (INIS)

    Glascoe, L; Buscheck, T A; Gansemer, J; Sun, Y

    2002-01-01

    The Multi-Scale Thermo-Hydrologic (MSTH) process model is a modeling abstraction of them1 hydrology (TH) of the potential Yucca Mountain repository at multiple spatial scales. The MSTH model as described herein was used for the Supplemental Science and Performance Analyses (BSC, 2001) and is documented in detail in CRWMS M and O (2000) and Glascoe et al. (2002). The model has been validated to a nested grid model in Buscheck et al. (In Review). The MSTH approach is necessary for modeling thermal hydrology at Yucca Mountain for two reasons: (1) varying levels of detail are necessary at different spatial scales to capture important TH processes and (2) a fully-coupled TH model of the repository which includes the necessary spatial detail is computationally prohibitive. The MSTH model consists of six ''submodels'' which are combined in a manner to reduce the complexity of modeling where appropriate. The coupling of these models allows for appropriate consideration of mountain-scale thermal hydrology along with the thermal hydrology of drift-scale discrete waste packages of varying heat load. Two stages are involved in the MSTH approach, first, the execution of submodels, and second, the assembly of submodels using the Multi-scale Thermohydrology Abstraction Code (MSTHAC). MSTHAC assembles the submodels in a five-step process culminating in the TH model output of discrete waste packages including a mountain-scale influence

  12. Modeling of problems of projection: A non-countercyclic approach

    Directory of Open Access Journals (Sweden)

    Jason Ginsburg

    2016-06-01

    Full Text Available This paper describes a computational implementation of the recent Problems of Projection (POP approach to the study of language (Chomsky 2013; 2015. While adopting the basic proposals of POP, notably with respect to how labeling occurs, we a attempt to formalize the basic proposals of POP, and b develop new proposals that overcome some problems with POP that arise with respect to cyclicity, labeling, and wh-movement operations. We show how this approach accounts for simple declarative sentences, ECM constructions, and constructions that involve long-distance movement of a wh-phrase (including the that-trace effect. We implemented these proposals with a computer model that automatically constructs step-by-step derivations of target sentences, thus making it possible to verify that these proposals work.

  13. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    Science.gov (United States)

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  14. A complex systems approach to constructing better models for managing financial markets and the economy

    Science.gov (United States)

    Farmer, J. Doyne; Gallegati, M.; Hommes, C.; Kirman, A.; Ormerod, P.; Cincotti, S.; Sanchez, A.; Helbing, D.

    2012-11-01

    We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling.

  15. Structuring a Multiproduct Sales Quota-Bonus Plan for a Heterogeneous Sales Force: A Practical Model-Based Approach

    OpenAIRE

    Murali K. Mantrala; Prabhakant Sinha; Andris A. Zoltners

    1994-01-01

    This paper presents an agency theoretic model-based approach that assists sales managers in determining the profit-maximizing structure of a common multiproduct sales quota-bonus plan for a geographically specialized heterogeneous sales force operating in a repetitive buying environment. This approach involves estimating each salesperson's utility function for income and effort and using these models to predict individual sales achievements and the associated aggregate profit for the firm und...

  16. [New approaches in pharmacology: numerical modelling and simulation].

    Science.gov (United States)

    Boissel, Jean-Pierre; Cucherat, Michel; Nony, Patrice; Dronne, Marie-Aimée; Kassaï, Behrouz; Chabaud, Sylvie

    2005-01-01

    The complexity of pathophysiological mechanisms is beyond the capabilities of traditional approaches. Many of the decision-making problems in public health, such as initiating mass screening, are complex. Progress in genomics and proteomics, and the resulting extraordinary increase in knowledge with regard to interactions between gene expression, the environment and behaviour, the customisation of risk factors and the need to combine therapies that individually have minimal though well documented efficacy, has led doctors to raise new questions: how to optimise choice and the application of therapeutic strategies at the individual rather than the group level, while taking into account all the available evidence? This is essentially a problem of complexity with dimensions similar to the previous ones: multiple parameters with nonlinear relationships between them, varying time scales that cannot be ignored etc. Numerical modelling and simulation (in silico investigations) have the potential to meet these challenges. Such approaches are considered in drug innovation and development. They require a multidisciplinary approach, and this will involve modification of the way research in pharmacology is conducted.

  17. Practical modeling approaches for geological storage of carbon dioxide.

    Science.gov (United States)

    Celia, Michael A; Nordbotten, Jan M

    2009-01-01

    The relentless increase of anthropogenic carbon dioxide emissions and the associated concerns about climate change have motivated new ideas about carbon-constrained energy production. One technological approach to control carbon dioxide emissions is carbon capture and storage, or CCS. The underlying idea of CCS is to capture the carbon before it emitted to the atmosphere and store it somewhere other than the atmosphere. Currently, the most attractive option for large-scale storage is in deep geological formations, including deep saline aquifers. Many physical and chemical processes can affect the fate of the injected CO2, with the overall mathematical description of the complete system becoming very complex. Our approach to the problem has been to reduce complexity as much as possible, so that we can focus on the few truly important questions about the injected CO2, most of which involve leakage out of the injection formation. Toward this end, we have established a set of simplifying assumptions that allow us to derive simplified models, which can be solved numerically or, for the most simplified cases, analytically. These simplified models allow calculation of solutions to large-scale injection and leakage problems in ways that traditional multicomponent multiphase simulators cannot. Such simplified models provide important tools for system analysis, screening calculations, and overall risk-assessment calculations. We believe this is a practical and important approach to model geological storage of carbon dioxide. It also serves as an example of how complex systems can be simplified while retaining the essential physics of the problem.

  18. A novel approach to multihazard modeling and simulation.

    Science.gov (United States)

    Smith, Silas W; Portelli, Ian; Narzisi, Giuseppe; Nelson, Lewis S; Menges, Fabian; Rekow, E Dianne; Mincer, Joshua S; Mishra, Bhubaneswar; Goldfrank, Lewis R

    2009-06-01

    To develop and apply a novel modeling approach to support medical and public health disaster planning and response using a sarin release scenario in a metropolitan environment. An agent-based disaster simulation model was developed incorporating the principles of dose response, surge response, and psychosocial characteristics superimposed on topographically accurate geographic information system architecture. The modeling scenarios involved passive and active releases of sarin in multiple transportation hubs in a metropolitan city. Parameters evaluated included emergency medical services, hospital surge capacity (including implementation of disaster plan), and behavioral and psychosocial characteristics of the victims. In passive sarin release scenarios of 5 to 15 L, mortality increased nonlinearly from 0.13% to 8.69%, reaching 55.4% with active dispersion, reflecting higher initial doses. Cumulative mortality rates from releases in 1 to 3 major transportation hubs similarly increased nonlinearly as a function of dose and systemic stress. The increase in mortality rate was most pronounced in the 80% to 100% emergency department occupancy range, analogous to the previously observed queuing phenomenon. Effective implementation of hospital disaster plans decreased mortality and injury severity. Decreasing ambulance response time and increasing available responding units reduced mortality among potentially salvageable patients. Adverse psychosocial characteristics (excess worry and low compliance) increased demands on health care resources. Transfer to alternative urban sites was possible. An agent-based modeling approach provides a mechanism to assess complex individual and systemwide effects in rare events.

  19. Extension of the direct statistical approach to a volume parameter model (non-integer splitting)

    International Nuclear Information System (INIS)

    Burn, K.W.

    1990-01-01

    The Direct Statistical Approach is a rigorous mathematical derivation of the second moment for surface splitting and Russian Roulette games attached to the Monte Carlo modelling of fixed source particle transport. It has been extended to a volume parameter model (involving non-integer ''expected value'' splitting), and then to a cell model. The cell model gives second moment and time functions that have a closed form. This suggests the possibility of two different methods of solution of the optimum splitting/Russian Roulette parameters. (author)

  20. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  1. Scaling Consumers' Purchase Involvement: A New Approach

    Directory of Open Access Journals (Sweden)

    Jörg Kraigher-Krainer

    2012-06-01

    Full Text Available A two-dimensional scale, called ECID Scale, is presented in this paper. The scale is based on a comprehensive model and captures the two antecedent factors of purchase-related involvement, namely whether motivation is intrinsic or extrinsic and whether risk is perceived as low or high. The procedure of scale development and item selection is described. The scale turns out to perform well in terms of validity, reliability, and objectivity despite the use of a small set of items – four each – allowing for simultaneous measurements of up to ten purchases per respondent. The procedure of administering the scale is described so that it can now easily be applied by both, scholars and practitioners. Finally, managerial implications of data received from its application which provide insights into possible strategic marketing conclusions are discussed.

  2. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    Science.gov (United States)

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  3. A Multi-Model Approach for System Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad; Bækgaard, Mikkel Ask Buur

    2007-01-01

    A multi-model approach for system diagnosis is presented in this paper. The relation with fault diagnosis as well as performance validation is considered. The approach is based on testing a number of pre-described models and find which one is the best. It is based on an active approach......,i.e. an auxiliary input to the system is applied. The multi-model approach is applied on a wind turbine system....

  4. Modelling Approach In Islamic Architectural Designs

    Directory of Open Access Journals (Sweden)

    Suhaimi Salleh

    2014-06-01

    Full Text Available Architectural designs contribute as one of the main factors that should be considered in minimizing negative impacts in planning and structural development in buildings such as in mosques. In this paper, the ergonomics perspective is revisited which hence focuses on the conditional factors involving organisational, psychological, social and population as a whole. This paper tries to highlight the functional and architectural integration with ecstatic elements in the form of decorative and ornamental outlay as well as incorporating the building structure such as wall, domes and gates. This paper further focuses the mathematical aspects of the architectural designs such as polar equations and the golden ratio. These designs are modelled into mathematical equations of various forms, while the golden ratio in mosque is verified using two techniques namely, the geometric construction and the numerical method. The exemplary designs are taken from theSabah Bandaraya Mosque in Likas, Kota Kinabalu and the Sarawak State Mosque in Kuching,while the Universiti Malaysia Sabah Mosque is used for the Golden Ratio. Results show thatIslamic architectural buildings and designs have long had mathematical concepts and techniques underlying its foundation, hence, a modelling approach is needed to rejuvenate these Islamic designs.

  5. A survey on control schemes for distributed solar collector fields. Part I: Modeling and basic control approaches

    Energy Technology Data Exchange (ETDEWEB)

    Camacho, E.F.; Rubio, F.R. [Universidad de Sevilla, Escuela Superior de Ingenieros, Departamento de Ingenieria de Sistemas y Automatica, Camino de Los Descubrimientos s/n, E-41092, Sevilla (Spain); Berenguel, M. [Universidad de Almeria, Departamento de Lenguajes y Computacion, Area de Ingenieria de Sistemas y Automatica, Carretera Sacramento s/n, E-04120 La Canada, Almeria (Spain); Valenzuela, L. [Plataforma Solar de Almeria - CIEMAT, Carretera Senes s/n, P.O. Box 22, E-04200 Tabernas, Almeria (Spain)

    2007-10-15

    This article presents a survey of the different automatic control techniques that have been applied to control the outlet temperature of solar plants with distributed collectors during the last 25 years. Different aspects of the control problem involved in this kind of plants are treated, from modeling and simulation approaches to the different basic control schemes developed and successfully applied in real solar plants. A classification of the modeling and control approaches is used to explain the main features of each strategy. (author)

  6. New business models for electric cars-A holistic approach

    International Nuclear Information System (INIS)

    Kley, Fabian; Lerch, Christian; Dallinger, David

    2011-01-01

    Climate change and global resource shortages have led to rethinking traditional individual mobility services based on combustion engines. As the consequence of technological improvements, the first electric vehicles are now being introduced and greater market penetration can be expected. But any wider implementation of battery-powered electrical propulsion systems in the future will give rise to new challenges for both the traditional automotive industry and other new players, e.g. battery manufacturers, the power supply industry and other service providers. Different application cases of electric vehicles are currently being discussed which means that numerous business models could emerge, leading to new shares in value creation and involving new players. Consequently, individual stakeholders are uncertain about which business models are really effective with regard to targeting a profitable overall concept. Therefore, this paper aims to define a holistic approach to developing business models for electric mobility, which analyzes the system as a whole on the one hand and provides decision support for affected enterprises on the other. To do so, the basic elements of electric mobility are considered and topical approaches to business models for various stakeholders are discussed. The paper concludes by presenting a systemic instrument for business models based on morphological methods. - Highlights: → We present a systemic instrument to analyze business models for electric vehicles. → Provide decision support for an enterprises dealing with electric vehicle innovations. → Combine business aspects of the triad between vehicles concepts, infrastructure as well as system integration. → In the market, activities in all domains have been initiated, but often with undefined or unclear structures.

  7. A modified dynamic evolving neural-fuzzy approach to modeling customer satisfaction for affective design.

    Science.gov (United States)

    Kwong, C K; Fung, K Y; Jiang, Huimin; Chan, K Y; Siu, Kin Wai Michael

    2013-01-01

    Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1) the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS) failed to run due to a large number of inputs; (2) the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort.

  8. A Modified Dynamic Evolving Neural-Fuzzy Approach to Modeling Customer Satisfaction for Affective Design

    Directory of Open Access Journals (Sweden)

    C. K. Kwong

    2013-01-01

    Full Text Available Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1 the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS failed to run due to a large number of inputs; (2 the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort.

  9. Linear mixed-effects modeling approach to FMRI group analysis.

    Science.gov (United States)

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity

  10. The balance space approach to multicriteria decision making—involving the decision maker

    OpenAIRE

    Ehrgott, M.

    2002-01-01

    The balance space approach (introduced by Galperin in 1990) provides a new view on multicriteria optimization. Looking at deviations from global optimality of the different objectives, balance points and balance numbers are defined when either different or equal deviations for each objective are allowed. Apportioned balance numbers allow the specification of proportions among the deviations. Through this concept the decision maker can be involved in the decision process. In this paper we prov...

  11. A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming

    International Nuclear Information System (INIS)

    Meier, Horst; Laurischkat, Roman; Zhu Junhong

    2011-01-01

    One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi body system model and its included compensation method.

  12. Patient involvement in mental health care: culture, communication and caution.

    Science.gov (United States)

    Tse, Samson; Tang, Jessica; Kan, Alice

    2015-02-01

    Patient or service user involvement in mental health services (MHS) is a hallmark of the recovery approach. In this viewpoint article, we review Tambuyzer et al. paper 'Patient involvement in mental health care: One size does not fit all' in order to express our opinion of their work. We also suggest specific actions that may enhance the implementation of patient involvement in MHS. We make three main points about Tambuyzer et al. model. First, the cultural dimension of patient involvement seems underemphasized in the model. Second, the model might be improved if the increasing role of communications technology in patient involvement is taken into consideration. Third, it is important to acknowledge that the process of patient involvement is not linear, and participation is not a homogeneous experience. We suggest that the model be expanded and that further work be carried out on the implementation of patient involvement in MHS. © 2012 John Wiley & Sons Ltd.

  13. Regional-scale brine migration along vertical pathways due to CO2 injection - Part 1: The participatory modeling approach

    Science.gov (United States)

    Scheer, Dirk; Konrad, Wilfried; Class, Holger; Kissinger, Alexander; Knopf, Stefan; Noack, Vera

    2017-06-01

    Saltwater intrusion into potential drinking water aquifers due to the injection of CO2 into deep saline aquifers is one of the potential hazards associated with the geological storage of CO2. Thus, in a site selection process, models for predicting the fate of the displaced brine are required, for example, for a risk assessment or the optimization of pressure management concepts. From the very beginning, this research on brine migration aimed at involving expert and stakeholder knowledge and assessment in simulating the impacts of injecting CO2 into deep saline aquifers by means of a participatory modeling process. The involvement exercise made use of two approaches. First, guideline-based interviews were carried out, aiming at eliciting expert and stakeholder knowledge and assessments of geological structures and mechanisms affecting CO2-induced brine migration. Second, a stakeholder workshop including the World Café format yielded evaluations and judgments of the numerical modeling approach, scenario selection, and preliminary simulation results. The participatory modeling approach gained several results covering brine migration in general, the geological model sketch, scenario development, and the review of the preliminary simulation results. These results were included in revised versions of both the geological model and the numerical model, helping to improve the analysis of regional-scale brine migration along vertical pathways due to CO2 injection.

  14. Parental Involvement in the Musical Education of Violin Students: Suzuki and "Traditional" Approaches Compared

    Science.gov (United States)

    Bugeja, Clare

    2009-01-01

    This article investigates parental involvement in the musical education of violin students and the changing role of the parents' across the learning process. Two contexts were compared, one emphasising the Suzuki methodology and the other a "traditional" approach. Students learning "traditionally" are typically taught note reading from the…

  15. Combining engineering and data-driven approaches: Development of a generic fire risk model facilitating calibration

    DEFF Research Database (Denmark)

    De Sanctis, G.; Fischer, K.; Kohler, J.

    2014-01-01

    Fire risk models support decision making for engineering problems under the consistent consideration of the associated uncertainties. Empirical approaches can be used for cost-benefit studies when enough data about the decision problem are available. But often the empirical approaches...... a generic risk model that is calibrated to observed fire loss data. Generic risk models assess the risk of buildings based on specific risk indicators and support risk assessment at a portfolio level. After an introduction to the principles of generic risk assessment, the focus of the present paper...... are not detailed enough. Engineering risk models, on the other hand, may be detailed but typically involve assumptions that may result in a biased risk assessment and make a cost-benefit study problematic. In two related papers it is shown how engineering and data-driven modeling can be combined by developing...

  16. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  17. Electromagnetic forward modelling for realistic Earth models using unstructured tetrahedral meshes and a meshfree approach

    Science.gov (United States)

    Farquharson, C.; Long, J.; Lu, X.; Lelievre, P. G.

    2017-12-01

    Real-life geology is complex, and so, even when allowing for the diffusive, low resolution nature of geophysical electromagnetic methods, we need Earth models that can accurately represent this complexity when modelling and inverting electromagnetic data. This is particularly the case for the scales, detail and conductivity contrasts involved in mineral and hydrocarbon exploration and development, but also for the larger scale of lithospheric studies. Unstructured tetrahedral meshes provide a flexible means of discretizing a general, arbitrary Earth model. This is important when wanting to integrate a geophysical Earth model with a geological Earth model parameterized in terms of surfaces. Finite-element and finite-volume methods can be derived for computing the electric and magnetic fields in a model parameterized using an unstructured tetrahedral mesh. A number of such variants have been proposed and have proven successful. However, the efficiency and accuracy of these methods can be affected by the "quality" of the tetrahedral discretization, that is, how many of the tetrahedral cells in the mesh are long, narrow and pointy. This is particularly the case if one wants to use an iterative technique to solve the resulting linear system of equations. One approach to deal with this issue is to develop sophisticated model and mesh building and manipulation capabilities in order to ensure that any mesh built from geological information is of sufficient quality for the electromagnetic modelling. Another approach is to investigate other methods of synthesizing the electromagnetic fields. One such example is a "meshfree" approach in which the electromagnetic fields are synthesized using a mesh that is distinct from the mesh used to parameterized the Earth model. There are then two meshes, one describing the Earth model and one used for the numerical mathematics of computing the fields. This means that there are no longer any quality requirements on the model mesh, which

  18. Dry deposition models for radionuclides dispersed in air: a new approach for deposition velocity evaluation schema

    Science.gov (United States)

    Giardina, M.; Buffa, P.; Cervone, A.; De Rosa, F.; Lombardo, C.; Casamirra, M.

    2017-11-01

    In the framework of a National Research Program funded by the Italian Minister of Economic Development, the Department of Energy, Information Engineering and Mathematical Models (DEIM) of Palermo University and ENEA Research Centre of Bologna, Italy are performing several research activities to study physical models and mathematical approaches aimed at investigating dry deposition mechanisms of radioactive pollutants. On the basis of such studies, a new approach to evaluate the dry deposition velocity for particles is proposed. Comparisons with some literature experimental data show that the proposed dry deposition scheme can capture the main phenomena involved in the dry deposition process successfully.

  19. Family members' involvement in psychiatric care: experiences of the healthcare professionals' approach and feeling of alienation.

    Science.gov (United States)

    Ewertzon, M; Lützén, K; Svensson, E; Andershed, B

    2010-06-01

    The involvement of family members in psychiatric care is important for the recovery of persons with psychotic disorders and subsequently reduces the burden on the family. Earlier qualitative studies suggest that the participation of family members can be limited by how they experience the professionals' approach, which suggests a connection to the concept of alienation. Thus, the aim of this study was in a national sample investigate family members' experiences of the psychiatric health care professionals' approach. Data were collected by the Family Involvement and Alienation Questionnaire. The median level and quartiles were used to describe the distributions and data were analysed with non-parametric statistical methods. Seventy family members of persons receiving psychiatric care participated in the study. The results indicate that a majority of the participants respond that they have experiencing a negative approach from the professionals, indicating lack of confirmation and cooperation. The results also indicate that a majority of the participants felt powerlessness and social isolation in the care being provided, indicating feelings of alienation. A significant but weak association was found between the family members' experiences of the professionals' approach and their feelings of alienation.

  20. Involving people with learning disabilities in nurse education: towards an inclusive approach.

    Science.gov (United States)

    Bollard, Martin; Lahiff, John; Parkes, Neville

    2012-02-01

    There is limited evidence that explores how to effectively include people with learning disabilities in nurse education in the U.K. The majority of reported work relates to mental health nursing and social work training (Morgan and Jones, 2009). This paper specifically reports on the processes and activities undertaken by the authors with people with learning disabilities in the development of a new BSc learning disability nursing programme, a specific branch of nursing in the U.K. In doing so, findings and discussion from two separate projects involving students and people with learning disabilities will be integrated into the paper. EPICURE (Engagement, Processing, Interpretation, Critique, Usefulness, Relevance and Ethics (Stige et al. 2009) is adopted as a qualitative framework throughout the paper to evaluate the reported work that took place between September 2006 and October 2010. Suggestions are therefore made regarding the benefits and challenges of striving towards an inclusive approach to user involvement in nurse education, with particular reference to learning disability. The work presented in the paper demonstrates how through careful involvement of this population, deeper learning opportunities for all nursing students can be created. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Environmental Management Model for Road Maintenance Operation Involving Community Participation

    Science.gov (United States)

    Triyono, A. R. H.; Setyawan, A.; Sobriyah; Setiono, P.

    2017-07-01

    Public expectations of Central Java, which is very high on demand fulfillment, especially road infrastructure as outlined in the number of complaints and community expectations tweeter, Short Mail Massage (SMS), e-mail and public reports from various media, Highways Department of Central Java province requires development model of environmental management in the implementation of a routine way by involving the community in order to fulfill the conditions of a representative, may serve road users safely and comfortably. This study used survey method with SEM analysis and SWOT with Latent Independent Variable (X), namely; Public Participation in the regulation, development, construction and supervision of road (PSM); Public behavior in the utilization of the road (PMJ) Provincial Road Service (PJP); Safety in the Provincial Road (KJP); Integrated Management System (SMT) and latent dependent variable (Y) routine maintenance of the provincial road that is integrated with the environmental management system and involve the participation of the community (MML). The result showed the implementation of routine maintenance of road conditions in Central Java province has yet to implement an environmental management by involving the community; Therefore developed environmental management model with the results of H1: Community Participation (PSM) has positive influence on the Model of Environmental Management (MML); H2: Behavior Society in Jalan Utilization (PMJ) positive effect on Model Environmental Management (MML); H3: Provincial Road Service (PJP) positive effect on Model Environmental Management (MML); H4: Safety in the Provincial Road (KJP) positive effect on Model Environmental Management (MML); H5: Integrated Management System (SMT) has positive influence on the Model of Environmental Management (MML). From the analysis obtained formulation model describing the relationship / influence of the independent variables PSM, PMJ, PJP, KJP, and SMT on the dependent variable

  2. Surrogate model approach for improving the performance of reactive transport simulations

    Science.gov (United States)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2016-04-01

    Reactive transport models can serve a large number of important geoscientific applications involving underground resources in industry and scientific research. It is common for simulation of reactive transport to consist of at least two coupled simulation models. First is a hydrodynamics simulator that is responsible for simulating the flow of groundwaters and transport of solutes. Hydrodynamics simulators are well established technology and can be very efficient. When hydrodynamics simulations are performed without coupled geochemistry, their spatial geometries can span millions of elements even when running on desktop workstations. Second is a geochemical simulation model that is coupled to the hydrodynamics simulator. Geochemical simulation models are much more computationally costly. This is a problem that makes reactive transport simulations spanning millions of spatial elements very difficult to achieve. To address this problem we propose to replace the coupled geochemical simulation model with a surrogate model. A surrogate is a statistical model created to include only the necessary subset of simulator complexity for a particular scenario. To demonstrate the viability of such an approach we tested it on a popular reactive transport benchmark problem that involves 1D Calcite transport. This is a published benchmark problem (Kolditz, 2012) for simulation models and for this reason we use it to test the surrogate model approach. To do this we tried a number of statistical models available through the caret and DiceEval packages for R, to be used as surrogate models. These were trained on randomly sampled subset of the input-output data from the geochemical simulation model used in the original reactive transport simulation. For validation we use the surrogate model to predict the simulator output using the part of sampled input data that was not used for training the statistical model. For this scenario we find that the multivariate adaptive regression splines

  3. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  4. HEDR modeling approach

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1992-07-01

    This report details the conceptual approaches to be used in calculating radiation doses to individuals throughout the various periods of operations at the Hanford Site. The report considers the major environmental transport pathways--atmospheric, surface water, and ground water--and projects and appropriate modeling technique for each. The modeling sequence chosen for each pathway depends on the available data on doses, the degree of confidence justified by such existing data, and the level of sophistication deemed appropriate for the particular pathway and time period being considered

  5. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  6. DISCRETIZATION APPROACH USING RAY-TESTING MODEL IN PARTING LINE AND PARTING SURFACE GENERATION

    Institute of Scientific and Technical Information of China (English)

    HAN Jianwen; JIAN Bin; YAN Guangrong; LEI Yi

    2007-01-01

    Surface classification, 3D parting line, parting surface generation and demoldability analysis which is helpful to select optimal parting direction and optimal parting line are involved in automatic cavity design based on the ray-testing model. A new ray-testing approach is presented to classify the part surfaces to core/cavity surfaces and undercut surfaces by automatic identifying the visibility of surfaces. A simple, direct and efficient algorithm to identify surface visibility is developed. The algorithm is robust and adapted to rather complicated geometry, so it is valuable in computer-aided mold design systems. To validate the efficiency of the approach, an experimental program is implemented. Case studies show that the approach is practical and valuable in automatic parting line and parting surface generation.

  7. A semi-analytical refrigeration cycle modelling approach for a heat pump hot water heater

    Science.gov (United States)

    Panaras, G.; Mathioulakis, E.; Belessiotis, V.

    2018-04-01

    The use of heat pump systems in applications like the production of hot water or space heating makes important the modelling of the processes for the evaluation of the performance of existing systems, as well as for design purposes. The proposed semi-analytical model offers the opportunity to estimate the performance of a heat pump system producing hot water, without using detailed geometrical or any performance data. This is important, as for many commercial systems the type and characteristics of the involved subcomponents can hardly be detected, thus not allowing the implementation of more analytical approaches or the exploitation of the manufacturers' catalogue performance data. The analysis copes with the issues related with the development of the models of the subcomponents involved in the studied system. Issues not discussed thoroughly in the existing literature, as the refrigerant mass inventory in the case an accumulator is present, are examined effectively.

  8. Addressing dependability by applying an approach for model-based risk assessment

    International Nuclear Information System (INIS)

    Gran, Bjorn Axel; Fredriksen, Rune; Thunem, Atoosa P.-J.

    2007-01-01

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development

  9. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  10. A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model

    Energy Technology Data Exchange (ETDEWEB)

    Pasqualini, Donatella [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-11

    This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimated stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.

  11. Teacher Training in Family Involvement: An Interpersonal Approach.

    Science.gov (United States)

    Coleman, Mick; Wallinga, Charlotte

    2000-01-01

    Discusses ways to develop family-school-community involvement, based on an early childhood teacher training course in family involvement. Discusses strategies for using Maslow's Hierarchy of Needs to facilitate family involvement interactions, and using student teachers' experiences for structuring reflective thought about family involvement…

  12. Application of various FLD modelling approaches

    Science.gov (United States)

    Banabic, D.; Aretz, H.; Paraianu, L.; Jurco, P.

    2005-07-01

    This paper focuses on a comparison between different modelling approaches to predict the forming limit diagram (FLD) for sheet metal forming under a linear strain path using the recently introduced orthotropic yield criterion BBC2003 (Banabic D et al 2005 Int. J. Plasticity 21 493-512). The FLD models considered here are a finite element based approach, the well known Marciniak-Kuczynski model, the modified maximum force criterion according to Hora et al (1996 Proc. Numisheet'96 Conf. (Dearborn/Michigan) pp 252-6), Swift's diffuse (Swift H W 1952 J. Mech. Phys. Solids 1 1-18) and Hill's classical localized necking approach (Hill R 1952 J. Mech. Phys. Solids 1 19-30). The FLD of an AA5182-O aluminium sheet alloy has been determined experimentally in order to quantify the predictive capabilities of the models mentioned above.

  13. Parents as Role Models: Parental Behavior Affects Adolescents' Plans for Work Involvement

    Science.gov (United States)

    Wiese, Bettina S.; Freund, Alexandra M.

    2011-01-01

    This study (N = 520 high-school students) investigates the influence of parental work involvement on adolescents' own plans regarding their future work involvement. As expected, adolescents' perceptions of parental work behavior affected their plans for own work involvement. Same-sex parents served as main role models for the adolescents' own…

  14. A Unified Approach to Modeling and Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    2010-01-01

    of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

  15. On the modelling of microsegregation in steels involving thermodynamic databases

    International Nuclear Information System (INIS)

    You, D; Bernhard, C; Michelic, S; Wieser, G; Presoly, P

    2016-01-01

    A microsegregation model involving thermodynamic database based on Ohnaka's model is proposed. In the model, the thermodynamic database is applied for equilibrium calculation. Multicomponent alloy effects on partition coefficients and equilibrium temperatures are accounted for. Microsegregation and partition coefficients calculated using different databases exhibit significant differences. The segregated concentrations predicted using the optimized database are in good agreement with the measured inter-dendritic concentrations. (paper)

  16. Setting up recovery clinics and promoting service user involvement.

    Science.gov (United States)

    John, Thomas

    2017-06-22

    Service user involvement in mental health has gained considerable momentum. Evidence from the literature suggests that it remains largely theoretical rather than being put into practice. The current nature of acute inpatient mental health units creates various challenges for nurses to put this concept into practice. Recovery clinics were introduced to bridge this gap and to promote service user involvement practice within the current care delivery model at Kent and Medway NHS and Social Care Partnership Trust. It has shaped new ways of working for nurses with a person-centred approach as its philosophy. Service users and nurses were involved in implementing a needs-led and bottom-up initiative using Kotter's change model. Initial results suggest that it has been successful in meeting its objectives evidenced through increased meaningful interactions and involvement in care by service users and carers. The clinics have gained wide recognition and have highlighted a need for further research into care delivery models to promote service user involvement in these units.

  17. The necessary burden of involving stakeholders in agent-based modelling for education and decision-making

    Science.gov (United States)

    Bommel, P.; Bautista Solís, P.; Leclerc, G.

    2016-12-01

    We implemented a participatory process with water stakeholders for improving resilience to drought at watershed scale, and for reducing water pollution disputes in drought prone Northwestern Costa Rica. The purpose is to facilitate co-management in a rural watershed impacted by recurrent droughts related to ENSO. The process involved designing "ContaMiCuenca", a hybrid agent-based model where users can specify the decisions of their agents. We followed a Companion Modeling approach (www.commod.org) and organized 10 workshops that included research techniques such as participatory diagnostics, actor-resources-interaction and UML diagrams, multi-agents model design, and interactive simulation sessions. We collectively assessed the main water issues in the watershed, prioritized their importance, defined the objectives of the process, and pilot-tested ContaMiCuenca for environmental education with adults and children. Simulation sessions resulted in debates about the need to improve the model accuracy, arguably more relevant for decision-making. This helped identify sensible knowledge gaps in the groundwater pollution and aquifer dynamics that need to be addressed in order to improve our collective learning. Significant mismatches among participants expectations, objectives, and agendas considerably slowed down the participatory process. The main issue may originate in participants expecting technical solutions from a positivist science, as constantly promoted in the region by dole-out initiatives, which is incompatible with the constructivist stance of participatory modellers. This requires much closer interaction of community members with modellers, which may be hard to attain in the current research practice and institutional context. Nevertheless, overcoming these constraints is necessary for a true involvement of water stakeholders to achieve community-based decisions that facilitate integrated water management. Our findings provide significant guidance for

  18. Technical note: Comparison of methane ebullition modelling approaches used in terrestrial wetland models

    Science.gov (United States)

    Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo

    2018-02-01

    Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.

  19. Time-dependent evolution of rock slopes by a multi-modelling approach

    Science.gov (United States)

    Bozzano, F.; Della Seta, M.; Martino, S.

    2016-06-01

    This paper presents a multi-modelling approach that incorporates contributions from morpho-evolutionary modelling, detailed engineering-geological modelling and time-dependent stress-strain numerical modelling to analyse the rheological evolution of a river valley slope over approximately 102 kyr. The slope is located in a transient, tectonically active landscape in southwestern Tyrrhenian Calabria (Italy), where gravitational processes drive failures in rock slopes. Constraints on the valley profile development were provided by a morpho-evolutionary model based on the correlation of marine and river strath terraces. Rock mass classes were identified through geomechanical parameters that were derived from engineering-geological surveys and outputs of a multi-sensor slope monitoring system. The rock mass classes were associated to lithotechnical units to obtain a high-resolution engineering-geological model along a cross section of the valley. Time-dependent stress-strain numerical modelling reproduced the main morpho-evolutionary stages of the valley slopes. The findings demonstrate that a complex combination of eustatism, uplift and Mass Rock Creep (MRC) deformations can lead to first-time failures of rock slopes when unstable conditions are encountered up to the generation of stress-controlled shear zones. The multi-modelling approach enabled us to determine that such complex combinations may have been sufficient for the first-time failure of the S. Giovanni slope at approximately 140 ka (MIS 7), even without invoking any trigger. Conversely, further reactivations of the landslide must be related to triggers such as earthquakes, rainfall and anthropogenic activities. This failure involved a portion of the slope where a plasticity zone resulted from mass rock creep that evolved with a maximum strain rate of 40% per thousand years, after the formation of a river strath terrace. This study demonstrates that the multi-modelling approach presented herein is a useful

  20. Reflections on Practical Approaches to Involving Children and Young People in the Data Analysis Process

    Science.gov (United States)

    Coad, Jane; Evans, Ruth

    2008-01-01

    This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…

  1. System Behavior Models: A Survey of Approaches

    Science.gov (United States)

    2016-06-01

    OF FIGURES Spiral Model .................................................................................................3 Figure 1. Approaches in... spiral model was chosen for researching and structuring this thesis, shown in Figure 1. This approach allowed multiple iterations of source material...applications and refining through iteration. 3 Spiral Model Figure 1. D. SCOPE The research is limited to a literature review, limited

  2. Set-Theoretic Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan

    Despite being widely accepted and applied, maturity models in Information Systems (IS) have been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. This PhD thesis focuses on addressing...... these criticisms by incorporating recent developments in configuration theory, in particular application of set-theoretic approaches. The aim is to show the potential of employing a set-theoretic approach for maturity model research and empirically demonstrating equifinal paths to maturity. Specifically...... methodological guidelines consisting of detailed procedures to systematically apply set theoretic approaches for maturity model research and provides demonstrations of it application on three datasets. The thesis is a collection of six research papers that are written in a sequential manner. The first paper...

  3. Complement Involvement in Periodontitis: Molecular Mechanisms and Rational Therapeutic Approaches.

    Science.gov (United States)

    Hajishengallis, George; Maekawa, Tomoki; Abe, Toshiharu; Hajishengallis, Evlambia; Lambris, John D

    2015-01-01

    The complement system is a network of interacting fluid-phase and cell surface-associated molecules that trigger, amplify, and regulate immune and inflammatory signaling pathways. Dysregulation of this finely balanced network can destabilize host-microbe homeostasis and cause inflammatory tissue damage. Evidence from clinical and animal model-based studies suggests that complement is implicated in the pathogenesis of periodontitis, a polymicrobial community-induced chronic inflammatory disease that destroys the tooth-supporting tissues. This review discusses molecular mechanisms of complement involvement in the dysbiotic transformation of the periodontal microbiome and the resulting destructive inflammation, culminating in loss of periodontal bone support. These mechanistic studies have additionally identified potential therapeutic targets. In this regard, interventional studies in preclinical models have provided proof-of-concept for using complement inhibitors for the treatment of human periodontitis.

  4. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  5. Sport Education as a Curriculum Approach to Student Learning of Invasion Games: Effects on Game Performance and Game Involvement.

    Science.gov (United States)

    Farias, Cláudio; Valério, Carla; Mesquita, Isabel

    2018-03-01

    The teaching and learning of games and sport-based activities has historically been the dominant form of the physical education curricula. With an interest in providing to students meaningful and culturally situated sporting experiences, Sport Education is probably the most implemented and researched pedagogical model worldwide. However, although there is considerable evidence that the model as a curriculum approach can benefit the development of social goals and healthy sport behaviors, not a single study as to date examined students' game-play development beyond participation in single and isolated teaching units. Therefore, the purpose of this study was to examine students' development of Game Performance and Game Involvement during participation in three consecutive Sport Education seasons of invasion games. The participants were an experienced physical education teacher and one seventh-grade class totaling 26 students (10 girls and 16 boys). Using the Game Performance Assessment Instrument (Oslin et al., 1998), pre-test to post-tests measures of students' Game Performance and Game Involvement were collected during their participation in basketball (20 lessons), handball (16 lessons), and football (18 lessons) units. Inter-group differences and pre-test to post-test improvements within each season were analyzed through 2 (time) x group (sport) repeated measures ANOVA tests. There were found significant pre-test to post-test improvements in Game Performance and Game Involvement in the second (handball) and third (football) seasons, but not in the first season (basketball). Students' Game Performance and Involvement scores of handball and football were significantly higher than their scores while playing basketball. The opportunity for an extended engagement in game-play activities and prolonged membership of students in the same teams throughout three consecutive seasons of Sport Education were key to the outcomes found. The specific configurations of the game

  6. Challenges and opportunities for integrating lake ecosystem modelling approaches

    Science.gov (United States)

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  7. Models of galaxies - The modal approach

    International Nuclear Information System (INIS)

    Lin, C.C.; Lowe, S.A.

    1990-01-01

    The general viability of the modal approach to the spiral structure in normal spirals and the barlike structure in certain barred spirals is discussed. The usefulness of the modal approach in the construction of models of such galaxies is examined, emphasizing the adoption of a model appropriate to observational data for both the spiral structure of a galaxy and its basic mass distribution. 44 refs

  8. A phasor approach analysis of multiphoton FLIM measurements of three-dimensional cell culture models

    Science.gov (United States)

    Lakner, P. H.; Möller, Y.; Olayioye, M. A.; Brucker, S. Y.; Schenke-Layland, K.; Monaghan, M. G.

    2016-03-01

    Fluorescence lifetime imaging microscopy (FLIM) is a useful approach to obtain information regarding the endogenous fluorophores present in biological samples. The concise evaluation of FLIM data requires the use of robust mathematical algorithms. In this study, we developed a user-friendly phasor approach for analyzing FLIM data and applied this method on three-dimensional (3D) Caco-2 models of polarized epithelial luminal cysts in a supporting extracellular matrix environment. These Caco-2 based models were treated with epidermal growth factor (EGF), to stimulate proliferation in order to determine if FLIM could detect such a change in cell behavior. Autofluorescence from nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) in luminal Caco-2 cysts was stimulated by 2-photon laser excitation. Using a phasor approach, the lifetimes of involved fluorophores and their contribution were calculated with fewer initial assumptions when compared to multiexponential decay fitting. The phasor approach simplified FLIM data analysis, making it an interesting tool for non-experts in numerical data analysis. We observed that an increased proliferation stimulated by EGF led to a significant shift in fluorescence lifetime and a significant alteration of the phasor data shape. Our data demonstrates that multiphoton FLIM analysis with the phasor approach is a suitable method for the non-invasive analysis of 3D in vitro cell culture models qualifying this method for monitoring basic cellular features and the effect of external factors.

  9. Building spatio-temporal database model based on ontological approach using relational database environment

    International Nuclear Information System (INIS)

    Mahmood, N.; Burney, S.M.A.

    2017-01-01

    Everything in this world is encapsulated by space and time fence. Our daily life activities are utterly linked and related with other objects in vicinity. Therefore, a strong relationship exist with our current location, time (including past, present and future) and event through with we are moving as an object also affect our activities in life. Ontology development and its integration with database are vital for the true understanding of the complex systems involving both spatial and temporal dimensions. In this paper we propose a conceptual framework for building spatio-temporal database model based on ontological approach. We have used relational data model for modelling spatio-temporal data content and present our methodology with spatio-temporal ontological accepts and its transformation into spatio-temporal database model. We illustrate the implementation of our conceptual model through a case study related to cultivated land parcel used for agriculture to exhibit the spatio-temporal behaviour of agricultural land and related entities. Moreover, it provides a generic approach for designing spatiotemporal databases based on ontology. The proposed model is capable to understand the ontological and somehow epistemological commitments and to build spatio-temporal ontology and transform it into a spatio-temporal data model. Finally, we highlight the existing and future research challenges. (author)

  10. Evaluation of alternative model-data fusion approaches in water balance estimation across Australia

    Science.gov (United States)

    van Dijk, A. I. J. M.; Renzullo, L. J.

    2009-04-01

    Australia's national agencies are developing a continental modelling system to provide a range of water information services. It will include rolling water balance estimation to underpin national water accounts, water resources assessments that interpret current water resources availability and trends in a historical context, and water resources predictions coupled to climate and weather forecasting. The nation-wide coverage, currency, accuracy, and consistency required means that remote sensing will need to play an important role along with in-situ observations. Different approaches to blending models and observations can be considered. Integration of on-ground and remote sensing data into land surface models in atmospheric applications often involves state updating through model-data assimilation techniques. By comparison, retrospective water balance estimation and hydrological scenario modelling to date has mostly relied on static parameter fitting against observations and has made little use of earth observation. The model-data fusion approach most appropriate for a continental water balance estimation system will need to consider the trade-off between computational overhead and the accuracy gains achieved when using more sophisticated synthesis techniques and additional observations. This trade-off was investigated using a landscape hydrological model and satellite-based estimates of soil moisture and vegetation properties for aseveral gauged test catchments in southeast Australia.

  11. Why involve families in acute mental healthcare? A collaborative conceptual review.

    Science.gov (United States)

    Dirik, Aysegul; Sandhu, Sima; Giacco, Domenico; Barrett, Katherine; Bennison, Gerry; Collinson, Sue; Priebe, Stefan

    2017-09-27

    Family involvement is strongly recommended in clinical guidelines but suffers from poor implementation. To explore this topic at a conceptual level, a multidisciplinary review team including academics, clinicians and individuals with lived experience undertook a review to explore the theoretical background of family involvement models in acute mental health treatment and how this relates to their delivery. A conceptual review was undertaken, including a systematic search and narrative synthesis. Included family models were mapped onto the most commonly referenced underlying theories: the diathesis-stress model, systems theories and postmodern theories of mental health. Common components of the models were summarised and compared. Lastly, a thematic analysis was undertaken to explore the role of patients and families in the delivery of the approaches. General adult acute mental health treatment. Six distinct family involvement models were identified: Calgary Family Assessment and Intervention Models, ERIC (Equipe Rapide d'Intervention de Crise), Family Psychoeducation Models, Family Systems Approach, Open Dialogue and the Somerset Model. Findings indicated that despite wide variation in the theoretical models underlying family involvement models, there were many commonalities in their components, such as a focus on communication, language use and joint decision-making. Thematic analysis of the role of patients and families identified several issues for implementation. This included potential harms that could emerge during delivery of the models, such as imposing linear 'patient-carer' relationships and the risk of perceived coercion. We conclude that future staff training may benefit from discussing the chosen family involvement model within the context of other theories of mental health. This may help to clarify the underlying purpose of family involvement and address the diverse needs and world views of patients, families and professionals in acute settings.

  12. Multichannel approach to the Glauber model for heavy-ion collisions

    International Nuclear Information System (INIS)

    Lenzi, S.M.; Zardi, F.; Vitturi, A.

    1990-01-01

    A formalism is developed in order to describe, within the Glauber model, the scattering processes between heavy ions in situations involving several coupled channels. The approach is based on a suitable truncation of the number of nuclear states which can be excited at each microscopic nucleon-nucleon collision. The set of coupled equations for the S-matrix elements of the conventional reaction theory is replaced by simple matrix relations, only involving the nucleon-nucleon scattering amplitude and the nuclear densities and transition densities. This method avoids the difficulties arising from the combinatorial aspects of the multiple scattering theories, the slow convergence of the series, and the problems of center-of-mass correlations. We discuss some specific examples of multichannel collisions where the multiple-scattering series can be summed to give analytic expressions for the scattering amplitude. We finally explicate the formalism for the perturbative treatment of mutual excitation and charge-exchange processes

  13. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    Science.gov (United States)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  14. Evaporator modeling - A hybrid approach

    International Nuclear Information System (INIS)

    Ding Xudong; Cai Wenjian; Jia Lei; Wen Changyun

    2009-01-01

    In this paper, a hybrid modeling approach is proposed to model two-phase flow evaporators. The main procedures for hybrid modeling includes: (1) Based on the energy and material balance, and thermodynamic principles to formulate the process fundamental governing equations; (2) Select input/output (I/O) variables responsible to the system performance which can be measured and controlled; (3) Represent those variables existing in the original equations but are not measurable as simple functions of selected I/Os or constants; (4) Obtaining a single equation which can correlate system inputs and outputs; and (5) Identify unknown parameters by linear or nonlinear least-squares methods. The method takes advantages of both physical and empirical modeling approaches and can accurately predict performance in wide operating range and in real-time, which can significantly reduce the computational burden and increase the prediction accuracy. The model is verified with the experimental data taken from a testing system. The testing results show that the proposed model can predict accurately the performance of the real-time operating evaporator with the maximum error of ±8%. The developed models will have wide applications in operational optimization, performance assessment, fault detection and diagnosis

  15. Analysis student self efficacy in terms of using Discovery Learning model with SAVI approach

    Science.gov (United States)

    Sahara, Rifki; Mardiyana, S., Dewi Retno Sari

    2017-12-01

    Often students are unable to prove their academic achievement optimally according to their abilities. One reason is that they often feel unsure that they are capable of completing the tasks assigned to them. For students, such beliefs are necessary. The term belief has called self efficacy. Self efficacy is not something that has brought about by birth or something with permanent quality of an individual, but is the result of cognitive processes, the meaning one's self efficacy will be stimulated through learning activities. Self efficacy has developed and enhanced by a learning model that can stimulate students to foster confidence in their capabilities. One of them is by using Discovery Learning model with SAVI approach. Discovery Learning model with SAVI approach is one of learning models that involves the active participation of students in exploring and discovering their own knowledge and using it in problem solving by utilizing all the sensory devices they have. This naturalistic qualitative research aims to analyze student self efficacy in terms of use the Discovery Learning model with SAVI approach. The subjects of this study are 30 students focused on eight students who have high, medium, and low self efficacy obtained through purposive sampling technique. The data analysis of this research used three stages, that were reducing, displaying, and getting conclusion of the data. Based on the results of data analysis, it was concluded that the self efficacy appeared dominantly on the learning by using Discovery Learning model with SAVI approach is magnitude dimension.

  16. Deep Appearance Models: A Deep Boltzmann Machine Approach for Face Modeling

    OpenAIRE

    Duong, Chi Nhan; Luu, Khoa; Quach, Kha Gia; Bui, Tien D.

    2016-01-01

    The "interpretation through synthesis" approach to analyze face images, particularly Active Appearance Models (AAMs) method, has become one of the most successful face modeling approaches over the last two decades. AAM models have ability to represent face images through synthesis using a controllable parameterized Principal Component Analysis (PCA) model. However, the accuracy and robustness of the synthesized faces of AAM are highly depended on the training sets and inherently on the genera...

  17. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  18. User involvement in the design of human-computer interactions: some similarities and differences between design approaches

    NARCIS (Netherlands)

    Bekker, M.M.; Long, J.B.

    1998-01-01

    This paper presents a general review of user involvement in the design of human-computer interactions, as advocated by a selection of different approaches to design. The selection comprises User-Centred Design, Participatory Design, Socio-Technical Design, Soft Systems Methodology, and Joint

  19. An integrated approach to consumer representation and involvement in a multicentre randomized controlled trial.

    Science.gov (United States)

    Langston, Anne L; McCallum, Marilyn; Campbell, Marion K; Robertson, Clare; Ralston, Stuart H

    2005-01-01

    Although, consumer involvement in individual studies is often limited, their involvement in guiding health research is generally considered to be beneficial. This paper outlines our experiences of an integrated relationship between the organisers of a clinical trial and a consumer organisation. The PRISM trial is a UK multicentre, randomized controlled trial comparing treatment strategies for Paget's disease of the bone. The National Association for the Relief of Paget's Disease (NARPD) is the only UK support group for sufferers of Paget's disease and has worked closely with the PRISM team from the outset. NARPD involvement is integral to the conduct of the trial and specific roles have included: peer-review; trial steering committee membership; provision of advice to participants, and promotion of the trial amongst Paget's disease patients. The integrated relationship has yielded benefits to both the trial and the consumer organisation. The benefits for the trial have included: recruitment of participants via NARPD contacts; well-informed participants; unsolicited patient advocacy of the trial; and interested and pro-active collaborators. For the NARPD and Paget's disease sufferers, benefits have included: increased awareness of Paget's disease; increased access to relevant health research; increased awareness of the NARPD services; and wider transfer of diagnosis and management knowledge to/from health care professionals. Our experience has shown that an integrated approach between a trial team and a consumer organisation is worthwhile. Adoption of such an approach in other trials may yield significant improvements in recruitment and quality of participant information flow. There are, however, resource implications for both parties.

  20. Models of user involvement in the mental health context: intentions and implementation challenges.

    Science.gov (United States)

    Storm, Marianne; Edwards, Adrian

    2013-09-01

    Patient-centered care, shared decision-making, patient participation and the recovery model are models of care which incorporate user involvement and patients' perspectives on their treatment and care. The aims of this paper are to examine these different care models and their association with user involvement in the mental health context and discuss some of the challenges associated with their implementation. The sources used are health policy documents and published literature and research on patient-centered care, shared decision-making, patient participation and recovery. The policy documents advocate that mental health services should be oriented towards patients' or users' needs, participation and involvement. These policies also emphasize recovery and integration of people with mental disorders in the community. However, these collaborative care models have generally been subject to limited empirical research about effectiveness. There are also challenges to implementation of the models in inpatient care. What evidence there is indicates tensions between patients' and providers' perspectives on treatment and care. There are issues related to risk and the person's capacity for user involvement, and concerns about what role patients themselves wish to play in decision-making. Lack of competence and awareness among providers are further issues. Further work on training, evaluation and implementation is needed to ensure that inpatient mental health services are adapting user oriented care models at all levels of services.

  1. Fires involving radioactive materials : transference model; operative recommendations

    International Nuclear Information System (INIS)

    Rodriguez, C.E.; Puntarulo, L.J.; Canibano, J.A.

    1988-01-01

    In all aspects related to the nuclear activity, the occurrence of an explosion, fire or burst type accident, with or without victims, is directly related to the characteristics of the site. The present work analyses the different parameters involved, describing a transference model and recommendations for evaluation and control of the radiological risk for firemen. Special emphasis is placed on the measurement of the variables existing in this kind of operations

  2. Unpacking buyer-seller differences in valuation from experience: A cognitive modeling approach.

    Science.gov (United States)

    Pachur, Thorsten; Scheibehenne, Benjamin

    2017-12-01

    People often indicate a higher price for an object when they own it (i.e., as sellers) than when they do not (i.e., as buyers)-a phenomenon known as the endowment effect. We develop a cognitive modeling approach to formalize, disentangle, and compare alternative psychological accounts (e.g., loss aversion, loss attention, strategic misrepresentation) of such buyer-seller differences in pricing decisions of monetary lotteries. To also be able to test possible buyer-seller differences in memory and learning, we study pricing decisions from experience, obtained with the sampling paradigm, where people learn about a lottery's payoff distribution from sequential sampling. We first formalize different accounts as models within three computational frameworks (reinforcement learning, instance-based learning theory, and cumulative prospect theory), and then fit the models to empirical selling and buying prices. In Study 1 (a reanalysis of published data with hypothetical decisions), models assuming buyer-seller differences in response bias (implementing a strategic-misrepresentation account) performed best; models assuming buyer-seller differences in choice sensitivity or memory (implementing a loss-attention account) generally fared worst. In a new experiment involving incentivized decisions (Study 2), models assuming buyer-seller differences in both outcome sensitivity (as proposed by a loss-aversion account) and response bias performed best. In both Study 1 and 2, the models implemented in cumulative prospect theory performed best. Model recovery studies validated our cognitive modeling approach, showing that the models can be distinguished rather well. In summary, our analysis supports a loss-aversion account of the endowment effect, but also reveals a substantial contribution of simple response bias.

  3. A methodology proposal for collaborative business process elaboration using a model-driven approach

    Science.gov (United States)

    Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé

    2015-05-01

    Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).

  4. Involving older people in a multi-centre randomised trial of a complex intervention in pre-hospital emergency care: implementation of a collaborative model.

    Science.gov (United States)

    Koniotou, Marina; Evans, Bridie Angela; Chatters, Robin; Fothergill, Rachael; Garnsworthy, Christopher; Gaze, Sarah; Halter, Mary; Mason, Suzanne; Peconi, Julie; Porter, Alison; Siriwardena, A Niroshan; Toghill, Alun; Snooks, Helen

    2015-07-10

    Health services research is expected to involve service users as active partners in the research process, but few examples report how this has been achieved in practice in trials. We implemented a model to involve service users in a multi-centre randomised controlled trial in pre-hospital emergency care. We used the generic Standard Operating Procedure (SOP) from our Clinical Trials Unit (CTU) as the basis for creating a model to fit the context and population of the SAFER 2 trial. In our model, we planned to involve service users at all stages in the trial through decision-making forums at 3 levels: 1) strategic; 2) site (e.g. Wales; London; East Midlands); 3) local. We linked with charities and community groups to recruit people with experience of our study population. We collected notes of meetings alongside other documentary evidence such as attendance records and study documentation to track how we implemented our model. We involved service users at strategic, site and local level. We also added additional strategic level forums (Task and Finish Groups and Writing Days) where we included service users. Service user involvement varied in frequency and type across meetings, research stages and locations but stabilised and increased as the trial progressed. Involving service users in the SAFER 2 trial showed how it is feasible and achievable for patients, carers and potential patients sharing the demographic characteristics of our study population to collaborate in a multi-centre trial at the level which suited their health, location, skills and expertise. A standard model of involvement can be tailored by adopting a flexible approach to take account of the context and complexities of a multi-site trial. Current Controlled Trials ISRCTN60481756. Registered: 13 March 2009.

  5. Current approaches to gene regulatory network modelling

    Directory of Open Access Journals (Sweden)

    Brazma Alvis

    2007-09-01

    Full Text Available Abstract Many different approaches have been developed to model and simulate gene regulatory networks. We proposed the following categories for gene regulatory network models: network parts lists, network topology models, network control logic models, and dynamic models. Here we will describe some examples for each of these categories. We will study the topology of gene regulatory networks in yeast in more detail, comparing a direct network derived from transcription factor binding data and an indirect network derived from genome-wide expression data in mutants. Regarding the network dynamics we briefly describe discrete and continuous approaches to network modelling, then describe a hybrid model called Finite State Linear Model and demonstrate that some simple network dynamics can be simulated in this model.

  6. Sport Education as a Curriculum Approach to Student Learning of Invasion Games: Effects on Game Performance and Game Involvement

    Science.gov (United States)

    Farias, Cláudio; Valério, Carla; Mesquita, Isabel

    2018-01-01

    The teaching and learning of games and sport-based activities has historically been the dominant form of the physical education curricula. With an interest in providing to students meaningful and culturally situated sporting experiences, Sport Education is probably the most implemented and researched pedagogical model worldwide. However, although there is considerable evidence that the model as a curriculum approach can benefit the development of social goals and healthy sport behaviors, not a single study as to date examined students’ game-play development beyond participation in single and isolated teaching units. Therefore, the purpose of this study was to examine students’ development of Game Performance and Game Involvement during participation in three consecutive Sport Education seasons of invasion games. The participants were an experienced physical education teacher and one seventh-grade class totaling 26 students (10 girls and 16 boys). Using the Game Performance Assessment Instrument (Oslin et al., 1998), pre-test to post-tests measures of students’ Game Performance and Game Involvement were collected during their participation in basketball (20 lessons), handball (16 lessons), and football (18 lessons) units. Inter-group differences and pre-test to post-test improvements within each season were analyzed through 2 (time) x group (sport) repeated measures ANOVA tests. There were found significant pre-test to post-test improvements in Game Performance and Game Involvement in the second (handball) and third (football) seasons, but not in the first season (basketball). Students’ Game Performance and Involvement scores of handball and football were significantly higher than their scores while playing basketball. The opportunity for an extended engagement in game-play activities and prolonged membership of students in the same teams throughout three consecutive seasons of Sport Education were key to the outcomes found. The specific configurations of

  7. Modelling chloride penetration in concrete using electrical voltage and current approaches

    Directory of Open Access Journals (Sweden)

    Juan Lizarazo-Marriaga

    2011-03-01

    Full Text Available This paper reports a research programme aimed at giving a better understanding of the phenomena involved in the chloride penetration in cement-based materials. The general approach used was to solve the Nernst-Planck equation numerically for two physical ideal states that define the possible conditions under which chlorides will move through concrete. These conditions are named in this paper as voltage control and current control. For each condition, experiments and simulations were carried out in order to establish the importance of electrical variables such as voltage and current in modelling chloride transport in concrete. The results of experiments and simulations showed that if those electrical variables are included as key parameters in the modelling of chloride penetration through concrete, a better understanding of this complex phenomenon can be obtained.

  8. Introducing an Intervention Model for Fostering Affective Involvement with Persons Who Are Congenitally Deafblind

    Science.gov (United States)

    Martens, Marga A. W.; Janssen, Marleen J.; Ruijssenaars, Wied A. J. J. M.; Riksen-Walraven, J. Marianne

    2014-01-01

    The article presented here introduces the Intervention Model for Affective Involvement (IMAI), which was designed to train staff members (for example, teachers, caregivers, support workers) to foster affective involvement during interaction and communication with persons who have congenital deaf-blindness. The model is theoretically underpinned,…

  9. Modeling of modification experiments involving neutral-gas release

    International Nuclear Information System (INIS)

    Bernhardt, P.A.

    1983-01-01

    Many experiments involve the injection of neutral gases into the upper atmosphere. Examples are critical velocity experiments, MHD wave generation, ionospheric hole production, plasma striation formation, and ion tracing. Many of these experiments are discussed in other sessions of the Active Experiments Conference. This paper limits its discussion to: (1) the modeling of the neutral gas dynamics after injection, (2) subsequent formation of ionosphere holes, and (3) use of such holes as experimental tools

  10. Neural Network Control of CSTR for Reversible Reaction Using Reverence Model Approach

    Directory of Open Access Journals (Sweden)

    Duncan ALOKO

    2007-01-01

    Full Text Available In this work, non-linear control of CSTR for reversible reaction is carried out using Neural Network as design tool. The Model Reverence approach in used to design ANN controller. The idea is to have a control system that will be able to achieve improvement in the level of conversion and to be able to track set point change and reject load disturbance. We use PID control scheme as benchmark to study the performance of the controller. The comparison shows that ANN controller out perform PID in the extreme range of non-linearity.This paper represents a preliminary effort to design a simplified neutral network control scheme for a class of non-linear process. Future works will involve further investigation of the effectiveness of thin approach for the real industrial chemical process

  11. Tuneable resolution as a systems biology approach for multi-scale, multi-compartment computational models.

    Science.gov (United States)

    Kirschner, Denise E; Hunt, C Anthony; Marino, Simeone; Fallahi-Sichani, Mohammad; Linderman, Jennifer J

    2014-01-01

    The use of multi-scale mathematical and computational models to study complex biological processes is becoming increasingly productive. Multi-scale models span a range of spatial and/or temporal scales and can encompass multi-compartment (e.g., multi-organ) models. Modeling advances are enabling virtual experiments to explore and answer questions that are problematic to address in the wet-lab. Wet-lab experimental technologies now allow scientists to observe, measure, record, and analyze experiments focusing on different system aspects at a variety of biological scales. We need the technical ability to mirror that same flexibility in virtual experiments using multi-scale models. Here we present a new approach, tuneable resolution, which can begin providing that flexibility. Tuneable resolution involves fine- or coarse-graining existing multi-scale models at the user's discretion, allowing adjustment of the level of resolution specific to a question, an experiment, or a scale of interest. Tuneable resolution expands options for revising and validating mechanistic multi-scale models, can extend the longevity of multi-scale models, and may increase computational efficiency. The tuneable resolution approach can be applied to many model types, including differential equation, agent-based, and hybrid models. We demonstrate our tuneable resolution ideas with examples relevant to infectious disease modeling, illustrating key principles at work. © 2014 The Authors. WIREs Systems Biology and Medicine published by Wiley Periodicals, Inc.

  12. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  13. Effects of deceptive packaging and product involvement on purchase intention: an elaboration likelihood model perspective.

    Science.gov (United States)

    Lammers, H B

    2000-04-01

    From an Elaboration Likelihood Model perspective, it was hypothesized that postexposure awareness of deceptive packaging claims would have a greater negative effect on scores for purchase intention by consumers lowly involved rather than highly involved with a product (n = 40). Undergraduates who were classified as either highly or lowly (ns = 20 and 20) involved with M&Ms examined either a deceptive or non-deceptive package design for M&Ms candy and were subsequently informed of the deception employed in the packaging before finally rating their intention to purchase. As anticipated, highly deceived subjects who were low in involvement rated intention to purchase lower than their highly involved peers. Overall, the results attest to the robustness of the model and suggest that the model has implications beyond advertising effects and into packaging effects.

  14. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  15. Strengthening stakeholder involvement in health workforce governance: why we need to talk about power.

    Science.gov (United States)

    Kuhlmann, Ellen; Burau, Viola

    2018-01-01

    There is now widespread agreement on the benefits of an integrated, people-centred health workforce, but the implementation of new models is difficult. We argue that we need to think about stakeholders and power, if we want to ensure change in the health workforce. We discuss these issues from a governance perspective and suggest a critical approach to stakeholder involvement as an indicator of good governance. Three models of involving stakeholders in health workforce governance can be identified: corporatist professional involvement either in a continental European model of conservative corporatism or in a Nordic model of public corporatism; managerialist and market-centred involvement of professions as organizational agents; and a more inclusive, network-based involvement of plural professional experts at different levels of governance. The power relations embedded in these models of stakeholder involvement have different effects on capacity building for an integrated health workforce.

  16. Omics Approach to Identify Factors Involved in Brassica Disease Resistance.

    Science.gov (United States)

    Francisco, Marta; Soengas, Pilar; Velasco, Pablo; Bhadauria, Vijai; Cartea, Maria E; Rodríguez, Victor M

    2016-01-01

    Understanding plant's defense mechanisms and their response to biotic stresses is of fundamental meaning for the development of resistant crop varieties and more productive agriculture. The Brassica genus involves a large variety of economically important species and cultivars used as vegetable source, oilseeds, forage and ornamental. Damage caused by pathogens attack affects negatively various aspects of plant growth, development, and crop productivity. Over the last few decades, advances in plant physiology, genetics, and molecular biology have greatly improved our understanding of plant responses to biotic stress conditions. In this regard, various 'omics' technologies enable qualitative and quantitative monitoring of the abundance of various biological molecules in a high-throughput manner, and thus allow determination of their variation between different biological states on a genomic scale. In this review, we have described advances in 'omic' tools (genomics, transcriptomics, proteomics and metabolomics) in the view of conventional and modern approaches being used to elucidate the molecular mechanisms that underlie Brassica disease resistance.

  17. Modeling economic costs of disasters and recovery involving positive effects of reconstruction: analysis using a dynamic CGE model

    Science.gov (United States)

    Xie, W.; Li, N.; Wu, J.-D.; Hao, X.-L.

    2013-11-01

    Disaster damages have negative effects on economy, whereas reconstruction investments have positive effects. The aim of this study is to model economic causes of disasters and recovery involving positive effects of reconstruction activities. Computable general equilibrium (CGE) model is a promising approach because it can incorporate these two kinds of shocks into a unified framework and further avoid double-counting problem. In order to factor both shocks in CGE model, direct loss is set as the amount of capital stock reduced on supply side of economy; A portion of investments restore the capital stock in existing period; An investment-driven dynamic model is formulated due to available reconstruction data, and the rest of a given country's saving is set as an endogenous variable. The 2008 Wenchuan Earthquake is selected as a case study to illustrate the model, and three scenarios are constructed: S0 (no disaster occurs), S1 (disaster occurs with reconstruction investment) and S2 (disaster occurs without reconstruction investment). S0 is taken as business as usual, and the differences between S1 and S0 and that between S2 and S0 can be interpreted as economic losses including reconstruction and excluding reconstruction respectively. The study showed that output from S1 is found to be closer to real data than that from S2. S2 overestimates economic loss by roughly two times that under S1. The gap in economic aggregate between S1 and S0 is reduced to 3% in 2011, a level that should take another four years to achieve under S2.

  18. A Research Framework for Understanding the Practical Impact of Family Involvement in the Juvenile Justice System: The Juvenile Justice Family Involvement Model.

    Science.gov (United States)

    Walker, Sarah Cusworth; Bishop, Asia S; Pullmann, Michael D; Bauer, Grace

    2015-12-01

    Family involvement is recognized as a critical element of service planning for children's mental health, welfare and education. For the juvenile justice system, however, parents' roles in this system are complex due to youths' legal rights, public safety, a process which can legally position parents as plaintiffs, and a historical legacy of blaming parents for youth indiscretions. Three recent national surveys of juvenile justice-involved parents reveal that the current paradigm elicits feelings of stress, shame and distrust among parents and is likely leading to worse outcomes for youth, families and communities. While research on the impact of family involvement in the justice system is starting to emerge, the field currently has no organizing framework to guide a research agenda, interpret outcomes or translate findings for practitioners. We propose a research framework for family involvement that is informed by a comprehensive review and content analysis of current, published arguments for family involvement in juvenile justice along with a synthesis of family involvement efforts in other child-serving systems. In this model, family involvement is presented as an ascending, ordinal concept beginning with (1) exclusion, and moving toward climates characterized by (2) information-giving, (3) information-eliciting and (4) full, decision-making partnerships. Specific examples of how courts and facilities might align with these levels are described. Further, the model makes predictions for how involvement will impact outcomes at multiple levels with applications for other child-serving systems.

  19. Estimation in a multiplicative mixed model involving a genetic relationship matrix

    Directory of Open Access Journals (Sweden)

    Eccleston John A

    2009-04-01

    Full Text Available Abstract Genetic models partitioning additive and non-additive genetic effects for populations tested in replicated multi-environment trials (METs in a plant breeding program have recently been presented in the literature. For these data, the variance model involves the direct product of a large numerator relationship matrix A, and a complex structure for the genotype by environment interaction effects, generally of a factor analytic (FA form. With MET data, we expect a high correlation in genotype rankings between environments, leading to non-positive definite covariance matrices. Estimation methods for reduced rank models have been derived for the FA formulation with independent genotypes, and we employ these estimation methods for the more complex case involving the numerator relationship matrix. We examine the performance of differing genetic models for MET data with an embedded pedigree structure, and consider the magnitude of the non-additive variance. The capacity of existing software packages to fit these complex models is largely due to the use of the sparse matrix methodology and the average information algorithm. Here, we present an extension to the standard formulation necessary for estimation with a factor analytic structure across multiple environments.

  20. A dynamic performance model for redox-flow batteries involving soluble species

    International Nuclear Information System (INIS)

    Shah, A.A.; Watt-Smith, M.J.; Walsh, F.C.

    2008-01-01

    A transient modelling framework for a vanadium redox-flow battery (RFB) is developed and experiments covering a range of vanadium concentration and electrolyte flow rate are conducted. The two-dimensional model is based on a comprehensive description of mass, charge and momentum transport and conservation, and is combined with a global kinetic model for reactions involving vanadium species. The model is validated against the experimental data and is used to study the effects of variations in concentration, electrolyte flow rate and electrode porosity. Extensions to the model and future work are suggested

  1. On a model-based approach to radiation protection

    International Nuclear Information System (INIS)

    Waligorski, M.P.R.

    2002-01-01

    There is a preoccupation with linearity and absorbed dose as the basic quantifiers of radiation hazard. An alternative is the fluence approach, whereby radiation hazard may be evaluated, at least in principle, via an appropriate action cross section. In order to compare these approaches, it may be useful to discuss them as quantitative descriptors of survival and transformation-like endpoints in cell cultures in vitro - a system thought to be relevant to modelling radiation hazard. If absorbed dose is used to quantify these biological endpoints, then non-linear dose-effect relations have to be described, and, e.g. after doses of densely ionising radiation, dose-correction factors as high as 20 are required. In the fluence approach only exponential effect-fluence relationships can be readily described. Neither approach alone exhausts the scope of experimentally observed dependencies of effect on dose or fluence. Two-component models, incorporating a suitable mixture of the two approaches, are required. An example of such a model is the cellular track structure theory developed by Katz over thirty years ago. The practical consequences of modelling radiation hazard using this mixed two-component approach are discussed. (author)

  2. Mathematical Modeling Approaches in Plant Metabolomics.

    Science.gov (United States)

    Fürtauer, Lisa; Weiszmann, Jakob; Weckwerth, Wolfram; Nägele, Thomas

    2018-01-01

    The experimental analysis of a plant metabolome typically results in a comprehensive and multidimensional data set. To interpret metabolomics data in the context of biochemical regulation and environmental fluctuation, various approaches of mathematical modeling have been developed and have proven useful. In this chapter, a general introduction to mathematical modeling is presented and discussed in context of plant metabolism. A particular focus is laid on the suitability of mathematical approaches to functionally integrate plant metabolomics data in a metabolic network and combine it with other biochemical or physiological parameters.

  3. Aircraft operational reliability—A model-based approach and a case study

    International Nuclear Information System (INIS)

    Tiassou, Kossi; Kanoun, Karama; Kaâniche, Mohamed; Seguin, Christel; Papadopoulos, Chris

    2013-01-01

    The success of an aircraft mission is subject to the fulfillment of some operational requirements before and during each flight. As these requirements depend essentially on the aircraft system components and the mission profile, the effects of failures can be very severe if they are not anticipated. Hence, one should be able to assess the aircraft operational reliability with regard to its missions in order to be able to cope with failures. We address aircraft operational reliability modeling to support maintenance planning during the mission achievement. We develop a modeling approach, based on a meta-model that is used as a basis: (i) to structure the information needed to assess aircraft operational reliability and (ii) to build a stochastic model that can be tuned dynamically, in order to take into account the aircraft system operational state, a mission profile and the maintenance facilities available at the flight stop locations involved in the mission. The aim is to enable operational reliability assessment online. A case study, based on an aircraft subsystem, is considered for illustration using the Stochastic Activity Networks (SANs) formalism

  4. A conceptual model of people's approach to sanitation

    International Nuclear Information System (INIS)

    Avvannavar, Santosh M.; Mani, Monto

    2008-01-01

    Sanitation is a term primarily used to characterize the safe and sound handling (and disposal) of human excreta - or simply, people's approach to take-care of their (unavoidable) primal urge. According to the recent Human Development Report 2006 Global access to proper sanitation stands at approximately 58% with 37% being a conservative estimate both for South Asia and Sub-Saharan Africa. Various multi-million dollar sanitation programmes the world over have had little success, often due to inadequate understanding of people's sanitation approach. Sanitation approach includes the perception, feel and practices involved in satisficing the primal need to defecate and urinate (and their disposal). This paper presents a structure to understand the nature of psycho-socio-economic influences that determine societal approach to sanitation. Societies across the globe have evolved imbibing diverse influences attributed to the local environment, religion, cultural practices, war, etc. While a civilization's living environment reflects these influences in their built-environment characteristics, the influences are often deep-rooted and can be traced to the way the community members satisfice their need to defecate and urinate (sanitation approach). The objective of this paper is to trace the various approaches that diverse societies/civilizations, over time, across the world have had towards sanitation, and present a structure to articulate and understand determining factors. Sanitation also involves other domestic (solid and liquid) waste disposal but in the context of this paper the scope of sanitation has been restricted to human excreta alone. The structure presented and discussed in this paper would be useful in understanding a community better in terms of providing appropriate sanitation. It is hoped that this structure be considered as a basis for further refinement and detailed research into each of the factors determining people's sanitation approach

  5. Developing a physiologically based approach for modeling plutonium decorporation therapy with DTPA.

    Science.gov (United States)

    Kastl, Manuel; Giussani, Augusto; Blanchardon, Eric; Breustedt, Bastian; Fritsch, Paul; Hoeschen, Christoph; Lopez, Maria Antonia

    2014-11-01

    To develop a physiologically based compartmental approach for modeling plutonium decorporation therapy with the chelating agent Diethylenetriaminepentaacetic acid (Ca-DTPA/Zn-DTPA). Model calculations were performed using the software package SAAM II (©The Epsilon Group, Charlottesville, Virginia, USA). The Luciani/Polig compartmental model with age-dependent description of the bone recycling processes was used for the biokinetics of plutonium. The Luciani/Polig model was slightly modified in order to account for the speciation of plutonium in blood and for the different affinities for DTPA of the present chemical species. The introduction of two separate blood compartments, describing low-molecular-weight complexes of plutonium (Pu-LW) and transferrin-bound plutonium (Pu-Tf), respectively, and one additional compartment describing plutonium in the interstitial fluids was performed successfully. The next step of the work is the modeling of the chelation process, coupling the physiologically modified structure with the biokinetic model for DTPA. RESULTS of animal studies performed under controlled conditions will enable to better understand the principles of the involved mechanisms.

  6. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  7. Does Business Model Affect CSR Involvement? A Survey of Polish Manufacturing and Service Companies

    Directory of Open Access Journals (Sweden)

    Marzanna Katarzyna Witek-Hajduk

    2016-02-01

    Full Text Available The study explores links between types of business models used by companies and their involvement in CSR. As the main part of our conceptual framework we used a business model taxonomy developed by Dudzik and Witek-Hajduk, which identifies five types of models: traditionalists, market players, contractors, distributors, and integrators. From shared characteristics of the business model profiles, we proposed that market players and integrators will show significantly higher levels of involvement in CSR than the three other classes of companies. Among other things, both market players and integrators relied strongly on building own brand value and fostering harmonious supply channel relations, which served as a rationale for our hypothesis. The data for the study were obtained through a combined CATI and CAWI survey on a group of 385 managers of medium and large enterprises. The sample was representative for the three Polish industries of chemical manufacturing, food production, and retailing. Statistical methods included confirmatory factor analysis and one-way ANOVA with contrasts and post hoc tests. The findings supported our hypothesis, showing that market players and integrators were indeed more engaged in CSR than other groups of firms. This may suggest that managers in control of these companies could bolster the integrity of their business models by increasing CSR involvement. Another important contribution of the study was to propose and validate a versatile scale for assessing CSR involvement, which showed measurement invariance for all involved industries.

  8. Impact of consumers' health beliefs, involvement and risk perception of fish consumption

    DEFF Research Database (Denmark)

    Pieniak, Zuzanna; Verbeke, Wim; Scholderer, Joachim

    2008-01-01

    Purpose - To investigate the impact of consumers' health beliefs, involvement, and risk perception on fish consumption in five European countries. Design/methodology/approach - Cross-sectional data were collected through the SEAFOODplus pan-European consumer survey (n=4,786) with samples represen......Purpose - To investigate the impact of consumers' health beliefs, involvement, and risk perception on fish consumption in five European countries. Design/methodology/approach - Cross-sectional data were collected through the SEAFOODplus pan-European consumer survey (n=4,786) with samples...... representative for age and region in Belgium, the Netherlands, Denmark, Spain and Poland. Structural equation modeling (LISREL) was used in order to simultaneously estimate the strength and direction of all relationships in our model.- Our model contributes to a better understanding of factors influencing fish...... consumption. Health involvement is found as an indirect whilst interest in healthy eating emerges as a direct driver of fish consumption behaviour. On the contrary, risk perception has a negative impact on fish consumption. Research limitations/implications - Further research using survey questionnaires could...

  9. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  10. A new Markov-chain-related statistical approach for modelling synthetic wind power time series

    International Nuclear Information System (INIS)

    Pesch, T; Hake, J F; Schröders, S; Allelein, H J

    2015-01-01

    The integration of rising shares of volatile wind power in the generation mix is a major challenge for the future energy system. To address the uncertainties involved in wind power generation, models analysing and simulating the stochastic nature of this energy source are becoming increasingly important. One statistical approach that has been frequently used in the literature is the Markov chain approach. Recently, the method was identified as being of limited use for generating wind time series with time steps shorter than 15–40 min as it is not capable of reproducing the autocorrelation characteristics accurately. This paper presents a new Markov-chain-related statistical approach that is capable of solving this problem by introducing a variable second lag. Furthermore, additional features are presented that allow for the further adjustment of the generated synthetic time series. The influences of the model parameter settings are examined by meaningful parameter variations. The suitability of the approach is demonstrated by an application analysis with the example of the wind feed-in in Germany. It shows that—in contrast to conventional Markov chain approaches—the generated synthetic time series do not systematically underestimate the required storage capacity to balance wind power fluctuation. (paper)

  11. Thermodynamic consistency of viscoplastic material models involving external variable rates in the evolution equations for the internal variables

    International Nuclear Information System (INIS)

    Malmberg, T.

    1993-09-01

    The objective of this study is to derive and investigate thermodynamic restrictions for a particular class of internal variable models. Their evolution equations consist of two contributions: the usual irreversible part, depending only on the present state, and a reversible but path dependent part, linear in the rates of the external variables (evolution equations of ''mixed type''). In the first instance the thermodynamic analysis is based on the classical Clausius-Duhem entropy inequality and the Coleman-Noll argument. The analysis is restricted to infinitesimal strains and rotations. The results are specialized and transferred to a general class of elastic-viscoplastic material models. Subsequently, they are applied to several viscoplastic models of ''mixed type'', proposed or discussed in the literature (Robinson et al., Krempl et al., Freed et al.), and it is shown that some of these models are thermodynamically inconsistent. The study is closed with the evaluation of the extended Clausius-Duhem entropy inequality (concept of Mueller) where the entropy flux is governed by an assumed constitutive equation in its own right; also the constraining balance equations are explicitly accounted for by the method of Lagrange multipliers (Liu's approach). This analysis is done for a viscoplastic material model with evolution equations of the ''mixed type''. It is shown that this approach is much more involved than the evaluation of the classical Clausius-Duhem entropy inequality with the Coleman-Noll argument. (orig.) [de

  12. A piecewise modeling approach for climate sensitivity studies: Tests with a shallow-water model

    Science.gov (United States)

    Shao, Aimei; Qiu, Chongjian; Niu, Guo-Yue

    2015-10-01

    In model-based climate sensitivity studies, model errors may grow during continuous long-term integrations in both the "reference" and "perturbed" states and hence the climate sensitivity (defined as the difference between the two states). To reduce the errors, we propose a piecewise modeling approach that splits the continuous long-term simulation into subintervals of sequential short-term simulations, and updates the modeled states through re-initialization at the end of each subinterval. In the re-initialization processes, this approach updates the reference state with analysis data and updates the perturbed states with the sum of analysis data and the difference between the perturbed and the reference states, thereby improving the credibility of the modeled climate sensitivity. We conducted a series of experiments with a shallow-water model to evaluate the advantages of the piecewise approach over the conventional continuous modeling approach. We then investigated the impacts of analysis data error and subinterval length used in the piecewise approach on the simulations of the reference and perturbed states as well as the resulting climate sensitivity. The experiments show that the piecewise approach reduces the errors produced by the conventional continuous modeling approach, more effectively when the analysis data error becomes smaller and the subinterval length is shorter. In addition, we employed a nudging assimilation technique to solve possible spin-up problems caused by re-initializations by using analysis data that contain inconsistent errors between mass and velocity. The nudging technique can effectively diminish the spin-up problem, resulting in a higher modeling skill.

  13. A Partial Least Square Approach for Modeling Gene-gene and Gene-environment Interactions When Multiple Markers Are Genotyped

    Science.gov (United States)

    Wang, Tao; Ho, Gloria; Ye, Kenny; Strickler, Howard; Elston, Robert C.

    2008-01-01

    Genetic association studies achieve an unprecedented level of resolution in mapping disease genes by genotyping dense SNPs in a gene region. Meanwhile, these studies require new powerful statistical tools that can optimally handle a large amount of information provided by genotype data. A question that arises is how to model interactions between two genes. Simply modeling all possible interactions between the SNPs in two gene regions is not desirable because a greatly increased number of degrees of freedom can be involved in the test statistic. We introduce an approach to reduce the genotype dimension in modeling interactions. The genotype compression of this approach is built upon the information on both the trait and the cross-locus gametic disequilibrium between SNPs in two interacting genes, in such a way as to parsimoniously model the interactions without loss of useful information in the process of dimension reduction. As a result, it improves power to detect association in the presence of gene-gene interactions. This approach can be similarly applied for modeling gene-environment interactions. We compare this method with other approaches: the corresponding test without modeling any interaction, that based on a saturated interaction model, that based on principal component analysis, and that based on Tukey’s 1-df model. Our simulations suggest that this new approach has superior power to that of the other methods. In an application to endometrial cancer case-control data from the Women’s Health Initiative (WHI), this approach detected AKT1 and AKT2 as being significantly associated with endometrial cancer susceptibility by taking into account their interactions with BMI. PMID:18615621

  14. A partial least-square approach for modeling gene-gene and gene-environment interactions when multiple markers are genotyped.

    Science.gov (United States)

    Wang, Tao; Ho, Gloria; Ye, Kenny; Strickler, Howard; Elston, Robert C

    2009-01-01

    Genetic association studies achieve an unprecedented level of resolution in mapping disease genes by genotyping dense single nucleotype polymorphisms (SNPs) in a gene region. Meanwhile, these studies require new powerful statistical tools that can optimally handle a large amount of information provided by genotype data. A question that arises is how to model interactions between two genes. Simply modeling all possible interactions between the SNPs in two gene regions is not desirable because a greatly increased number of degrees of freedom can be involved in the test statistic. We introduce an approach to reduce the genotype dimension in modeling interactions. The genotype compression of this approach is built upon the information on both the trait and the cross-locus gametic disequilibrium between SNPs in two interacting genes, in such a way as to parsimoniously model the interactions without loss of useful information in the process of dimension reduction. As a result, it improves power to detect association in the presence of gene-gene interactions. This approach can be similarly applied for modeling gene-environment interactions. We compare this method with other approaches, the corresponding test without modeling any interaction, that based on a saturated interaction model, that based on principal component analysis, and that based on Tukey's one-degree-of-freedom model. Our simulations suggest that this new approach has superior power to that of the other methods. In an application to endometrial cancer case-control data from the Women's Health Initiative, this approach detected AKT1 and AKT2 as being significantly associated with endometrial cancer susceptibility by taking into account their interactions with body mass index.

  15. Patient involvement in Danish health care

    DEFF Research Database (Denmark)

    Vrangbaek, Karsten

    2015-01-01

    PURPOSE: The purpose of this paper is to investigate different types of patient involvement in Denmark, and to discuss the potential implications of pursuing several strategies for patient involvement simultaneously. DESIGN/METHODOLOGY/APPROACH: The paper presents a preliminary framework for anal......PURPOSE: The purpose of this paper is to investigate different types of patient involvement in Denmark, and to discuss the potential implications of pursuing several strategies for patient involvement simultaneously. DESIGN/METHODOLOGY/APPROACH: The paper presents a preliminary framework...... for analysis of patient involvement in health care. This framework is used to analyze key governance features of patient involvement in Denmark based on previous research papers and reports describing patient involvement in Danish health care. FINDINGS: Patient involvement is important in Denmark...... be identified when pursuing the strategies at the same time. RESEARCH LIMITATIONS/IMPLICATIONS: Because of the chosen research approach, the research results may lack generalizability. Therefore, researchers are encouraged to test the proposed framework further. PRACTICAL IMPLICATIONS: The paper includes...

  16. A Discrete Monetary Economic Growth Model with the MIU Approach

    Directory of Open Access Journals (Sweden)

    Wei-Bin Zhang

    2008-01-01

    Full Text Available This paper proposes an alternative approach to economic growth with money. The production side is the same as the Solow model, the Ramsey model, and the Tobin model. But we deal with behavior of consumers differently from the traditional approaches. The model is influenced by the money-in-the-utility (MIU approach in monetary economics. It provides a mechanism of endogenous saving which the Solow model lacks and avoids the assumption of adding up utility over a period of time upon which the Ramsey approach is based.

  17. A Modified Approach in Modeling and Calculation of Contact Characteristics of Rough Surfaces

    Directory of Open Access Journals (Sweden)

    J.A. Abdo

    2005-12-01

    Full Text Available A mathematical formulation for the contact of rough surfaces is presented. The derivation of the contact model is facilitated through the definition of plastic asperities that are assumed to be embedded at a critical depth within the actual surface asperities. The surface asperities are assumed to deform elastically whereas the plastic asperities experience only plastic deformation. The deformation of plastic asperities is made to obey the law of conservation of volume. It is believed that the proposed model is advantageous since (a it provides a more accurate account of elasticplastic behavior of surfaces in contact and (b it is applicable to model formulations that involve asperity shoulder-to shoulder contact. Comparison of numerical results for estimating true contact area and contact force using the proposed model and the earlier methods suggest that the proposed approach provides a more realistic prediction of elastic-plastic contact behavior.

  18. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.

    Science.gov (United States)

    Wong, Rowena Syn Yin; Ismail, Noor Azina

    2016-01-01

    There are not many studies that attempt to model intensive care unit (ICU) risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU. This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV) model. Bayesian Markov Chain Monte Carlo (MCMC) simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method. The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS) was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC) values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05) for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study. Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.

  19. Nonperturbative approach to the attractive Hubbard model

    International Nuclear Information System (INIS)

    Allen, S.; Tremblay, A.-M. S.

    2001-01-01

    A nonperturbative approach to the single-band attractive Hubbard model is presented in the general context of functional-derivative approaches to many-body theories. As in previous work on the repulsive model, the first step is based on a local-field-type ansatz, on enforcement of the Pauli principle and a number of crucial sumrules. The Mermin-Wagner theorem in two dimensions is automatically satisfied. At this level, two-particle self-consistency has been achieved. In the second step of the approximation, an improved expression for the self-energy is obtained by using the results of the first step in an exact expression for the self-energy, where the high- and low-frequency behaviors appear separately. The result is a cooperon-like formula. The required vertex corrections are included in this self-energy expression, as required by the absence of a Migdal theorem for this problem. Other approaches to the attractive Hubbard model are critically compared. Physical consequences of the present approach and agreement with Monte Carlo simulations are demonstrated in the accompanying paper (following this one)

  20. Systems biology integration of proteomic data in rodent models of depression reveals involvement of the immune response and glutamatergic signaling.

    Science.gov (United States)

    Carboni, Lucia; Nguyen, Thanh-Phuong; Caberlotto, Laura

    2016-12-01

    The pathophysiological basis of major depression is incompletely understood. Recently, numerous proteomic studies have been performed in rodent models of depression to investigate the molecular underpinnings of depressive-like behaviours with an unbiased approach. The objective of the study is to integrate the results of these proteomic studies in depression models to shed light on the most relevant molecular pathways involved in the disease. Network analysis is performed integrating preexisting proteomic data from rodent models of depression. The IntAct mouse and the HRPD are used as reference protein-protein interaction databases. The functionality analyses of the networks are then performed by testing overrepresented GO biological process terms and pathways. Functional enrichment analyses of the networks revealed an association with molecular processes related to depression in humans, such as those involved in the immune response. Pathways impacted by clinically effective antidepressants are modulated, including glutamatergic signaling and neurotrophic responses. Moreover, dysregulations of proteins regulating energy metabolism and circadian rhythms are implicated. The comparison with protein pathways modulated in depressive patients revealed significant overlapping. This systems biology study supports the notion that animal models can contribute to the research into the biology and therapeutics of depression. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  2. Coarse-Grained Model for Water Involving a Virtual Site.

    Science.gov (United States)

    Deng, Mingsen; Shen, Hujun

    2016-02-04

    In this work, we propose a new coarse-grained (CG) model for water by combining the features of two popular CG water models (BMW and MARTINI models) as well as by adopting a topology similar to that of the TIP4P water model. In this CG model, a CG unit, representing four real water molecules, consists of a virtual site, two positively charged particles, and a van der Waals (vdW) interaction center. Distance constraint is applied to the bonds formed between the vdW interaction center and the positively charged particles. The virtual site, which carries a negative charge, is determined by the locations of the two positively charged particles and the vdW interaction center. For the new CG model of water, we coined the name "CAVS" (charge is attached to a virtual site) due to the involvment of the virtual site. After being tested in molecular dynamic (MD) simulations of bulk water at various time steps, under different temperatures and in different salt (NaCl) concentrations, the CAVS model offers encouraging predictions for some bulk properties of water (such as density, dielectric constant, etc.) when compared to experimental ones.

  3. Stakeholder involvement in establishing a milk quality sub-index in dairy cow breeding goals: a Delphi approach.

    Science.gov (United States)

    Henchion, M; McCarthy, M; Resconi, V C; Berry, D P; McParland, S

    2016-05-01

    The relative weighting on traits within breeding goals are generally determined by bio-economic models or profit functions. While such methods have generally delivered profitability gains to producers, and are being expanded to consider non-market values, current approaches generally do not consider the numerous and diverse stakeholders that affect, or are affected, by such tools. Based on principles of respondent anonymity, iteration, controlled feedback and statistical aggregation of feedback, a Delphi study was undertaken to gauge stakeholder opinion of the importance of detailed milk quality traits within an overall dairy breeding goal for profit, with the aim of assessing its suitability as a complementary, participatory approach to defining breeding goals. The questionnaires used over two survey rounds asked stakeholders: (a) their opinion on incorporating an explicit sub-index for milk quality into a national breeding goal; (b) the importance they would assign to a pre-determined list of milk quality traits and (c) the (relative) weighting they would give such a milk quality sub-index. Results from the survey highlighted a good degree of consensus among stakeholders on the issues raised. Similarly, revelation of the underlying assumptions and knowledge used by stakeholders to make their judgements illustrated their ability to consider a range of perspectives when evaluating traits, and to reconsider their answers based on the responses and rationales given by others, which demonstrated social learning. Finally, while the relative importance assigned by stakeholders in the Delphi survey (4% to 10%) and the results of calculations based on selection index theory of the relative emphasis that should be placed on milk quality to halt any deterioration (16%) are broadly in line, the difference indicates the benefit of considering more than one approach to determining breeding goals. This study thus illustrates the role of the Delphi technique, as a complementary

  4. Hong Kong Chinese daughters' intergenerational caregiving obligations: a cultural model approach.

    Science.gov (United States)

    Holroyd, E

    2001-11-01

    This paper, based on a study carried out in Hong Kong, outlines the caregiving obligations of Hong Kong Chinese daughters towards their frail elderly parents. A cultural model approach drawn from cognitive anthropology is taken to focus on how Chinese caregiving daughters develop a sense of what is right and emotionally fulfilling and acquire the motivation to care for their parents. An ethnographic approach was used in the study and techniques included guided and open-ended interviews and non-participatory observations. A total of 20 co-residential caregiving daughters were interviewed in their homes on average twice over the course of one year. All interviews were conducted in Cantonese. Although the sample was small, daughters' accounts are structured by reference to cultural models and this structure provides the common basis for generalisability of results. Concepts of Confucian antecedents, reciprocity and personhood and other modern ideas of filial duty are explored. Conclusions are drawn about the shifting rights and obligations of Chinese caregiving daughters within the contemporary urban realities of Hong Kong. The findings of this study have relevance for the development of welfare policy for older Chinese persons and the chronically ill, and to all services involving women. The findings will also serve to inform family caregiver education programs.

  5. A moving approach for the Vector Hysteron Model

    Energy Technology Data Exchange (ETDEWEB)

    Cardelli, E. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Faba, A., E-mail: antonio.faba@unipg.it [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Laudani, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy); Quondam Antonio, S. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Riganti Fulginei, F.; Salvini, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy)

    2016-04-01

    A moving approach for the VHM (Vector Hysteron Model) is here described, to reconstruct both scalar and rotational magnetization of electrical steels with weak anisotropy, such as the non oriented grain Silicon steel. The hysterons distribution is postulated to be function of the magnetization state of the material, in order to overcome the practical limitation of the congruency property of the standard VHM approach. By using this formulation and a suitable accommodation procedure, the results obtained indicate that the model is accurate, in particular in reproducing the experimental behavior approaching to the saturation region, allowing a real improvement respect to the previous approach.

  6. A Declarative Design Approach to Modeling Traditional and Non-Traditional Space Systems

    Science.gov (United States)

    Hoag, Lucy M.

    The space system design process is known to be laborious, complex, and computationally demanding. It is highly multi-disciplinary, involving several interdependent subsystems that must be both highly optimized and reliable due to the high cost of launch. Satellites must also be capable of operating in harsh and unpredictable environments, so integrating high-fidelity analysis is important. To address each of these concerns, a holistic design approach is necessary. However, while the sophistication of space systems has evolved significantly in the last 60 years, improvements in the design process have been comparatively stagnant. Space systems continue to be designed using a procedural, subsystem-by-subsystem approach. This method is inadequate since it generally requires extensive iteration and limited or heuristic-based search, which can be slow, labor-intensive, and inaccurate. The use of a declarative design approach can potentially address these inadequacies. In the declarative programming style, the focus of a problem is placed on what the objective is, and not necessarily how it should be achieved. In the context of design, this entails knowledge expressed as a declaration of statements that are true about the desired artifact instead of explicit instructions on how to implement it. A well-known technique is through constraint-based reasoning, where a design problem is represented as a network of rules and constraints that are reasoned across by a solver to dynamically discover the optimal candidate(s). This enables implicit instantiation of the tradespace and allows for automatic generation of all feasible design candidates. As such, this approach also appears to be well-suited to modeling adaptable space systems, which generally have large tradespaces and possess configurations that are not well-known a priori. This research applied a declarative design approach to holistic satellite design and to tradespace exploration for adaptable space systems. The

  7. Personalization of models with many model parameters: an efficient sensitivity analysis approach.

    Science.gov (United States)

    Donders, W P; Huberts, W; van de Vosse, F N; Delhaas, T

    2015-10-01

    Uncertainty quantification and global sensitivity analysis are indispensable for patient-specific applications of models that enhance diagnosis or aid decision-making. Variance-based sensitivity analysis methods, which apportion each fraction of the output uncertainty (variance) to the effects of individual input parameters or their interactions, are considered the gold standard. The variance portions are called the Sobol sensitivity indices and can be estimated by a Monte Carlo (MC) approach (e.g., Saltelli's method [1]) or by employing a metamodel (e.g., the (generalized) polynomial chaos expansion (gPCE) [2, 3]). All these methods require a large number of model evaluations when estimating the Sobol sensitivity indices for models with many parameters [4]. To reduce the computational cost, we introduce a two-step approach. In the first step, a subset of important parameters is identified for each output of interest using the screening method of Morris [5]. In the second step, a quantitative variance-based sensitivity analysis is performed using gPCE. Efficient sampling strategies are introduced to minimize the number of model runs required to obtain the sensitivity indices for models considering multiple outputs. The approach is tested using a model that was developed for predicting post-operative flows after creation of a vascular access for renal failure patients. We compare the sensitivity indices obtained with the novel two-step approach with those obtained from a reference analysis that applies Saltelli's MC method. The two-step approach was found to yield accurate estimates of the sensitivity indices at two orders of magnitude lower computational cost. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Stochastic approaches to inflation model building

    International Nuclear Information System (INIS)

    Ramirez, Erandy; Liddle, Andrew R.

    2005-01-01

    While inflation gives an appealing explanation of observed cosmological data, there are a wide range of different inflation models, providing differing predictions for the initial perturbations. Typically models are motivated either by fundamental physics considerations or by simplicity. An alternative is to generate large numbers of models via a random generation process, such as the flow equations approach. The flow equations approach is known to predict a definite structure to the observational predictions. In this paper, we first demonstrate a more efficient implementation of the flow equations exploiting an analytic solution found by Liddle (2003). We then consider alternative stochastic methods of generating large numbers of inflation models, with the aim of testing whether the structures generated by the flow equations are robust. We find that while typically there remains some concentration of points in the observable plane under the different methods, there is significant variation in the predictions amongst the methods considered

  9. Application of a random effects negative binomial model to examine tram-involved crash frequency on route sections in Melbourne, Australia.

    Science.gov (United States)

    Naznin, Farhana; Currie, Graham; Logan, David; Sarvi, Majid

    2016-07-01

    Safety is a key concern in the design, operation and development of light rail systems including trams or streetcars as they impose crash risks on road users in terms of crash frequency and severity. The aim of this study is to identify key traffic, transit and route factors that influence tram-involved crash frequencies along tram route sections in Melbourne. A random effects negative binomial (RENB) regression model was developed to analyze crash frequency data obtained from Yarra Trams, the tram operator in Melbourne. The RENB modelling approach can account for spatial and temporal variations within observation groups in panel count data structures by assuming that group specific effects are randomly distributed across locations. The results identify many significant factors effecting tram-involved crash frequency including tram service frequency (2.71), tram stop spacing (-0.42), tram route section length (0.31), tram signal priority (-0.25), general traffic volume (0.18), tram lane priority (-0.15) and ratio of platform tram stops (-0.09). Findings provide useful insights on route section level tram-involved crashes in an urban tram or streetcar operating environment. The method described represents a useful planning tool for transit agencies hoping to improve safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Hydrologic Model Development and Calibration: Contrasting a Single- and Multi-Objective Approach for Comparing Model Performance

    Science.gov (United States)

    Asadzadeh, M.; Maclean, A.; Tolson, B. A.; Burn, D. H.

    2009-05-01

    Hydrologic model calibration aims to find a set of parameters that adequately simulates observations of watershed behavior, such as streamflow, or a state variable, such as snow water equivalent (SWE). There are different metrics for evaluating calibration effectiveness that involve quantifying prediction errors, such as the Nash-Sutcliffe (NS) coefficient and bias evaluated for the entire calibration period, on a seasonal basis, for low flows, or for high flows. Many of these metrics are conflicting such that the set of parameters that maximizes the high flow NS differs from the set of parameters that maximizes the low flow NS. Conflicting objectives are very likely when different calibration objectives are based on different fluxes and/or state variables (e.g., NS based on streamflow versus SWE). One of the most popular ways to balance different metrics is to aggregate them based on their importance and find the set of parameters that optimizes a weighted sum of the efficiency metrics. Comparing alternative hydrologic models (e.g., assessing model improvement when a process or more detail is added to the model) based on the aggregated objective might be misleading since it represents one point on the tradeoff of desired error metrics. To derive a more comprehensive model comparison, we solved a bi-objective calibration problem to estimate the tradeoff between two error metrics for each model. Although this approach is computationally more expensive than the aggregation approach, it results in a better understanding of the effectiveness of selected models at each level of every error metric and therefore provides a better rationale for judging relative model quality. The two alternative models used in this study are two MESH hydrologic models (version 1.2) of the Wolf Creek Research basin that differ in their watershed spatial discretization (a single Grouped Response Unit, GRU, versus multiple GRUs). The MESH model, currently under development by Environment

  11. Across frequency processes involved in auditory detection of coloration

    DEFF Research Database (Denmark)

    Buchholz, Jörg; Kerketsos, P

    2008-01-01

    filterbank was designed to approximate auditory filter-shapes measured by Oxenham and Shera [JARO, 2003, 541-554], derived from forward masking data. The results of the present study demonstrate that a “purely” spectrum-based model approach can successfully describe auditory coloration detection even at high......When an early wall reflection is added to a direct sound, a spectral modulation is introduced to the signal's power spectrum. This spectral modulation typically produces an auditory sensation of coloration or pitch. Throughout this study, auditory spectral-integration effects involved in coloration...... detection are investigated. Coloration detection thresholds were therefore measured as a function of reflection delay and stimulus bandwidth. In order to investigate the involved auditory mechanisms, an auditory model was employed that was conceptually similar to the peripheral weighting model [Yost, JASA...

  12. A Conceptual Modeling Approach for OLAP Personalization

    Science.gov (United States)

    Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan

    Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.

  13. A BEHAVIORAL-APPROACH TO LINEAR EXACT MODELING

    NARCIS (Netherlands)

    ANTOULAS, AC; WILLEMS, JC

    1993-01-01

    The behavioral approach to system theory provides a parameter-free framework for the study of the general problem of linear exact modeling and recursive modeling. The main contribution of this paper is the solution of the (continuous-time) polynomial-exponential time series modeling problem. Both

  14. A computational modeling approach for the characterization of mechanical properties of 3D alginate tissue scaffolds.

    Science.gov (United States)

    Nair, K; Yan, K C; Sun, W

    2008-01-01

    Scaffold guided tissue engineering is an innovative approach wherein cells are seeded onto biocompatible and biodegradable materials to form 3-dimensional (3D) constructs that, when implanted in the body facilitate the regeneration of tissue. Tissue scaffolds act as artificial extracellular matrix providing the environment conducive for tissue growth. Characterization of scaffold properties is necessary to understand better the underlying processes involved in controlling cell behavior and formation of functional tissue. We report a computational modeling approach to characterize mechanical properties of 3D gellike biomaterial, specifically, 3D alginate scaffold encapsulated with cells. Alginate inherent nonlinearity and variations arising from minute changes in its concentration and viscosity make experimental evaluation of its mechanical properties a challenging and time consuming task. We developed an in silico model to determine the stress-strain relationship of alginate based scaffolds from experimental data. In particular, we compared the Ogden hyperelastic model to other hyperelastic material models and determined that this model was the most suitable to characterize the nonlinear behavior of alginate. We further propose a mathematical model that represents the alginate material constants in Ogden model as a function of concentrations and viscosity. This study demonstrates the model capability to predict mechanical properties of 3D alginate scaffolds.

  15. Service quality benchmarking via a novel approach based on fuzzy ELECTRE III and IPA: an empirical case involving the Italian public healthcare context.

    Science.gov (United States)

    La Fata, Concetta Manuela; Lupo, Toni; Piazza, Tommaso

    2017-11-21

    A novel fuzzy-based approach which combines ELECTRE III along with the Importance-Performance Analysis (IPA) is proposed in the present work to comparatively evaluate the service quality in the public healthcare context. Specifically, ELECTRE III is firstly considered to compare the service performance of examined hospitals in a noncompensatory manner. Afterwards, IPA is employed to support the service quality management to point out improvement needs and their priorities. The proposed approach also incorporates features of the Fuzzy Set Theory so as to address the possible uncertainty, subjectivity and vagueness of involved experts in evaluating the service quality. The model is applied to five major Sicilian public hospitals, and strengths and criticalities of the delivered service are finally highlighted and discussed. Although several approaches combining multi-criteria methods have already been proposed in the literature to evaluate the service performance in the healthcare field, to the best of the authors' knowledge the present work represents the first attempt at comparing service performance of alternatives in a noncompensatory manner in the investigated context.

  16. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  17. A Dynamic Intelligent Decision Approach to Dependency Modeling of Project Tasks in Complex Engineering System Optimization

    Directory of Open Access Journals (Sweden)

    Tinggui Chen

    2013-01-01

    Full Text Available Complex engineering system optimization usually involves multiple projects or tasks. On the one hand, dependency modeling among projects or tasks highlights structures in systems and their environments which can help to understand the implications of connectivity on different aspects of system performance and also assist in designing, optimizing, and maintaining complex systems. On the other hand, multiple projects or tasks are either happening at the same time or scheduled into a sequence in order to use common resources. In this paper, we propose a dynamic intelligent decision approach to dependency modeling of project tasks in complex engineering system optimization. The approach takes this decision process as a two-stage decision-making problem. In the first stage, a task clustering approach based on modularization is proposed so as to find out a suitable decomposition scheme for a large-scale project. In the second stage, according to the decomposition result, a discrete artificial bee colony (ABC algorithm inspired by the intelligent foraging behavior of honeybees is developed for the resource constrained multiproject scheduling problem. Finally, a certain case from an engineering design of a chemical processing system is utilized to help to understand the proposed approach.

  18. Statistical aspects of carbon fiber risk assessment modeling. [fire accidents involving aircraft

    Science.gov (United States)

    Gross, D.; Miller, D. R.; Soland, R. M.

    1980-01-01

    The probabilistic and statistical aspects of the carbon fiber risk assessment modeling of fire accidents involving commercial aircraft are examined. Three major sources of uncertainty in the modeling effort are identified. These are: (1) imprecise knowledge in establishing the model; (2) parameter estimation; and (3)Monte Carlo sampling error. All three sources of uncertainty are treated and statistical procedures are utilized and/or developed to control them wherever possible.

  19. Towards a 3d Spatial Urban Energy Modelling Approach

    Science.gov (United States)

    Bahu, J.-M.; Koch, A.; Kremers, E.; Murshed, S. M.

    2013-09-01

    Today's needs to reduce the environmental impact of energy use impose dramatic changes for energy infrastructure and existing demand patterns (e.g. buildings) corresponding to their specific context. In addition, future energy systems are expected to integrate a considerable share of fluctuating power sources and equally a high share of distributed generation of electricity. Energy system models capable of describing such future systems and allowing the simulation of the impact of these developments thus require a spatial representation in order to reflect the local context and the boundary conditions. This paper describes two recent research approaches developed at EIFER in the fields of (a) geo-localised simulation of heat energy demand in cities based on 3D morphological data and (b) spatially explicit Agent-Based Models (ABM) for the simulation of smart grids. 3D city models were used to assess solar potential and heat energy demand of residential buildings which enable cities to target the building refurbishment potentials. Distributed energy systems require innovative modelling techniques where individual components are represented and can interact. With this approach, several smart grid demonstrators were simulated, where heterogeneous models are spatially represented. Coupling 3D geodata with energy system ABMs holds different advantages for both approaches. On one hand, energy system models can be enhanced with high resolution data from 3D city models and their semantic relations. Furthermore, they allow for spatial analysis and visualisation of the results, with emphasis on spatially and structurally correlations among the different layers (e.g. infrastructure, buildings, administrative zones) to provide an integrated approach. On the other hand, 3D models can benefit from more detailed system description of energy infrastructure, representing dynamic phenomena and high resolution models for energy use at component level. The proposed modelling strategies

  20. Introducing an Intervention Model for Fostering Affective Involvement with Persons Who Are Congenitally Deafblind

    NARCIS (Netherlands)

    Martens, M.A.W.; Janssen, M.J.; Ruijssenaars, A.J.J.M.; Riksen-Walraven, J.M.A.

    2014-01-01

    The article presented here introduces the Intervention Model for Affective Involvement (IMAI), which was designed to train staff members (for example, teachers, caregivers, support workers) to foster affective involvement during interaction and communication with persons who have congenital

  1. A dynamic approach for the impact of a toxic gas dispersion hazard considering human behaviour and dispersion modelling.

    Science.gov (United States)

    Lovreglio, Ruggiero; Ronchi, Enrico; Maragkos, Georgios; Beji, Tarek; Merci, Bart

    2016-11-15

    The release of toxic gases due to natural/industrial accidents or terrorist attacks in populated areas can have tragic consequences. To prevent and evaluate the effects of these disasters different approaches and modelling tools have been introduced in the literature. These instruments are valuable tools for risk managers doing risk assessment of threatened areas. Despite the significant improvements in hazard assessment in case of toxic gas dispersion, these analyses do not generally include the impact of human behaviour and people movement during emergencies. This work aims at providing an approach which considers both modelling of gas dispersion and evacuation movement in order to improve the accuracy of risk assessment for disasters involving toxic gases. The approach is applied to a hypothetical scenario including a ship releasing Nitrogen dioxide (NO2) on a crowd attending a music festival. The difference between the results obtained with existing static methods (people do not move) and a dynamic approach (people move away from the danger) which considers people movement with different degrees of sophistication (either a simple linear path or more complex behavioural modelling) is discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  3. Development of generalized boiling transition model applicable for wide variety of fuel bundle geometries. Basic strategy and numerical approaches

    International Nuclear Information System (INIS)

    Ninokata, Hisashi; Sadatomi, Michio; Okawa, Tomio

    2003-01-01

    In order to establish a key technology to realize advanced BWR fuel designs, a three-year project of the advanced subchannel analysis code development had been started since 2002. The five dominant factors involved in the boiling transitional process in the fuel bundles were focused. They are, (1) inter-subchannel exchanges, (2) influences of obstacles (3) dryout of liquid film, (4) transition of two-phase flow regimes and (5) deposition of droplets. It has been recognized that present physical models or constitutive equations in subchannel formulations need to be improved so that they include geometrical effects in the fuel bundle design more mechanistically and universally. Through reviewing literatures and existent experimental results, underlying elementary processes and geometrical factors that are indispensable for improving subchannel codes were identified. The basic strategy that combines numerical and experimental approaches was proposed aiming at establishment of mechanistic models for the five dominant factors. In this paper, the present status of methodologies for detailed two-phase flow studies has been summarized. According to spatial scales of focused elementary processes, proper numerical approaches were selected. For some promising numerical approaches, preliminary calcitonins were performed for assessing their applicability to investigation of elementary processes involved in the boiling transition. (author)

  4. Generating Collaborative Systems for Digital Libraries: a Model-Driven Approach

    Directory of Open Access Journals (Sweden)

    Alessio Malizia

    2010-12-01

    Full Text Available The design and development of a digital library involves different stakeholders, such as: information architects, librarians, and domain experts, who need to agree on a common language to describe, discuss, and negotiate the services the library has to offer. To this end, high-level, language-neutral models have to be devised. Metamodeling techniques favor the definition of domainspecific visual languages through which stakeholders can share their views and directly manipulate representations of the domain entities. This paper describes CRADLE (Cooperative-Relational Approach to Digital Library Environments, a metamodel-based framework and visual language for the definition of notions and services related to the development of digital libraries. A collection of tools allows the automatic generation of several services, defined with the CRADLE visual language, and of the graphical user interfaces providing access to them for the final user. The effectiveness of the approach is illustrated by presenting digital libraries generated with CRADLE, while the CRADLE environment has been evaluated by using the cognitive dimensions framework.

  5. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  6. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    Science.gov (United States)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  7. Assessing a Top-Down Modeling Approach for Seasonal Scale Snow Sensitivity

    Science.gov (United States)

    Luce, C. H.; Lute, A.

    2017-12-01

    Mechanistic snow models are commonly applied to assess changes to snowpacks in a warming climate. Such assessments involve a number of assumptions about details of weather at daily to sub-seasonal time scales. Models of season-scale behavior can provide contrast for evaluating behavior at time scales more in concordance with climate warming projections. Such top-down models, however, involve a degree of empiricism, with attendant caveats about the potential of a changing climate to affect calibrated relationships. We estimated the sensitivity of snowpacks from 497 Snowpack Telemetry (SNOTEL) stations in the western U.S. based on differences in climate between stations (spatial analog). We examined the sensitivity of April 1 snow water equivalent (SWE) and mean snow residence time (SRT) to variations in Nov-Mar precipitation and average Nov-Mar temperature using multivariate local-fit regressions. We tested the modeling approach using a leave-one-out cross-validation as well as targeted two-fold non-random cross-validations contrasting, for example, warm vs. cold years, dry vs. wet years, and north vs. south stations. Nash-Sutcliffe Efficiency (NSE) values for the validations were strong for April 1 SWE, ranging from 0.71 to 0.90, and still reasonable, but weaker, for SRT, in the range of 0.64 to 0.81. From these ranges, we exclude validations where the training data do not represent the range of target data. A likely reason for differences in validation between the two metrics is that the SWE model reflects the influence of conservation of mass while using temperature as an indicator of the season-scale energy balance; in contrast, SRT depends more strongly on the energy balance aspects of the problem. Model forms with lower numbers of parameters generally validated better than more complex model forms, with the caveat that pseudoreplication could encourage selection of more complex models when validation contrasts were weak. Overall, the split sample validations

  8. Atomistic approach for modeling metal-semiconductor interfaces

    DEFF Research Database (Denmark)

    Stradi, Daniele; Martinez, Umberto; Blom, Anders

    2016-01-01

    realistic metal-semiconductor interfaces and allows for a direct comparison between theory and experiments via the I–V curve. In particular, it will be demonstrated how doping — and bias — modifies the Schottky barrier, and how finite size models (the slab approach) are unable to describe these interfaces......We present a general framework for simulating interfaces using an atomistic approach based on density functional theory and non-equilibrium Green's functions. The method includes all the relevant ingredients, such as doping and an accurate value of the semiconductor band gap, required to model...

  9. Multi-model approach to characterize human handwriting motion.

    Science.gov (United States)

    Chihi, I; Abdelkrim, A; Benrejeb, M

    2016-02-01

    This paper deals with characterization and modelling of human handwriting motion from two forearm muscle activity signals, called electromyography signals (EMG). In this work, an experimental approach was used to record the coordinates of a pen tip moving on the (x, y) plane and EMG signals during the handwriting act. The main purpose is to design a new mathematical model which characterizes this biological process. Based on a multi-model approach, this system was originally developed to generate letters and geometric forms written by different writers. A Recursive Least Squares algorithm is used to estimate the parameters of each sub-model of the multi-model basis. Simulations show good agreement between predicted results and the recorded data.

  10. Public interface and waste management planning: An approach for integrating community involvement in waste strategies

    International Nuclear Information System (INIS)

    Xiques, P.J.

    1988-01-01

    Public involvement and information programs have bridged a communication abyss and allowed waste management policy-makers to understand legitimate public concerns. The perception often held by waste generators that technical concerns had greater validity than institutional issues is being altered as managers realize that information failures can halt a program as abruptly as technical ones. The role and level of involvement of the public in establishing waste management policies has changed dramatically over the past decade. Once the domain only of the generators and regulators, effective waste management strategy development must now make early provisions for public and local government involvement. By allowing public decision makers to participate in the initial planning process and maintain involvement throughout the implementation, many institutional barriers can be avoided. In today's climate, such barriers may represent direct costs, such as litigation, or indirect costs, such as delay, deferral, or duplication of work. Government programs have historically enjoyed a degree of insulation from public involvement factors on the basis of national security, defense, or the greater public good. However, such programs are no longer sacrosanct. Today, the cost of cleaning up past environmental impact can leave little or no money to meet present program objectives. Thus failure to get a public consensus before beginning remedial action can have a major impact on the allocation of scarce resources. Specific approaches to integrating the public into the planning phase of waste management will be addressed, including audience identification, issue analysis and tracking, prioritization of concerns, and information tool development

  11. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    Science.gov (United States)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  12. Nonlinear Modeling of the PEMFC Based On NNARX Approach

    OpenAIRE

    Shan-Jen Cheng; Te-Jen Chang; Kuang-Hsiung Tan; Shou-Ling Kuo

    2015-01-01

    Polymer Electrolyte Membrane Fuel Cell (PEMFC) is such a time-vary nonlinear dynamic system. The traditional linear modeling approach is hard to estimate structure correctly of PEMFC system. From this reason, this paper presents a nonlinear modeling of the PEMFC using Neural Network Auto-regressive model with eXogenous inputs (NNARX) approach. The multilayer perception (MLP) network is applied to evaluate the structure of the NNARX model of PEMFC. The validity and accurac...

  13. Understanding Gulf War Illness: An Integrative Modeling Approach

    Science.gov (United States)

    2017-10-01

    using a novel mathematical model. The computational biology approach will enable the consortium to quickly identify targets of dysfunction and find... computer / mathematical paradigms for evaluation of treatment strategies 12-30 50% Develop pilot clinical trials on basis of animal studies 24-36 60...the goal of testing chemical treatments. The immune and autonomic biomarkers will be tested using a computational modeling approach allowing for a

  14. Canadian Whole-Farm Model Holos - Development, Stakeholder Involvement, and Model Application

    Science.gov (United States)

    Kroebel, R.; Janzen, H.; Beauchemin, K. A.

    2017-12-01

    modelling approach based on the ICBM model. Also under development are sub-models to predict ammonia volatilization and water budgets. Development of Holos is expected to continue, forging an interactive link between ongoing research and the interests of stakeholders in an ever-changing agricultural environment.

  15. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  16. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  17. Heat transfer modeling an inductive approach

    CERN Document Server

    Sidebotham, George

    2015-01-01

    This innovative text emphasizes a "less-is-more" approach to modeling complicated systems such as heat transfer by treating them first as "1-node lumped models" that yield simple closed-form solutions. The author develops numerical techniques for students to obtain more detail, but also trains them to use the techniques only when simpler approaches fail. Covering all essential methods offered in traditional texts, but with a different order, Professor Sidebotham stresses inductive thinking and problem solving as well as a constructive understanding of modern, computer-based practice. Readers learn to develop their own code in the context of the material, rather than just how to use packaged software, offering a deeper, intrinsic grasp behind models of heat transfer. Developed from over twenty-five years of lecture notes to teach students of mechanical and chemical engineering at The Cooper Union for the Advancement of Science and Art, the book is ideal for students and practitioners across engineering discipl...

  18. Prevention approaches in a preclinical canine model of Alzheimer’s disease: Benefits and challenges

    Directory of Open Access Journals (Sweden)

    Paulina R. Davis

    2014-03-01

    Full Text Available Aged dogs spontaneously develop many features of human aging and Alzheimer’s disease (AD including cognitive decline and neuropathology. In this review, we discuss age-dependent learning tasks, memory tasks, and functional measures that can be used in aged dogs for sensitive treatment outcome measures. Neuropathology that is linked to cognitive decline is described along with examples of treatment studies that show reduced neuropathology in aging dogs (dietary manipulations, behavioral enrichment, immunotherapy, and statins. Studies in canine show that multi-targeted approaches may be more beneficial than single pathway manipulations (e.g. antioxidants combined with behavioral enrichment. Aging canine studies show good predictive validity for human clinical trials outcomes (e.g. immunotherapy and several interventions tested in dogs strongly support a prevention approach (e.g. immunotherapy and statins. Further, dogs are ideally suited for prevention studies as they the age because onset of cognitive decline and neuropathology strongly support longitudinal interventions that can be completed within a 3-5 year period. Disadvantages to using the canine model are that they lengthy, use labor-intensive comprehensive cognitive testing, and involve costly housing (almost as high as that of nonhuman primates. However overall, using the dog as a preclinical model for testing preventive approaches for AD may complement work in rodents and nonhuman primates.

  19. Polynomial Chaos Expansion Approach to Interest Rate Models

    Directory of Open Access Journals (Sweden)

    Luca Di Persio

    2015-01-01

    Full Text Available The Polynomial Chaos Expansion (PCE technique allows us to recover a finite second-order random variable exploiting suitable linear combinations of orthogonal polynomials which are functions of a given stochastic quantity ξ, hence acting as a kind of random basis. The PCE methodology has been developed as a mathematically rigorous Uncertainty Quantification (UQ method which aims at providing reliable numerical estimates for some uncertain physical quantities defining the dynamic of certain engineering models and their related simulations. In the present paper, we use the PCE approach in order to analyze some equity and interest rate models. In particular, we take into consideration those models which are based on, for example, the Geometric Brownian Motion, the Vasicek model, and the CIR model. We present theoretical as well as related concrete numerical approximation results considering, without loss of generality, the one-dimensional case. We also provide both an efficiency study and an accuracy study of our approach by comparing its outputs with the ones obtained adopting the Monte Carlo approach, both in its standard and its enhanced version.

  20. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.

    Directory of Open Access Journals (Sweden)

    Rowena Syn Yin Wong

    Full Text Available There are not many studies that attempt to model intensive care unit (ICU risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU.This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV model. Bayesian Markov Chain Monte Carlo (MCMC simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method.The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05 for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study.Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.

  1. A Multivariate Approach to Functional Neuro Modeling

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.

    1998-01-01

    by the application of linear and more flexible, nonlinear microscopic regression models to a real-world dataset. The dependency of model performance, as quantified by generalization error, on model flexibility and training set size is demonstrated, leading to the important realization that no uniformly optimal model......, provides the basis for a generalization theoretical framework relating model performance to model complexity and dataset size. Briefly summarized the major topics discussed in the thesis include: - An introduction of the representation of functional datasets by pairs of neuronal activity patterns...... exists. - Model visualization and interpretation techniques. The simplicity of this task for linear models contrasts the difficulties involved when dealing with nonlinear models. Finally, a visualization technique for nonlinear models is proposed. A single observation emerges from the thesis...

  2. Experience of Public Involvement in Canada Presented to the Forum for Stakeholder Confidence

    International Nuclear Information System (INIS)

    Facella, Jo-Ann; Patton, Pat

    2008-01-01

    Pat Patton of NWMO, Canada, summarised the experiences of the organisation's three-year study aimed at identifying a broadly supported approach to managing Canada's nuclear fuel waste. The starting point of the study was the recognition that citizen perception of safety and acceptability are strongly interrelated, therefore understanding and addressing the social dimension of safety would be critical for finding a socially acceptable RWM approach. An iterative and collaborative dialogue was conducted between specialists and citizens to both identify how safety is to be assessed and to carry out the assessment. First, objectives, values and ethical principles were defined, which formed the basis for the criteria of selecting a preferred RWM approach. The dialogue revealed that adaptability of the management approach to new information and technological advancement is a key requirement. Continuous learning, RD and D, and citizen involvement over the course of implementation were also identified as important components of the management approach. Ms Patton presented an illustrative model for public involvement during the implementation process. According to the model, implementation would be a multi-stage process with a continuous interaction between scientific and technical specialists, potentially affected communities and the implementer. Finally, Ms Patton outlined some key challenges for future dialogues between non-specialists and experts, including the development of tools for involving citizens in increasingly more knowledge-intensive areas and communicating research results which address issues highlighted by citizens

  3. Calibration of environmental radionuclide transfer models using a Bayesian approach with Markov chain Monte Carlo simulations and model comparisons - Calibration of radionuclides transfer models in the environment using a Bayesian approach with Markov chain Monte Carlo simulation and comparison of models

    Energy Technology Data Exchange (ETDEWEB)

    Nicoulaud-Gouin, V.; Giacalone, M.; Gonze, M.A. [Institut de Radioprotection et de Surete Nucleaire-PRP-ENV/SERIS/LM2E (France); Martin-Garin, A.; Garcia-Sanchez, L. [IRSN-PRP-ENV/SERIS/L2BT (France)

    2014-07-01

    , distinguishes instantaneous (K{sub d}1) and first-order kinetics of sorption and desorption processes (λ{sub fix}, λ{sub rem}), each having potentially a limited sorption capacity. A Soil-Plant Deposition Model describing the weeds contamination in {sup 137}Cs, {sup 134}Cs and {sup 131}I, with in situ measures in the Fukushima prefecture (Gonze et al. submitted to this conference). This model considers two foliage pools and a root pool, and describes foliar biomass growth with a Verhulst model. One prerequisite for calibration is model identifiability. Here, we showed that there are not unique parameter values corresponding to a data set. However, sharp distributions were found when several data sets were involved. One numerical difficulty of Markov Chains is to check convergence. It was here examined with Raftery and Lewis diagnostic, Gelman and Rubin plots, and simulation trails. Failing to converge may indicate that the model is not adapted to the observations. The Bayes factor was used to decide between competing models, which applies even if they are not nested. For most data series, EK model was preferable to the nested K{sub d} approach. An Empirical Dynamical Model -consisting of two exponential functions- was compared to the Soil-Plant Deposition Model, by distinguishing site-specific parameters and invariant parameters between stations, in order to study the goodness-of-fit of the Soil-Plant Deposition Model. (authors)

  4. Modelling of Dispersed Gas-Liquid Flow using LBGK and LPT Approach

    Science.gov (United States)

    Agarwal, Alankar; Prakash, Akshay; Ravindra, B.

    2017-11-01

    The dynamics of gas bubbles play a significant, if not crucial, role in a large variety of industrial process that involves using reactors. Many of these processes are still not well understood in terms of optimal scale-up strategies.An accurate modeling of bubbles and bubble swarms become important for high fidelity bioreactor simulations. This study is a part of the development of robust bubble fluid interaction modules for simulation of industrial-scale reactors. The work presents the simulation of a single bubble rising in a quiescent water tank using current models presented in the literature for bubble-fluid interaction. In this multiphase benchmark problem, the continuous phase (water) is discretized using the Lattice Bhatnagar-Gross and Krook (LBGK) model of Lattice Boltzmann Method (LBM), while the dispersed gas phase (i.e. air-bubble) modeled with the Lagrangian particle tracking (LPT) approach. The cheap clipped fourth order polynomial function is used to model the interaction between two phases. The model is validated by comparing the simulation results for terminal velocity of a bubble at varying bubble diameter and the influence of bubble motion in liquid velocity with the theoretical and previously available experimental data. This work is supported by the ``Centre for Development of Advanced Computing (C-DAC), Pune'' by providing the advanced computational facility in PARAM Yuva-II.

  5. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  6. Advanced language modeling approaches, case study: Expert search

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2008-01-01

    This tutorial gives a clear and detailed overview of advanced language modeling approaches and tools, including the use of document priors, translation models, relevance models, parsimonious models and expectation maximization training. Expert search will be used as a case study to explain the

  7. Evidence-based policy: implications for nursing and policy involvement.

    Science.gov (United States)

    Hewison, Alistair

    2008-11-01

    Evidence-based policy making is espoused as a central feature of government in the United Kingdom. However, an expectation that this will improve the quality of policy produced and provide a path to increased involvement of nurses in the policy process is misplaced. The purpose of this article is to demonstrate that the emphasis on evidence-based policy is problematic and cannot be regarded as a "new model" of policy making. Also, it could deflect attention from more practical approaches to policy involvement on the part of nurses. Policy development activities, acquisition of skills in policy analysis, and other forms of involvement are needed if nurses are to move along the continuum from policy literacy, through policy acumen, to policy competence. This involves taking a critical stance on the notion of evidence-based policy.

  8. Medical staff involvement in nursing homes: development of a conceptual model and research agenda.

    Science.gov (United States)

    Shield, Renée; Rosenthal, Marsha; Wetle, Terrie; Tyler, Denise; Clark, Melissa; Intrator, Orna

    2014-02-01

    Medical staff (physicians, nurse practitioners, physicians' assistants) involvement in nursing homes (NH) is limited by professional guidelines, government policies, regulations, and reimbursements, creating bureaucratic burden. The conceptual NH Medical Staff Involvement Model, based on our mixed-methods research, applies the Donabedian "structure-process-outcomes" framework to the NH, identifying measures for a coordinated research agenda. Quantitative surveys and qualitative interviews conducted with medical directors, administrators and directors of nursing, other experts, residents and family members and Minimum Data Set, the Online Certification and Reporting System and Medicare Part B claims data related to NH structure, process, and outcomes were analyzed. NH control of medical staff, or structure, affects medical staff involvement in care processes and is associated with better outcomes (e.g., symptom management, appropriate transitions, satisfaction). The model identifies measures clarifying the impact of NH medical staff involvement on care processes and resident outcomes and has strong potential to inform regulatory policies.

  9. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  10. Modeling amorphization of tetrahedral structures under local approaches

    International Nuclear Information System (INIS)

    Jesurum, C.E.; Pulim, V.; Berger, B.; Hobbs, L.W.

    1997-01-01

    Many crystalline ceramics can be topologically disordered (amorphized) by disordering radiation events involving high-energy collision cascades or (in some cases) successive single-atom displacements. The authors are interested in both the potential for disorder and the possible aperiodic structures adopted following the disordering event. The potential for disordering is related to connectivity, and among those structures of interest are tetrahedral networks (such as SiO 2 , SiC and Si 3 N 4 ) comprising corner-shared tetrahedral units whose connectivities are easily evaluated. In order to study the response of these networks to radiation, the authors have chosen to model their assembly according to the (simple) local rules that each corner obeys in connecting to another tetrahedron; in this way they easily erect large computer models of any crystalline polymorphic form. Amorphous structures can be similarly grown by application of altered rules. They have adopted a simple model of irradiation in which all bonds in the neighborhood of a designated tetrahedron are destroyed, and they reform the bonds in this region according to a set of (possibly different) local rules appropriate to the environmental conditions. When a tetrahedron approaches the boundary of this neighborhood, it undergoes an optimization step in which a spring is inserted between two corners of compatible tetrahedra when they are within a certain distance of one another; component forces are then applied that act to minimize the distance between these corners and minimize the deviation from the rules. The resulting structure is then analyzed for the complete adjacency matrix, irreducible ring statistics, and bond angle distributions

  11. Modeling of nonlinear responses for reciprocal transducers involving polarization switching

    DEFF Research Database (Denmark)

    Willatzen, Morten; Wang, Linxiang

    2007-01-01

    Nonlinearities and hysteresis effects in a reciprocal PZT transducer are examined by use of a dynamical mathematical model on the basis of phase-transition theory. In particular, we consider the perovskite piezoelectric ceramic in which the polarization process in the material can be modeled...... by Landau theory for the first-order phase transformation, in which each polarization state is associated with a minimum of the Landau free-energy function. Nonlinear constitutive laws are obtained by using thermodynamical equilibrium conditions, and hysteretic behavior of the material can be modeled...... intrinsically. The time-dependent Ginzburg-Landau theory is used in the parameter identification involving hysteresis effects. We use the Chebyshev collocation method in the numerical simulations. The elastic field is assumed to be coupled linearly with other fields, and the nonlinearity is in the E-D coupling...

  12. Methodological challenges involved in compiling the Nahua pharmacopeia.

    Science.gov (United States)

    De Vos, Paula

    2017-06-01

    Recent work in the history of science has questioned the Eurocentric nature of the field and sought to include a more global approach that would serve to displace center-periphery models in favor of approaches that take seriously local knowledge production. Historians of Iberian colonial science have taken up this approach, which involves reliance on indigenous knowledge traditions of the Americas. These traditions present a number of challenges to modern researchers, including availability and reliability of source material, issues of translation and identification, and lack of systematization. This essay explores the challenges that emerged in the author's attempt to compile a pre-contact Nahua pharmacopeia, the reasons for these challenges, and the ways they may - or may not - be overcome.

  13. Modelling dynamics of atmosphere ventilation and industrial city’s air pollution analysis: New approach

    Science.gov (United States)

    Glushkov, A. V.; Khetselius, O. Yu; Agayar, E. V.; Buyadzhi, V. V.; Romanova, A. V.; Mansarliysky, V. F.

    2017-10-01

    We present a new effective approach to analysis and modelling the natural air ventilation in an atmosphere of the industrial city, which is based on the Arakawa-Schubert and Glushkov models, modified to calculate the current involvement of the ensemble of clouds, and advanced mathematical methods of modelling an unsteady turbulence in the urban area. For the first time the methods of a plane complex field and spectral expansion algorithms are applied to calculate the air circulation for the cloud layer arrays, penetrating into the territory of the industrial city. We have also taken into account for the mechanisms of transformation of the cloud system advection over the territory of the urban area. The results of test computing the air ventilation characteristics are presented for the Odessa city. All above cited methods and models together with the standard monitoring and management systems can be considered as a basis for comprehensive “Green City” construction technology.

  14. Modeling of the bacterial mechanism of methicillin-resistance by a systems biology approach.

    Directory of Open Access Journals (Sweden)

    Ida Autiero

    Full Text Available BACKGROUND: A microorganism is a complex biological system able to preserve its functional features against external perturbations and the ability of the living systems to oppose to these external perturbations is defined "robustness". The antibiotic resistance, developed by different bacteria strains, is a clear example of robustness and of ability of the bacterial system to acquire a particular functional behaviour in response to environmental changes. In this work we have modeled the whole mechanism essential to the methicillin-resistance through a systems biology approach. The methicillin is a beta-lactamic antibiotic that act by inhibiting the penicillin-binding proteins (PBPs. These PBPs are involved in the synthesis of peptidoglycans, essential mesh-like polymers that surround cellular enzymes and are crucial for the bacterium survival. METHODOLOGY: The network of genes, mRNA, proteins and metabolites was created using CellDesigner program and the data of molecular interactions are stored in Systems Biology Markup Language (SBML. To simulate the dynamic behaviour of this biochemical network, the kinetic equations were associated with each reaction. CONCLUSIONS: Our model simulates the mechanism of the inactivation of the PBP by methicillin, as well as the expression of PBP2a isoform, the regulation of the SCCmec elements (SCC: staphylococcal cassette chromosome and the synthesis of peptidoglycan by PBP2a. The obtained results by our integrated approach show that the model describes correctly the whole phenomenon of the methicillin resistance and is able to respond to the external perturbations in the same way of the real cell. Therefore, this model can be useful to develop new therapeutic approaches for the methicillin control and to understand the general mechanism regarding the cellular resistance to some antibiotics.

  15. Smeared crack modelling approach for corrosion-induced concrete damage

    DEFF Research Database (Denmark)

    Thybo, Anna Emilie Anusha; Michel, Alexander; Stang, Henrik

    2017-01-01

    In this paper a smeared crack modelling approach is used to simulate corrosion-induced damage in reinforced concrete. The presented modelling approach utilizes a thermal analogy to mimic the expansive nature of solid corrosion products, while taking into account the penetration of corrosion...... products into the surrounding concrete, non-uniform precipitation of corrosion products, and creep. To demonstrate the applicability of the presented modelling approach, numerical predictions in terms of corrosion-induced deformations as well as formation and propagation of micro- and macrocracks were......-induced damage phenomena in reinforced concrete. Moreover, good agreements were also found between experimental and numerical data for corrosion-induced deformations along the circumference of the reinforcement....

  16. An Alternative Approach to the Extended Drude Model

    Science.gov (United States)

    Gantzler, N. J.; Dordevic, S. V.

    2018-05-01

    The original Drude model, proposed over a hundred years ago, is still used today for the analysis of optical properties of solids. Within this model, both the plasma frequency and quasiparticle scattering rate are constant, which makes the model rather inflexible. In order to circumvent this problem, the so-called extended Drude model was proposed, which allowed for the frequency dependence of both the quasiparticle scattering rate and the effective mass. In this work we will explore an alternative approach to the extended Drude model. Here, one also assumes that the quasiparticle scattering rate is frequency dependent; however, instead of the effective mass, the plasma frequency becomes frequency-dependent. This alternative model is applied to the high Tc superconductor Bi2Sr2CaCu2O8+δ (Bi2212) with Tc = 92 K, and the results are compared and contrasted with the ones obtained from the conventional extended Drude model. The results point to several advantages of this alternative approach to the extended Drude model.

  17. An object-oriented approach to energy-economic modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wise, M.A.; Fox, J.A.; Sands, R.D.

    1993-12-01

    In this paper, the authors discuss the experiences in creating an object-oriented economic model of the U.S. energy and agriculture markets. After a discussion of some central concepts, they provide an overview of the model, focusing on the methodology of designing an object-oriented class hierarchy specification based on standard microeconomic production functions. The evolution of the model from the class definition stage to programming it in C++, a standard object-oriented programming language, will be detailed. The authors then discuss the main differences between writing the object-oriented program versus a procedure-oriented program of the same model. Finally, they conclude with a discussion of the advantages and limitations of the object-oriented approach based on the experience in building energy-economic models with procedure-oriented approaches and languages.

  18. Helpful Components Involved in the Cognitive-Experiential Model of Dream Work

    Science.gov (United States)

    Tien, Hsiu-Lan Shelley; Chen, Shuh-Chi; Lin, Chia-Huei

    2009-01-01

    The purpose of the study was to examine the helpful components involved in the Hill's cognitive-experiential dream work model. Participants were 27 volunteer clients from colleges and universities in northern and central parts of Taiwan. Each of the clients received 1-2 sessions of dream interpretations. The cognitive-experiential dream work model…

  19. Multiscale approach to equilibrating model polymer melts

    DEFF Research Database (Denmark)

    Svaneborg, Carsten; Ali Karimi-Varzaneh, Hossein; Hojdis, Nils

    2016-01-01

    We present an effective and simple multiscale method for equilibrating Kremer Grest model polymer melts of varying stiffness. In our approach, we progressively equilibrate the melt structure above the tube scale, inside the tube and finally at the monomeric scale. We make use of models designed...

  20. Collaborative Proposal: Transforming How Climate System Models are Used: A Global, Multi-Resolution Approach

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald

    2013-04-15

    Despite the great interest in regional modeling for both weather and climate applications, regional modeling is not yet at the stage that it can be used routinely and effectively for climate modeling of the ocean. The overarching goal of this project is to transform how climate models are used by developing and implementing a robust, efficient, and accurate global approach to regional ocean modeling. To achieve this goal, we will use theoretical and computational means to resolve several basic modeling and algorithmic issues. The first task is to develop techniques for transitioning between parameterized and high-fidelity regional ocean models as the discretization grid transitions from coarse to fine regions. The second task is to develop estimates for the error in scientifically relevant quantities of interest that provide a systematic way to automatically determine where refinement is needed in order to obtain accurate simulations of dynamic and tracer transport in regional ocean models. The third task is to develop efficient, accurate, and robust time-stepping schemes for variable spatial resolution discretizations used in regional ocean models of dynamics and tracer transport. The fourth task is to develop frequency-dependent eddy viscosity finite element and discontinuous Galerkin methods and study their performance and effectiveness for simulation of dynamics and tracer transport in regional ocean models. These four projects share common difficulties and will be approach using a common computational and mathematical toolbox. This is a multidisciplinary project involving faculty and postdocs from Colorado State University, Florida State University, and Penn State University along with scientists from Los Alamos National Laboratory. The completion of the tasks listed within the discussion of the four sub-projects will go a long way towards meeting our goal of developing superior regional ocean models that will transform how climate system models are used.

  1. An Environmental Management Maturity Model of Construction Programs Using the AHP-Entropy Approach

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2018-06-01

    Full Text Available The accelerating process of urbanization in China has led to considerable opportunities for the development of construction projects, however, environmental issues have become an important constraint on the implementation of these projects. To quantitatively describe the environmental management capabilities of such projects, this paper proposes a 2-dimensional Environmental Management Maturity Model of Construction Program (EMMMCP based on an analysis of existing projects, group management theory and a management maturity model. In this model, a synergetic process was included to compensate for the lack of consideration of synergies in previous studies, and it was involved in the construction of the first dimension, i.e., the environmental management index system. The second dimension, i.e., the maturity level of environment management, was then constructed by redefining the hierarchical characteristics of construction program (CP environmental management maturity. Additionally, a mathematical solution to this proposed model was derived via the Analytic Hierarchy Process (AHP-entropy approach. To verify the effectiveness and feasibility of this proposed model, a computational experiment was conducted, and the results show that this approach could not only measure the individual levels of different processes, but also achieve the most important objective of providing a reference for stakeholders when making decisions on the environmental management of construction program, which reflects this model is reasonable for evaluating the level of environmental management maturity in CP. To our knowledge, this paper is the first study to evaluate the environmental management maturity levels of CP, which would fill the gap between project program management and environmental management and provide a reference for relevant management personnel to enhance their environmental management capabilities.

  2. Elements of a pragmatic approach for dealing with bias and uncertainty in experiments through predictions : experiment design and data conditioning; %22real space%22 model validation and conditioning; hierarchical modeling and extrapolative prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose

    2011-11-01

    This report explores some important considerations in devising a practical and consistent framework and methodology for utilizing experiments and experimental data to support modeling and prediction. A pragmatic and versatile 'Real Space' approach is outlined for confronting experimental and modeling bias and uncertainty to mitigate risk in modeling and prediction. The elements of experiment design and data analysis, data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. Rationale is given for the various choices underlying the Real Space end-to-end approach. The approach adopts and refines some elements and constructs from the literature and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various types of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, solid and structural mechanics, irradiated electronics, and combustion in fluids and solids.

  3. Deformation analysis of polymers composites: rheological model involving time-based fractional derivative

    DEFF Research Database (Denmark)

    Zhou, H. W.; Yi, H. Y.; Mishnaevsky, Leon

    2017-01-01

    A modeling approach to time-dependent property of Glass Fiber Reinforced Polymers (GFRP) composites is of special interest for quantitative description of long-term behavior. An electronic creep machine is employed to investigate the time-dependent deformation of four specimens of dog-bond-shaped......A modeling approach to time-dependent property of Glass Fiber Reinforced Polymers (GFRP) composites is of special interest for quantitative description of long-term behavior. An electronic creep machine is employed to investigate the time-dependent deformation of four specimens of dog......-bond-shaped GFRP composites at various stress level. A negative exponent function based on structural changes is introduced to describe the damage evolution of material properties in the process of creep test. Accordingly, a new creep constitutive equation, referred to fractional derivative Maxwell model...... by the fractional derivative Maxwell model proposed in the paper are in a good agreement with the experimental data. It is shown that the new creep constitutive model proposed in the paper needs few parameters to represent various time-dependent behaviors....

  4. Dynamic Involvement of Real World Objects in the IoT: A Consensus-Based Cooperation Approach.

    Science.gov (United States)

    Pilloni, Virginia; Atzori, Luigi; Mallus, Matteo

    2017-03-01

    A significant role in the Internet of Things (IoT) will be taken by mobile and low-cost unstable devices, which autonomously self-organize and introduce highly dynamic and heterogeneous scenarios for the deployment of distributed applications. This entails the devices to cooperate to dynamically find the suitable combination of their involvement so as to improve the system reliability while following the changes in their status. Focusing on the above scenario, we propose a distributed algorithm for resources allocation that is run by devices that can perform the same task required by the applications, allowing for a flexible and dynamic binding of the requested services with the physical IoT devices. It is based on a consensus approach, which maximizes the lifetime of groups of nodes involved and ensures the fulfillment of the requested Quality of Information (QoI) requirements. Experiments have been conducted with real devices, showing an improvement of device lifetime of more than 20 % , with respect to a uniform distribution of tasks.

  5. A novel approach to modeling and diagnosing the cardiovascular system

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States); Allen, P.A. [Life Link, Richland, WA (United States)

    1995-07-01

    A novel approach to modeling and diagnosing the cardiovascular system is introduced. A model exhibits a subset of the dynamics of the cardiovascular behavior of an individual by using a recurrent artificial neural network. Potentially, a model will be incorporated into a cardiovascular diagnostic system. This approach is unique in that each cardiovascular model is developed from physiological measurements of an individual. Any differences between the modeled variables and the variables of an individual at a given time are used for diagnosis. This approach also exploits sensor fusion to optimize the utilization of biomedical sensors. The advantage of sensor fusion has been demonstrated in applications including control and diagnostics of mechanical and chemical processes.

  6. Modeling of heat transfer in a horizontal heat-generating layer by an effective diffusivity approach

    International Nuclear Information System (INIS)

    Cheung, F.B.; Shiah, S.W.

    1994-01-01

    The concept of effective diffusivity is employed to model various processes of heat transfer in a volumetrically heated fluid layer subjected to different initial and boundary conditions. The approach, which involves the solution of only heat diffusion equations, is found to give rather accurate predictions of the transient response of an initially stagnant fluid layer to a step input of power as well as the developing and decaying nature of the flow following a step change in the internal Rayleigh number from one state of steady convection to another. The approach is also found to be applicable to various flow regions of a heat-generating fluid layer, and is not limited to the case in which the entire layer is in turbulent motion. The simplicity and accuracy of the method are clearly illustrated in the analysis. Validity of the effective diffusivity approach is demonstrated by comparing the predicted results with corresponding experimental data

  7. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  8. Neuroprotective effect of lurasidone via antagonist activities on histamine in a rat model of cranial nerve involvement.

    Science.gov (United States)

    He, Baoming; Yu, Liang; Li, Suping; Xu, Fei; Yang, Lili; Ma, Shuai; Guo, Yi

    2018-04-01

    Cranial nerve involvement frequently involves neuron damage and often leads to psychiatric disorder caused by multiple inducements. Lurasidone is a novel antipsychotic agent approved for the treatment of cranial nerve involvement and a number of mental health conditions in several countries. In the present study, the neuroprotective effect of lurasidone by antagonist activities on histamine was investigated in a rat model of cranial nerve involvement. The antagonist activities of lurasidone on serotonin 5‑HT7, serotonin 5‑HT2A, serotonin 5‑HT1A and serotonin 5‑HT6 were analyzed, and the preclinical therapeutic effects of lurasidone were examined in a rat model of cranial nerve involvement. The safety, maximum tolerated dose (MTD) and preliminary antitumor activity of lurasidone were also assessed in the cranial nerve involvement model. The therapeutic dose of lurasidone was 0.32 mg once daily, administered continuously in 14‑day cycles. The results of the present study found that the preclinical prescriptions induced positive behavioral responses following treatment with lurasidone. The MTD was identified as a once daily administration of 0.32 mg lurasidone. Long‑term treatment with lurasidone for cranial nerve involvement was shown to improve the therapeutic effects and reduce anxiety in the experimental rats. In addition, treatment with lurasidone did not affect body weight. The expression of the language competence protein, Forkhead‑BOX P2, was increased, and the levels of neuroprotective SxIP motif and microtubule end‑binding protein were increased in the hippocampal cells of rats with cranial nerve involvement treated with lurasidone. Lurasidone therapy reinforced memory capability and decreased anxiety. Taken together, lurasidone treatment appeared to protect against language disturbances associated with negative and cognitive impairment in the rat model of cranial nerve involvement, providing a basis for its use in the clinical treatment of

  9. A data-driven modeling approach to identify disease-specific multi-organ networks driving physiological dysregulation.

    Directory of Open Access Journals (Sweden)

    Warren D Anderson

    2017-07-01

    Full Text Available Multiple physiological systems interact throughout the development of a complex disease. Knowledge of the dynamics and connectivity of interactions across physiological systems could facilitate the prevention or mitigation of organ damage underlying complex diseases, many of which are currently refractory to available therapeutics (e.g., hypertension. We studied the regulatory interactions operating within and across organs throughout disease development by integrating in vivo analysis of gene expression dynamics with a reverse engineering approach to infer data-driven dynamic network models of multi-organ gene regulatory influences. We obtained experimental data on the expression of 22 genes across five organs, over a time span that encompassed the development of autonomic nervous system dysfunction and hypertension. We pursued a unique approach for identification of continuous-time models that jointly described the dynamics and structure of multi-organ networks by estimating a sparse subset of ∼12,000 possible gene regulatory interactions. Our analyses revealed that an autonomic dysfunction-specific multi-organ sequence of gene expression activation patterns was associated with a distinct gene regulatory network. We analyzed the model structures for adaptation motifs, and identified disease-specific network motifs involving genes that exhibited aberrant temporal dynamics. Bioinformatic analyses identified disease-specific single nucleotide variants within or near transcription factor binding sites upstream of key genes implicated in maintaining physiological homeostasis. Our approach illustrates a novel framework for investigating the pathogenesis through model-based analysis of multi-organ system dynamics and network properties. Our results yielded novel candidate molecular targets driving the development of cardiovascular disease, metabolic syndrome, and immune dysfunction.

  10. A robust quantitative near infrared modeling approach for blend monitoring.

    Science.gov (United States)

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  11. An ontology-based approach for modelling architectural styles

    OpenAIRE

    Pahl, Claus; Giesecke, Simon; Hasselbring, Wilhelm

    2007-01-01

    peer-reviewed The conceptual modelling of software architectures is of central importance for the quality of a software system. A rich modelling language is required to integrate the different aspects of architecture modelling, such as architectural styles, structural and behavioural modelling, into a coherent framework.We propose an ontological approach for architectural style modelling based on description logic as an abstract, meta-level modelling instrument. Architect...

  12. An Approach to Enforcing Clark-Wilson Model in Role-based Access Control Model

    Institute of Scientific and Technical Information of China (English)

    LIANGBin; SHIWenchang; SUNYufang; SUNBo

    2004-01-01

    Using one security model to enforce another is a prospective solution to multi-policy support. In this paper, an approach to the enforcing Clark-Wilson data integrity model in the Role-based access control (RBAC) model is proposed. An enforcement construction with great feasibility is presented. In this construction, a direct way to enforce the Clark-Wilson model is provided, the corresponding relations among users, transformation procedures, and constrained data items are strengthened; the concepts of task and subtask are introduced to enhance the support to least-privilege. The proposed approach widens the applicability of RBAC. The theoretical foundation for adopting Clark-Wilson model in a RBAC system with small cost is offered to meet the requirements of multi-policy support and policy flexibility.

  13. Pioneering partnerships: Resident involvement from multiple perspectives

    NARCIS (Netherlands)

    Baur, V.E.; Abma, T.A.; Boelsma, F.; Woelders, S.

    2013-01-01

    Resident involvement in residential care homes is a challenge due to shortcomings of consumerist and formal approaches such as resident councils. The PARTNER approach aims to involve residents through collective action to improve their community life and wellbeing. The purpose of this article is to

  14. Defining stakeholder involvement in participatory design processes

    NARCIS (Netherlands)

    Vink, P.; Imada, A.S.; Zink, K.J.

    2008-01-01

    A participatory approach could be used to implement work place or organizational improvements. However, the question is which participants should be involved and how. In this paper the theoretical involvement in different steps of a linear stepwise approach is described and compared with the latest

  15. Developing a quality by design approach to model tablet dissolution testing: an industrial case study.

    Science.gov (United States)

    Yekpe, Ketsia; Abatzoglou, Nicolas; Bataille, Bernard; Gosselin, Ryan; Sharkawi, Tahmer; Simard, Jean-Sébastien; Cournoyer, Antoine

    2017-11-02

    This study applied the concept of Quality by Design (QbD) to tablet dissolution. Its goal was to propose a quality control strategy to model dissolution testing of solid oral dose products according to International Conference on Harmonization guidelines. The methodology involved the following three steps: (1) a risk analysis to identify the material- and process-related parameters impacting the critical quality attributes of dissolution testing, (2) an experimental design to evaluate the influence of design factors (attributes and parameters selected by risk analysis) on dissolution testing, and (3) an investigation of the relationship between design factors and dissolution profiles. Results show that (a) in the case studied, the two parameters impacting dissolution kinetics are active pharmaceutical ingredient particle size distributions and tablet hardness and (b) these two parameters could be monitored with PAT tools to predict dissolution profiles. Moreover, based on the results obtained, modeling dissolution is possible. The practicality and effectiveness of the QbD approach were demonstrated through this industrial case study. Implementing such an approach systematically in industrial pharmaceutical production would reduce the need for tablet dissolution testing.

  16. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  17. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    International Nuclear Information System (INIS)

    Klos, Richard

    2008-03-01

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  18. A conceptual model of people's approach to sanitation

    Energy Technology Data Exchange (ETDEWEB)

    Avvannavar, Santosh M. [Center for Sustainable Technologies, Indian Institute of Science, Bangalore-560012, Karnataka (India)], E-mail: santosh@astra.iisc.ernet; Mani, Monto [Center for Sustainable Technologies, Indian Institute of Science, Bangalore-560012, Karnataka (India)

    2008-02-01

    Sanitation is a term primarily used to characterize the safe and sound handling (and disposal) of human excreta - or simply, people's approach to take-care of their (unavoidable) primal urge. According to the recent Human Development Report 2006 Global access to proper sanitation stands at approximately 58% with 37% being a conservative estimate both for South Asia and Sub-Saharan Africa. Various multi-million dollar sanitation programmes the world over have had little success, often due to inadequate understanding of people's sanitation approach. Sanitation approach includes the perception, feel and practices involved in satisficing the primal need to defecate and urinate (and their disposal). This paper presents a structure to understand the nature of psycho-socio-economic influences that determine societal approach to sanitation. Societies across the globe have evolved imbibing diverse influences attributed to the local environment, religion, cultural practices, war, etc. While a civilization's living environment reflects these influences in their built-environment characteristics, the influences are often deep-rooted and can be traced to the way the community members satisfice their need to defecate and urinate (sanitation approach). The objective of this paper is to trace the various approaches that diverse societies/civilizations, over time, across the world have had towards sanitation, and present a structure to articulate and understand determining factors. Sanitation also involves other domestic (solid and liquid) waste disposal but in the context of this paper the scope of sanitation has been restricted to human excreta alone. The structure presented and discussed in this paper would be useful in understanding a community better in terms of providing appropriate sanitation. It is hoped that this structure be considered as a basis for further refinement and detailed research into each of the factors determining people's sanitation

  19. Development of a subway operation incident delay model using accelerated failure time approaches.

    Science.gov (United States)

    Weng, Jinxian; Zheng, Yang; Yan, Xuedong; Meng, Qiang

    2014-12-01

    This study aims to develop a subway operational incident delay model using the parametric accelerated time failure (AFT) approach. Six parametric AFT models including the log-logistic, lognormal and Weibull models, with fixed and random parameters are built based on the Hong Kong subway operation incident data from 2005 to 2012, respectively. In addition, the Weibull model with gamma heterogeneity is also considered to compare the model performance. The goodness-of-fit test results show that the log-logistic AFT model with random parameters is most suitable for estimating the subway incident delay. First, the results show that a longer subway operation incident delay is highly correlated with the following factors: power cable failure, signal cable failure, turnout communication disruption and crashes involving a casualty. Vehicle failure makes the least impact on the increment of subway operation incident delay. According to these results, several possible measures, such as the use of short-distance and wireless communication technology (e.g., Wifi and Zigbee) are suggested to shorten the delay caused by subway operation incidents. Finally, the temporal transferability test results show that the developed log-logistic AFT model with random parameters is stable over time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Popularity Modeling for Mobile Apps: A Sequential Approach.

    Science.gov (United States)

    Zhu, Hengshu; Liu, Chuanren; Ge, Yong; Xiong, Hui; Chen, Enhong

    2015-07-01

    The popularity information in App stores, such as chart rankings, user ratings, and user reviews, provides an unprecedented opportunity to understand user experiences with mobile Apps, learn the process of adoption of mobile Apps, and thus enables better mobile App services. While the importance of popularity information is well recognized in the literature, the use of the popularity information for mobile App services is still fragmented and under-explored. To this end, in this paper, we propose a sequential approach based on hidden Markov model (HMM) for modeling the popularity information of mobile Apps toward mobile App services. Specifically, we first propose a popularity based HMM (PHMM) to model the sequences of the heterogeneous popularity observations of mobile Apps. Then, we introduce a bipartite based method to precluster the popularity observations. This can help to learn the parameters and initial values of the PHMM efficiently. Furthermore, we demonstrate that the PHMM is a general model and can be applicable for various mobile App services, such as trend based App recommendation, rating and review spam detection, and ranking fraud detection. Finally, we validate our approach on two real-world data sets collected from the Apple Appstore. Experimental results clearly validate both the effectiveness and efficiency of the proposed popularity modeling approach.

  1. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  2. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body

  3. The relevance and implications of organizational involvement for serious mental illness populations.

    Science.gov (United States)

    Treichler, Emily B H; Evans, Eric A; Johnson, J Rock; O'Hare, Mary; Spaulding, William D

    2015-07-01

    Consumer involvement has gained greater prominence in serious mental illness (SMI) because of the harmonious forces of new research findings, psychiatric rehabilitation, and the recovery movement. Previously conceived subdomains of consumer involvement include physical involvement, social involvement, and psychological involvement. We posit a fourth subdomain, organizational involvement. We have operationally defined organizational involvement as the involvement of mental health consumers in activities and organizations that are relevant to the mental health aspect of their identities from an individual to a systemic level across arenas relevant to mental health. This study surveyed adults with SMI regarding their current level of organizational involvement along with their preferences and beliefs about organizational involvement. Additionally, a path model was conducted to understand the relationships between domains of consumer involvement. Although participants reported wanting to be involved in identified organizational involvement activities and believing it was important to be involved in these kinds of activities, organizational involvement was low overall. The path model indicated that psychological involvement among other factors influence organizational involvement, which informed our suggestions to improve organizational involvement among people with SMI. Successful implementation must be a thoroughly consumer-centered approach creating meaningful and accessible involvement opportunities. Our study and prior studies indicate that organizational involvement and other subdomains of consumer involvement are key to the health and wellbeing of consumers, and therefore greater priority should be given to interventions aimed at increasing these essential domains. (c) 2015 APA, all rights reserved).

  4. Partnership Selection Involving Mixed Types of Uncertain Preferences

    Directory of Open Access Journals (Sweden)

    Li-Ching Ma

    2013-01-01

    Full Text Available Partnership selection is an important issue in management science. This study proposes a general model based on mixed integer programming and goal-programming analytic hierarchy process (GP-AHP to solve partnership selection problems involving mixed types of uncertain or inconsistent preferences. The proposed approach is designed to deal with crisp, interval, step, fuzzy, or mixed comparison preferences, derive crisp priorities, and improve multiple solution problems. The degree of fulfillment of a decision maker’s preferences is also taken into account. The results show that the proposed approach keeps more solution ratios within the given preferred intervals and yields less deviation. In addition, the proposed approach can treat incomplete preference matrices with flexibility in reducing the number of pairwise comparisons required and can also be conveniently developed into a decision support system.

  5. A devolved model for public involvement in the field of mental health research: case study learning.

    Science.gov (United States)

    Moule, Pam; Davies, Rosie

    2016-12-01

    Patient and public involvement in all aspects of research is espoused and there is a continued interest in understanding its wider impact. Existing investigations have identified both beneficial outcomes and remaining issues. This paper presents the impact of public involvement in one case study led by a mental health charity conducted as part of a larger research project. The case study used a devolved model of working, contracting with service user-led organizations to maximize the benefits of local knowledge on the implementation of personalized budgets, support recruitment and local user-led organizations. To understand the processes and impact of public involvement in a devolved model of working with user-led organizations. Multiple data collection methods were employed throughout 2012. These included interviews with the researchers (n = 10) and research partners (n = 5), observation of two case study meetings and the review of key case study documentation. Analysis was conducted in NVivo10 using a coding framework developed following a literature review. Five key themes emerged from the data; Devolved model, Nature of involvement, Enabling factors, Implementation challenges and Impact. While there were some challenges of implementing the devolved model it is clear that our findings add to the growing understanding of the positive benefits research partners can bring to complex research. A devolved model can support the involvement of user-led organizations in research if there is a clear understanding of the underpinning philosophy and support mechanisms are in place. © 2015 The Authors. Health Expectations Published by John Wiley & Sons Ltd.

  6. A systemic approach to modelling of radiobiological effects

    International Nuclear Information System (INIS)

    Obaturov, G.M.

    1988-01-01

    Basic principles of the systemic approach to modelling of the radiobiological effects at different levels of cell organization have been formulated. The methodology is proposed for theoretical modelling of the effects at these levels

  7. Validating a Model of Motivational Factors Influencing Involvement for Parents of Transition-Age Youth with Disabilities

    Science.gov (United States)

    Hirano, Kara A.; Shanley, Lina; Garbacz, S. Andrew; Rowe, Dawn A.; Lindstrom, Lauren; Leve, Leslie D.

    2018-01-01

    Parent involvement is a predictor of postsecondary education and employment outcomes, but rigorous measures of parent involvement for youth with disabilities are lacking. Hirano, Garbacz, Shanley, and Rowe adapted scales based on Hoover-Dempsey and Sandler model of parent involvement for use with parents of youth with disabilities aged 14 to 23.…

  8. Supporting Active User Involvment in Prototyping

    DEFF Research Database (Denmark)

    Grønbæk, Kaj

    1990-01-01

    The term prototyping has in recent years become a buzzword in both research and practice of system design due to a number of claimed advantages of prototyping techniques over traditional specification techniques. In particular it is often stated that prototyping facilitates the users' involvement...... in the development process. But prototyping does not automatically imply active user involvement! Thus a cooperative prototyping approach aiming at involving users actively and creatively in system design is proposed in this paper. The key point of the approach is to involve users in activities that closely couple...... development of prototypes to early evaluation of prototypes in envisioned use situations. Having users involved in such activities creates new requirements for tool support. Tools that support direct manipulation of prototypes and simulation of behaviour have shown promise for cooperative prototyping...

  9. A Blended Learning Approach to Teaching Project Management: A Model for Active Participation and Involvement--Insights from Norway

    Science.gov (United States)

    Hussein, Bassam A.

    2015-01-01

    The paper demonstrates and evaluates the effectiveness of a blended learning approach to create a meaningful learning environment. We use the term blended learning approach in this paper to refer to the use of multiple or hybrid instructional methods that emphasize the role of learners as contributors to the learning process rather than recipients…

  10. Deducing Electronic Unit Internal Response During a Vibration Test Using a Lumped Parameter Modeling Approach

    Science.gov (United States)

    Van Dyke, Michael B.

    2014-01-01

    During random vibration testing of electronic boxes there is often a desire to know the dynamic response of certain internal printed wiring boards (PWBs) for the purpose of monitoring the response of sensitive hardware or for post-test forensic analysis in support of anomaly investigation. Due to restrictions on internally mounted accelerometers for most flight hardware there is usually no means to empirically observe the internal dynamics of the unit, so one must resort to crude and highly uncertain approximations. One common practice is to apply Miles Equation, which does not account for the coupled response of the board in the chassis, resulting in significant over- or under-prediction. This paper explores the application of simple multiple-degree-of-freedom lumped parameter modeling to predict the coupled random vibration response of the PWBs in their fundamental modes of vibration. A simple tool using this approach could be used during or following a random vibration test to interpret vibration test data from a single external chassis measurement to deduce internal board dynamics by means of a rapid correlation analysis. Such a tool might also be useful in early design stages as a supplemental analysis to a more detailed finite element analysis to quickly prototype and analyze the dynamics of various design iterations. After developing the theoretical basis, a lumped parameter modeling approach is applied to an electronic unit for which both external and internal test vibration response measurements are available for direct comparison. Reasonable correlation of the results demonstrates the potential viability of such an approach. Further development of the preliminary approach presented in this paper will involve correlation with detailed finite element models and additional relevant test data.

  11. A kinetic approach to magnetospheric modeling

    International Nuclear Information System (INIS)

    Whipple, E.C. Jr.

    1979-01-01

    The earth's magnetosphere is caused by the interaction between the flowing solar wind and the earth's magnetic dipole, with the distorted magnetic field in the outer parts of the magnetosphere due to the current systems resulting from this interaction. It is surprising that even the conceptually simple problem of the collisionless interaction of a flowing plasma with a dipole magnetic field has not been solved. A kinetic approach is essential if one is to take into account the dispersion of particles with different energies and pitch angles and the fact that particles on different trajectories have different histories and may come from different sources. Solving the interaction problem involves finding the various types of possible trajectories, populating them with particles appropriately, and then treating the electric and magnetic fields self-consistently with the resulting particle densities and currents. This approach is illustrated by formulating a procedure for solving the collisionless interaction problem on open field lines in the case of a slowly flowing magnetized plasma interacting with a magnetic dipole

  12. A kinetic approach to magnetospheric modeling

    Science.gov (United States)

    Whipple, E. C., Jr.

    1979-01-01

    The earth's magnetosphere is caused by the interaction between the flowing solar wind and the earth's magnetic dipole, with the distorted magnetic field in the outer parts of the magnetosphere due to the current systems resulting from this interaction. It is surprising that even the conceptually simple problem of the collisionless interaction of a flowing plasma with a dipole magnetic field has not been solved. A kinetic approach is essential if one is to take into account the dispersion of particles with different energies and pitch angles and the fact that particles on different trajectories have different histories and may come from different sources. Solving the interaction problem involves finding the various types of possible trajectories, populating them with particles appropriately, and then treating the electric and magnetic fields self-consistently with the resulting particle densities and currents. This approach is illustrated by formulating a procedure for solving the collisionless interaction problem on open field lines in the case of a slowly flowing magnetized plasma interacting with a magnetic dipole.

  13. Dynamics and control of quadcopter using linear model predictive control approach

    Science.gov (United States)

    Islam, M.; Okasha, M.; Idres, M. M.

    2017-12-01

    This paper investigates the dynamics and control of a quadcopter using the Model Predictive Control (MPC) approach. The dynamic model is of high fidelity and nonlinear, with six degrees of freedom that include disturbances and model uncertainties. The control approach is developed based on MPC to track different reference trajectories ranging from simple ones such as circular to complex helical trajectories. In this control technique, a linearized model is derived and the receding horizon method is applied to generate the optimal control sequence. Although MPC is computer expensive, it is highly effective to deal with the different types of nonlinearities and constraints such as actuators’ saturation and model uncertainties. The MPC parameters (control and prediction horizons) are selected by trial-and-error approach. Several simulation scenarios are performed to examine and evaluate the performance of the proposed control approach using MATLAB and Simulink environment. Simulation results show that this control approach is highly effective to track a given reference trajectory.

  14. Model-centric approaches for the development of health information systems.

    Science.gov (United States)

    Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa

    2007-01-01

    Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.

  15. Comparing Two Different Approaches to the Modeling of the Common Cause Failures in Fault Trees

    International Nuclear Information System (INIS)

    Vukovic, I.; Mikulicic, V.; Vrbanic, I.

    2002-01-01

    The potential for common cause failures in systems that perform critical functions has been recognized as very important contributor to risk associated with operation of nuclear power plants. Consequentially, modeling of common cause failures (CCF) in fault trees has become one among the essential elements in any probabilistic safety assessment (PSA). Detailed and realistic representation of CCF potential in fault tree structure is sometimes very challenging task. This is especially so in the cases where a common cause group involves more than two components. During the last ten years the difficulties associated with this kind of modeling have been overcome to some degree by development of integral PSA tools with high capabilities. Some of them allow for the definition of CCF groups and their automated expanding in the process of Boolean resolution and generation of minimal cutsets. On the other hand, in PSA models developed and run by more traditional tools, CCF-potential had to be modeled in the fault trees explicitly. With explicit CCF modeling, fault trees can grow very large, especially in the cases when they involve CCF groups with 3 or more members, which can become an issue for the management of fault trees and basic events with traditional non-integral PSA models. For these reasons various simplifications had to be made. Speaking in terms of an overall PSA model, there are also some other issues that need to be considered, such as maintainability and accessibility of the model. In this paper a comparison is made between the two approaches to CCF modeling. Analysis is based on a full-scope Level 1 PSA model for internal initiating events that had originally been developed by a traditional PSA tool and later transferred to a new-generation PSA tool with automated CCF modeling capabilities. Related aspects and issues mentioned above are discussed in the paper. (author)

  16. Rapid customization system for 3D-printed splint using programmable modeling technique - a practical approach.

    Science.gov (United States)

    Li, Jianyou; Tanaka, Hiroya

    2018-01-01

    Traditional splinting processes are skill dependent and irreversible, and patient satisfaction levels during rehabilitation are invariably lowered by the heavy structure and poor ventilation of splints. To overcome this drawback, use of the 3D-printing technology has been proposed in recent years, and there has been an increase in public awareness. However, application of 3D-printing technologies is limited by the low CAD proficiency of clinicians as well as unforeseen scan flaws within anatomic models.A programmable modeling tool has been employed to develop a semi-automatic design system for generating a printable splint model. The modeling process was divided into five stages, and detailed steps involved in construction of the proposed system as well as automatic thickness calculation, the lattice structure, and assembly method have been thoroughly described. The proposed approach allows clinicians to verify the state of the splint model at every stage, thereby facilitating adjustment of input content and/or other parameters to help solve possible modeling issues. A finite element analysis simulation was performed to evaluate the structural strength of generated models. A fit investigation was applied on fabricated splints and volunteers to assess the wearing experience. Manual modeling steps involved in complex splint designs have been programed into the proposed automatic system. Clinicians define the splinting region by drawing two curves, thereby obtaining the final model within minutes. The proposed system is capable of automatically patching up minor flaws within the limb model as well as calculating the thickness and lattice density of various splints. Large splints could be divided into three parts for simultaneous multiple printing. This study highlights the advantages, limitations, and possible strategies concerning application of programmable modeling tools in clinical processes, thereby aiding clinicians with lower CAD proficiencies to become adept

  17. Exploring the Influence of Parental Involvement and Socioeconomic Status on Teen Digital Citizenship: A Path Modeling Approach

    Science.gov (United States)

    Wang, Xianhui; Xing, Wanli

    2018-01-01

    One important aspect of digital citizenship, defined as "the norms of appropriate, responsible behavior with regard to technology use," is to reinforce ethical online behavior and discourage risky conduct. The purpose of this study was to examine the effects of parental involvement and socioeconomic status on teens digital citizenship,…

  18. Managing supplier involvement in new product development: a portfolio approach

    NARCIS (Netherlands)

    Wynstra, Finn; ten Pierick, E.

    2000-01-01

    Supplier involvement in new product development projects has become an increasingly popular method for improving project effectiveness (product costs and quality) and project efficiency (development costs and time). One of the key issues in managing this involvement is determining which type of

  19. Risk Modelling for Passages in Approach Channel

    Directory of Open Access Journals (Sweden)

    Leszek Smolarek

    2013-01-01

    Full Text Available Methods of multivariate statistics, stochastic processes, and simulation methods are used to identify and assess the risk measures. This paper presents the use of generalized linear models and Markov models to study risks to ships along the approach channel. These models combined with simulation testing are used to determine the time required for continuous monitoring of endangered objects or period at which the level of risk should be verified.

  20. Perceptions of ambiguously unpleasant interracial interactions: a structural equation modeling approach.

    Science.gov (United States)

    Marino, Teresa L; Negy, Charles; Hammons, Mary E; McKinney, Cliff; Asberg, Kia

    2007-11-01

    Despite a general consensus in the United States that overtly racist acts are unacceptable, many ambiguous situations in everyday life raise questions of whether racism has influenced a person's behavior in an interracial encounter. The authors of the present study sought to (a) examine simultaneously an array of variables thought to be related to perceived racism and (b) investigate how the contribution of these variables may differ with respect to the asymmetry hypothesis, which suggests that acts of discrimination from a dominant person toward a subordinate person will be viewed as more biased than if the situation were reversed. The authors used a dual structural equation modeling approach. Results indicated that ethnic identity significantly predicted perceived racism. In addition, the extent to which cognitive interpretation style significantly predicted perceived racism depended on the ethnicity of participants involved in the interaction.

  1. Author’s response: A universal approach to modeling visual word recognition and reading: not only possible, but also inevitable.

    Science.gov (United States)

    Frost, Ram

    2012-10-01

    I have argued that orthographic processing cannot be understood and modeled without considering the manner in which orthographic structure represents phonological, semantic, and morphological information in a given writing system. A reading theory, therefore, must be a theory of the interaction of the reader with his/her linguistic environment. This outlines a novel approach to studying and modeling visual word recognition, an approach that focuses on the common cognitive principles involved in processing printed words across different writing systems. These claims were challenged by several commentaries that contested the merits of my general theoretical agenda, the relevance of the evolution of writing systems, and the plausibility of finding commonalities in reading across orthographies. Other commentaries extended the scope of the debate by bringing into the discussion additional perspectives. My response addresses all these issues. By considering the constraints of neurobiology on modeling reading, developmental data, and a large scope of cross-linguistic evidence, I argue that front-end implementations of orthographic processing that do not stem from a comprehensive theory of the complex information conveyed by writing systems do not present a viable approach for understanding reading. The common principles by which writing systems have evolved to represent orthographic, phonological, and semantic information in a language reveal the critical distributional characteristics of orthographic structure that govern reading behavior. Models of reading should thus be learning models, primarily constrained by cross-linguistic developmental evidence that describes how the statistical properties of writing systems shape the characteristics of orthographic processing. When this approach is adopted, a universal model of reading is possible.

  2. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  3. Feedback structure based entropy approach for multiple-model estimation

    Institute of Scientific and Technical Information of China (English)

    Shen-tu Han; Xue Anke; Guo Yunfei

    2013-01-01

    The variable-structure multiple-model (VSMM) approach, one of the multiple-model (MM) methods, is a popular and effective approach in handling problems with mode uncertainties. The model sequence set adaptation (MSA) is the key to design a better VSMM. However, MSA methods in the literature have big room to improve both theoretically and practically. To this end, we propose a feedback structure based entropy approach that could find the model sequence sets with the smallest size under certain conditions. The filtered data are fed back in real time and can be used by the minimum entropy (ME) based VSMM algorithms, i.e., MEVSMM. Firstly, the full Markov chains are used to achieve optimal solutions. Secondly, the myopic method together with particle filter (PF) and the challenge match algorithm are also used to achieve sub-optimal solutions, a trade-off between practicability and optimality. The numerical results show that the proposed algorithm provides not only refined model sets but also a good robustness margin and very high accuracy.

  4. Comprehensive personal witness: a model to enlarge missional involvement of the local church

    Directory of Open Access Journals (Sweden)

    Hancke, Frans

    2013-06-01

    Full Text Available In the The Split-Level Fellowship, Wesley Baker analysed the role of individual members in the Church. He gave a name to a tragic phenomenon with which Church leaders are familiar. Although true of society in general it is especially true of the church. Baker called the difference between the committed few and the uninvolved many, Factor Beta. This reality triggers the question: Why are the majority of Christians in the world not missionally involved through personal witness and which factors consequently influence personal witness and missional involvement? This article explains how the range of personal witness and missional involvement found in local churches are rooted in certain fundamental factors and conditions which are mutually influencing each other and ultimately contribute towards forming a certain paradigm. This paradigm acts as the basis from which certain behavioural patterns (witness will manifest. The factors influencing witness are either described as accelerators or decelerators and their relativity and mutual relationships are considered. Factors acting as decelerators can severely hamper or even annul witness, while accelerators on the other hand, can have an immensely positive effect to enlarge the transformational influence of witness. In conclusion a transformational model is developed through which paradigms can be influenced and eventually changed. This model fulfils a diagnostic and remedial function and will support local churches to enlarge the individual and corporate missional involvement of believers.

  5. Spatial Preference Modelling for equitable infrastructure provision: an application of Sen's Capability Approach

    Science.gov (United States)

    Wismadi, Arif; Zuidgeest, Mark; Brussel, Mark; van Maarseveen, Martin

    2014-01-01

    To determine whether the inclusion of spatial neighbourhood comparison factors in Preference Modelling allows spatial decision support systems (SDSSs) to better address spatial equity, we introduce Spatial Preference Modelling (SPM). To evaluate the effectiveness of this model in addressing equity, various standardisation functions in both Non-Spatial Preference Modelling and SPM are compared. The evaluation involves applying the model to a resource location-allocation problem for transport infrastructure in the Special Province of Yogyakarta in Indonesia. We apply Amartya Sen's Capability Approach to define opportunity to mobility as a non-income indicator. Using the extended Moran's I interpretation for spatial equity, we evaluate the distribution output regarding, first, `the spatial distribution patterns of priority targeting for allocation' (SPT) and, second, `the effect of new distribution patterns after location-allocation' (ELA). The Moran's I index of the initial map and its comparison with six patterns for SPT as well as ELA consistently indicates that the SPM is more effective for addressing spatial equity. We conclude that the inclusion of spatial neighbourhood comparison factors in Preference Modelling improves the capability of SDSS to address spatial equity. This study thus proposes a new formal method for SDSS with specific attention on resource location-allocation to address spatial equity.

  6. A generalized quarter car modelling approach with frame flexibility ...

    Indian Academy of Sciences (India)

    HUSAIN KANCHWALA

    Department of Mechanical Engineering, Indian Institute of Technology, Kanpur 208016, India e-mail: ... Quarter-car model; laplace domain; other wheel effects; reduced order; wheel hop; frame flexibility. ..... simply involve adding some internal modelling details. ... scale simulation, analysis and control design, and has been.

  7. Modeling the Relations among Parental Involvement, School Engagement and Academic Performance of High School Students

    Science.gov (United States)

    Al-Alwan, Ahmed F.

    2014-01-01

    The author proposed a model to explain how parental involvement and school engagement related to academic performance. Participants were (671) 9th and 10th graders students who completed two scales of "parental involvement" and "school engagement" in their regular classrooms. Results of the path analysis suggested that the…

  8. Degree of multicollinearity and variables involved in linear dependence in additive-dominant models

    Directory of Open Access Journals (Sweden)

    Juliana Petrini

    2012-12-01

    Full Text Available The objective of this work was to assess the degree of multicollinearity and to identify the variables involved in linear dependence relations in additive-dominant models. Data of birth weight (n=141,567, yearling weight (n=58,124, and scrotal circumference (n=20,371 of Montana Tropical composite cattle were used. Diagnosis of multicollinearity was based on the variance inflation factor (VIF and on the evaluation of the condition indexes and eigenvalues from the correlation matrix among explanatory variables. The first model studied (RM included the fixed effect of dam age class at calving and the covariates associated to the direct and maternal additive and non-additive effects. The second model (R included all the effects of the RM model except the maternal additive effects. Multicollinearity was detected in both models for all traits considered, with VIF values of 1.03 - 70.20 for RM and 1.03 - 60.70 for R. Collinearity increased with the increase of variables in the model and the decrease in the number of observations, and it was classified as weak, with condition index values between 10.00 and 26.77. In general, the variables associated with additive and non-additive effects were involved in multicollinearity, partially due to the natural connection between these covariables as fractions of the biological types in breed composition.

  9. A frequency domain approach for MPC tuning

    NARCIS (Netherlands)

    Özkan, L.; Meijs, J.B.; Backx, A.C.P.M.; Karimi, I.A.; Srinivasan, R.

    2012-01-01

    This paper presents a frequency domain based approach to tune the penalty weights in the model predictive control (MPC) formulation. The two-step tuning method involves the design of a favourite controller taking into account the model-plant mismatch followed by the controller matching. We implement

  10. [Burning mouth syndrome - a joint biopsychosocial approach].

    Science.gov (United States)

    Arpone, Francesca; Combremont, Florian; Weber, Kerstin; Scolozzi, Paolo

    2016-02-10

    Burning mouth syndrome (BMS) is a medical condition that is often refractory to conventional diagnostic and therapeutic methods. Patients suffering from BMS can benefit from a biopsychosocial approach in a joint, medical-psychological consultation model. Such a consultation exists at Geneva University Hospitals, involving the collaboration of the maxillo-facial and oral surgery division and the division of liaison psychiatry and crisis intervention, in order to take into account the multiple factors involved in BMS onset and persistence. This article will describe BMS clinical presentation, and present an integrate approach to treat these patients.

  11. Hypercompetitive Environments: An Agent-based model approach

    Science.gov (United States)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  12. Stochastic Boolean networks: An efficient approach to modeling gene regulatory networks

    Directory of Open Access Journals (Sweden)

    Liang Jinghang

    2012-08-01

    Full Text Available Abstract Background Various computational models have been of interest due to their use in the modelling of gene regulatory networks (GRNs. As a logical model, probabilistic Boolean networks (PBNs consider molecular and genetic noise, so the study of PBNs provides significant insights into the understanding of the dynamics of GRNs. This will ultimately lead to advances in developing therapeutic methods that intervene in the process of disease development and progression. The applications of PBNs, however, are hindered by the complexities involved in the computation of the state transition matrix and the steady-state distribution of a PBN. For a PBN with n genes and N Boolean networks, the complexity to compute the state transition matrix is O(nN22n or O(nN2n for a sparse matrix. Results This paper presents a novel implementation of PBNs based on the notions of stochastic logic and stochastic computation. This stochastic implementation of a PBN is referred to as a stochastic Boolean network (SBN. An SBN provides an accurate and efficient simulation of a PBN without and with random gene perturbation. The state transition matrix is computed in an SBN with a complexity of O(nL2n, where L is a factor related to the stochastic sequence length. Since the minimum sequence length required for obtaining an evaluation accuracy approximately increases in a polynomial order with the number of genes, n, and the number of Boolean networks, N, usually increases exponentially with n, L is typically smaller than N, especially in a network with a large number of genes. Hence, the computational efficiency of an SBN is primarily limited by the number of genes, but not directly by the total possible number of Boolean networks. Furthermore, a time-frame expanded SBN enables an efficient analysis of the steady-state distribution of a PBN. These findings are supported by the simulation results of a simplified p53 network, several randomly generated networks and a

  13. Parental Involvement in Elementary Children's Religious Education: A Phenomenological Approach

    Science.gov (United States)

    Bunnell, Peter Wayne

    2016-01-01

    The issue of parental involvement in religious education is an important one for the family, the church, the Christian school, and society. The purpose of this phenomenological study was to describe parents' concepts and practices of involvement in their children's religious education as evangelical Christian parents in Midwestern communities.…

  14. A Review of Accident Modelling Approaches for Complex Critical Sociotechnical Systems

    National Research Council Canada - National Science Library

    Qureshi, Zahid H

    2008-01-01

    .... This report provides a review of key traditional accident modelling approaches and their limitations, and describes new system-theoretic approaches to the modelling and analysis of accidents in safety-critical systems...

  15. Parameter identification and global sensitivity analysis of Xin'anjiang model using meta-modeling approach

    Directory of Open Access Journals (Sweden)

    Xiao-meng Song

    2013-01-01

    Full Text Available Parameter identification, model calibration, and uncertainty quantification are important steps in the model-building process, and are necessary for obtaining credible results and valuable information. Sensitivity analysis of hydrological model is a key step in model uncertainty quantification, which can identify the dominant parameters, reduce the model calibration uncertainty, and enhance the model optimization efficiency. There are, however, some shortcomings in classical approaches, including the long duration of time and high computation cost required to quantitatively assess the sensitivity of a multiple-parameter hydrological model. For this reason, a two-step statistical evaluation framework using global techniques is presented. It is based on (1 a screening method (Morris for qualitative ranking of parameters, and (2 a variance-based method integrated with a meta-model for quantitative sensitivity analysis, i.e., the Sobol method integrated with the response surface model (RSMSobol. First, the Morris screening method was used to qualitatively identify the parameters' sensitivity, and then ten parameters were selected to quantify the sensitivity indices. Subsequently, the RSMSobol method was used to quantify the sensitivity, i.e., the first-order and total sensitivity indices based on the response surface model (RSM were calculated. The RSMSobol method can not only quantify the sensitivity, but also reduce the computational cost, with good accuracy compared to the classical approaches. This approach will be effective and reliable in the global sensitivity analysis of a complex large-scale distributed hydrological model.

  16. Relationships among Adolescents' Leisure Motivation, Leisure Involvement, and Leisure Satisfaction: A Structural Equation Model

    Science.gov (United States)

    Chen, Ying-Chieh; Li, Ren-Hau; Chen, Sheng-Hwang

    2013-01-01

    The purpose of this cross-sectional study was to test a cause-and-effect model of factors affecting leisure satisfaction among Taiwanese adolescents. A structural equation model was proposed in which the relationships among leisure motivation, leisure involvement, and leisure satisfaction were explored. The study collected data from 701 adolescent…

  17. Adolescents and Music Media: Toward an Involvement-Mediational Model of Consumption and Self-Concept

    Science.gov (United States)

    Kistler, Michelle; Rodgers, Kathleen Boyce; Power, Thomas; Austin, Erica Weintraub; Hill, Laura Griner

    2010-01-01

    Using social cognitive theory and structural regression modeling, we examined pathways between early adolescents' music media consumption, involvement with music media, and 3 domains of self-concept (physical appearance, romantic appeal, and global self-worth; N=124). A mediational model was supported for 2 domains of self-concept. Music media…

  18. Numeric, Agent-based or System dynamics model? Which modeling approach is the best for vast population simulation?

    Science.gov (United States)

    Cimler, Richard; Tomaskova, Hana; Kuhnova, Jitka; Dolezal, Ondrej; Pscheidl, Pavel; Kuca, Kamil

    2018-02-01

    Alzheimer's disease is one of the most common mental illnesses. It is posited that more than 25 % of the population is affected by some mental disease during their lifetime. Treatment of each patient draws resources from the economy concerned. Therefore, it is important to quantify the potential economic impact. Agent-based, system dynamics and numerical approaches to dynamic modeling of the population of the European Union and its patients with Alzheimer's disease are presented in this article. Simulations, their characteristics, and the results from different modeling tools are compared. The results of these approaches are compared with EU population growth predictions from the statistical office of the EU by Eurostat. The methodology of a creation of the models is described and all three modeling approaches are compared. The suitability of each modeling approach for the population modeling is discussed. In this case study, all three approaches gave us the results corresponding with the EU population prediction. Moreover, we were able to predict the number of patients with AD and, based on the modeling method, we were also able to monitor different characteristics of the population. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Prospective memory after moderate-to-severe traumatic brain injury: a multinomial modeling approach.

    Science.gov (United States)

    Pavawalla, Shital P; Schmitter-Edgecombe, Maureen; Smith, Rebekah E

    2012-01-01

    Prospective memory (PM), which can be understood as the processes involved in realizing a delayed intention, is consistently found to be impaired after a traumatic brain injury (TBI). Although PM can be empirically dissociated from retrospective memory, it inherently involves both a prospective component (i.e., remembering that an action needs to be carried out) and retrospective components (i.e., remembering what action needs to be executed and when). This study utilized a multinomial processing tree model to disentangle the prospective (that) and retrospective recognition (when) components underlying PM after moderate-to-severe TBI. Seventeen participants with moderate to severe TBI and 17 age- and education-matched control participants completed an event-based PM task that was embedded within an ongoing computer-based color-matching task. The multinomial processing tree modeling approach revealed a significant group difference in the prospective component, indicating that the control participants allocated greater preparatory attentional resources to the PM task compared to the TBI participants. Participants in the TBI group were also found to be significantly more impaired than controls in the when aspect of the retrospective component. These findings indicated that the TBI participants had greater difficulty allocating the necessary preparatory attentional resources to the PM task and greater difficulty discriminating between PM targets and nontargets during task execution, despite demonstrating intact posttest recall and/or recognition of the PM tasks and targets.

  20. Simple Heuristic Approach to Introduction of the Black-Scholes Model

    Science.gov (United States)

    Yalamova, Rossitsa

    2010-01-01

    A heuristic approach to explaining of the Black-Scholes option pricing model in undergraduate classes is described. The approach draws upon the method of protocol analysis to encourage students to "think aloud" so that their mental models can be surfaced. It also relies upon extensive visualizations to communicate relationships that are…

  1. Dopamine selectively remediates 'model-based' reward learning: a computational approach.

    Science.gov (United States)

    Sharp, Madeleine E; Foerde, Karin; Daw, Nathaniel D; Shohamy, Daphna

    2016-02-01

    Patients with loss of dopamine due to Parkinson's disease are impaired at learning from reward. However, it remains unknown precisely which aspect of learning is impaired. In particular, learning from reward, or reinforcement learning, can be driven by two distinct computational processes. One involves habitual stamping-in of stimulus-response associations, hypothesized to arise computationally from 'model-free' learning. The other, 'model-based' learning, involves learning a model of the world that is believed to support goal-directed behaviour. Much work has pointed to a role for dopamine in model-free learning. But recent work suggests model-based learning may also involve dopamine modulation, raising the possibility that model-based learning may contribute to the learning impairment in Parkinson's disease. To directly test this, we used a two-step reward-learning task which dissociates model-free versus model-based learning. We evaluated learning in patients with Parkinson's disease tested ON versus OFF their dopamine replacement medication and in healthy controls. Surprisingly, we found no effect of disease or medication on model-free learning. Instead, we found that patients tested OFF medication showed a marked impairment in model-based learning, and that this impairment was remediated by dopaminergic medication. Moreover, model-based learning was positively correlated with a separate measure of working memory performance, raising the possibility of common neural substrates. Our results suggest that some learning deficits in Parkinson's disease may be related to an inability to pursue reward based on complete representations of the environment. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. A Knowledge Model Sharing Based Approach to Privacy-Preserving Data Mining

    OpenAIRE

    Hongwei Tian; Weining Zhang; Shouhuai Xu; Patrick Sharkey

    2012-01-01

    Privacy-preserving data mining (PPDM) is an important problem and is currently studied in three approaches: the cryptographic approach, the data publishing, and the model publishing. However, each of these approaches has some problems. The cryptographic approach does not protect privacy of learned knowledge models and may have performance and scalability issues. The data publishing, although is popular, may suffer from too much utility loss for certain types of data mining applications. The m...

  3. A new approach to Naturalness in SUSY models

    CERN Document Server

    Ghilencea, D M

    2013-01-01

    We review recent results that provide a new approach to the old problem of naturalness in supersymmetric models, without relying on subjective definitions for the fine-tuning associated with {\\it fixing} the EW scale (to its measured value) in the presence of quantum corrections. The approach can address in a model-independent way many questions related to this problem. The results show that naturalness and its measure (fine-tuning) are an intrinsic part of the likelihood to fit the data that {\\it includes} the EW scale. One important consequence is that the additional {\\it constraint} of fixing the EW scale, usually not imposed in the data fits of the models, impacts on their overall likelihood to fit the data (or chi^2/ndf, ndf: number of degrees of freedom). This has negative implications for the viability of currently popular supersymmetric extensions of the Standard Model.

  4. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    characterized by equifinality, multiple conjunctural causation, and case diversity. We prescribe methodological guidelines consisting of a six-step procedure to systematically apply set theoretic methods to conceptualize, develop, and empirically derive maturity models and provide a demonstration......Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models...

  5. An integrated modeling approach to age invariant face recognition

    Science.gov (United States)

    Alvi, Fahad Bashir; Pears, Russel

    2015-03-01

    This Research study proposes a novel method for face recognition based on Anthropometric features that make use of an integrated approach comprising of a global and personalized models. The system is aimed to at situations where lighting, illumination, and pose variations cause problems in face recognition. A Personalized model covers the individual aging patterns while a Global model captures general aging patterns in the database. We introduced a de-aging factor that de-ages each individual in the database test and training sets. We used the k nearest neighbor approach for building a personalized model and global model. Regression analysis was applied to build the models. During the test phase, we resort to voting on different features. We used FG-Net database for checking the results of our technique and achieved 65 percent Rank 1 identification rate.

  6. Variational approach to chiral quark models

    Energy Technology Data Exchange (ETDEWEB)

    Futami, Yasuhiko; Odajima, Yasuhiko; Suzuki, Akira

    1987-03-01

    A variational approach is applied to a chiral quark model to test the validity of the perturbative treatment of the pion-quark interaction based on the chiral symmetry principle. It is indispensably related to the chiral symmetry breaking radius if the pion-quark interaction can be regarded as a perturbation.

  7. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  8. Multilevel Molecular Modeling Approach for a Rational Design of Ionic Current Sensors for Nanofluidics.

    Science.gov (United States)

    Kirch, Alexsandro; de Almeida, James M; Miranda, Caetano R

    2018-05-10

    The complexity displayed by nanofluidic-based systems involves electronic and dynamic aspects occurring across different size and time scales. To properly model such kind of system, we introduced a top-down multilevel approach, combining molecular dynamics simulations (MD) with first-principles electronic transport calculations. The potential of this technique was demonstrated by investigating how the water and ionic flow through a (6,6) carbon nanotube (CNT) influences its electronic transport properties. We showed that the confinement on the CNT favors the partially hydrated Na, Cl, and Li ions to exchange charge with the nanotube. This leads to a change in the electronic transmittance, allowing for the distinguishing of cations from anions. Such an ionic trace may handle an indirect measurement of the ionic current that is recorded as a sensing output. With this case study, we are able to show the potential of this top-down multilevel approach, to be applied on the design of novel nanofluidic devices.

  9. Learning the Task Management Space of an Aircraft Approach Model

    Science.gov (United States)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  10. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    The dynamics of a storage tunnel is examined using a model based on on-line measured data and a combination of simple deterministic and black-box stochastic elements. This approach, called gray-box modeling, is a new promising methodology for giving an on-line state description of sewer systems...... of the water in the overflow structures. The capacity of a pump draining the storage tunnel is estimated for two different rain events, revealing that the pump was malfunctioning during the first rain event. The proposed modeling approach can be used in automated online surveillance and control and implemented...

  11. A study of multidimensional modeling approaches for data warehouse

    Science.gov (United States)

    Yusof, Sharmila Mat; Sidi, Fatimah; Ibrahim, Hamidah; Affendey, Lilly Suriani

    2016-08-01

    Data warehouse system is used to support the process of organizational decision making. Hence, the system must extract and integrate information from heterogeneous data sources in order to uncover relevant knowledge suitable for decision making process. However, the development of data warehouse is a difficult and complex process especially in its conceptual design (multidimensional modeling). Thus, there have been various approaches proposed to overcome the difficulty. This study surveys and compares the approaches of multidimensional modeling and highlights the issues, trend and solution proposed to date. The contribution is on the state of the art of the multidimensional modeling design.

  12. Modelling efficient innovative work: integration of economic and social psychological approaches

    Directory of Open Access Journals (Sweden)

    Babanova Yulia

    2017-01-01

    Full Text Available The article deals with the relevance of integration of economic and social psychological approaches to the solution of enhancing the efficiency of innovation management. The content, features and specifics of the modelling methods within each of approaches are unfolded and options of integration are considered. The economic approach lies in the generation of the integrated matrix concept of management of innovative development of an enterprise in line with the stages of innovative work and the use of the integrated vector method for the evaluation of the innovative enterprise development level. The social psychological approach lies in the development of a system of psychodiagnostic indexes of activity resources within the scope of psychological innovative audit of enterprise management and development of modelling methods for the balance of activity trends. Modelling the activity resources is based on the system of equations accounting for the interaction type of psychodiagnostic indexes. Integration of two approaches includes a methodological level, a level of empirical studies and modelling methods. There are suggested options of integrating the economic and psychological approaches to analyze available material and non-material resources of the enterprises’ innovative work and to forecast an optimal option of development based on the implemented modelling methods.

  13. Present developments in reaching an international consensus for a model-based approach to particle beam therapy.

    Science.gov (United States)

    Prayongrat, Anussara; Umegaki, Kikuo; van der Schaaf, Arjen; Koong, Albert C; Lin, Steven H; Whitaker, Thomas; McNutt, Todd; Matsufuji, Naruhiro; Graves, Edward; Mizuta, Masahiko; Ogawa, Kazuhiko; Date, Hiroyuki; Moriwaki, Kensuke; Ito, Yoichi M; Kobashi, Keiji; Dekura, Yasuhiro; Shimizu, Shinichi; Shirato, Hiroki

    2018-03-01

    Particle beam therapy (PBT), including proton and carbon ion therapy, is an emerging innovative treatment for cancer patients. Due to the high cost of and limited access to treatment, meticulous selection of patients who would benefit most from PBT, when compared with standard X-ray therapy (XRT), is necessary. Due to the cost and labor involved in randomized controlled trials, the model-based approach (MBA) is used as an alternative means of establishing scientific evidence in medicine, and it can be improved continuously. Good databases and reasonable models are crucial for the reliability of this approach. The tumor control probability and normal tissue complication probability models are good illustrations of the advantages of PBT, but pre-existing NTCP models have been derived from historical patient treatments from the XRT era. This highlights the necessity of prospectively analyzing specific treatment-related toxicities in order to develop PBT-compatible models. An international consensus has been reached at the Global Institution for Collaborative Research and Education (GI-CoRE) joint symposium, concluding that a systematically developed model is required for model accuracy and performance. Six important steps that need to be observed in these considerations include patient selection, treatment planning, beam delivery, dose verification, response assessment, and data analysis. Advanced technologies in radiotherapy and computer science can be integrated to improve the efficacy of a treatment. Model validation and appropriately defined thresholds in a cost-effectiveness centered manner, together with quality assurance in the treatment planning, have to be achieved prior to clinical implementation.

  14. A finite volume alternate direction implicit approach to modeling selective laser melting

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Mohanty, Sankhya

    2013-01-01

    Over the last decade, several studies have attempted to develop thermal models for analyzing the selective laser melting process with a vision to predict thermal stresses, microstructures and resulting mechanical properties of manufactured products. While a holistic model addressing all involved...... to accurately simulate the process, are constrained by either the size or scale of the model domain. A second challenging aspect involves the inclusion of non-linear material behavior into the 3D implicit FE models. An alternating direction implicit (ADI) method based on a finite volume (FV) formulation...... is proposed for modeling single-layer and few-layers selective laser melting processes. The ADI technique is implemented and applied for two cases involving constant material properties and non-linear material behavior. The ADI FV method consume less time while having comparable accuracy with respect to 3D...

  15. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  16. Benchmarking novel approaches for modelling species range dynamics.

    Science.gov (United States)

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches

  17. Numerical schemes for the hybrid modeling approach of gas-particle turbulent flows

    International Nuclear Information System (INIS)

    Dorogan, K.

    2012-01-01

    Hybrid Moments/PDF methods have shown to be well suitable for the description of poly-dispersed turbulent two-phase flows in non-equilibrium which are encountered in some industrial situations involving chemical reactions, combustion or sprays. They allow to obtain a fine enough physical description of the poly-dispersity, non-linear source terms and convection phenomena. However, their approximations are noised with the statistical error, which in several situations may be a source of a bias. An alternative hybrid Moments-Moments/PDF approach examined in this work consists in coupling the Moments and the PDF descriptions, within the description of the dispersed phase itself. This hybrid method could reduce the statistical error and remove the bias. However, such a coupling is not straightforward in practice and requires the development of accurate and stable numerical schemes. The approaches introduced in this work rely on the combined use of the up-winding and relaxation-type techniques. They allow to obtain stable unsteady approximations for a system of partial differential equations containing non-smooth external data which are provided by the PDF part of the model. A comparison of the results obtained using the present method with those of the 'classical' hybrid approach is presented in terms of the numerical errors for a case of a co-current gas-particle wall jet. (author)

  18. City evacuations an interdisciplinary approach

    CERN Document Server

    Binner, Jane; Branicki, Layla; Galla, Tobias; Jones, Nick; King, James; Kolokitha, Magdalini; Smyrnakis, Michalis

    2015-01-01

    Evacuating a city is a complex problem that involves issues of governance, preparedness education, warning, information sharing, population dynamics, resilience and recovery. As natural and anthropogenic threats to cities grow, it is an increasingly pressing problem for policy makers and practitioners.   The book is the result of a unique interdisciplinary collaboration between researchers in the physical and social sciences to consider how an interdisciplinary approach can help plan for large scale evacuations.  It draws on perspectives from physics, mathematics, organisation theory, economics, sociology and education.  Importantly it goes beyond disciplinary boundaries and considers how interdisciplinary methods are necessary to approach a complex problem involving human actors and increasingly complex communications and transportation infrastructures.   Using real world case studies and modelling the book considers new approaches to evacuation dynamics.  It addresses questions of complexity, not only ...

  19. Models and impact of patient and public involvement in studies carried out by the Medical Research Council Clinical Trials Unit at University College London: findings from ten case studies.

    Science.gov (United States)

    South, Annabelle; Hanley, Bec; Gafos, Mitzy; Cromarty, Ben; Stephens, Richard; Sturgeon, Kate; Scott, Karen; Cragg, William J; Tweed, Conor D; Teera, Jacqueline; Vale, Claire L

    2016-07-29

    Patient and public involvement (PPI) in studies carried out by the UK Medical Research Council Clinical Trials Unit (MRC CTU) at University College London varies by research type and setting. We developed a series of case studies of PPI to document and share good practice. We used purposive sampling to identify studies representing the scope of research at the MRC CTU and different approaches to PPI. We carried out semi-structured interviews with staff and patient representatives. Interview notes were analysed descriptively to categorise the main aims and motivations for involvement; activities undertaken; their impact on the studies and lessons learned. We conducted 19 interviews about ten case studies, comprising one systematic review, one observational study and 8 randomised controlled trials in HIV and cancer. Studies were either open or completed, with start dates between 2003 and 2011. Interviews took place between March and November 2014 and were updated in summer 2015 where there had been significant developments in the study (i.e. if the study had presented results subsequent to the interview taking place). A wide range of PPI models, including representation on trial committees or management groups, community engagement, one-off task-focused activities, patient research partners and participant involvement had been used. Overall, interviewees felt that PPI had a positive impact, leading to improvements, for example in the research question; study design; communication with potential participants; study recruitment; confidence to carry out or complete a study; interpretation and communication of results; and influence on future research. A range of models of PPI can benefit clinical studies. Researchers should consider different approaches to PPI, based on the desired impact and the people they want to involve. Use of multiple models may increase the potential impacts of PPI in clinical research.

  20. Quasirelativistic quark model in quasipotential approach

    CERN Document Server

    Matveev, V A; Savrin, V I; Sissakian, A N

    2002-01-01

    The relativistic particles interaction is described within the frames of quasipotential approach. The presentation is based on the so called covariant simultaneous formulation of the quantum field theory, where by the theory is considered on the spatial-like three-dimensional hypersurface in the Minkowski space. Special attention is paid to the methods of plotting various quasipotentials as well as to the applications of the quasipotential approach to describing the characteristics of the relativistic particles interaction in the quark models, namely: the hadrons elastic scattering amplitudes, the mass spectra and widths mesons decays, the cross sections of the deep inelastic leptons scattering on the hadrons

  1. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources are preserved...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model-based...

  2. Modeling vapor pressures of solvent systems with and without a salt effect: An extension of the LSER approach

    International Nuclear Information System (INIS)

    Senol, Aynur

    2015-01-01

    Highlights: • A new polynomial vapor pressure approach for pure solvents is presented. • Solvation models reproduce the vapor pressure data within a 4% mean error. • A concentration-basis vapor pressure model is also implemented on relevant systems. • The reliability of existing models was analyzed using log-ratio objective function. - Abstract: A new polynomial vapor pressure approach for pure solvents is presented. The model is incorporated into the LSER (linear solvation energy relation) based solvation model framework and checked for consistency in reproducing experimental vapor pressures of salt-containing solvent systems. The developed two structural forms of the generalized solvation model (Senol, 2013) provide a relatively accurate description of the salting effect on vapor pressure of (solvent + salt) systems. The equilibrium data spanning vapor pressures of eighteen (solvent + salt) and three (solvent (1) + solvent (2) + salt) systems have been subjected to establish the basis for the model reliability analysis using a log-ratio objective function. The examined vapor pressure relations reproduce the observed performance relatively accurately, yielding the overall design factors of 1.084, 1.091 and 1.052 for the integrated property-basis solvation model (USMIP), reduced property-basis solvation model and concentration-dependent model, respectively. Both the integrated property-basis and reduced property-basis solvation models were able to simulate satisfactorily the vapor pressure data of a binary solvent mixture involving a salt, yielding an overall mean error of 5.2%

  3. A novel approach of modeling continuous dark hydrogen fermentation.

    Science.gov (United States)

    Alexandropoulou, Maria; Antonopoulou, Georgia; Lyberatos, Gerasimos

    2018-02-01

    In this study a novel modeling approach for describing fermentative hydrogen production in a continuous stirred tank reactor (CSTR) was developed, using the Aquasim modeling platform. This model accounts for the key metabolic reactions taking place in a fermentative hydrogen producing reactor, using fixed stoichiometry but different reaction rates. Biomass yields are determined based on bioenergetics. The model is capable of describing very well the variation in the distribution of metabolic products for a wide range of hydraulic retention times (HRT). The modeling approach is demonstrated using the experimental data obtained from a CSTR, fed with food industry waste (FIW), operating at different HRTs. The kinetic parameters were estimated through fitting to the experimental results. Hydrogen and total biogas production rates were predicted very well by the model, validating the basic assumptions regarding the implicated stoichiometric biochemical reactions and their kinetic rates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. An interdisciplinary approach for earthquake modelling and forecasting

    Science.gov (United States)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.

    2016-12-01

    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  5. An explanatory model of maths achievement:Perceived parental involvement and academic motivation.

    Science.gov (United States)

    Rodríguez, Susana; Piñeiro, Isabel; Gómez-Taibo, Mª L; Regueiro, Bibiana; Estévez, Iris; Valle, Antonio

    2017-05-01

    Although numerous studies have tried to explain performance in maths very few have deeply explored the relationship between different variables and how they jointly explain mathematical performance. With a sample of 897 students in 5th and 6th grade in Primary Education and using structural equation modeling (SEM), this study analyzes how the perception of parents’ beliefs is related to children´s beliefs, their involvement in mathematical tasks and their performance. Perceived parental involvement contributes to the motivation of their children in mathematics. Direct supervision of students’ academic work by parents may increase students’ concerns about the image and rating of their children, but not their academic performance. In fact, maths achievement depends directly and positively on the parents’ expectations and children’s maths self-efficacy and negatively on the parents’ help in tasks and performance goal orientation. Perceived parental involvement contributes to children’s motivation in maths essentially conveying confidence in their abilities and showing interest in their progress and schoolwork.

  6. Constructing a justice model based on Sen's capability approach

    OpenAIRE

    Yüksel, Sevgi; Yuksel, Sevgi

    2008-01-01

    The thesis provides a possible justice model based on Sen's capability approach. For this goal, we first analyze the general structure of a theory of justice, identifying the main variables and issues. Furthermore, based on Sen (2006) and Kolm (1998), we look at 'transcendental' and 'comparative' approaches to justice and concentrate on the sufficiency condition for the comparative approach. Then, taking Rawls' theory of justice as a starting point, we present how Sen's capability approach em...

  7. Biotic interactions in the face of climate change: a comparison of three modelling approaches.

    Directory of Open Access Journals (Sweden)

    Anja Jaeschke

    Full Text Available Climate change is expected to alter biotic interactions, and may lead to temporal and spatial mismatches of interacting species. Although the importance of interactions for climate change risk assessments is increasingly acknowledged in observational and experimental studies, biotic interactions are still rarely incorporated in species distribution models. We assessed the potential impacts of climate change on the obligate interaction between Aeshna viridis and its egg-laying plant Stratiotes aloides in Europe, based on an ensemble modelling technique. We compared three different approaches for incorporating biotic interactions in distribution models: (1 We separately modelled each species based on climatic information, and intersected the future range overlap ('overlap approach'. (2 We modelled the potential future distribution of A. viridis with the projected occurrence probability of S. aloides as further predictor in addition to climate ('explanatory variable approach'. (3 We calibrated the model of A. viridis in the current range of S. aloides and multiplied the future occurrence probabilities of both species ('reference area approach'. Subsequently, all approaches were compared to a single species model of A. viridis without interactions. All approaches projected a range expansion for A. viridis. Model performance on test data and amount of range gain differed depending on the biotic interaction approach. All interaction approaches yielded lower range gains (up to 667% lower than the model without interaction. Regarding the contribution of algorithm and approach to the overall uncertainty, the main part of explained variation stems from the modelling algorithm, and only a small part is attributed to the modelling approach. The comparison of the no-interaction model with the three interaction approaches emphasizes the importance of including obligate biotic interactions in projective species distribution modelling. We recommend the use of

  8. Top-down approach to unified supergravity models

    International Nuclear Information System (INIS)

    Hempfling, R.

    1994-03-01

    We introduce a new approach for studying unified supergravity models. In this approach all the parameters of the grand unified theory (GUT) are fixed by imposing the corresponding number of low energy observables. This determines the remaining particle spectrum whose dependence on the low energy observables can now be investigated. We also include some SUSY threshold corrections that have previously been neglected. In particular the SUSY threshold corrections to the fermion masses can have a significant impact on the Yukawa coupling unification. (orig.)

  9. A robust Bayesian approach to modeling epistemic uncertainty in common-cause failure models

    International Nuclear Information System (INIS)

    Troffaes, Matthias C.M.; Walter, Gero; Kelly, Dana

    2014-01-01

    In a standard Bayesian approach to the alpha-factor model for common-cause failure, a precise Dirichlet prior distribution models epistemic uncertainty in the alpha-factors. This Dirichlet prior is then updated with observed data to obtain a posterior distribution, which forms the basis for further inferences. In this paper, we adapt the imprecise Dirichlet model of Walley to represent epistemic uncertainty in the alpha-factors. In this approach, epistemic uncertainty is expressed more cautiously via lower and upper expectations for each alpha-factor, along with a learning parameter which determines how quickly the model learns from observed data. For this application, we focus on elicitation of the learning parameter, and find that values in the range of 1 to 10 seem reasonable. The approach is compared with Kelly and Atwood's minimally informative Dirichlet prior for the alpha-factor model, which incorporated precise mean values for the alpha-factors, but which was otherwise quite diffuse. Next, we explore the use of a set of Gamma priors to model epistemic uncertainty in the marginal failure rate, expressed via a lower and upper expectation for this rate, again along with a learning parameter. As zero counts are generally less of an issue here, we find that the choice of this learning parameter is less crucial. Finally, we demonstrate how both epistemic uncertainty models can be combined to arrive at lower and upper expectations for all common-cause failure rates. Thereby, we effectively provide a full sensitivity analysis of common-cause failure rates, properly reflecting epistemic uncertainty of the analyst on all levels of the common-cause failure model

  10. Tetramer model of leukoemeraldine-emeraldine electrochemistry in the presence of trihalogenoacetic acids. DFT approach.

    Science.gov (United States)

    Barbosa, Nuno Almeida; Grzeszczuk, Maria; Wieczorek, Robert

    2015-01-15

    First results of the application of the DFT computational approach to the reversible electrochemistry of polyaniline are presented. A tetrameric chain was used as the simplest model of the polyaniline polymer species. The system under theoretical investigation involved six tetramer species, two electrons, and two protons, taking part in 14 elementary reactions. Moreover, the tetramer species were interacting with two trihalogenoacetic acid molecules. Trifluoroacetic, trichloroacetic, and tribromoacetic acids were found to impact the redox transformation of polyaniline as shown by cyclic voltammetry. The theoretical approach was considered as a powerful tool for investigating the main factors of importance for the experimental behavior. The DFT method provided molecular structures, interaction energies, and equilibrium energies of all of the tetramer-acid complexes. Differences between the energies of the isolated tetramer species and their complexes with acids are discussed in terms of the elementary reactions, that is, ionization potentials and electron affinities, equilibrium constants, electrode potentials, and reorganization energies. The DFT results indicate a high impact of the acid on the reorganization energy of a particular elementary electron-transfer reaction. The ECEC oxidation path was predicted by the calculations. The model of the reacting system must be extended to octamer species and/or dimeric oligomer species to better approximate the real polymer situation.

  11. Simplified Model for the Population Dynamics Involved in a Malaria Crisis

    International Nuclear Information System (INIS)

    Kenfack-Jiotsa, A.; Fotsa-Ngaffo, F.

    2009-12-01

    We adapt a simple model of predator-prey to the population involved in a crisis of malaria. The study is made only in the stream blood inside the human body except for the liver. Particularly we look at the dynamics of the malaria parasites 'merozoites' and their interaction with the blood components, more specifically the red blood cells (RBC) and the immune response grouped under the white blood cells (WBC). The stability analysis of the system reveals an important practical direction to investigate as regards the ratio WBC over RBC since it is a fundamental parameter that characterizes stable regions. The model numerically presents a wide range of possible features of the disease. Even with its simplified form, the model not only recovers well-known results but in addition predicts possible hidden phenomenon and an interesting clinical feature a malaria crisis. (author)

  12. Unraveling the Mechanisms of Manual Therapy: Modeling an Approach.

    Science.gov (United States)

    Bialosky, Joel E; Beneciuk, Jason M; Bishop, Mark D; Coronado, Rogelio A; Penza, Charles W; Simon, Corey B; George, Steven Z

    2018-01-01

    Synopsis Manual therapy interventions are popular among individual health care providers and their patients; however, systematic reviews do not strongly support their effectiveness. Small treatment effect sizes of manual therapy interventions may result from a "one-size-fits-all" approach to treatment. Mechanistic-based treatment approaches to manual therapy offer an intriguing alternative for identifying patients likely to respond to manual therapy. However, the current lack of knowledge of the mechanisms through which manual therapy interventions inhibit pain limits such an approach. The nature of manual therapy interventions further confounds such an approach, as the related mechanisms are likely a complex interaction of factors related to the patient, the provider, and the environment in which the intervention occurs. Therefore, a model to guide both study design and the interpretation of findings is necessary. We have previously proposed a model suggesting that the mechanical force from a manual therapy intervention results in systemic neurophysiological responses leading to pain inhibition. In this clinical commentary, we provide a narrative appraisal of the model and recommendations to advance the study of manual therapy mechanisms. J Orthop Sports Phys Ther 2018;48(1):8-18. doi:10.2519/jospt.2018.7476.

  13. Hydrological models are mediating models

    Science.gov (United States)

    Babel, L. V.; Karssenberg, D.

    2013-08-01

    Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting

  14. A Model-Driven Approach to e-Course Management

    Science.gov (United States)

    Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana

    2018-01-01

    This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…

  15. Modelling the Heat Consumption in District Heating Systems using a Grey-box approach

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2006-01-01

    identification of an overall model structure followed by data-based modelling, whereby the details of the model are identified. This approach is sometimes called grey-box modelling, but the specific approach used here does not require states to be specified. Overall, the paper demonstrates the power of the grey......-box approach. (c) 2005 Elsevier B.V. All rights reserved....

  16. Mars approach for global sensitivity analysis of differential equation models with applications to dynamics of influenza infection.

    Science.gov (United States)

    Lee, Yeonok; Wu, Hulin

    2012-01-01

    Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.

  17. Patient involvement in hospital architecture

    DEFF Research Database (Denmark)

    Herriott, Richard

    2017-01-01

    the structure of the design process, identification and ranking of stakeholders, the methods of user-involvement and approaches to accessibility. The paper makes recommendations for a change of approach to user-participation in large-scale, long-duration projects. The paper adds new insight on an under...

  18. A comprehensive approach to dark matter studies: exploration of simplified top-philic models

    Energy Technology Data Exchange (ETDEWEB)

    Arina, Chiara; Backović, Mihailo [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium); Conte, Eric [Groupe de Recherche de Physique des Hautes Énergies (GRPHE), Université de Haute-Alsace,IUT Colmar, F-68008 Colmar Cedex (France); Fuks, Benjamin [Sorbonne Universités, UPMC University Paris 06, UMR 7589, LPTHE, F-75005, Paris (France); CNRS, UMR 7589, LPTHE, F-75005, Paris (France); Guo, Jun [State Key Laboratory of Theoretical Physics, Institute of Theoretical Physics,Chinese Academy of Sciences, Beijing 100190 (China); Institut Pluridisciplinaire Hubert Curien/Département Recherches Subatomiques,Université de Strasbourg/CNRS-IN2P3, F-67037 Strasbourg (France); Heisig, Jan [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, D-52056 Aachen (Germany); Hespel, Benoît [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium); Krämer, Michael [Institute for Theoretical Particle Physics and Cosmology, RWTH Aachen University,Sommerfeldstr. 16, D-52056 Aachen (Germany); Maltoni, Fabio; Martini, Antony [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium); Mawatari, Kentarou [Laboratoire de Physique Subatomique et de Cosmologie, Université Grenoble-Alpes,CNRS/IN2P3, 53 Avenue des Martyrs, F-38026 Grenoble (France); Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel andInternational Solvay Institutes, Pleinlaan 2, B-1050 Brussels (Belgium); Pellen, Mathieu [Universität Würzburg, Institut für Theoretische Physik und Astrophysik,Emil-Hilb-Weg 22, 97074 Würzburg (Germany); Vryonidou, Eleni [Centre for Cosmology, Particle Physics and Phenomenology (CP3),Université catholique de Louvain, Chemin du Cyclotron 2, B-1348 Louvain-la-Neuve (Belgium)

    2016-11-21

    Studies of dark matter lie at the interface of collider physics, astrophysics and cosmology. Constraining models featuring dark matter candidates entails the capability to provide accurate predictions for large sets of observables and compare them to a wide spectrum of data. We present a framework which, starting from a model Lagrangian, allows one to consistently and systematically make predictions, as well as to confront those predictions with a multitude of experimental results. As an application, we consider a class of simplified dark matter models where a scalar mediator couples only to the top quark and a fermionic dark sector (i.e. the simplified top-philic dark matter model). We study in detail the complementarity of relic density, direct/indirect detection and collider searches in constraining the multi-dimensional model parameter space, and efficiently identify regions where individual approaches to dark matter detection provide the most stringent bounds. In the context of collider studies of dark matter, we point out the complementarity of LHC searches in probing different regions of the model parameter space with final states involving top quarks, photons, jets and/or missing energy. Our study of dark matter production at the LHC goes beyond the tree-level approximation and we show examples of how higher-order corrections to dark matter production processes can affect the interpretation of the experimental results.

  19. Fracture network modelling: an integrated approach for realisation of complex fracture network geometries

    International Nuclear Information System (INIS)

    Srivastava, R.M.

    2007-01-01

    they consist of a family of equally likely renditions of fracture geometry, each one honouring the same surface and subsurface constraints. Such probabilistic models are well suited to studying issues involving risk assessment and quantification of uncertainty. This assists the exploration of geo-scientific uncertainty and how the inherent non-uniqueness of DCMs affects confidence in predictions of how the far-field geosphere affects overall safety of the proposed repository. The approach provides models that are systematic and traceable in the sense that all of the data, assumptions and parameter choices are clearly recorded and auditable. At the same time that subjective decisions are avoided, the various parameter choices still allow reasoned judgement from structural geology and geomechanics to constrain the model. By providing place-holders for such judgements, this approach moves this type of information from an undocumented constraint to a reviewable parameter choice. The technical consistency of these FNMs, their auditability and their visual and scientific realism all contribute to the presentation of geologic safety arguments that demonstrate good judgement, thereby increasing confidence in the entire modelling effort. (author)

  20. Modeling energy fluxes in heterogeneous landscapes employing a mosaic approach

    Science.gov (United States)

    Klein, Christian; Thieme, Christoph; Priesack, Eckart

    2015-04-01

    Recent studies show that uncertainties in regional and global climate and weather simulations are partly due to inadequate descriptions of the energy flux exchanges between the land surface and the atmosphere. One major shortcoming is the limitation of the grid-cell resolution, which is recommended to be about at least 3x3 km² in most models due to limitations in the model physics. To represent each individual grid cell most models select one dominant soil type and one dominant land use type. This resolution, however, is often too coarse in regions where the spatial diversity of soil and land use types are high, e.g. in Central Europe. An elegant method to avoid the shortcoming of grid cell resolution is the so called mosaic approach. This approach is part of the recently developed ecosystem model framework Expert-N 5.0. The aim of this study was to analyze the impact of the characteristics of two managed fields, planted with winter wheat and potato, on the near surface soil moistures and on the near surface energy flux exchanges of the soil-plant-atmosphere interface. The simulated energy fluxes were compared with eddy flux tower measurements between the respective fields at the research farm Scheyern, North-West of Munich, Germany. To perform these simulations, we coupled the ecosystem model Expert-N 5.0 to an analytical footprint model. The coupled model system has the ability to calculate the mixing ratio of the surface energy fluxes at a given point within one grid cell (in this case at the flux tower between the two fields). This approach accounts for the differences of the two soil types, of land use managements, and of canopy properties due to footprint size dynamics. Our preliminary simulation results show that a mosaic approach can improve modeling and analyzing energy fluxes when the land surface is heterogeneous. In this case our applied method is a promising approach to extend weather and climate models on the regional and on the global scale.

  1. Using the trans-lamina terminalis route via a pterional approach to resect a retrochiasmatic craniopharyngioma involving the third ventricle.

    Science.gov (United States)

    Weil, Alexander G; Robert, Thomas; Alsaiari, Sultan; Obaid, Sami; Bojanowski, Michel W

    2016-01-01

    Retrochiasmatic craniopharyngiomas involving the anterior third ventricle are challenging to access. Although the pterional approach is a common route for suprasellar lesions, when the craniopharyngioma extends behind the chiasma into the third ventricle, access is even more difficult, and the lamina terminalis may offer a good working window. The translamina terminalis approach provides direct access to the retrochiasmatic portion of the tumor with minimal brain retraction and no manipulation of the visual nerves. In this video, we emphasize the utility of using the lamina terminalis corridor to resect the retrochiasmatic intraventricular portion of a craniopharyngioma. The video can be found here: https://youtu.be/hrLNC0hDKe4 .

  2. Designing water demand management schemes using a socio-technical modelling approach.

    Science.gov (United States)

    Baki, Sotiria; Rozos, Evangelos; Makropoulos, Christos

    2018-05-01

    Although it is now widely acknowledged that urban water systems (UWSs) are complex socio-technical systems and that a shift towards a socio-technical approach is critical in achieving sustainable urban water management, still, more often than not, UWSs are designed using a segmented modelling approach. As such, either the analysis focuses on the description of the purely technical sub-system, without explicitly taking into account the system's dynamic socio-economic processes, or a more interdisciplinary approach is followed, but delivered through relatively coarse models, which often fail to provide a thorough representation of the urban water cycle and hence cannot deliver accurate estimations of the hydrosystem's responses. In this work we propose an integrated modelling approach for the study of the complete socio-technical UWS that also takes into account socio-economic and climatic variability. We have developed an integrated model, which is used to investigate the diffusion of household water conservation technologies and its effects on the UWS, under different socio-economic and climatic scenarios. The integrated model is formed by coupling a System Dynamics model that simulates the water technology adoption process, and the Urban Water Optioneering Tool (UWOT) for the detailed simulation of the urban water cycle. The model and approach are tested and demonstrated in an urban redevelopment area in Athens, Greece under different socio-economic scenarios and policy interventions. It is suggested that the proposed approach can establish quantifiable links between socio-economic change and UWS responses and therefore assist decision makers in designing more effective and resilient long-term strategies for water conservation. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Analysis of enamel development using murine model systems: approaches and limitations.

    Directory of Open Access Journals (Sweden)

    Megan K Pugach

    2014-09-01

    Full Text Available A primary goal of enamel research is to understand and potentially treat or prevent enamel defects related to amelogenesis imperfecta (AI. Rodents are ideal models to assist our understanding of how enamel is formed because they are easily genetically modified, and their continuously erupting incisors display all stages of enamel development and mineralization. While numerous methods have been developed to generate and analyze genetically modified rodent enamel, it is crucial to understand the limitations and challenges associated with these methods in order to draw appropriate conclusions that can be applied translationally, to AI patient care. We have highlighted methods involved in generating and analyzing rodent enamel and potential approaches to overcoming limitations of these methods: 1 generating transgenic, knockout and knockin mouse models, and 2 analyzing rodent enamel mineral density and functional properties (structure, mechanics of mature enamel. There is a need for a standardized workflow to analyze enamel phenotypes in rodent models so that investigators can compare data from different studies. These methods include analyses of gene and protein expression, developing enamel histology, enamel pigment, degree of mineralization, enamel structure and mechanical properties. Standardization of these methods with regard to stage of enamel development and sample preparation is crucial, and ideally investigators can use correlative and complementary techniques with the understanding that developing mouse enamel is dynamic and complex.

  4. Who Involves Whom?

    Science.gov (United States)

    Ward, Clifford

    1979-01-01

    The author reviews the development of a parents' group at the Bradford Grange School (Manchester, United Kingdom) for ESN (educationally subnormal) children. Problems with the initial parents' group are pointed out, successful approaches are considered, and the importance of parent involvement is stressed. (SBH)

  5. Modelling the dynamics of traits involved in fighting-predators-prey system.

    Science.gov (United States)

    Kooi, B W

    2015-12-01

    We study the dynamics of a predator-prey system where predators fight for captured prey besides searching for and handling (and digestion) of the prey. Fighting for prey is modelled by a continuous time hawk-dove game dynamics where the gain depends on the amount of disputed prey while the costs for fighting is constant per fighting event. The strategy of the predator-population is quantified by a trait being the proportion of the number of predator-individuals playing hawk tactics. The dynamics of the trait is described by two models of adaptation: the replicator dynamics (RD) and the adaptive dynamics (AD). In the RD-approach a variant individual with an adapted trait value changes the population's strategy, and consequently its trait value, only when its payoff is larger than the population average. In the AD-approach successful replacement of the resident population after invasion of a rare variant population with an adapted trait value is a step in a sequence changing the population's strategy, and hence its trait value. The main aim is to compare the consequences of the two adaptation models. In an equilibrium predator-prey system this will lead to convergence to a neutral singular strategy, while in the oscillatory system to a continuous singular strategy where in this endpoint the resident population is not invasible by any variant population. In equilibrium (low prey carrying capacity) RD and AD-approach give the same results, however not always in a periodically oscillating system (high prey carrying-capacity) where the trait is density-dependent. For low costs the predator population is monomorphic (only hawks) while for high costs dimorphic (hawks and doves). These results illustrate that intra-specific trait dynamics matters in predator-prey dynamics.

  6. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  7. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær; Stoustrup, Jakob

    2008-01-01

    This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...

  8. Comparison of two novel approaches to model fibre reinforced concrete

    NARCIS (Netherlands)

    Radtke, F.K.F.; Simone, A.; Sluys, L.J.

    2009-01-01

    We present two approaches to model fibre reinforced concrete. In both approaches, discrete fibre distributions and the behaviour of the fibre-matrix interface are explicitly considered. One approach employs the reaction forces from fibre to matrix while the other is based on the partition of unity

  9. Merits of a Scenario Approach in Dredge Plume Modelling

    DEFF Research Database (Denmark)

    Pedersen, Claus; Chu, Amy Ling Chu; Hjelmager Jensen, Jacob

    2011-01-01

    Dredge plume modelling is a key tool for quantification of potential impacts to inform the EIA process. There are, however, significant uncertainties associated with the modelling at the EIA stage when both dredging methodology and schedule are likely to be a guess at best as the dredging...... contractor would rarely have been appointed. Simulation of a few variations of an assumed full dredge period programme will generally not provide a good representation of the overall environmental risks associated with the programme. An alternative dredge plume modelling strategy that attempts to encapsulate...... uncertainties associated with preliminary dredging programmes by using a scenario-based modelling approach is presented. The approach establishes a set of representative and conservative scenarios for key factors controlling the spill and plume dispersion and simulates all combinations of e.g. dredge, climatic...

  10. An approach to multiscale modelling with graph grammars.

    Science.gov (United States)

    Ong, Yongzhi; Streit, Katarína; Henke, Michael; Kurth, Winfried

    2014-09-01

    Functional-structural plant models (FSPMs) simulate biological processes at different spatial scales. Methods exist for multiscale data representation and modification, but the advantages of using multiple scales in the dynamic aspects of FSPMs remain unclear. Results from multiscale models in various other areas of science that share fundamental modelling issues with FSPMs suggest that potential advantages do exist, and this study therefore aims to introduce an approach to multiscale modelling in FSPMs. A three-part graph data structure and grammar is revisited, and presented with a conceptual framework for multiscale modelling. The framework is used for identifying roles, categorizing and describing scale-to-scale interactions, thus allowing alternative approaches to model development as opposed to correlation-based modelling at a single scale. Reverse information flow (from macro- to micro-scale) is catered for in the framework. The methods are implemented within the programming language XL. Three example models are implemented using the proposed multiscale graph model and framework. The first illustrates the fundamental usage of the graph data structure and grammar, the second uses probabilistic modelling for organs at the fine scale in order to derive crown growth, and the third combines multiscale plant topology with ozone trends and metabolic network simulations in order to model juvenile beech stands under exposure to a toxic trace gas. The graph data structure supports data representation and grammar operations at multiple scales. The results demonstrate that multiscale modelling is a viable method in FSPM and an alternative to correlation-based modelling. Advantages and disadvantages of multiscale modelling are illustrated by comparisons with single-scale implementations, leading to motivations for further research in sensitivity analysis and run-time efficiency for these models.

  11. A fuzzy approach for modelling radionuclide in lake system

    International Nuclear Information System (INIS)

    Desai, H.K.; Christian, R.A.; Banerjee, J.; Patra, A.K.

    2013-01-01

    Radioactive liquid waste is generated during operation and maintenance of Pressurised Heavy Water Reactors (PHWRs). Generally low level liquid waste is diluted and then discharged into the near by water-body through blowdown water discharge line as per the standard waste management practice. The effluents from nuclear installations are treated adequately and then released in a controlled manner under strict compliance of discharge criteria. An attempt was made to predict the concentration of 3 H released from Kakrapar Atomic Power Station at Ratania Regulator, about 2.5 km away from the discharge point, where human exposure is expected. Scarcity of data and complex geometry of the lake prompted the use of Heuristic approach. Under this condition, Fuzzy rule based approach was adopted to develop a model, which could predict 3 H concentration at Ratania Regulator. Three hundred data were generated for developing the fuzzy rules, in which input parameters were water flow from lake and 3 H concentration at discharge point. The Output was 3 H concentration at Ratania Regulator. These data points were generated by multiple regression analysis of the original data. Again by using same methodology hundred data were generated for the validation of the model, which were compared against the predicted output generated by using Fuzzy Rule based approach. Root Mean Square Error of the model came out to be 1.95, which showed good agreement by Fuzzy model of natural ecosystem. -- Highlights: • Uncommon approach (Fuzzy Rule Base) of modelling radionuclide dispersion in Lake. • Predicts 3 H released from Kakrapar Atomic Power Station at a point of human exposure. • RMSE of fuzzy model is 1.95, which means, it has well imitated natural ecosystem

  12. Does the cognitive reflection test measure cognitive reflection? A mathematical modeling approach.

    Science.gov (United States)

    Campitelli, Guillermo; Gerrans, Paul

    2014-04-01

    We used a mathematical modeling approach, based on a sample of 2,019 participants, to better understand what the cognitive reflection test (CRT; Frederick In Journal of Economic Perspectives, 19, 25-42, 2005) measures. This test, which is typically completed in less than 10 min, contains three problems and aims to measure the ability or disposition to resist reporting the response that first comes to mind. However, since the test contains three mathematically based problems, it is possible that the test only measures mathematical abilities, and not cognitive reflection. We found that the models that included an inhibition parameter (i.e., the probability of inhibiting an intuitive response), as well as a mathematical parameter (i.e., the probability of using an adequate mathematical procedure), fitted the data better than a model that only included a mathematical parameter. We also found that the inhibition parameter in males is best explained by both rational thinking ability and the disposition toward actively open-minded thinking, whereas in females this parameter was better explained by rational thinking only. With these findings, this study contributes to the understanding of the processes involved in solving the CRT, and will be particularly useful for researchers who are considering using this test in their research.

  13. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  14. Validation of an employee satisfaction model: A structural equation model approach

    OpenAIRE

    Ophillia Ledimo; Nico Martins

    2015-01-01

    The purpose of this study was to validate an employee satisfaction model and to determine the relationships between the different dimensions of the concept, using the structural equation modelling approach (SEM). A cross-sectional quantitative survey design was used to collect data from a random sample of (n=759) permanent employees of a parastatal organisation. Data was collected using the Employee Satisfaction Survey (ESS) to measure employee satisfaction dimensions. Following the steps of ...

  15. Data and Dynamics Driven Approaches for Modelling and Forecasting the Red Sea Chlorophyll

    KAUST Repository

    Dreano, Denis

    2017-01-01

    concentration and have practical applications for fisheries operation and harmful algae blooms monitoring. Modelling approaches can be divided between physics- driven (dynamical) approaches, and data-driven (statistical) approaches. Dynamical models are based

  16. High dimensions - a new approach to fermionic lattice models

    International Nuclear Information System (INIS)

    Vollhardt, D.

    1991-01-01

    The limit of high spatial dimensions d, which is well-established in the theory of classical and localized spin models, is shown to be a fruitful approach also to itinerant fermion systems, such as the Hubbard model and the periodic Anderson model. Many investigations which are probability difficult in finite dimensions, become tractable in d=∞. At the same time essential features of systems in d=3 and even lower dimensions are very well described by the results obtained in d=∞. A wide range of applications of this new concept (e.g., in perturbation theory, Fermi liquid theory, variational approaches, exact results, etc.) is discussed and the state-of-the-art is reviewed. (orig.)

  17. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  18. Synthesis of industrial applications of local approach to fracture models

    International Nuclear Information System (INIS)

    Eripret, C.

    1993-03-01

    This report gathers different applications of local approach to fracture models to various industrial configurations, such as nuclear pressure vessel steel, cast duplex stainless steels, or primary circuit welds such as bimetallic welds. As soon as models are developed on the basis of microstructural observations, damage mechanisms analyses, and fracture process, the local approach to fracture proves to solve problems where classical fracture mechanics concepts fail. Therefore, local approach appears to be a powerful tool, which completes the standard fracture criteria used in nuclear industry by exhibiting where and why those classical concepts become unvalid. (author). 1 tab., 18 figs., 25 refs

  19. CFD Modeling of Wall Steam Condensation: Two-Phase Flow Approach versus Homogeneous Flow Approach

    International Nuclear Information System (INIS)

    Mimouni, S.; Mechitoua, N.; Foissac, A.; Hassanaly, M.; Ouraou, M.

    2011-01-01

    The present work is focused on the condensation heat transfer that plays a dominant role in many accident scenarios postulated to occur in the containment of nuclear reactors. The study compares a general multiphase approach implemented in NEPTUNE C FD with a homogeneous model, of widespread use for engineering studies, implemented in Code S aturne. The model implemented in NEPTUNE C FD assumes that liquid droplets form along the wall within nucleation sites. Vapor condensation on droplets makes them grow. Once the droplet diameter reaches a critical value, gravitational forces compensate surface tension force and then droplets slide over the wall and form a liquid film. This approach allows taking into account simultaneously the mechanical drift between the droplet and the gas, the heat and mass transfer on droplets in the core of the flow and the condensation/evaporation phenomena on the walls. As concern the homogeneous approach, the motion of the liquid film due to the gravitational forces is neglected, as well as the volume occupied by the liquid. Both condensation models and compressible procedures are validated and compared to experimental data provided by the TOSQAN ISP47 experiment (IRSN Saclay). Computational results compare favorably with experimental data, particularly for the Helium and steam volume fractions.

  20. Modelling of volunteer satisfaction and intention to remain in community service: A stepwise approach

    Science.gov (United States)

    Hasan, Hazlin; Wahid, Sharifah Norhuda Syed; Jais, Mohammad; Ridzuan, Arifi

    2017-05-01

    The purpose of this study is to obtain the most significant model of volunteer satisfaction and intention to remain in community service by using a stepwise approach. Currently, Malaysians, young and old are showing more interests in involving themselves in community service projects, either locally or internationally. This positive movement of serving the needy is somehow being halted by the lack of human and financial resources. Therefore, the trend today sees organizers of such projects depend heavily on voluntary supports as they enable project managers to add and to expand the quantity and diversity of services offered without exhausting the minimal budget available. Volunteers are considered a valuable commodity as the available pool of volunteers may be declining due to various reasons which include the volunteer satisfaction. In tandem with the existing situation, a selected sample of 215 diploma students from one of the public universities in Malaysia, who have been involved in at least one community service project, agreed that everybody should have a volunteering intention in helping others. The findings revealed that the most significant model obtained contains two factors that contributed towards intention to remain in community service; work assignment and organizational support, with work assignment becoming the most significant factor. Further research on the differences of intention to remain in community service between students' stream and gender would be conducted to contribute to the body of knowledge.

  1. A Comparison of Deterministic and Stochastic Modeling Approaches for Biochemical Reaction Systems: On Fixed Points, Means, and Modes.

    Science.gov (United States)

    Hahl, Sayuri K; Kremling, Andreas

    2016-01-01

    In the mathematical modeling of biochemical reactions, a convenient standard approach is to use ordinary differential equations (ODEs) that follow the law of mass action. However, this deterministic ansatz is based on simplifications; in particular, it neglects noise, which is inherent to biological processes. In contrast, the stochasticity of reactions is captured in detail by the discrete chemical master equation (CME). Therefore, the CME is frequently applied to mesoscopic systems, where copy numbers of involved components are small and random fluctuations are thus significant. Here, we compare those two common modeling approaches, aiming at identifying parallels and discrepancies between deterministic variables and possible stochastic counterparts like the mean or modes of the state space probability distribution. To that end, a mathematically flexible reaction scheme of autoregulatory gene expression is translated into the corresponding ODE and CME formulations. We show that in the thermodynamic limit, deterministic stable fixed points usually correspond well to the modes in the stationary probability distribution. However, this connection might be disrupted in small systems. The discrepancies are characterized and systematically traced back to the magnitude of the stoichiometric coefficients and to the presence of nonlinear reactions. These factors are found to synergistically promote large and highly asymmetric fluctuations. As a consequence, bistable but unimodal, and monostable but bimodal systems can emerge. This clearly challenges the role of ODE modeling in the description of cellular signaling and regulation, where some of the involved components usually occur in low copy numbers. Nevertheless, systems whose bimodality originates from deterministic bistability are found to sustain a more robust separation of the two states compared to bimodal, but monostable systems. In regulatory circuits that require precise coordination, ODE modeling is thus still

  2. BioModels: expanding horizons to include more modelling approaches and formats.

    Science.gov (United States)

    Glont, Mihai; Nguyen, Tung V N; Graesslin, Martin; Hälke, Robert; Ali, Raza; Schramm, Jochen; Wimalaratne, Sarala M; Kothamachu, Varun B; Rodriguez, Nicolas; Swat, Maciej J; Eils, Jurgen; Eils, Roland; Laibe, Camille; Malik-Sheriff, Rahuman S; Chelliah, Vijayalakshmi; Le Novère, Nicolas; Hermjakob, Henning

    2018-01-04

    BioModels serves as a central repository of mathematical models representing biological processes. It offers a platform to make mathematical models easily shareable across the systems modelling community, thereby supporting model reuse. To facilitate hosting a broader range of model formats derived from diverse modelling approaches and tools, a new infrastructure for BioModels has been developed that is available at http://www.ebi.ac.uk/biomodels. This new system allows submitting and sharing of a wide range of models with improved support for formats other than SBML. It also offers a version-control backed environment in which authors and curators can work collaboratively to curate models. This article summarises the features available in the current system and discusses the potential benefit they offer to the users over the previous system. In summary, the new portal broadens the scope of models accepted in BioModels and supports collaborative model curation which is crucial for model reproducibility and sharing. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Object-Oriented Approach to Modeling Units of Pneumatic Systems

    Directory of Open Access Journals (Sweden)

    Yu. V. Kyurdzhiev

    2014-01-01

    Full Text Available The article shows the relevance of the approaches to the object-oriented programming when modeling the pneumatic units (PU.Based on the analysis of the calculation schemes of aggregates pneumatic systems two basic objects, namely a cavity flow and a material point were highlighted.Basic interactions of objects are defined. Cavity-cavity interaction: ex-change of matter and energy with the flows of mass. Cavity-point interaction: force interaction, exchange of energy in the form of operation. Point-point in-teraction: force interaction, elastic interaction, inelastic interaction, and inter-vals of displacement.The authors have developed mathematical models of basic objects and interactions. Models and interaction of elements are implemented in the object-oriented programming.Mathematical models of elements of PU design scheme are implemented in derived from the base class. These classes implement the models of flow cavity, piston, diaphragm, short channel, diaphragm to be open by a given law, spring, bellows, elastic collision, inelastic collision, friction, PU stages with a limited movement, etc.A numerical integration of differential equations for the mathematical models of PU design scheme elements is based on the Runge-Kutta method of the fourth order. On request each class performs a tact of integration i.e. calcu-lation of the coefficient method.The paper presents an integration algorithm of the system of differential equations. All objects of the PU design scheme are placed in a unidirectional class list. Iterator loop cycle initiates the integration tact of all the objects in the list. One in four iteration makes a transition to the next step of integration. Calculation process stops when any object shows a shutdowns flag.The proposed approach was tested in the calculation of a number of PU designs. With regard to traditional approaches to modeling, the authors-proposed method features in easy enhancement, code reuse, high reliability

  4. An effective hierarchical model for the biomolecular covalent bond: an approach integrating artificial chemistry and an actual terrestrial life system.

    Science.gov (United States)

    Oohashi, Tsutomu; Ueno, Osamu; Maekawa, Tadao; Kawai, Norie; Nishina, Emi; Honda, Manabu

    2009-01-01

    Under the AChem paradigm and the programmed self-decomposition (PSD) model, we propose a hierarchical model for the biomolecular covalent bond (HBCB model). This model assumes that terrestrial organisms arrange their biomolecules in a hierarchical structure according to the energy strength of their covalent bonds. It also assumes that they have evolutionarily selected the PSD mechanism of turning biological polymers (BPs) into biological monomers (BMs) as an efficient biomolecular recycling strategy We have examined the validity and effectiveness of the HBCB model by coordinating two complementary approaches: biological experiments using existent terrestrial life, and simulation experiments using an AChem system. Biological experiments have shown that terrestrial life possesses a PSD mechanism as an endergonic, genetically regulated process and that hydrolysis, which decomposes a BP into BMs, is one of the main processes of such a mechanism. In simulation experiments, we compared different virtual self-decomposition processes. The virtual species in which the self-decomposition process mainly involved covalent bond cleavage from a BP to BMs showed evolutionary superiority over other species in which the self-decomposition process involved cleavage from BP to classes lower than BM. These converging findings strongly support the existence of PSD and the validity and effectiveness of the HBCB model.

  5. Mathematical Modeling in Mathematics Education: Basic Concepts and Approaches

    Science.gov (United States)

    Erbas, Ayhan Kürsat; Kertil, Mahmut; Çetinkaya, Bülent; Çakiroglu, Erdinç; Alacaci, Cengiz; Bas, Sinem

    2014-01-01

    Mathematical modeling and its role in mathematics education have been receiving increasing attention in Turkey, as in many other countries. The growing body of literature on this topic reveals a variety of approaches to mathematical modeling and related concepts, along with differing perspectives on the use of mathematical modeling in teaching and…

  6. Integrating UML, the Q-model and a Multi-Agent Approach in Process Specifications and Behavioural Models of Organisations

    Directory of Open Access Journals (Sweden)

    Raul Savimaa

    2005-08-01

    Full Text Available Efficient estimation and representation of an organisation's behaviour requires specification of business processes and modelling of actors' behaviour. Therefore the existing classical approaches that concentrate only on planned processes are not suitable and an approach that integrates process specifications with behavioural models of actors should be used instead. The present research indicates that a suitable approach should be based on interactive computing. This paper examines the integration of UML diagrams for process specifications, the Q-model specifications for modelling timing criteria of existing and planned processes and a multi-agent approach for simulating non-deterministic behaviour of human actors in an organisation. The corresponding original methodology is introduced and some of its applications as case studies are reviewed.

  7. A Bayesian nonparametric approach to causal inference on quantiles.

    Science.gov (United States)

    Xu, Dandan; Daniels, Michael J; Winterstein, Almut G

    2018-02-25

    We propose a Bayesian nonparametric approach (BNP) for causal inference on quantiles in the presence of many confounders. In particular, we define relevant causal quantities and specify BNP models to avoid bias from restrictive parametric assumptions. We first use Bayesian additive regression trees (BART) to model the propensity score and then construct the distribution of potential outcomes given the propensity score using a Dirichlet process mixture (DPM) of normals model. We thoroughly evaluate the operating characteristics of our approach and compare it to Bayesian and frequentist competitors. We use our approach to answer an important clinical question involving acute kidney injury using electronic health records. © 2018, The International Biometric Society.

  8. Towards Detecting the Crowd Involved in Social Events

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2017-10-01

    Full Text Available Knowing how people interact with urban environments is fundamental for a variety of fields, ranging from transportation to social science. Despite the fact that human mobility patterns have been a major topic of study in recent years, a challenge to understand large-scale human behavior when a certain event occurs remains due to a lack of either relevant data or suitable approaches. Psychological crowd refers to a group of people who are usually located at different places and show different behaviors, but who are very sensitively driven to take the same act (gather together by a certain event, which has been theoretically studied by social psychologists since the 19th century. This study aims to propose a computational approach using a machine learning method to model psychological crowds, contributing to the better understanding of human activity patterns under events. Psychological features and mental unity of the crowd are computed to detect the involved individuals. A national event happening across the USA in April, 2015 is analyzed using geotagged tweets as a case study to test our approach. The result shows that 81% of individuals in the crowd can be successfully detected. Through investigating the geospatial pattern of the involved users, not only can the event related users be identified but also those unobserved users before the event can be uncovered. The proposed approach can effectively represent the psychological feature and measure the mental unity of the psychological crowd, which sheds light on the study of large-scale psychological crowd and provides an innovative way to understanding human behavior under events.

  9. A Survey of Game Theoretic Approaches to Modelling Decision-Making in Information Warfare Scenarios

    Directory of Open Access Journals (Sweden)

    Kathryn Merrick

    2016-07-01

    Full Text Available Our increasing dependence on information technologies and autonomous systems has escalated international concern for information- and cyber-security in the face of politically, socially and religiously motivated cyber-attacks. Information warfare tactics that interfere with the flow of information can challenge the survival of individuals and groups. It is increasingly important that both humans and machines can make decisions that ensure the trustworthiness of information, communication and autonomous systems. Subsequently, an important research direction is concerned with modelling decision-making processes. One approach to this involves modelling decision-making scenarios as games using game theory. This paper presents a survey of information warfare literature, with the purpose of identifying games that model different types of information warfare operations. Our contribution is a systematic identification and classification of information warfare games, as a basis for modelling decision-making by humans and machines in such scenarios. We also present a taxonomy of games that map to information warfare and cyber crime problems as a precursor to future research on decision-making in such scenarios. We identify and discuss open research questions including the role of behavioural game theory in modelling human decision making and the role of machine decision-making in information warfare scenarios.

  10. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  11. A comprehensive dynamic modeling approach for giant magnetostrictive material actuators

    International Nuclear Information System (INIS)

    Gu, Guo-Ying; Zhu, Li-Min; Li, Zhi; Su, Chun-Yi

    2013-01-01

    In this paper, a comprehensive modeling approach for a giant magnetostrictive material actuator (GMMA) is proposed based on the description of nonlinear electromagnetic behavior, the magnetostrictive effect and frequency response of the mechanical dynamics. It maps the relationships between current and magnetic flux at the electromagnetic part to force and displacement at the mechanical part in a lumped parameter form. Towards this modeling approach, the nonlinear hysteresis effect of the GMMA appearing only in the electrical part is separated from the linear dynamic plant in the mechanical part. Thus, a two-module dynamic model is developed to completely characterize the hysteresis nonlinearity and the dynamic behaviors of the GMMA. The first module is a static hysteresis model to describe the hysteresis nonlinearity, and the cascaded second module is a linear dynamic plant to represent the dynamic behavior. To validate the proposed dynamic model, an experimental platform is established. Then, the linear dynamic part and the nonlinear hysteresis part of the proposed model are identified in sequence. For the linear part, an approach based on axiomatic design theory is adopted. For the nonlinear part, a Prandtl–Ishlinskii model is introduced to describe the hysteresis nonlinearity and a constrained quadratic optimization method is utilized to identify its coefficients. Finally, experimental tests are conducted to demonstrate the effectiveness of the proposed dynamic model and the corresponding identification method. (paper)

  12. Bystander Approaches: Empowering Students to Model Ethical Sexual Behavior

    Science.gov (United States)

    Lynch, Annette; Fleming, Wm. Michael

    2005-01-01

    Sexual violence on college campuses is well documented. Prevention education has emerged as an alternative to victim-- and perpetrator--oriented approaches used in the past. One sexual violence prevention education approach focuses on educating and empowering the bystander to become a point of ethical intervention. In this model, bystanders to…

  13. Accurate phenotyping: Reconciling approaches through Bayesian model averaging.

    Directory of Open Access Journals (Sweden)

    Carla Chia-Ming Chen

    Full Text Available Genetic research into complex diseases is frequently hindered by a lack of clear biomarkers for phenotype ascertainment. Phenotypes for such diseases are often identified on the basis of clinically defined criteria; however such criteria may not be suitable for understanding the genetic composition of the diseases. Various statistical approaches have been proposed for phenotype definition; however our previous studies have shown that differences in phenotypes estimated using different approaches have substantial impact on subsequent analyses. Instead of obtaining results based upon a single model, we propose a new method, using Bayesian model averaging to overcome problems associated with phenotype definition. Although Bayesian model averaging has been used in other fields of research, this is the first study that uses Bayesian model averaging to reconcile phenotypes obtained using multiple models. We illustrate the new method by applying it to simulated genetic and phenotypic data for Kofendred personality disorder-an imaginary disease with several sub-types. Two separate statistical methods were used to identify clusters of individuals with distinct phenotypes: latent class analysis and grade of membership. Bayesian model averaging was then used to combine the two clusterings for the purpose of subsequent linkage analyses. We found that causative genetic loci for the disease produced higher LOD scores using model averaging than under either individual model separately. We attribute this improvement to consolidation of the cores of phenotype clusters identified using each individual method.

  14. Modeling cell adhesion and proliferation: a cellular-automata based approach.

    Science.gov (United States)

    Vivas, J; Garzón-Alvarado, D; Cerrolaza, M

    Cell adhesion is a process that involves the interaction between the cell membrane and another surface, either a cell or a substrate. Unlike experimental tests, computer models can simulate processes and study the result of experiments in a shorter time and lower costs. One of the tools used to simulate biological processes is the cellular automata, which is a dynamic system that is discrete both in space and time. This work describes a computer model based on cellular automata for the adhesion process and cell proliferation to predict the behavior of a cell population in suspension and adhered to a substrate. The values of the simulated system were obtained through experimental tests on fibroblast monolayer cultures. The results allow us to estimate the cells settling time in culture as well as the adhesion and proliferation time. The change in the cells morphology as the adhesion over the contact surface progress was also observed. The formation of the initial link between cell and the substrate of the adhesion was observed after 100 min where the cell on the substrate retains its spherical morphology during the simulation. The cellular automata model developed is, however, a simplified representation of the steps in the adhesion process and the subsequent proliferation. A combined framework of experimental and computational simulation based on cellular automata was proposed to represent the fibroblast adhesion on substrates and changes in a macro-scale observed in the cell during the adhesion process. The approach showed to be simple and efficient.

  15. Consumer input into health care: Time for a new active and comprehensive model of consumer involvement.

    Science.gov (United States)

    Hall, Alix E; Bryant, Jamie; Sanson-Fisher, Rob W; Fradgley, Elizabeth A; Proietto, Anthony M; Roos, Ian

    2018-03-07

    To ensure the provision of patient-centred health care, it is essential that consumers are actively involved in the process of determining and implementing health-care quality improvements. However, common strategies used to involve consumers in quality improvements, such as consumer membership on committees and collection of patient feedback via surveys, are ineffective and have a number of limitations, including: limited representativeness; tokenism; a lack of reliable and valid patient feedback data; infrequent assessment of patient feedback; delays in acquiring feedback; and how collected feedback is used to drive health-care improvements. We propose a new active model of consumer engagement that aims to overcome these limitations. This model involves the following: (i) the development of a new measure of consumer perceptions; (ii) low cost and frequent electronic data collection of patient views of quality improvements; (iii) efficient feedback to the health-care decision makers; and (iv) active involvement of consumers that fosters power to influence health system changes. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.

  16. A hybrid modeling approach for option pricing

    Science.gov (United States)

    Hajizadeh, Ehsan; Seifi, Abbas

    2011-11-01

    The complexity of option pricing has led many researchers to develop sophisticated models for such purposes. The commonly used Black-Scholes model suffers from a number of limitations. One of these limitations is the assumption that the underlying probability distribution is lognormal and this is so controversial. We propose a couple of hybrid models to reduce these limitations and enhance the ability of option pricing. The key input to option pricing model is volatility. In this paper, we use three popular GARCH type model for estimating volatility. Then, we develop two non-parametric models based on neural networks and neuro-fuzzy networks to price call options for S&P 500 index. We compare the results with those of Black-Scholes model and show that both neural network and neuro-fuzzy network models outperform Black-Scholes model. Furthermore, comparing the neural network and neuro-fuzzy approaches, we observe that for at-the-money options, neural network model performs better and for both in-the-money and an out-of-the money option, neuro-fuzzy model provides better results.

  17. Minimally invasive approach for lesions involving the frontal sinus

    African Journals Online (AJOL)

    risk of future meningitis. The frontal ... Traditional open surgery for frontal sinus pathology and cerebrospinal fluid (CSF) leaks is complex and involves a ... sinus. The wound is closed in two layers ... He had noted displacement of his right eye.

  18. Modelling of ductile and cleavage fracture by local approach

    International Nuclear Information System (INIS)

    Samal, M.K.; Dutta, B.K.; Kushwaha, H.S.

    2000-08-01

    This report describes the modelling of ductile and cleavage fracture processes by local approach. It is now well known that the conventional fracture mechanics method based on single parameter criteria is not adequate to model the fracture processes. It is because of the existence of effect of size and geometry of flaw, loading type and rate on the fracture resistance behaviour of any structure. Hence, it is questionable to use same fracture resistance curves as determined from standard tests in the analysis of real life components because of existence of all the above effects. So, there is need to have a method in which the parameters used for the analysis will be true material properties, i.e. independent of geometry and size. One of the solutions to the above problem is the use of local approaches. These approaches have been extensively studied and applied to different materials (including SA33 Gr.6) in this report. Each method has been studied and reported in a separate section. This report has been divided into five sections. Section-I gives a brief review of the fundamentals of fracture process. Section-II deals with modelling of ductile fracture by locally uncoupled type of models. In this section, the critical cavity growth parameters of the different models have been determined for the primary heat transport (PHT) piping material of Indian pressurised heavy water reactor (PHWR). A comparative study has been done among different models. The dependency of the critical parameters on stress triaxiality factor has also been studied. It is observed that Rice and Tracey's model is the most suitable one. But, its parameters are not fully independent of triaxiality factor. For this purpose, a modification to Rice and Tracery's model is suggested in Section-III. Section-IV deals with modelling of ductile fracture process by locally coupled type of models. Section-V deals with the modelling of cleavage fracture process by Beremins model, which is based on Weibulls

  19. Overview of Computer Simulation Modeling Approaches and Methods

    Science.gov (United States)

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  20. Repetitive Identification of Structural Systems Using a Nonlinear Model Parameter Refinement Approach

    Directory of Open Access Journals (Sweden)

    Jeng-Wen Lin

    2009-01-01

    Full Text Available This paper proposes a statistical confidence interval based nonlinear model parameter refinement approach for the health monitoring of structural systems subjected to seismic excitations. The developed model refinement approach uses the 95% confidence interval of the estimated structural parameters to determine their statistical significance in a least-squares regression setting. When the parameters' confidence interval covers the zero value, it is statistically sustainable to truncate such parameters. The remaining parameters will repetitively undergo such parameter sifting process for model refinement until all the parameters' statistical significance cannot be further improved. This newly developed model refinement approach is implemented for the series models of multivariable polynomial expansions: the linear, the Taylor series, and the power series model, leading to a more accurate identification as well as a more controllable design for system vibration control. Because the statistical regression based model refinement approach is intrinsically used to process a “batch” of data and obtain an ensemble average estimation such as the structural stiffness, the Kalman filter and one of its extended versions is introduced to the refined power series model for structural health monitoring.

  1. A simple approach for the modeling of an ODS steel mechanical behavior in pilgering conditions

    International Nuclear Information System (INIS)

    Vanegas-Márquez, E.; Mocellin, K.; Toualbi, L.; Carlan, Y. de; Logé, R.E.

    2012-01-01

    Highlights: ► The mechanical behavior of an ODS steel is investigated under pilgering conditions. ► Two mechanical tests show different trends, and are described with a simple model. ► Model parameters are identified using one sample, and considering strain range changes. ► The constitutive model involves few parameters but their values are strain path dependent. ► One identified set of parameters would be appropriate for FEM modeling of pilgering. - Abstract: The optimization of the forming of ODS tubes is linked to the choice of an appropriated constitutive model for modeling the metal forming process. In the framework of a unified plastic constitutive theory, the strain-controlled cyclic characteristics of a ferritic ODS steel were analyzed and modeled with two different tests. The first test is a classical tension–compression test, and leads to cyclic softening at low to intermediate strain amplitudes. The second test consists in alternated uniaxial compressions along two perpendicular axes, and is selected based on the similarities with the loading path induced by the Fe–14Cr–1W–Ti ODS cladding tube pilgering process. This second test exhibits cyclic hardening at all tested strain amplitudes. Since variable strain amplitudes prevail in pilgering conditions, the parameters of the considered constitutive law were identified based on a loading sequence including strain amplitude changes. A proposed semi automated inverse analysis methodology is shown to efficiently provide optimal sets of parameters for the considered loading sequences. When compared to classical approaches, the model involves a reduced number of parameters, while keeping a good ability to capture stress changes induced by strain amplitude changes. Furthermore, the methodology only requires one test, which is an advantage when the amount of available material is limited. As two distinct sets of parameters were identified for the two considered tests, it is recommended to

  2. Inverse modeling approach for evaluation of kinetic parameters of a biofilm reactor using tabu search.

    Science.gov (United States)

    Kumar, B Shiva; Venkateswarlu, Ch

    2014-08-01

    The complex nature of biological reactions in biofilm reactors often poses difficulties in analyzing such reactors experimentally. Mathematical models could be very useful for their design and analysis. However, application of biofilm reactor models to practical problems proves somewhat ineffective due to the lack of knowledge of accurate kinetic models and uncertainty in model parameters. In this work, we propose an inverse modeling approach based on tabu search (TS) to estimate the parameters of kinetic and film thickness models. TS is used to estimate these parameters as a consequence of the validation of the mathematical models of the process with the aid of measured data obtained from an experimental fixed-bed anaerobic biofilm reactor involving the treatment of pharmaceutical industry wastewater. The results evaluated for different modeling configurations of varying degrees of complexity illustrate the effectiveness of TS for accurate estimation of kinetic and film thickness model parameters of the biofilm process. The results show that the two-dimensional mathematical model with Edward kinetics (with its optimum parameters as mu(max)rho(s)/Y = 24.57, Ks = 1.352 and Ki = 102.36) and three-parameter film thickness expression (with its estimated parameters as a = 0.289 x 10(-5), b = 1.55 x 10(-4) and c = 15.2 x 10(-6)) better describes the biofilm reactor treating the industry wastewater.

  3. Mathematical models for therapeutic approaches to control HIV disease transmission

    CERN Document Server

    Roy, Priti Kumar

    2015-01-01

    The book discusses different therapeutic approaches based on different mathematical models to control the HIV/AIDS disease transmission. It uses clinical data, collected from different cited sources, to formulate the deterministic as well as stochastic mathematical models of HIV/AIDS. It provides complementary approaches, from deterministic and stochastic points of view, to optimal control strategy with perfect drug adherence and also tries to seek viewpoints of the same issue from different angles with various mathematical models to computer simulations. The book presents essential methods and techniques for students who are interested in designing epidemiological models on HIV/AIDS. It also guides research scientists, working in the periphery of mathematical modeling, and helps them to explore a hypothetical method by examining its consequences in the form of a mathematical modelling and making some scientific predictions. The model equations, mathematical analysis and several numerical simulations that are...

  4. Pure transvaginal excision of mesh erosion involving the bladder.

    Science.gov (United States)

    Firoozi, Farzeen; Goldman, Howard B

    2013-06-01

    We present a pure transvaginal approach to the removal of eroded mesh involving the bladder secondary to placement of transvaginal mesh for management of pelvic organ prolapse (POP) using a mesh kit. Although technically challenging, we demonstrate the feasibility of a purely transvaginal approach, avoiding a potentially more morbid transabdominal approach. The video presents the surgical technique of pure transvaginal excision of mesh erosion involving the bladder after mesh placement using a prolapse kit was performed. This video shows that purely transvaginal removal of mesh erosion involving the bladder can be done safely and is feasible.

  5. A security modeling approach for web-service-based business processes

    DEFF Research Database (Denmark)

    Jensen, Meiko; Feja, Sven

    2009-01-01

    a transformation that automatically derives WS-SecurityPolicy-conformant security policies from the process model, which in conjunction with the generated WS-BPEL processes and WSDL documents provides the ability to deploy and run the complete security-enhanced process based on Web Service technology.......The rising need for security in SOA applications requires better support for management of non-functional properties in web-based business processes. Here, the model-driven approach may provide valuable benefits in terms of maintainability and deployment. Apart from modeling the pure functionality...... of a process, the consideration of security properties at the level of a process model is a promising approach. In this work-in-progress paper we present an extension to the ARIS SOA Architect that is capable of modeling security requirements as a separate security model view. Further we provide...

  6. Query Language for Location-Based Services: A Model Checking Approach

    Science.gov (United States)

    Hoareau, Christian; Satoh, Ichiro

    We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.

  7. Merging Digital Surface Models Implementing Bayesian Approaches

    Science.gov (United States)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  8. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  9. Box-wing model approach for solar radiation pressure modelling in a multi-GNSS scenario

    Science.gov (United States)

    Tobias, Guillermo; Jesús García, Adrián

    2016-04-01

    The solar radiation pressure force is the largest orbital perturbation after the gravitational effects and the major error source affecting GNSS satellites. A wide range of approaches have been developed over the years for the modelling of this non gravitational effect as part of the orbit determination process. These approaches are commonly divided into empirical, semi-analytical and analytical, where their main difference relies on the amount of knowledge of a-priori physical information about the properties of the satellites (materials and geometry) and their attitude. It has been shown in the past that the pre-launch analytical models fail to achieve the desired accuracy mainly due to difficulties in the extrapolation of the in-orbit optical and thermic properties, the perturbations in the nominal attitude law and the aging of the satellite's surfaces, whereas empirical models' accuracies strongly depend on the amount of tracking data used for deriving the models, and whose performances are reduced as the area to mass ratio of the GNSS satellites increases, as it happens for the upcoming constellations such as BeiDou and Galileo. This paper proposes to use basic box-wing model for Galileo complemented with empirical parameters, based on the limited available information about the Galileo satellite's geometry. The satellite is modelled as a box, representing the satellite bus, and a wing representing the solar panel. The performance of the model will be assessed for GPS, GLONASS and Galileo constellations. The results of the proposed approach have been analyzed over a one year period. In order to assess the results two different SRP models have been used. Firstly, the proposed box-wing model and secondly, the new CODE empirical model, ECOM2. The orbit performances of both models are assessed using Satellite Laser Ranging (SLR) measurements, together with the evaluation of the orbit prediction accuracy. This comparison shows the advantages and disadvantages of

  10. A computational approach to compare regression modelling strategies in prediction research.

    Science.gov (United States)

    Pajouheshnia, Romin; Pestman, Wiebe R; Teerenstra, Steven; Groenwold, Rolf H H

    2016-08-25

    It is often unclear which approach to fit, assess and adjust a model will yield the most accurate prediction model. We present an extension of an approach for comparing modelling strategies in linear regression to the setting of logistic regression and demonstrate its application in clinical prediction research. A framework for comparing logistic regression modelling strategies by their likelihoods was formulated using a wrapper approach. Five different strategies for modelling, including simple shrinkage methods, were compared in four empirical data sets to illustrate the concept of a priori strategy comparison. Simulations were performed in both randomly generated data and empirical data to investigate the influence of data characteristics on strategy performance. We applied the comparison framework in a case study setting. Optimal strategies were selected based on the results of a priori comparisons in a clinical data set and the performance of models built according to each strategy was assessed using the Brier score and calibration plots. The performance of modelling strategies was highly dependent on the characteristics of the development data in both linear and logistic regression settings. A priori comparisons in four empirical data sets found that no strategy consistently outperformed the others. The percentage of times that a model adjustment strategy outperformed a logistic model ranged from 3.9 to 94.9 %, depending on the strategy and data set. However, in our case study setting the a priori selection of optimal methods did not result in detectable improvement in model performance when assessed in an external data set. The performance of prediction modelling strategies is a data-dependent process and can be highly variable between data sets within the same clinical domain. A priori strategy comparison can be used to determine an optimal logistic regression modelling strategy for a given data set before selecting a final modelling approach.

  11. Numerical modeling of hydrodynamics and sediment transport—an integrated approach

    Science.gov (United States)

    Gic-Grusza, Gabriela; Dudkowska, Aleksandra

    2017-10-01

    Point measurement-based estimation of bedload transport in the coastal zone is very difficult. The only way to assess the magnitude and direction of bedload transport in larger areas, particularly those characterized by complex bottom topography and hydrodynamics, is to use a holistic approach. This requires modeling of waves, currents, and the critical bed shear stress and bedload transport magnitude, with a due consideration to the realistic bathymetry and distribution of surface sediment types. Such a holistic approach is presented in this paper which describes modeling of bedload transport in the Gulf of Gdańsk. Extreme storm conditions defined based on 138-year NOAA data were assumed. The SWAN model (Booij et al. 1999) was used to define wind-wave fields, whereas wave-induced currents were calculated using the Kołodko and Gic-Grusza (2015) model, and the magnitude of bedload transport was estimated using the modified Meyer-Peter and Müller (1948) formula. The calculations were performed using a GIS model. The results obtained are innovative. The approach presented appears to be a valuable source of information on bedload transport in the coastal zone.

  12. A multiscale modeling approach for biomolecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Bowling, Alan, E-mail: bowling@uta.edu; Haghshenas-Jaryani, Mahdi, E-mail: mahdi.haghshenasjaryani@mavs.uta.edu [The University of Texas at Arlington, Department of Mechanical and Aerospace Engineering (United States)

    2015-04-15

    This paper presents a new multiscale molecular dynamic model for investigating the effects of external interactions, such as contact and impact, during stepping and docking of motor proteins and other biomolecular systems. The model retains the mass properties ensuring that the result satisfies Newton’s second law. This idea is presented using a simple particle model to facilitate discussion of the rigid body model; however, the particle model does provide insights into particle dynamics at the nanoscale. The resulting three-dimensional model predicts a significant decrease in the effect of the random forces associated with Brownian motion. This conclusion runs contrary to the widely accepted notion that the motor protein’s movements are primarily the result of thermal effects. This work focuses on the mechanical aspects of protein locomotion; the effect ATP hydrolysis is estimated as internal forces acting on the mechanical model. In addition, the proposed model can be numerically integrated in a reasonable amount of time. Herein, the differences between the motion predicted by the old and new modeling approaches are compared using a simplified model of myosin V.

  13. Job involvement of primary healthcare employees: does a service provision model play a role?

    Science.gov (United States)

    Koponen, Anne M; Laamanen, Ritva; Simonsen-Rehn, Nina; Sundell, Jari; Brommels, Mats; Suominen, Sakari

    2010-05-01

    To investigate whether the development of job involvement of primary healthcare (PHC) employees in Southern Municipality (SM), where PHC services were outsourced to an independent non-profit organisation, differed from that in the three comparison municipalities (M1, M2, M3) with municipal service providers. Also, the associations of job involvement with factors describing the psychosocial work environment were investigated. A panel mail survey 2000-02 in Finland (n=369, response rates 73% and 60%). The data were analysed by descriptive statistics and multivariate linear regression analysis. Despite the favourable development in the psychosocial work environment, job involvement decreased most in SM, which faced the biggest organisational changes. Job involvement decreased also in M3, where the psychosocial work environment deteriorated most. Job involvement in 2002 was best predicted by high baseline level of interactional justice and work control, positive change in interactional justice, and higher age. Also other factors, such as organisational stability, seemed to play a role; after controlling for the effect of the psychosocial work characteristics, job involvement was higher in M3 than in SM. Outsourcing of PHC services may decrease job involvement at least during the first years. A particular service provision model is better than the others only if it is superior in providing a favourable and stable psychosocial work environment.

  14. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    Creation of DEVS models has been advanced through Model Driven Architecture and its frameworks. The overarching role of the frameworks has been to help develop model specifications in a disciplined fashion. Frameworks can provide intermediary layers between the higher level mathematical models...... and their corresponding software specifications from both structural and behavioral aspects. Unlike structural modeling, developing models to specify behavior of systems is known to be harder and more complex, particularly when operations with non-trivial control schemes are required. In this paper, we propose specifying...... activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  15. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches

    Energy Technology Data Exchange (ETDEWEB)

    Walke, Russell C. [Quintessa Limited, The Hub, 14 Station Road, Henley-on-Thames (United Kingdom); Kirchner, Gerald [University of Hamburg, ZNF, Beim Schlump 83, 20144 Hamburg (Germany); Xu, Shulan; Dverstorp, Bjoern [Swedish Radiation Safety Authority, SE-171 16 Stockholm (Sweden)

    2014-07-01

    to the biosphere. Some radionuclides do not reach equilibrium within the time frame that the biosphere evolves at the Forsmark site, making associated dose factors sensitive to time scales assumed for biosphere evolution. Comparison of the results generated by both types of model demonstrates that, for areas that evolve from marine, through lakes and mires to terrestrial systems with organic soils, the approach adopted in SKB's model is conservative. However, higher dose factors are possible when potential for long-term irrigation with shallow groundwater is considered. Surveys of groundwater wells in the Forsmark area today show that some shallow groundwater is used to water plants, which demonstrates that small scale irrigation from such sources cannot be ruled out for present-day or warmer climate states. Complex models use more of the available site-specific information and contribute to an understanding of complex process interactions and effects of system heterogeneity. The study shows, however, that simple 'reference' biosphere models enable processes that control potential radionuclide impacts to be identified, taking into account climate variability. They help to build understanding and confidence in more complex modelling approaches, quantify the conservatisms involved and remain a valuable tool for nuclear waste disposal licensing procedures. (authors)

  16. A Composite Modelling Approach to Decision Support by the Use of the CBA-DK Model

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn; Salling, Kim Bang; Leleur, Steen

    2007-01-01

    This paper presents a decision support system for assessment of transport infrastructure projects. The composite modelling approach, COSIMA, combines a cost-benefit analysis by use of the CBA-DK model with multi-criteria analysis applying the AHP and SMARTER techniques. The modelling uncertaintie...

  17. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    International Nuclear Information System (INIS)

    Higdon, Dave; McDonnell, Jordan D; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2015-01-01

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model η(θ), where θ denotes the uncertain, best input setting. Hence the statistical model is of the form y=η(θ)+ϵ, where ϵ accounts for measurement, and possibly other, error sources. When nonlinearity is present in η(⋅), the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model η(⋅). This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. We also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory. (paper)

  18. The "Village" model: a consumer-driven approach for aging in place.

    Science.gov (United States)

    Scharlach, Andrew; Graham, Carrie; Lehning, Amanda

    2012-06-01

    This study examines the characteristics of the "Village" model, an innovative consumer-driven approach that aims to promote aging in place through a combination of member supports, service referrals, and consumer engagement. Thirty of 42 fully operational Villages completed 2 surveys. One survey examined Villages' member characteristics, membership types, and fee structures. An additional survey collected information about organizational mission, goals, methods of operation, funding sources, challenges, and older adults' roles. Villages provide a variety of support services designed to help members age in place, meet service needs, and promote health and quality of life. Most Villages operate relatively autonomously, relying primarily on member fees and donations. Village members typically are highly involved in organizational development and oversight and provide services to other members in almost half of the Villages. Members predominantly are aged 65 years or older, White, non-Hispanic, homeowners, and have care needs that are slightly lower than those of the elderly U.S. population overall. Villages are a promising model for addressing service needs among middle-class seniors who seek to age in their own homes and communities. Financial sustainability is apt to be a challenge unless Villages secure more stable sources of funding. Organizational sustainability may be promoted through affiliations with social service agencies and other sources of technical and financial assistance. Future evaluation is needed regarding the impact of Villages on elders' ability to age in place as well as the long-term sustainability of the Village model.

  19. A Bayesian Approach to Model Selection in Hierarchical Mixtures-of-Experts Architectures.

    Science.gov (United States)

    Tanner, Martin A.; Peng, Fengchun; Jacobs, Robert A.

    1997-03-01

    There does not exist a statistical model that shows good performance on all tasks. Consequently, the model selection problem is unavoidable; investigators must decide which model is best at summarizing the data for each task of interest. This article presents an approach to the model selection problem in hierarchical mixtures-of-experts architectures. These architectures combine aspects of generalized linear models with those of finite mixture models in order to perform tasks via a recursive "divide-and-conquer" strategy. Markov chain Monte Carlo methodology is used to estimate the distribution of the architectures' parameters. One part of our approach to model selection attempts to estimate the worth of each component of an architecture so that relatively unused components can be pruned from the architecture's structure. A second part of this approach uses a Bayesian hypothesis testing procedure in order to differentiate inputs that carry useful information from nuisance inputs. Simulation results suggest that the approach presented here adheres to the dictum of Occam's razor; simple architectures that are adequate for summarizing the data are favored over more complex structures. Copyright 1997 Elsevier Science Ltd. All Rights Reserved.

  20. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension

    Directory of Open Access Journals (Sweden)

    Ueno Kazuko

    2009-04-01

    Full Text Available Abstract Background Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. Results A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules – Rule I and Rule II – to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in

  1. A long-memory model of motor learning in the saccadic system: a regime-switching approach.

    Science.gov (United States)

    Wong, Aaron L; Shelhamer, Mark

    2013-08-01

    Maintenance of movement accuracy relies on motor learning, by which prior errors guide future behavior. One aspect of this learning process involves the accurate generation of predictions of movement outcome. These predictions can, for example, drive anticipatory movements during a predictive-saccade task. Predictive saccades are rapid eye movements made to anticipated future targets based on error information from prior movements. This predictive process exhibits long-memory (fractal) behavior, as suggested by inter-trial fluctuations. Here, we model this learning process using a regime-switching approach, which avoids the computational complexities associated with true long-memory processes. The resulting model demonstrates two fundamental characteristics. First, long-memory behavior can be mimicked by a system possessing no true long-term memory, producing model outputs consistent with human-subjects performance. In contrast, the popular two-state model, which is frequently used in motor learning, cannot replicate these findings. Second, our model suggests that apparent long-term memory arises from the trade-off between correcting for the most recent movement error and maintaining consistent long-term behavior. Thus, the model surprisingly predicts that stronger long-memory behavior correlates to faster learning during adaptation (in which systematic errors drive large behavioral changes); greater apparent long-term memory indicates more effective incorporation of error from the cumulative history across trials.

  2. Modelling Configuration Knowledge in Heterogeneous Product Families

    DEFF Research Database (Denmark)

    Queva, Matthieu Stéphane Benoit; Männistö, Tomi; Ricci, Laurent

    2011-01-01

    Product configuration systems play an important role in the development of Mass Customisation. The configuration of complex product families may nowadays involve multiple design disciplines, e.g. hardware, software and services. In this paper, we present a conceptual approach for modelling...... the variability in such heterogeneous product families. Our approach is based on a framework that aims to cater for the different stakeholders involved in the modelling and management of the product family. The modelling approach is centred around the concepts of views, types and constraints and is illustrated...... by a motivation example. Furthermore, as a proof of concept, a prototype has been implemented for configuring a non-trivial heterogeneous product family....

  3. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    Science.gov (United States)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  4. The Role of Student Involvement and Perceptions of Integration in a Causal Model of Student Persistence.

    Science.gov (United States)

    Berger, Joseph B.; Milem, Jeffrey F.

    1999-01-01

    This study refined and applied an integrated model of undergraduate persistence (accounting for both behavioral and perceptual components) to examine first-year retention at a private, highly selective research university. Results suggest that including behaviorally based measures of involvement improves the model's explanatory power concerning…

  5. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches

    Science.gov (United States)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia

    2017-10-01

    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  6. University Physics Students' Use of Models in Explanations of Phenomena Involving Interaction between Metals and Electromagnetic Radiation.

    Science.gov (United States)

    Redfors, Andreas; Ryder, Jim

    2001-01-01

    Examines third year university physics students' use of models when explaining familiar phenomena involving interaction between metals and electromagnetic radiation. Concludes that few students use a single model consistently. (Contains 27 references.) (DDR)

  7. A generalized approach for historical mock-up acquisition and data modelling: Towards historically enriched 3D city models

    Science.gov (United States)

    Hervy, B.; Billen, R.; Laroche, F.; Carré, C.; Servières, M.; Van Ruymbeke, M.; Tourre, V.; Delfosse, V.; Kerouanton, J.-L.

    2012-10-01

    Museums are filled with hidden secrets. One of those secrets lies behind historical mock-ups whose signification goes far behind a simple representation of a city. We face the challenge of designing, storing and showing knowledge related to these mock-ups in order to explain their historical value. Over the last few years, several mock-up digitalisation projects have been realised. Two of them, Nantes 1900 and Virtual Leodium, propose innovative approaches that present a lot of similarities. This paper presents a framework to go one step further by analysing their data modelling processes and extracting what could be a generalized approach to build a numerical mock-up and the knowledge database associated. Geometry modelling and knowledge modelling influence each other and are conducted in a parallel process. Our generalized approach describes a global overview of what can be a data modelling process. Our next goal is obviously to apply this global approach on other historical mock-up, but we also think about applying it to other 3D objects that need to embed semantic data, and approaching historically enriched 3D city models.

  8. A review of function modeling: Approaches and applications

    OpenAIRE

    Erden, M.S.; Komoto, H.; Van Beek, T.J.; D'Amelio, V.; Echavarria, E.; Tomiyama, T.

    2008-01-01

    This work is aimed at establishing a common frame and understanding of function modeling (FM) for our ongoing research activities. A comparative review of the literature is performed to grasp the various FM approaches with their commonalities and differences. The relations of FM with the research fields of artificial intelligence, design theory, and maintenance are discussed. In this discussion the goals are to highlight the features of various classical approaches in relation to FM, to delin...

  9. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    Science.gov (United States)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  10. A distributed delay approach for modeling delayed outcomes in pharmacokinetics and pharmacodynamics studies.

    Science.gov (United States)

    Hu, Shuhua; Dunlavey, Michael; Guzy, Serge; Teuscher, Nathan

    2018-04-01

    A distributed delay approach was proposed in this paper to model delayed outcomes in pharmacokinetics and pharmacodynamics studies. This approach was shown to be general enough to incorporate a wide array of pharmacokinetic and pharmacodynamic models as special cases including transit compartment models, effect compartment models, typical absorption models (either zero-order or first-order absorption), and a number of atypical (or irregular) absorption models (e.g., parallel first-order, mixed first-order and zero-order, inverse Gaussian, and Weibull absorption models). Real-life examples were given to demonstrate how to implement distributed delays in Phoenix ® NLME™ 8.0, and to numerically show the advantages of the distributed delay approach over the traditional methods.

  11. Numerical modeling of underground openings behavior with a viscoplastic approach

    International Nuclear Information System (INIS)

    Kleine, A.

    2007-01-01

    Nature is complex and must be approached in total modesty by engineers seeking to predict the behavior of underground openings. The engineering of industrial projects in underground situations, with high economic and social stakes (Alpine mountain crossings, nuclear waste repository), mean striving to gain better understanding of the behavioral mechanisms of the openings to be designed. This improvement necessarily involves better physical representativeness of macroscopic mechanisms and the provision of prediction tools suited to the expectations and needs of the engineers. The calculation tools developed in this work is in step with this concern for satisfying industrial needs and developing knowledge related to the rheology of geo-materials. These developments led to the proposing of a mechanical constitutive model, suited to lightly fissured rocks, comparable to continuous media, while integrating more particularly the effect of time. Thread of this study, the problematics ensued from the subject of the thesis is precisely about the rock mass delayed behavior in numerical modeling and its consequences on underground openings design. Based on physical concepts of reference, defined in several scales (macro/meso/micro), the developed constitutive model is translated in a mathematical formalism in order to be numerically implemented. Numerical applications presented as illustrations fall mainly within the framework of nuclear waste repository problems. They concern two very different configurations of underground openings: the AECL's underground canadian laboratory, excavated in the Lac du Bonnet granite, and the GMR gallery of Bure's laboratory (Meuse/Haute-Marne), dug in argillaceous rock. In this two cases, this constitutive model use highlights the gains to be obtained from allowing for delayed behavior regarding the accuracy of numerical tunnel behavior predictions in the short, medium and long terms. (author)

  12. A Low-involvement Choice Model for Consumer Panel Data

    OpenAIRE

    Brugha, Cathal; Turley, Darach

    1987-01-01

    The long overdue surge of interest in consumer behaviour texts in low-involvement purchasing has only begun to gather momemtum. It often takes the form of asking whether concepts usually associated with high-involvement purchasing can be applied, albeit in a modified form, to low-involvement purchasing. One such concept is evoked set, that is the range of brands deemed acceptable by a consumer in a particular product area. This has characteristically been associated with consumption involving...

  13. A novel approach for modelling complex maintenance systems using discrete event simulation

    International Nuclear Information System (INIS)

    Alrabghi, Abdullah; Tiwari, Ashutosh

    2016-01-01

    Existing approaches for modelling maintenance rely on oversimplified assumptions which prevent them from reflecting the complexity found in industrial systems. In this paper, we propose a novel approach that enables the modelling of non-identical multi-unit systems without restrictive assumptions on the number of units or their maintenance characteristics. Modelling complex interactions between maintenance strategies and their effects on assets in the system is achieved by accessing event queues in Discrete Event Simulation (DES). The approach utilises the wide success DES has achieved in manufacturing by allowing integration with models that are closely related to maintenance such as production and spare parts systems. Additional advantages of using DES include rapid modelling and visual interactive simulation. The proposed approach is demonstrated in a simulation based optimisation study of a published case. The current research is one of the first to optimise maintenance strategies simultaneously with their parameters while considering production dynamics and spare parts management. The findings of this research provide insights for non-conflicting objectives in maintenance systems. In addition, the proposed approach can be used to facilitate the simulation and optimisation of industrial maintenance systems. - Highlights: • This research is one of the first to optimise maintenance strategies simultaneously. • New insights for non-conflicting objectives in maintenance systems. • The approach can be used to optimise industrial maintenance systems.

  14. A variational approach to chiral quark models

    International Nuclear Information System (INIS)

    Futami, Yasuhiko; Odajima, Yasuhiko; Suzuki, Akira.

    1987-01-01

    A variational approach is applied to a chiral quark model to test the validity of the perturbative treatment of the pion-quark interaction based on the chiral symmetry principle. It is indispensably related to the chiral symmetry breaking radius if the pion-quark interaction can be regarded as a perturbation. (author)

  15. Time series modeling by a regression approach based on a latent process.

    Science.gov (United States)

    Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice

    2009-01-01

    Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.

  16. A discontinuous Galerkin approach for conservative modeling of fully nonlinear and weakly dispersive wave transformations

    Science.gov (United States)

    Sharifian, Mohammad Kazem; Kesserwani, Georges; Hassanzadeh, Yousef

    2018-05-01

    This work extends a robust second-order Runge-Kutta Discontinuous Galerkin (RKDG2) method to solve the fully nonlinear and weakly dispersive flows, within a scope to simultaneously address accuracy, conservativeness, cost-efficiency and practical needs. The mathematical model governing such flows is based on a variant form of the Green-Naghdi (GN) equations decomposed as a hyperbolic shallow water system with an elliptic source term. Practical features of relevance (i.e. conservative modeling over irregular terrain with wetting and drying and local slope limiting) have been restored from an RKDG2 solver to the Nonlinear Shallow Water (NSW) equations, alongside new considerations to integrate elliptic source terms (i.e. via a fourth-order local discretization of the topography) and to enable local capturing of breaking waves (i.e. via adding a detector for switching off the dispersive terms). Numerical results are presented, demonstrating the overall capability of the proposed approach in achieving realistic prediction of nearshore wave processes involving both nonlinearity and dispersion effects within a single model.

  17. A qualitative evaluation approach for energy system modelling frameworks

    DEFF Research Database (Denmark)

    Wiese, Frauke; Hilpert, Simon; Kaldemeyer, Cord

    2018-01-01

    properties define how useful it is in regard to the existing challenges. For energy system models, evaluation methods exist, but we argue that many decisions upon properties are rather made on the model generator or framework level. Thus, this paper presents a qualitative approach to evaluate frameworks...

  18. Systems and context modeling approach to requirements analysis

    Science.gov (United States)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  19. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  20. A global sensitivity analysis approach for morphogenesis models.

    Science.gov (United States)

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  1. Vector-model-supported approach in prostate plan optimization

    International Nuclear Information System (INIS)

    Liu, Eva Sau Fan; Wu, Vincent Wing Cheung; Harris, Benjamin; Lehman, Margot; Pryor, David; Chan, Lawrence Wing Chi

    2017-01-01

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100 previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration

  2. Vector-model-supported approach in prostate plan optimization

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Eva Sau Fan [Department of Radiation Oncology, Princess Alexandra Hospital, Brisbane (Australia); Department of Health Technology and Informatics, The Hong Kong Polytechnic University (Hong Kong); Wu, Vincent Wing Cheung [Department of Health Technology and Informatics, The Hong Kong Polytechnic University (Hong Kong); Harris, Benjamin [Department of Radiation Oncology, Princess Alexandra Hospital, Brisbane (Australia); Lehman, Margot; Pryor, David [Department of Radiation Oncology, Princess Alexandra Hospital, Brisbane (Australia); School of Medicine, University of Queensland (Australia); Chan, Lawrence Wing Chi, E-mail: wing.chi.chan@polyu.edu.hk [Department of Health Technology and Informatics, The Hong Kong Polytechnic University (Hong Kong)

    2017-07-01

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100 previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration

  3. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  4. Applying the Intervention Model for Fostering Affective Involvement with Persons Who Are Congenitally Deafblind: An Effect Study

    Science.gov (United States)

    Martens, Marga A. W.; Janssen, Marleen J.; Ruijssenaars, Wied A. J. J. M.; Huisman, Mark; Riksen-Walraven, J. Marianne

    2014-01-01

    Introduction: In this study, we applied the Intervention Model for Affective Involvement (IMAI) to four participants who are congenitally deafblind and their 16 communication partners in 3 different settings (school, a daytime activities center, and a group home). We examined whether the intervention increased affective involvement between the…

  5. Novel Harmonic Regularization Approach for Variable Selection in Cox’s Proportional Hazards Model

    Directory of Open Access Journals (Sweden)

    Ge-Jin Chu

    2014-01-01

    Full Text Available Variable selection is an important issue in regression and a number of variable selection methods have been proposed involving nonconvex penalty functions. In this paper, we investigate a novel harmonic regularization method, which can approximate nonconvex Lq  (1/2model using microarray gene expression data. The harmonic regularization method can be efficiently solved using our proposed direct path seeking approach, which can produce solutions that closely approximate those for the convex loss function and the nonconvex regularization. Simulation results based on the artificial datasets and four real microarray gene expression datasets, such as real diffuse large B-cell lymphoma (DCBCL, the lung cancer, and the AML datasets, show that the harmonic regularization method can be more accurate for variable selection than existing Lasso series methods.

  6. An integrated approach to permeability modeling using micro-models

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, A.H.; Leuangthong, O.; Deutsch, C.V. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[Alberta Univ., Edmonton, AB (Canada)

    2008-10-15

    An important factor in predicting the performance of steam assisted gravity drainage (SAGD) well pairs is the spatial distribution of permeability. Complications that make the inference of a reliable porosity-permeability relationship impossible include the presence of short-scale variability in sand/shale sequences; preferential sampling of core data; and uncertainty in upscaling parameters. Micro-modelling is a simple and effective method for overcoming these complications. This paper proposed a micro-modeling approach to account for sampling bias, small laminated features with high permeability contrast, and uncertainty in upscaling parameters. The paper described the steps and challenges of micro-modeling and discussed the construction of binary mixture geo-blocks; flow simulation and upscaling; extended power law formalism (EPLF); and the application of micro-modeling and EPLF. An extended power-law formalism to account for changes in clean sand permeability as a function of macroscopic shale content was also proposed and tested against flow simulation results. There was close agreement between the model and simulation results. The proposed methodology was also applied to build the porosity-permeability relationship for laminated and brecciated facies of McMurray oil sands. Experimental data was in good agreement with the experimental data. 8 refs., 17 figs.

  7. Bridging process-based and empirical approaches to modeling tree growth

    Science.gov (United States)

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  8. An Asset Pricing Approach to Testing General Term Structure Models including Heath-Jarrow-Morton Specifications and Affine Subclasses

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; van der Wel, Michel

    of the risk premium is associated with the slope factor, and individual risk prices depend on own past values, factor realizations, and past values of other risk prices, and are significantly related to the output gap, consumption, and the equity risk price. The absence of arbitrage opportunities is strongly...... is tested, but in addition to the standard bilinear term in factor loadings and market prices of risk, the relevant mean restriction in the term structure case involves an additional nonlinear (quadratic) term in factor loadings. We estimate our general model using likelihood-based dynamic factor model...... techniques for a variety of volatility factors, and implement the relevant likelihood ratio tests. Our factor model estimates are similar across a general state space implementation and an alternative robust two-step principal components approach. The evidence favors time-varying market prices of risk. Most...

  9. Policy harmonized approach for the EU agricultural sector modelling

    Directory of Open Access Journals (Sweden)

    G. SALPUTRA

    2008-12-01

    Full Text Available Policy harmonized (PH approach allows for the quantitative assessment of the impact of various elements of EU CAP direct support schemes, where the production effects of direct payments are accounted through reaction prices formed by producer price and policy price add-ons. Using the AGMEMOD model the impacts of two possible EU agricultural policy scenarios upon beef production have been analysed – full decoupling with a switch from historical to regional Single Payment scheme or alternatively with re-distribution of country direct payment envelopes via introduction of EU-wide flat area payment. The PH approach, by systematizing and harmonizing the management and use of policy data, ensures that projected differential policy impacts arising from changes in common EU policies reflect the likely actual differential impact as opposed to differences in how “common” policies are implemented within analytical models. In the second section of the paper the AGMEMOD model’s structure is explained. The policy harmonized evaluation method is presented in the third section. Results from an application of the PH approach are presented and discussed in the paper’s penultimate section, while section 5 concludes.;

  10. Initial assessment of a multi-model approach to spring flood forecasting in Sweden

    Science.gov (United States)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2015-06-01

    Hydropower is a major energy source in Sweden and proper reservoir management prior to the spring flood onset is crucial for optimal production. This requires useful forecasts of the accumulated discharge in the spring flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialised set-up of the HBV model. In this study, a number of new approaches to spring flood forecasting, that reflect the latest developments with respect to analysis and modelling on seasonal time scales, are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for three main Swedish rivers over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for specific locations and lead times improvements of 20-30 % are found. When combining all forecasts in a weighted multi-model approach, a mean improvement over all locations and lead times of nearly 10 % was indicated. This demonstrates the potential of the approach and further development and optimisation into an operational system is ongoing.

  11. Model-independent approach for dark matter phenomenology

    Indian Academy of Sciences (India)

    We have studied the phenomenology of dark matter at the ILC and cosmic positron experiments based on model-independent approach. We have found a strong correlation between dark matter signatures at the ILC and those in the indirect detection experiments of dark matter. Once the dark matter is discovered in the ...

  12. Model-independent approach for dark matter phenomenology ...

    Indian Academy of Sciences (India)

    Abstract. We have studied the phenomenology of dark matter at the ILC and cosmic positron experiments based on model-independent approach. We have found a strong correlation between dark matter signatures at the ILC and those in the indirect detec- tion experiments of dark matter. Once the dark matter is discovered ...

  13. A new modelling approach for zooplankton behaviour

    Science.gov (United States)

    Keiyu, A. Y.; Yamazaki, H.; Strickler, J. R.

    We have developed a new simulation technique to model zooplankton behaviour. The approach utilizes neither the conventional artificial intelligence nor neural network methods. We have designed an adaptive behaviour network, which is similar to BEER [(1990) Intelligence as an adaptive behaviour: an experiment in computational neuroethology, Academic Press], based on observational studies of zooplankton behaviour. The proposed method is compared with non- "intelligent" models—random walk and correlated walk models—as well as observed behaviour in a laboratory tank. Although the network is simple, the model exhibits rich behavioural patterns similar to live copepods.

  14. Practice-Informed Approaches to Addressing Substance Abuse and Trauma Exposure in Urban Native Families Involved with Child Welfare.

    Science.gov (United States)

    Lucero, Nancy M; Bussey, Marian

    2015-01-01

    Similar to families from other groups, urban-based American Indian and Alaska Native ("Native") family members involved with the child welfare system due to substance abuse issues are also often challenged by untreated trauma exposure. The link between these conditions and the history of genocidal policies aimed at destroying Native family ties, as well as experiences of ongoing discrimination, bring added dimensions for consideration when pro- viding services to these families. Practice-based evidence indicates that the trauma-informed and culturally responsive model developed by the Denver Indian Family Resource Center (DIFRC) shows promise in reducing out-of-home placements and re-referrals in urban Native families with substance abuse and child welfare concerns, while also increasing caregiver capabilities, family safety, and child well-being. This article provides strategies from the DIFRC approach that non-Native caseworkers and supervisors can utilize to create an environment in their own agencies that supports culturally based practice with Native families while incorporating a trauma-informed understanding of service needs of these families. Casework consistent with this approach demonstrates actions that meet the Active Efforts requirement of the Indian Child Welfare Act (ICWA) as well as sound clinical practice. Intensive and proactive case management designed specifically for families with high levels of service needs is a key strategy when combined with utilizing a caseworker brief screening tool for trauma exposure; training caseworkers to recognize trauma symptoms, making timely referrals to trauma treatment by behavioral health specialists experienced in working with Native clients, and providing a consistent service environment that focuses on client safety and worker trustworthiness. Finally, suggestions are put forth for agencies seeking to enhance their cultural responsiveness and include increasing workers' understanding of cultural values

  15. HEDR modeling approach: Revision 1

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1994-05-01

    This report is a revision of the previous Hanford Environmental Dose Reconstruction (HEDR) Project modeling approach report. This revised report describes the methods used in performing scoping studies and estimating final radiation doses to real and representative individuals who lived in the vicinity of the Hanford Site. The scoping studies and dose estimates pertain to various environmental pathways during various periods of time. The original report discussed the concepts under consideration in 1991. The methods for estimating dose have been refined as understanding of existing data, the scope of pathways, and the magnitudes of dose estimates were evaluated through scoping studies

  16. Crime Modeling using Spatial Regression Approach

    Science.gov (United States)

    Saleh Ahmar, Ansari; Adiatma; Kasim Aidid, M.

    2018-01-01

    Act of criminality in Indonesia increased both variety and quantity every year. As murder, rape, assault, vandalism, theft, fraud, fencing, and other cases that make people feel unsafe. Risk of society exposed to crime is the number of reported cases in the police institution. The higher of the number of reporter to the police institution then the number of crime in the region is increasing. In this research, modeling criminality in South Sulawesi, Indonesia with the dependent variable used is the society exposed to the risk of crime. Modelling done by area approach is the using Spatial Autoregressive (SAR) and Spatial Error Model (SEM) methods. The independent variable used is the population density, the number of poor population, GDP per capita, unemployment and the human development index (HDI). Based on the analysis using spatial regression can be shown that there are no dependencies spatial both lag or errors in South Sulawesi.

  17. The trilayer approach of teaching physiology, pathophysiology, and pharmacology concepts in a first-year pharmacy course: the TLAT model.

    Science.gov (United States)

    Islam, Mohammed A; Sabnis, Gauri; Farris, Fred

    2017-09-01

    This paper describes the development, implementation, and students' perceptions of a new trilayer approach of teaching (TLAT). The TLAT model involved blending lecture, in-class group activities, and out-of-class assignments on selected content areas and was implemented initially in a first-year integrated pharmacy course. Course contents were either delivered by traditional lectures or by the TLAT. A survey instrument was distributed by SurveyMonkey to determine students' perceptions of the TLAT model. Descriptive statistics were used for data analysis. Students' performance in a total of 225 examination and quiz questions was analyzed to evaluate whether the TLAT model improved students' learning. Students' ( n = 98) performance scores for TLAT-based and lecture-based questions were 83.3 ± 10.2 and 79.5 ± 14.0, respectively ( P Physiological Society.

  18. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  19. Development of flexible process-centric web applications: An integrated model driven approach

    NARCIS (Netherlands)

    Bernardi, M.L.; Cimitile, M.; Di Lucca, G.A.; Maggi, F.M.

    2012-01-01

    In recent years, Model Driven Engineering (MDE) approaches have been proposed and used to develop and evolve WAs. However, the definition of appropriate MDE approaches for the development of flexible process-centric WAs is still limited. In particular, (flexible) workflow models have never been

  20. A Modelling Approach for Improved Implementation of Information Technology in Manufacturing Systems

    DEFF Research Database (Denmark)

    Langer, Gilad; Larsen, Michael Holm; Kirkby, Lars Phillip

    1997-01-01

    The paper presents a modelling approach which is based on the multiple view perspective of Soft Systems Methodology and an encapsulation of these perspectives into an object orientated model. The approach provide a structured procedure for putting theoretical abstractions of a new production conc...

  1. Mechatronics by bond graphs an object-oriented approach to modelling and simulation

    CERN Document Server

    Damić, Vjekoslav

    2015-01-01

    This book presents a computer-aided approach to the design of mechatronic systems. Its subject is an integrated modeling and simulation in a visual computer environment. Since the first edition, the simulation software changed enormously, became more user-friendly and easier to use. Therefore, a second edition became necessary taking these improvements into account. The modeling is based on system top-down and bottom-up approach. The mathematical models are generated in a form of differential-algebraic equations and solved using numerical and symbolic algebra methods. The integrated approach developed is applied to mechanical, electrical and control systems, multibody dynamics, and continuous systems. .

  2. A Simple Approach to Account for Climate Model Interdependence in Multi-Model Ensembles

    Science.gov (United States)

    Herger, N.; Abramowitz, G.; Angelil, O. M.; Knutti, R.; Sanderson, B.

    2016-12-01

    Multi-model ensembles are an indispensable tool for future climate projection and its uncertainty quantification. Ensembles containing multiple climate models generally have increased skill, consistency and reliability. Due to the lack of agreed-on alternatives, most scientists use the equally-weighted multi-model mean as they subscribe to model democracy ("one model, one vote").Different research groups are known to share sections of code, parameterizations in their model, literature, or even whole model components. Therefore, individual model runs do not represent truly independent estimates. Ignoring this dependence structure might lead to a false model consensus, wrong estimation of uncertainty and effective number of independent models.Here, we present a way to partially address this problem by selecting a subset of CMIP5 model runs so that its climatological mean minimizes the RMSE compared to a given observation product. Due to the cancelling out of errors, regional biases in the ensemble mean are reduced significantly.Using a model-as-truth experiment we demonstrate that those regional biases persist into the future and we are not fitting noise, thus providing improved observationally-constrained projections of the 21st century. The optimally selected ensemble shows significantly higher global mean surface temperature projections than the original ensemble, where all the model runs are considered. Moreover, the spread is decreased well beyond that expected from the decreased ensemble size.Several previous studies have recommended an ensemble selection approach based on performance ranking of the model runs. Here, we show that this approach can perform even worse than randomly selecting ensemble members and can thus be harmful. We suggest that accounting for interdependence in the ensemble selection process is a necessary step for robust projections for use in impact assessments, adaptation and mitigation of climate change.

  3. A fuzzy-logic-based approach to qualitative safety modelling for marine systems

    International Nuclear Information System (INIS)

    Sii, H.S.; Ruxton, Tom; Wang Jin

    2001-01-01

    Safety assessment based on conventional tools (e.g. probability risk assessment (PRA)) may not be well suited for dealing with systems having a high level of uncertainty, particularly in the feasibility and concept design stages of a maritime or offshore system. By contrast, a safety model using fuzzy logic approach employing fuzzy IF-THEN rules can model the qualitative aspects of human knowledge and reasoning processes without employing precise quantitative analyses. A fuzzy-logic-based approach may be more appropriately used to carry out risk analysis in the initial design stages. This provides a tool for working directly with the linguistic terms commonly used in carrying out safety assessment. This research focuses on the development and representation of linguistic variables to model risk levels subjectively. These variables are then quantified using fuzzy sets. In this paper, the development of a safety model using fuzzy logic approach for modelling various design variables for maritime and offshore safety based decision making in the concept design stage is presented. An example is used to illustrate the proposed approach

  4. Classical Michaelis-Menten and system theory approach to modeling metabolite formation kinetics.

    Science.gov (United States)

    Popović, Jovan

    2004-01-01

    When single doses of drug are administered and kinetics are linear, techniques, which are based on the compartment approach and the linear system theory approach, in modeling the formation of the metabolite from the parent drug are proposed. Unlike the purpose-specific compartment approach, the methodical, conceptual and computational uniformity in modeling various linear biomedical systems is the dominant characteristic of the linear system approach technology. Saturation of the metabolic reaction results in nonlinear kinetics according to the Michaelis-Menten equation. The two compartment open model with Michaelis-Menten elimination kinetics is theorethicaly basic when single doses of drug are administered. To simulate data or to fit real data using this model, one must resort to numerical integration. A biomathematical model for multiple dosage regimen calculations of nonlinear metabolic systems in steady-state and a working example with phenytoin are presented. High correlation between phenytoin steady-state serum levels calculated from individual Km and Vmax values in the 15 adult epileptic outpatients and the observed levels at the third adjustment of phenytoin daily dose (r=0.961, p<0.01) were found.

  5. Tornadoes and related damage costs: statistical modeling with a semi-Markov approach

    OpenAIRE

    Corini, Chiara; D'Amico, Guglielmo; Petroni, Filippo; Prattico, Flavio; Manca, Raimondo

    2015-01-01

    We propose a statistical approach to tornadoes modeling for predicting and simulating occurrences of tornadoes and accumulated cost distributions over a time interval. This is achieved by modeling the tornadoes intensity, measured with the Fujita scale, as a stochastic process. Since the Fujita scale divides tornadoes intensity into six states, it is possible to model the tornadoes intensity by using Markov and semi-Markov models. We demonstrate that the semi-Markov approach is able to reprod...

  6. A novel approach for runoff modelling in ungauged catchments by Catchment Morphing

    Science.gov (United States)

    Zhang, J.; Han, D.

    2017-12-01

    Runoff prediction in ungauged catchments has been one of the major challenges in the past decades. However, due to the tremendous heterogeneity of hydrological catchments, obstacles exist in deducing model parameters for ungauged catchments from gauged ones. We propose a novel approach to predict ungauged runoff with Catchment Morphing (CM) using a fully distributed model. CM is defined as by changing the catchment characteristics (area and slope here) from the baseline model built with a gauged catchment to model the ungauged ones. The advantages of CM are: (a) less demand of the similarity between the baseline catchment and the ungauged catchment, (b) less demand of available data, and (c) potentially applicable in varied catchments. A case study on seven catchments in the UK has been used to demonstrate the proposed scheme. To comprehensively examine the CM approach, distributed rainfall inputs are utilised in the model, and fractal landscapes are used to morph the land surface from the baseline model to the target model. The preliminary results demonstrate the feasibility of the approach, which is promising in runoff simulation for ungauged catchments. Clearly, more work beyond this pilot study is needed to explore and develop this new approach further to maturity by the hydrological community.

  7. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data

    KAUST Repository

    Sepúlveda, Nuno

    2013-02-26

    Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.

  8. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data.

    Science.gov (United States)

    Sepúlveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G

    2013-02-26

    The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model. Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates. In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data.

  9. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data

    KAUST Repository

    Sepú lveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G

    2013-01-01

    Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.

  10. Study on the systematic approach of Markov modeling for dependability analysis of complex fault-tolerant features with voting logics

    International Nuclear Information System (INIS)

    Son, Kwang Seop; Kim, Dong Hoon; Kim, Chang Hwoi; Kang, Hyun Gook

    2016-01-01

    The Markov analysis is a technique for modeling system state transitions and calculating the probability of reaching various system states. While it is a proper tool for modeling complex system designs involving timing, sequencing, repair, redundancy, and fault tolerance, as the complexity or size of the system increases, so does the number of states of interest, leading to difficulty in constructing and solving the Markov model. This paper introduces a systematic approach of Markov modeling to analyze the dependability of a complex fault-tolerant system. This method is based on the decomposition of the system into independent subsystem sets, and the system-level failure rate and the unavailability rate for the decomposed subsystems. A Markov model for the target system is easily constructed using the system-level failure and unavailability rates for the subsystems, which can be treated separately. This approach can decrease the number of states to consider simultaneously in the target system by building Markov models of the independent subsystems stage by stage, and results in an exact solution for the Markov model of the whole target system. To apply this method we construct a Markov model for the reactor protection system found in nuclear power plants, a system configured with four identical channels and various fault-tolerant architectures. The results show that the proposed method in this study treats the complex architecture of the system in an efficient manner using the merits of the Markov model, such as a time dependent analysis and a sequential process analysis. - Highlights: • Systematic approach of Markov modeling for system dependability analysis is proposed based on the independent subsystem set, its failure rate and unavailability rate. • As an application example, we construct the Markov model for the digital reactor protection system configured with four identical and independent channels, and various fault-tolerant architectures. • The

  11. Micro-simulation of vehicle conflicts involving right-turn vehicles at signalized intersections based on cellular automata.

    Science.gov (United States)

    Chai, C; Wong, Y D

    2014-02-01

    At intersection, vehicles coming from different directions conflict with each other. Improper geometric design and signal settings at signalized intersection will increase occurrence of conflicts between road users and results in a reduction of the safety level. This study established a cellular automata (CA) model to simulate vehicular interactions involving right-turn vehicles (as similar to left-turn vehicles in US). Through various simulation scenarios for four case cross-intersections, the relationships between conflict occurrences involving right-turn vehicles with traffic volume and right-turn movement control strategies are analyzed. Impacts of traffic volume, permissive right-turn compared to red-amber-green (RAG) arrow, shared straight-through and right-turn lane as well as signal setting are estimated from simulation results. The simulation model is found to be able to provide reasonable assessment of conflicts through comparison of existed simulation approach and observed accidents. Through the proposed approach, prediction models for occurrences and severity of vehicle conflicts can be developed for various geometric layouts and traffic control strategies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. What do lay people want to know about the disposal of nuclear waste? A mental model approach to the design and development of an online risk communication.

    Science.gov (United States)

    Skarlatidou, A; Cheng, T; Haklay, M

    2012-09-01

    Public participation requires the involvement of lay people in the decision-making processes of issues that concern them. It is currently practiced in a variety of domains, such as transport and environmental planning. Communicating risks can be a complex task, as there may be significant differences between the risk perceptions of experts and those of lay people. Among the plethora of problems that require public involvement is the site selection of a nuclear waste disposal site in the United Kingdom, which is discussed in this article. Previous ineffective attempts to locate a site provide evidence that the problem has a strong social dimension, and studies ascribe public opposition to a loss of public trust in governmental agencies and decisionmakers, and to a lack of public understanding of nuclear waste issues. Although the mental models approach has been successfully used in the effective communication of such risks as climate change, no attempt has been made to follow a prescriptive mental model approach to develop risk communication messages that inform lay people about nuclear waste disposal. After interviewing 20 lay people and 5 experts, we construct and compare their corresponding mental models to reveal any gaps and misconceptions. The mental models approach is further applied here to identify lay people's requirements regarding what they want to know about nuclear waste, and how this information should be presented so that it is easily understood. This article further describes how the mental models approach was used in the subsequent development of an online information system for the site selection of a nuclear waste repository in the United Kingdom, which is considered essential for the improvement of public understanding and the reestablishment of trust. © 2012 Society for Risk Analysis.

  13. Modeling of phase equilibria with CPA using the homomorph approach

    DEFF Research Database (Denmark)

    Breil, Martin Peter; Tsivintzelis, Ioannis; Kontogeorgis, Georgios

    2011-01-01

    For association models, like CPA and SAFT, a classical approach is often used for estimating pure-compound and mixture parameters. According to this approach, the pure-compound parameters are estimated from vapor pressure and liquid density data. Then, the binary interaction parameters, kij, are ...

  14. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  15. Approaches to modeling landscape-scale drought-induced forest mortality

    Science.gov (United States)

    Gustafson, Eric J.; Shinneman, Douglas

    2015-01-01

    Drought stress is an important cause of tree mortality in forests, and drought-induced disturbance events are projected to become more common in the future due to climate change. Landscape Disturbance and Succession Models (LDSM) are becoming widely used to project climate change impacts on forests, including potential interactions with natural and anthropogenic disturbances, and to explore the efficacy of alternative management actions to mitigate negative consequences of global changes on forests and ecosystem services. Recent studies incorporating drought-mortality effects into LDSMs have projected significant potential changes in forest composition and carbon storage, largely due to differential impacts of drought on tree species and interactions with other disturbance agents. In this chapter, we review how drought affects forest ecosystems and the different ways drought effects have been modeled (both spatially and aspatially) in the past. Building on those efforts, we describe several approaches to modeling drought effects in LDSMs, discuss advantages and shortcomings of each, and include two case studies for illustration. The first approach features the use of empirically derived relationships between measures of drought and the loss of tree biomass to drought-induced mortality. The second uses deterministic rules of species mortality for given drought events to project changes in species composition and forest distribution. A third approach is more mechanistic, simulating growth reductions and death caused by water stress. Because modeling of drought effects in LDSMs is still in its infancy, and because drought is expected to play an increasingly important role in forest health, further development of modeling drought-forest dynamics is urgently needed.

  16. Patient involvement in research programming and implementation: a responsive evaluation of the Dialogue Model for research agenda setting

    NARCIS (Netherlands)

    Abma, T.A.; Pittens, C.A.C.M.; Visse, M.; Elberse, J.E.; Broerse, J.E.W.

    2015-01-01

    Background: The Dialogue Model for research agenda-setting, involving multiple stakeholders including patients, was developed and validated in the Netherlands. However, there is little insight into whether and how patient involvement is sustained during the programming and implementation of research

  17. The Matrix model, a driven state variables approach to non-equilibrium thermodynamics

    NARCIS (Netherlands)

    Jongschaap, R.J.J.

    2001-01-01

    One of the new approaches in non-equilibrium thermodynamics is the so-called matrix model of Jongschaap. In this paper some features of this model are discussed. We indicate the differences with the more common approach based upon internal variables and the more sophisticated Hamiltonian and GENERIC

  18. Constrained-path quantum Monte Carlo approach for non-yrast states within the shell model

    Energy Technology Data Exchange (ETDEWEB)

    Bonnard, J. [INFN, Sezione di Padova, Padova (Italy); LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, Caen (France); Juillet, O. [LPC Caen, ENSICAEN, Universite de Caen, CNRS/IN2P3, Caen (France)

    2016-04-15

    The present paper intends to present an extension of the constrained-path quantum Monte Carlo approach allowing to reconstruct non-yrast states in order to reach the complete spectroscopy of nuclei within the interacting shell model. As in the yrast case studied in a previous work, the formalism involves a variational symmetry-restored wave function assuming two central roles. First, it guides the underlying Brownian motion to improve the efficiency of the sampling. Second, it constrains the stochastic paths according to the phaseless approximation to control sign or phase problems that usually plague fermionic QMC simulations. Proof-of-principle results in the sd valence space are reported. They prove the ability of the scheme to offer remarkably accurate binding energies for both even- and odd-mass nuclei irrespective of the considered interaction. (orig.)

  19. Phase transition approach to bursting in neuronal cultures: quorum percolation models

    Science.gov (United States)

    Monceau, P.; Renault, R.; Métens, S.; Bottani, S.; Fardet, T.

    2017-10-01

    The Quorum Percolation model has been designed in the context of neurobiology to describe bursts of activity occurring in neuronal cultures from the point of view of statistical physics rather than from a dynamical synchronization approach. It is based upon information propagation on a directed graph with a threshold activation rule; this leads to a phase diagram which exhibits a giant percolation cluster below some critical value mC of the excitability. We describe the main characteristics of the original model and derive extensions according to additional relevant biological features. Firstly, we investigate the effects of an excitability variability on the phase diagram and show that the percolation transition can be destroyed by a sufficient amount of such a disorder; we stress the weakly averaging character of the order parameter and show that connectivity and excitability can be seen as two overlapping aspects of the same reality. Secondly, we elaborate a discrete time stochastic model taking into account the decay originating from ionic leakage through the membrane of neurons and synaptic depression; we give evidence that the decay softens and shifts the transition, and conjecture than decay destroys the transition in the thermodynamical limit. We were able to develop mean-field theories associated with each of the two effects; we discuss the framework of their agreement with Monte Carlo simulations. It turns out that the the critical point mC from which information on the connectivity of the network can be inferred is affected by each of these additional effects. Lastly, we show how dynamical simulations of bursts with an adaptive exponential integrateand- fire model can be interpreted in terms of Quorum Percolation. Moreover, the usefulness of the percolation model including the set of sophistication we investigated can be extended to many scientific fields involving information propagation, such as the spread of rumors in sociology, ethology, ecology.

  20. Towards a model-based development approach for wireless sensor-actuator network protocols

    DEFF Research Database (Denmark)

    Kumar S., A. Ajith; Simonsen, Kent Inge

    2014-01-01

    Model-Driven Software Engineering (MDSE) is a promising approach for the development of applications, and has been well adopted in the embedded applications domain in recent years. Wireless Sensor Actuator Networks consisting of resource constrained hardware and platformspecific operating system...... induced due to manual translations. With the use of formal semantics in the modeling approach, we can further ensure the correctness of the source model by means of verification. Also, with the use of network simulators and formal modeling tools, we obtain a verified and validated model to be used...

  1. Observations involving broadband impedance modelling

    Energy Technology Data Exchange (ETDEWEB)

    Berg, J S [Stanford Linear Accelerator Center, Menlo Park, CA (United States)

    1996-08-01

    Results for single- and multi-bunch instabilities can be significantly affected by the precise model that is used for the broadband impedance. This paper discusses three aspects of broadband impedance modelling. The first is an observation of the effect that a seemingly minor change in an impedance model has on the single-bunch mode coupling threshold. The second is a successful attempt to construct a model for the high-frequency tails of an r.f. cavity. The last is a discussion of requirements for the mathematical form of an impedance which follow from the general properties of impedances. (author)

  2. Observations involving broadband impedance modelling

    International Nuclear Information System (INIS)

    Berg, J.S.

    1995-08-01

    Results for single- and multi-bunch instabilities can be significantly affected by the precise model that is used for the broadband impendance. This paper discusses three aspects of broadband impendance modeling. The first is an observation of the effect that a seemingly minor change in an impedance model has on the single-bunch mode coupling threshold. The second is a successful attempt to construct a model for the high-frequency tails of an r.f cavity. The last is a discussion of requirements for the mathematical form of an impendance which follow from the general properties of impendances

  3. Childhood craniopharyngioma: greater hypothalamic involvement before surgery is associated with higher homeostasis model insulin resistance index

    Science.gov (United States)

    Trivin, Christine; Busiah, Kanetee; Mahlaoui, Nizar; Recasens, Christophe; Souberbielle, Jean-Claude; Zerah, Michel; Sainte-Rose, Christian; Brauner, Raja

    2009-01-01

    Background Obesity seems to be linked to the hypothalamic involvement in craniopharyngioma. We evaluated the pre-surgery relationship between the degree of this involvement on magnetic resonance imaging and insulin resistance, as evaluated by the homeostasis model insulin resistance index (HOMA). As insulin-like growth factor 1, leptin, soluble leptin receptor (sOB-R) and ghrelin may also be involved, we compared their plasma concentrations and their link to weight change. Methods 27 children with craniopharyngioma were classified as either grade 0 (n = 7, no hypothalamic involvement), grade 1 (n = 8, compression without involvement), or grade 2 (n = 12, severe involvement). Results Despite having similar body mass indexes (BMI), the grade 2 patients had higher glucose, insulin and HOMA before surgery than the grade 0 (P = 0.02, craniopharyngioma before surgery seems to determine the degree of insulin resistance, regardless of the BMI. The pre-surgery HOMA values were correlated with the post-surgery weight gain. This suggests that obesity should be prevented by reducing inn secretion in those cases with hypothalamic involvement. PMID:19341477

  4. A Reformulated Model of Barriers to Parental Involvement in Education: Comment on Hornby and Lafaele (2011)

    Science.gov (United States)

    Fan, Weihua; Li, Nan; Sandoval, Jaime Robert

    2018-01-01

    In a 2011 article in this journal, Hornby and Lafaele provided a comprehensive model to understand barriers that may adversely impact effectiveness of parental involvement (PI) in education. The proposed explanatory model provides researchers with a new comprehensive and systematic perspective of the phenomenon in question with references from an…

  5. Data and Dynamics Driven Approaches for Modelling and Forecasting the Red Sea Chlorophyll

    KAUST Repository

    Dreano, Denis

    2017-05-31

    Phytoplankton is at the basis of the marine food chain and therefore play a fundamental role in the ocean ecosystem. However, the large-scale phytoplankton dynamics of the Red Sea are not well understood yet, mainly due to the lack of historical in situ measurements. As a result, our knowledge in this area relies mostly on remotely-sensed observations and large-scale numerical marine ecosystem models. Models are very useful to identify the mechanisms driving the variations in chlorophyll concentration and have practical applications for fisheries operation and harmful algae blooms monitoring. Modelling approaches can be divided between physics- driven (dynamical) approaches, and data-driven (statistical) approaches. Dynamical models are based on a set of differential equations representing the transfer of energy and matter between different subsets of the biota, whereas statistical models identify relationships between variables based on statistical relations within the available data. The goal of this thesis is to develop, implement and test novel dynamical and statistical modelling approaches for studying and forecasting the variability of chlorophyll concentration in the Red Sea. These new models are evaluated in term of their ability to efficiently forecast and explain the regional chlorophyll variability. We also propose innovative synergistic strategies to combine data- and physics-driven approaches to further enhance chlorophyll forecasting capabilities and efficiency.

  6. An LES-PBE-PDF approach for modeling particle formation in turbulent reacting flows

    Science.gov (United States)

    Sewerin, Fabian; Rigopoulos, Stelios

    2017-10-01

    Many chemical and environmental processes involve the formation of a polydispersed particulate phase in a turbulent carrier flow. Frequently, the immersed particles are characterized by an intrinsic property such as the particle size, and the distribution of this property across a sample population is taken as an indicator for the quality of the particulate product or its environmental impact. In the present article, we propose a comprehensive model and an efficient numerical solution scheme for predicting the evolution of the property distribution associated with a polydispersed particulate phase forming in a turbulent reacting flow. Here, the particulate phase is described in terms of the particle number density whose evolution in both physical and particle property space is governed by the population balance equation (PBE). Based on the concept of large eddy simulation (LES), we augment the existing LES-transported probability density function (PDF) approach for fluid phase scalars by the particle number density and obtain a modeled evolution equation for the filtered PDF associated with the instantaneous fluid composition and particle property distribution. This LES-PBE-PDF approach allows us to predict the LES-filtered fluid composition and particle property distribution at each spatial location and point in time without any restriction on the chemical or particle formation kinetics. In view of a numerical solution, we apply the method of Eulerian stochastic fields, invoking an explicit adaptive grid technique in order to discretize the stochastic field equation for the number density in particle property space. In this way, sharp moving features of the particle property distribution can be accurately resolved at a significantly reduced computational cost. As a test case, we consider the condensation of an aerosol in a developed turbulent mixing layer. Our investigation not only demonstrates the predictive capabilities of the LES-PBE-PDF model but also

  7. A diagnosis method for physical systems using a multi-modeling approach

    International Nuclear Information System (INIS)

    Thetiot, R.

    2000-01-01

    In this thesis we propose a method for diagnosis problem solving. This method is based on a multi-modeling approach describing both normal and abnormal behavior of a system. This modeling approach allows to represent a system at different abstraction levels (behavioral, functional and teleological. Fundamental knowledge is described according to a bond-graph representation. We show that bond-graph representation can be exploited in order to generate (completely or partially) the functional models. The different models of the multi-modeling approach allows to define the functional state of a system at different abstraction levels. We exploit this property to exonerate sub-systems for which the expected behavior is observed. The behavioral and functional descriptions of the remaining sub-systems are exploited hierarchically in a two steps process. In a first step, the abnormal behaviors explaining some observations are identified. In a second step, the remaining unexplained observations are used to generate conflict sets and thus the consistency based diagnoses. The modeling method and the diagnosis process have been applied to a Reactor Coolant Pump Sets. This application illustrates the concepts described in this thesis and shows its potentialities. (authors)

  8. ِDesigning a Model to Medical Errors Prediction for Outpatients Visits According to Rganizational Commitment and Job Involvement

    Directory of Open Access Journals (Sweden)

    SM Mirhosseini

    2015-09-01

    Full Text Available Abstract Introduction: A wide ranges of variables effect on the medical errors such as job involvement and organizational commitment. Coincidental relationship between two variables on medical errors during outpatients’ visits has been investigated to design a model. Methods: A field study with 114 physicians during outpatients’ visits revealed the mean of medical errors. Azimi and Allen-meyer questionnaires were used to measure Job involvement and organizational commitment. Physicians divided into four groups according to the Job involvement and organizational commitment in two dimensions (Zone1: high job involvement and high organizational commitment, Zone2: high job involvement and low organizational commitment, Zone3: low job involvement and high organizational commitment, Zone 4: low job involvement and low organizational commitment. ANOVA and Scheffe test were conducted to analyse the medical errors in four Zones by SPSS22. A guideline was presented according to the relationship between errors and two other variables. Results: The mean of organizational commitment was 79.50±12.30 and job involvement 12.72±3.66, medical errors in first group (0.32, second group (0.51, third group (0.41 and last one (0.50. ANOVA (F test=22.20, sig=0.00 and Scheffé were significant except for the second and forth group. The validity of the model was 73.60%. Conclusion: Applying some strategies to boost the organizational commitment and job involvement can help for diminishing the medical errors during outpatients’ visits. Thus, the investigation to comprehend the factors contributing organizational commitment and job involvement can be helpful.

  9. Computational model of precision grip in Parkinson’s disease: A Utility based approach

    Directory of Open Access Journals (Sweden)

    Ankur eGupta

    2013-12-01

    Full Text Available We propose a computational model of Precision Grip (PG performance in normal subjects and Parkinson’s Disease (PD patients. Prior studies on grip force generation in PD patients show an increase in grip force during ON medication and an increase in the variability of the grip force during OFF medication (Fellows et al 1998; Ingvarsson et al 1997. Changes in grip force generation in dopamine-deficient PD conditions strongly suggest contribution of the Basal Ganglia, a deep brain system having a crucial role in translating dopamine signals to decision making. The present approach is to treat the problem of modeling grip force generation as a problem of action selection, which is one of the key functions of the Basal Ganglia. The model consists of two components: 1 the sensory-motor loop component, and 2 the Basal Ganglia component. The sensory-motor loop component converts a reference position and a reference grip force, into lift force and grip force profiles, respectively. These two forces cooperate in grip-lifting a load. The sensory-motor loop component also includes a plant model that represents the interaction between two fingers involved in PG, and the object to be lifted. The Basal Ganglia component is modeled using Reinforcement Learning with the significant difference that the action selection is performed using utility distribution instead of using purely Value-based distribution, thereby incorporating risk-based decision making. The proposed model is able to account for the precision grip results from normal and PD patients accurately (Fellows et. al. 1998; Ingvarsson et. al. 1997. To our knowledge the model is the first model of precision grip in PD conditions.

  10. An approach to ductile fracture resistance modelling in pipeline steels

    Energy Technology Data Exchange (ETDEWEB)

    Pussegoda, L.N.; Fredj, A. [BMT Fleet Technology Ltd., Kanata (Canada)

    2009-07-01

    Ductile fracture resistance studies of high grade steels in the pipeline industry often included analyses of the crack tip opening angle (CTOA) parameter using 3-point bend steel specimens. The CTOA is a function of specimen ligament size in high grade materials. Other resistance measurements may include steady state fracture propagation energy, critical fracture strain, and the adoption of damage mechanisms. Modelling approaches for crack propagation were discussed in this abstract. Tension tests were used to calibrate damage model parameters. Results from the tests were then applied to the crack propagation in a 3-point bend specimen using modern 1980 vintage steels. Limitations and approaches to overcome the difficulties associated with crack propagation modelling were discussed.

  11. The Intersystem Model of Psychotherapy: An Integrated Systems Treatment Approach

    Science.gov (United States)

    Weeks, Gerald R.; Cross, Chad L.

    2004-01-01

    This article introduces the intersystem model of psychotherapy and discusses its utility as a truly integrative and comprehensive approach. The foundation of this conceptually complex approach comes from dialectic metatheory; hence, its derivation requires an understanding of both foundational and integrational constructs. The article provides a…

  12. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  13. Conceptual Model and Numerical Approaches for Unsaturated Zone Flow and Transport

    International Nuclear Information System (INIS)

    H.H. Liu

    2004-01-01

    The purpose of this model report is to document the conceptual and numerical models used for modeling unsaturated zone (UZ) fluid (water and air) flow and solute transport processes. This work was planned in ''Technical Work Plan for: Unsaturated Zone Flow Model and Analysis Report Integration'' (BSC 2004 [DIRS 169654], Sections 1.2.5, 2.1.1, 2.1.2 and 2.2.1). The conceptual and numerical modeling approaches described in this report are mainly used for models of UZ flow and transport in fractured, unsaturated rock under ambient conditions. Developments of these models are documented in the following model reports: (1) UZ Flow Model and Submodels; (2) Radionuclide Transport Models under Ambient Conditions. Conceptual models for flow and transport in unsaturated, fractured media are discussed in terms of their applicability to the UZ at Yucca Mountain. The rationale for selecting the conceptual models used for modeling of UZ flow and transport is documented. Numerical approaches for incorporating these conceptual models are evaluated in terms of their representation of the selected conceptual models and computational efficiency; and the rationales for selecting the numerical approaches used for modeling of UZ flow and transport are discussed. This report also documents activities to validate the active fracture model (AFM) based on experimental observations and theoretical developments. The AFM is a conceptual model that describes the fracture-matrix interaction in the UZ of Yucca Mountain. These validation activities are documented in Section 7 of this report regarding use of an independent line of evidence to provide additional confidence in the use of the AFM in the UZ models. The AFM has been used in UZ flow and transport models under both ambient and thermally disturbed conditions. Developments of these models are documented

  14. The simplified models approach to constraining supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Perez, Genessis [Institut fuer Theoretische Physik, Karlsruher Institut fuer Technologie (KIT), Wolfgang-Gaede-Str. 1, 76131 Karlsruhe (Germany); Kulkarni, Suchita [Laboratoire de Physique Subatomique et de Cosmologie, Universite Grenoble Alpes, CNRS IN2P3, 53 Avenue des Martyrs, 38026 Grenoble (France)

    2015-07-01

    The interpretation of the experimental results at the LHC are model dependent, which implies that the searches provide limited constraints on scenarios such as supersymmetry (SUSY). The Simplified Models Spectra (SMS) framework used by ATLAS and CMS collaborations is useful to overcome this limitation. SMS framework involves a small number of parameters (all the properties are reduced to the mass spectrum, the production cross section and the branching ratio) and hence is more generic than presenting results in terms of soft parameters. In our work, the SMS framework was used to test Natural SUSY (NSUSY) scenario. To accomplish this task, two automated tools (SModelS and Fastlim) were used to decompose the NSUSY parameter space in terms of simplified models and confront the theoretical predictions against the experimental results. The achievement of both, just as the strengths and limitations, are here expressed for the NSUSY scenario.

  15. Interfacial Fluid Mechanics A Mathematical Modeling Approach

    CERN Document Server

    Ajaev, Vladimir S

    2012-01-01

    Interfacial Fluid Mechanics: A Mathematical Modeling Approach provides an introduction to mathematical models of viscous flow used in rapidly developing fields of microfluidics and microscale heat transfer. The basic physical effects are first introduced in the context of simple configurations and their relative importance in typical microscale applications is discussed. Then,several configurations of importance to microfluidics, most notably thin films/droplets on substrates and confined bubbles, are discussed in detail.  Topics from current research on electrokinetic phenomena, liquid flow near structured solid surfaces, evaporation/condensation, and surfactant phenomena are discussed in the later chapters. This book also:  Discusses mathematical models in the context of actual applications such as electrowetting Includes unique material on fluid flow near structured surfaces and phase change phenomena Shows readers how to solve modeling problems related to microscale multiphase flows Interfacial Fluid Me...

  16. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  17. Involving the Young: The German Approach to Vocational Education

    Science.gov (United States)

    Hirche, Walter

    2012-01-01

    Youth unemployment is a huge challenge for sustainable development in our societies. Statistics offer an alarming picture of the involvement of youth in the labour markets on a global scale. Germany, due to a strong tradition in vocational education and training deeply rooted in its culture, has very low rates of youth unemployment. The so-called…

  18. River Export of Plastic from Land to Sea: A Global Modeling Approach

    Science.gov (United States)

    Siegfried, Max; Gabbert, Silke; Koelmans, Albert A.; Kroeze, Carolien; Löhr, Ansje; Verburg, Charlotte

    2016-04-01

    Plastic is increasingly considered a serious cause of water pollution. It is a threat to aquatic ecosystems, including rivers, coastal waters and oceans. Rivers transport considerable amounts of plastic from land to sea. The quantity and its main sources, however, are not well known. Assessing the amount of macro- and microplastic transport from river to sea is, therefore, important for understanding the dimension and the patterns of plastic pollution of aquatic ecosystems. In addition, it is crucial for assessing short- and long-term impacts caused by plastic pollution. Here we present a global modelling approach to quantify river export of plastic from land to sea. Our approach accounts for different types of plastic, including both macro- and micro-plastics. Moreover, we distinguish point sources and diffuse sources of plastic in rivers. Our modelling approach is inspired by global nutrient models, which include more than 6000 river basins. In this paper, we will present our modelling approach, as well as first model results for micro-plastic pollution in European rivers. Important sources of micro-plastics include personal care products, laundry, household dust and car tyre wear. We combine information on these sources with information on sewage management, and plastic retention during river transport for the largest European rivers. Our modelling approach may help to better understand and prevent water pollution by plastic , and at the same time serves as 'proof of concept' for future application on global scale.

  19. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  20. A GOCE-only global gravity field model by the space-wise approach

    DEFF Research Database (Denmark)

    Migliaccio, Frederica; Reguzzoni, Mirko; Gatti, Andrea

    2011-01-01

    The global gravity field model computed by the spacewise approach is one of three official solutions delivered by ESA from the analysis of the GOCE data. The model consists of a set of spherical harmonic coefficients and the corresponding error covariance matrix. The main idea behind this approach...... the orbit to reduce the noise variance and correlation before gridding the data. In the first release of the space-wise approach, based on a period of about two months, some prior information coming from existing gravity field models entered into the solution especially at low degrees and low orders...... degrees; the second is an internally computed GOCE-only prior model to be used in place of the official quick-look model, thus removing the dependency on EIGEN5C especially in the polar gaps. Once the procedure to obtain a GOCE-only solution has been outlined, a new global gravity field model has been...

  1. An approach for modelling interdependent infrastructures in the context of vulnerability analysis

    International Nuclear Information System (INIS)

    Johansson, Jonas; Hassel, Henrik

    2010-01-01

    Technical infrastructures of the society are becoming more and more interconnected and interdependent, i.e. the function of an infrastructure influences the function of other infrastructures. Disturbances in one infrastructure therefore often traverse to other dependent infrastructures and possibly even back to the infrastructure where the failure originated. It is becoming increasingly important to take these interdependencies into account when assessing the vulnerability of technical infrastructures. In the present paper, an approach for modelling interdependent technical infrastructures is proposed. The modelling approach considers structural properties, as employed in graph theory, as well as functional properties to increase its fidelity and usefulness. By modelling a fictional electrified railway network that consists of five systems and interdependencies between the systems, it is shown how the model can be employed in a vulnerability analysis. The model aims to capture both functional and geographic interdependencies. It is concluded that the proposed modelling approach is promising and suitable in the context of vulnerability analyses of interdependent systems.

  2. An evaluation of gas release modelling approaches as to their applicability in fuel behaviour models

    International Nuclear Information System (INIS)

    Mattila, L.J.; Sairanen, R.T.

    1980-01-01

    The release of fission gas from uranium oxide fuel to the voids in the fuel rod affects in many ways the behaviour of LWR fuel rods both during normal operating conditions including anticipated transients and during off-normal and accident conditions. The current trend towards significantly increased discharge burnup of LWR fuel will increase the importance of fission gas release considerations both from the design and safety viewpoints. In the paper fission gas release models are classified to 5 categories on the basis of complexity and physical sophistication. For each category, the basic approach common to the models included in the category is described, a few representative models of the category are singled out and briefly commented in some cases, the advantages and drawbacks of the approach are listed and discussed and conclusions on the practical feasibility of the approach are drawn. The evaluation is based on both literature survey and our experience in working with integral fuel behaviour models. The work has included verification efforts, attempts to improve certain features of the codes and engineering applications. The classification of fission gas release models regarding their applicability in fuel behaviour codes can of course be done only in a coarse manner. The boundaries between the different categories are vague and a model may be well refined in a way which transfers it to a higher category. Some current trends in fuel behaviour research are discussed which seem to motivate further extensive efforts in fission product release modelling and are certain to affect the prioritizing of the efforts. (author)

  3. Using a consensus approach based on the conservation of inter-residue contacts to rank CAPRI models

    KAUST Repository

    Vangone, Anna

    2013-10-17

    Herein we propose the use of a consensus approach, CONSRANK, for ranking CAPRI models. CONSRANK relies on the conservation of inter-residue contacts in the analyzed decoys ensemble. Models are ranked according to their ability to match the most frequently observed contacts. We applied CONSRANK to 19 CAPRI protein-protein targets, covering a wide range of prediction difficulty and involved in a variety of biological functions. CONSRANK results are consistently good, both in terms of native-like (NL) solutions ranked in the top positions and of values of the Area Under the receiver operating characteristic Curve (AUC). For targets having a percentage of NL solutions above 3%, an excellent performance is found, with AUC values approaching 1. For the difficult target T46, having only 3.4% NL solutions, the number of NL solutions in the top 5 and 10 ranked positions is enriched by a factor 30, and the AUC value is as high as 0.997. AUC values below 0.8 are only found for targets featuring a percentage of NL solutions within 1.1%. Remarkably, a false consensus emerges only in one case, T42, which happens to be an artificial protein, whose assembly details remain uncertain, based on controversial experimental data. We also show that CONSRANK still performs very well on a limited number of models, provided that more than 1 NL solution is included in the ensemble, thus extending its applicability to cases where few dozens of models are available.© 2013 Wiley Periodicals, Inc.

  4. A nonparametric approach to forecasting realized volatility

    OpenAIRE

    Adam Clements; Ralf Becker

    2009-01-01

    A well developed literature exists in relation to modeling and forecasting asset return volatility. Much of this relate to the development of time series models of volatility. This paper proposes an alternative method for forecasting volatility that does not involve such a model. Under this approach a forecast is a weighted average of historical volatility. The greatest weight is given to periods that exhibit the most similar market conditions to the time at which the forecast is being formed...

  5. Wave Resource Characterization Using an Unstructured Grid Modeling Approach

    Directory of Open Access Journals (Sweden)

    Wei-Cheng Wu

    2018-03-01

    Full Text Available This paper presents a modeling study conducted on the central Oregon coast for wave resource characterization, using the unstructured grid Simulating WAve Nearshore (SWAN model coupled with a nested grid WAVEWATCH III® (WWIII model. The flexibility of models with various spatial resolutions and the effects of open boundary conditions simulated by a nested grid WWIII model with different physics packages were evaluated. The model results demonstrate the advantage of the unstructured grid-modeling approach for flexible model resolution and good model skills in simulating the six wave resource parameters recommended by the International Electrotechnical Commission in comparison to the observed data in Year 2009 at National Data Buoy Center Buoy 46050. Notably, spectral analysis indicates that the ST4 physics package improves upon the ST2 physics package’s ability to predict wave power density for large waves, which is important for wave resource assessment, load calculation of devices, and risk management. In addition, bivariate distributions show that the simulated sea state of maximum occurrence with the ST4 physics package matched the observed data better than with the ST2 physics package. This study demonstrated that the unstructured grid wave modeling approach, driven by regional nested grid WWIII outputs along with the ST4 physics package, can efficiently provide accurate wave hindcasts to support wave resource characterization. Our study also suggests that wind effects need to be considered if the dimension of the model domain is greater than approximately 100 km, or O (102 km.

  6. A fuzzy approach for modelling radionuclide in lake system.

    Science.gov (United States)

    Desai, H K; Christian, R A; Banerjee, J; Patra, A K

    2013-10-01

    Radioactive liquid waste is generated during operation and maintenance of Pressurised Heavy Water Reactors (PHWRs). Generally low level liquid waste is diluted and then discharged into the near by water-body through blowdown water discharge line as per the standard waste management practice. The effluents from nuclear installations are treated adequately and then released in a controlled manner under strict compliance of discharge criteria. An attempt was made to predict the concentration of (3)H released from Kakrapar Atomic Power Station at Ratania Regulator, about 2.5 km away from the discharge point, where human exposure is expected. Scarcity of data and complex geometry of the lake prompted the use of Heuristic approach. Under this condition, Fuzzy rule based approach was adopted to develop a model, which could predict (3)H concentration at Ratania Regulator. Three hundred data were generated for developing the fuzzy rules, in which input parameters were water flow from lake and (3)H concentration at discharge point. The Output was (3)H concentration at Ratania Regulator. These data points were generated by multiple regression analysis of the original data. Again by using same methodology hundred data were generated for the validation of the model, which were compared against the predicted output generated by using Fuzzy Rule based approach. Root Mean Square Error of the model came out to be 1.95, which showed good agreement by Fuzzy model of natural ecosystem. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    Science.gov (United States)

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  8. Model-Assisted Estimation of Tropical Forest Biomass Change: A Comparison of Approaches

    Directory of Open Access Journals (Sweden)

    Nikolai Knapp

    2018-05-01

    Full Text Available Monitoring of changes in forest biomass requires accurate transfer functions between remote sensing-derived changes in canopy height (ΔH and the actual changes in aboveground biomass (ΔAGB. Different approaches can be used to accomplish this task: direct approaches link ΔH directly to ΔAGB, while indirect approaches are based on deriving AGB stock estimates for two points in time and calculating the difference. In some studies, direct approaches led to more accurate estimations, while, in others, indirect approaches led to more accurate estimations. It is unknown how each approach performs under different conditions and over the full range of possible changes. Here, we used a forest model (FORMIND to generate a large dataset (>28,000 ha of natural and disturbed forest stands over time. Remote sensing of forest height was simulated on these stands to derive canopy height models for each time step. Three approaches for estimating ΔAGB were compared: (i the direct approach; (ii the indirect approach and (iii an enhanced direct approach (dir+tex, using ΔH in combination with canopy texture. Total prediction accuracies of the three approaches measured as root mean squared errors (RMSE were RMSEdirect = 18.7 t ha−1, RMSEindirect = 12.6 t ha−1 and RMSEdir+tex = 12.4 t ha−1. Further analyses revealed height-dependent biases in the ΔAGB estimates of the direct approach, which did not occur with the other approaches. Finally, the three approaches were applied on radar-derived (TanDEM-X canopy height changes on Barro Colorado Island (Panama. The study demonstrates the potential of forest modeling for improving the interpretation of changes observed in remote sensing data and for comparing different methodologies.

  9. An Evolutionary Genomic Approach to Identify Genes Involved in Human Birth Timing

    Science.gov (United States)

    Orabona, Guilherme; Morgan, Thomas; Haataja, Ritva; Hallman, Mikko; Puttonen, Hilkka; Menon, Ramkumar; Kuczynski, Edward; Norwitz, Errol; Snegovskikh, Victoria; Palotie, Aarno; Fellman, Vineta; DeFranco, Emily A.; Chaudhari, Bimal P.; McGregor, Tracy L.; McElroy, Jude J.; Oetjens, Matthew T.; Teramo, Kari; Borecki, Ingrid; Fay, Justin; Muglia, Louis

    2011-01-01

    Coordination of fetal maturation with birth timing is essential for mammalian reproduction. In humans, preterm birth is a disorder of profound global health significance. The signals initiating parturition in humans have remained elusive, due to divergence in physiological mechanisms between humans and model organisms typically studied. Because of relatively large human head size and narrow birth canal cross-sectional area compared to other primates, we hypothesized that genes involved in parturition would display accelerated evolution along the human and/or higher primate phylogenetic lineages to decrease the length of gestation and promote delivery of a smaller fetus that transits the birth canal more readily. Further, we tested whether current variation in such accelerated genes contributes to preterm birth risk. Evidence from allometric scaling of gestational age suggests human gestation has been shortened relative to other primates. Consistent with our hypothesis, many genes involved in reproduction show human acceleration in their coding or adjacent noncoding regions. We screened >8,400 SNPs in 150 human accelerated genes in 165 Finnish preterm and 163 control mothers for association with preterm birth. In this cohort, the most significant association was in FSHR, and 8 of the 10 most significant SNPs were in this gene. Further evidence for association of a linkage disequilibrium block of SNPs in FSHR, rs11686474, rs11680730, rs12473870, and rs1247381 was found in African Americans. By considering human acceleration, we identified a novel gene that may be associated with preterm birth, FSHR. We anticipate other human accelerated genes will similarly be associated with preterm birth risk and elucidate essential pathways for human parturition. PMID:21533219

  10. A nationwide modelling approach to decommissioning - 16182

    International Nuclear Information System (INIS)

    Kelly, Bernard; Lowe, Andy; Mort, Paul

    2009-01-01

    In this paper we describe a proposed UK national approach to modelling decommissioning. For the first time, we shall have an insight into optimizing the safety and efficiency of a national decommissioning strategy. To do this we use the General Case Integrated Waste Algorithm (GIA), a universal model of decommissioning nuclear plant, power plant, waste arisings and the associated knowledge capture. The model scales from individual items of plant through cells, groups of cells, buildings, whole sites and then on up to a national scale. We describe the national vision for GIA which can be broken down into three levels: 1) the capture of the chronological order of activities that an experienced decommissioner would use to decommission any nuclear facility anywhere in the world - this is Level 1 of GIA; 2) the construction of an Operational Research (OR) model based on Level 1 to allow rapid what if scenarios to be tested quickly (Level 2); 3) the construction of a state of the art knowledge capture capability that allows future generations to learn from our current decommissioning experience (Level 3). We show the progress to date in developing GIA in levels 1 and 2. As part of level 1, GIA has assisted in the development of an IMechE professional decommissioning qualification. Furthermore, we describe GIA as the basis of a UK-Owned database of decommissioning norms for such things as costs, productivity, durations etc. From level 2, we report on a pilot study that has successfully tested the basic principles for the OR numerical simulation of the algorithm. We then highlight the advantages of applying the OR modelling approach nationally. In essence, a series of 'what if...' scenarios can be tested that will improve the safety and efficiency of decommissioning. (authors)

  11. A New Approach for Magneto-Static Hysteresis Behavioral Modeling

    DEFF Research Database (Denmark)

    Astorino, Antonio; Swaminathan, Madhavan; Antonini, Giulio

    2016-01-01

    in this paper is based on simple functions, which do not require calculus to be involved, thus assuring a very good efficiency in the algorithm. In addition, the proposed method enables initial magnetization curves, symmetric loops, minor loops, normal curves, and reversal curves of any order to be reproduced......, as demonstrated through the pertinent results provided in this paper. A model example based on the proposed modeling technique is also introduced and used as inductor core, in order to simulate an LR series circuit. Finally, the model ability to emulate hysteretic inductors is proved by the satisfactory agreement...

  12. Choosing an optimal model for failure data analysis by graphical approach

    International Nuclear Information System (INIS)

    Zhang, Tieling; Dwight, Richard

    2013-01-01

    Many models involving combination of multiple Weibull distributions, modification of Weibull distribution or extension of its modified ones, etc. have been developed to model a given set of failure data. The application of these models to modeling a given data set can be based on plotting the data on Weibull probability paper (WPP). Of them, two or more models are appropriate to model one typical shape of the fitting plot, whereas a specific model may be fit for analyzing different shapes of the plots. Hence, a problem arises, that is how to choose an optimal model for a given data set and how to model the data. The motivation of this paper is to address this issue. This paper summarizes the characteristics of Weibull-related models with more than three parameters including sectional models involving two or three Weibull distributions, competing risk model and mixed Weibull model. The models as discussed in this present paper are appropriate to model the data of which the shapes of plots on WPP can be concave, convex, S-shaped or inversely S-shaped. Then, the method for model selection is proposed, which is based on the shapes of the fitting plots. The main procedure for parameter estimation of the models is described accordingly. In addition, the range of data plots on WPP is clearly highlighted from the practical point of view. To note this is important as mathematical analysis of a model with neglecting the applicable range of the model plot will incur discrepancy or big errors in model selection and parameter estimates

  13. Technologies and Approaches to Elucidate and Model the Virulence Program of Salmonella.

    Energy Technology Data Exchange (ETDEWEB)

    McDermott, Jason E.; Yoon, Hyunjin; Nakayasu, Ernesto S.; Metz, Thomas O.; Hyduke, Daniel R.; Kidwai, Afshan S.; Palsson, Bernhard O.; Adkins, Joshua N.; Heffron, Fred

    2011-04-01

    Salmonella is a primary cause of enteric diseases in a variety of animals. During its evolution into a pathogenic bacterium, Salmonella acquired an elaborate regulatory network that responds to multiple environmental stimuli within host animals and integrates them resulting in fine regulation of the virulence program. The coordinated action by this regulatory network involves numerous virulence regulators, necessitating genome-wide profiling analysis to assess and combine efforts from multiple regulons. In this review we discuss recent high-throughput analytic approaches to understand the regulatory network of Salmonella that controls virulence processes. Application of high-throughput analyses have generated a large amount of data and driven development of computational approaches required for data integration. Therefore, we also cover computer-aided network analyses to infer regulatory networks, and demonstrate how genome-scale data can be used to construct regulatory and metabolic systems models of Salmonella pathogenesis. Genes that are coordinately controlled by multiple virulence regulators under infectious conditions are more likely to be important for pathogenesis. Thus, reconstructing the global regulatory network during infection or, at the very least, under conditions that mimic the host cellular environment not only provides a bird’s eye view of Salmonella survival strategy in response to hostile host environments but also serves as an efficient means to identify novel virulence factors that are essential for Salmonella to accomplish systemic infection in the host.

  14. Understanding complex urban systems multidisciplinary approaches to modeling

    CERN Document Server

    Gurr, Jens; Schmidt, J

    2014-01-01

    Understanding Complex Urban Systems takes as its point of departure the insight that the challenges of global urbanization and the complexity of urban systems cannot be understood – let alone ‘managed’ – by sectoral and disciplinary approaches alone. But while there has recently been significant progress in broadening and refining the methodologies for the quantitative modeling of complex urban systems, in deepening the theoretical understanding of cities as complex systems, or in illuminating the implications for urban planning, there is still a lack of well-founded conceptual thinking on the methodological foundations and the strategies of modeling urban complexity across the disciplines. Bringing together experts from the fields of urban and spatial planning, ecology, urban geography, real estate analysis, organizational cybernetics, stochastic optimization, and literary studies, as well as specialists in various systems approaches and in transdisciplinary methodologies of urban analysis, the volum...

  15. CM5: A pre-Swarm magnetic field model based upon the comprehensive modeling approach

    DEFF Research Database (Denmark)

    Sabaka, T.; Olsen, Nils; Tyler, Robert

    2014-01-01

    We have developed a model based upon the very successful Comprehensive Modeling (CM) approach using recent CHAMP, Ørsted, SAC-C and observatory hourly-means data from September 2000 to the end of 2013. This CM, called CM5, was derived from the algorithm that will provide a consistent line of Leve...

  16. A multi-model ensemble approach to seabed mapping

    Science.gov (United States)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  17. A model approach to assess the long-term trends of indirect photochemistry in lake water. The case of Lake Maggiore (NW Italy).

    Science.gov (United States)

    Minella, Marco; Rogora, Michela; Vione, Davide; Maurino, Valter; Minero, Claudio

    2011-08-15

    A model-based approach is here developed and applied to predict the long-term trends of indirect photochemical processes in the surface layer (5m water depth) of Lake Maggiore, NW Italy. For this lake, time series of the main parameters of photochemical importance that cover almost two decades are available. As a way to assess the relevant photochemical reactions, the modelled steady-state concentrations of important photogenerated transients ((•)OH, ³CDOM* and CO₃(-•)) were taken into account. A multivariate analysis approach was adopted to have an overview of the system, to emphasise relationships among chemical, photochemical and seasonal variables, and to highlight annual and long-term trends. Over the considered time period, because of the decrease of the dissolved organic carbon (DOC) content of water and of the increase of alkalinity, a significant increase is predicted for the steady-state concentrations of the radicals (•)OH and CO₃(-•). Therefore, the photochemical degradation processes that involve the two radical species would be enhanced. Another issue of potential photochemical importance is related to the winter maxima of nitrate (a photochemical (•)OH source) and the summer maxima of DOC ((•)OH sink and ³CDOM* source) in the lake water under consideration. From the combination of sunlight irradiance and chemical composition data, one predicts that the processes involving (•)OH and CO₃(-•) would be most important in spring, while the reactions involving ³CDOM* would be most important in summer. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Validation of Slosh Modeling Approach Using STAR-CCM+

    Science.gov (United States)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  19. Modified multiblock partial least squares path modeling algorithm with backpropagation neural networks approach

    Science.gov (United States)

    Yuniarto, Budi; Kurniawan, Robert

    2017-03-01

    PLS Path Modeling (PLS-PM) is different from covariance based SEM, where PLS-PM use an approach based on variance or component, therefore, PLS-PM is also known as a component based SEM. Multiblock Partial Least Squares (MBPLS) is a method in PLS regression which can be used in PLS Path Modeling which known as Multiblock PLS Path Modeling (MBPLS-PM). This method uses an iterative procedure in its algorithm. This research aims to modify MBPLS-PM with Back Propagation Neural Network approach. The result is MBPLS-PM algorithm can be modified using the Back Propagation Neural Network approach to replace the iterative process in backward and forward step to get the matrix t and the matrix u in the algorithm. By modifying the MBPLS-PM algorithm using Back Propagation Neural Network approach, the model parameters obtained are relatively not significantly different compared to model parameters obtained by original MBPLS-PM algorithm.

  20. Anthropomorphic Coding of Speech and Audio: A Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    W. Bastiaan Kleijn

    2005-06-01

    Full Text Available Auditory modeling is a well-established methodology that provides insight into human perception and that facilitates the extraction of signal features that are most relevant to the listener. The aim of this paper is to provide a tutorial on perceptual speech and audio coding using an invertible auditory model. In this approach, the audio signal is converted into an auditory representation using an invertible auditory model. The auditory representation is quantized and coded. Upon decoding, it is then transformed back into the acoustic domain. This transformation converts a complex distortion criterion into a simple one, thus facilitating quantization with low complexity. We briefly review past work on auditory models and describe in more detail the components of our invertible model and its inversion procedure, that is, the method to reconstruct the signal from the output of the auditory model. We summarize attempts to use the auditory representation for low-bit-rate coding. Our approach also allows the exploitation of the inherent redundancy of the human auditory system for the purpose of multiple description (joint source-channel coding.

  1. A modal approach to modeling spatially distributed vibration energy dissipation.

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, Daniel Joseph

    2010-08-01

    The nonlinear behavior of mechanical joints is a confounding element in modeling the dynamic response of structures. Though there has been some progress in recent years in modeling individual joints, modeling the full structure with myriad frictional interfaces has remained an obstinate challenge. A strategy is suggested for structural dynamics modeling that can account for the combined effect of interface friction distributed spatially about the structure. This approach accommodates the following observations: (1) At small to modest amplitudes, the nonlinearity of jointed structures is manifest primarily in the energy dissipation - visible as vibration damping; (2) Correspondingly, measured vibration modes do not change significantly with amplitude; and (3) Significant coupling among the modes does not appear to result at modest amplitudes. The mathematical approach presented here postulates the preservation of linear modes and invests all the nonlinearity in the evolution of the modal coordinates. The constitutive form selected is one that works well in modeling spatially discrete joints. When compared against a mathematical truth model, the distributed dissipation approximation performs well.

  2. Public Involvement in Repository Site Selection for Nuclear Waste: Towards a more Dynamic View in Decision-Making Processes

    Energy Technology Data Exchange (ETDEWEB)

    Kruetli, Pius; Stauffacher, Michael; Flueeler, Thomas; Scholz, Roland W. [ETH Zuerich (Switzerland). lnst. for Human-Environment Systems (HES)

    2006-09-15

    This paper discusses possibilities of public involvement in radioactive waste management. A general overview of the radioactive waste issue is presented referring to a proposed model of the respective decision-making process. Based on the well known participation ladder by Arnstein, we differentiate various intensities of public involvement. A matrix with public involvement and the decision-making process is introduced and three prototypical patterns are discussed. We conclude that time frame, the level of public involvement and the mission have to be considered as well as techniques and the overarching context - all in all, a systematic and dynamic approach for public involvement is needed.

  3. Public Involvement in Repository Site Selection for Nuclear Waste: Towards a more Dynamic View in Decision-Making Processes

    International Nuclear Information System (INIS)

    Kruetli, Pius; Stauffacher, Michael; Flueeler, Thomas; Scholz, Roland W.

    2006-01-01

    This paper discusses possibilities of public involvement in radioactive waste management. A general overview of the radioactive waste issue is presented referring to a proposed model of the respective decision-making process. Based on the well known participation ladder by Arnstein, we differentiate various intensities of public involvement. A matrix with public involvement and the decision-making process is introduced and three prototypical patterns are discussed. We conclude that time frame, the level of public involvement and the mission have to be considered as well as techniques and the overarching context - all in all, a systematic and dynamic approach for public involvement is needed

  4. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...... and define new ways to implement integrated dynamic models for the following project. In parallel, seven different developments of new methods, tools and algorithms have been performed to support the application of the approach. The developments concern: Decision diagrams – to clarify goals and the ability...... affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...

  5. Simultaneously uncovering the patterns of brain regions involved in different story reading subprocesses.

    Directory of Open Access Journals (Sweden)

    Leila Wehbe

    Full Text Available Story understanding involves many perceptual and cognitive subprocesses, from perceiving individual words, to parsing sentences, to understanding the relationships among the story characters. We present an integrated computational model of reading that incorporates these and additional subprocesses, simultaneously discovering their fMRI signatures. Our model predicts the fMRI activity associated with reading arbitrary text passages, well enough to distinguish which of two story segments is being read with 74% accuracy. This approach is the first to simultaneously track diverse reading subprocesses during complex story processing and predict the detailed neural representation of diverse story features, ranging from visual word properties to the mention of different story characters and different actions they perform. We construct brain representation maps that replicate many results from a wide range of classical studies that focus each on one aspect of language processing and offer new insights on which type of information is processed by different areas involved in language processing. Additionally, this approach is promising for studying individual differences: it can be used to create single subject maps that may potentially be used to measure reading comprehension and diagnose reading disorders.

  6. A machine learning approach to identify clinical trials involving nanodrugs and nanodevices from ClinicalTrials.gov.

    Directory of Open Access Journals (Sweden)

    Diana de la Iglesia

    Full Text Available Clinical Trials (CTs are essential for bridging the gap between experimental research on new drugs and their clinical application. Just like CTs for traditional drugs and biologics have helped accelerate the translation of biomedical findings into medical practice, CTs for nanodrugs and nanodevices could advance novel nanomaterials as agents for diagnosis and therapy. Although there is publicly available information about nanomedicine-related CTs, the online archiving of this information is carried out without adhering to criteria that discriminate between studies involving nanomaterials or nanotechnology-based processes (nano, and CTs that do not involve nanotechnology (non-nano. Finding out whether nanodrugs and nanodevices were involved in a study from CT summaries alone is a challenging task. At the time of writing, CTs archived in the well-known online registry ClinicalTrials.gov are not easily told apart as to whether they are nano or non-nano CTs-even when performed by domain experts, due to the lack of both a common definition for nanotechnology and of standards for reporting nanomedical experiments and results.We propose a supervised learning approach for classifying CT summaries from ClinicalTrials.gov according to whether they fall into the nano or the non-nano categories. Our method involves several stages: i extraction and manual annotation of CTs as nano vs. non-nano, ii pre-processing and automatic classification, and iii performance evaluation using several state-of-the-art classifiers under different transformations of the original dataset.The performance of the best automated classifier closely matches that of experts (AUC over 0.95, suggesting that it is feasible to automatically detect the presence of nanotechnology products in CT summaries with a high degree of accuracy. This can significantly speed up the process of finding whether reports on ClinicalTrials.gov might be relevant to a particular nanoparticle or nanodevice

  7. A machine learning approach to identify clinical trials involving nanodrugs and nanodevices from ClinicalTrials.gov.

    Science.gov (United States)

    de la Iglesia, Diana; García-Remesal, Miguel; Anguita, Alberto; Muñoz-Mármol, Miguel; Kulikowski, Casimir; Maojo, Víctor

    2014-01-01

    Clinical Trials (CTs) are essential for bridging the gap between experimental research on new drugs and their clinical application. Just like CTs for traditional drugs and biologics have helped accelerate the translation of biomedical findings into medical practice, CTs for nanodrugs and nanodevices could advance novel nanomaterials as agents for diagnosis and therapy. Although there is publicly available information about nanomedicine-related CTs, the online archiving of this information is carried out without adhering to criteria that discriminate between studies involving nanomaterials or nanotechnology-based processes (nano), and CTs that do not involve nanotechnology (non-nano). Finding out whether nanodrugs and nanodevices were involved in a study from CT summaries alone is a challenging task. At the time of writing, CTs archived in the well-known online registry ClinicalTrials.gov are not easily told apart as to whether they are nano or non-nano CTs-even when performed by domain experts, due to the lack of both a common definition for nanotechnology and of standards for reporting nanomedical experiments and results. We propose a supervised learning approach for classifying CT summaries from ClinicalTrials.gov according to whether they fall into the nano or the non-nano categories. Our method involves several stages: i) extraction and manual annotation of CTs as nano vs. non-nano, ii) pre-processing and automatic classification, and iii) performance evaluation using several state-of-the-art classifiers under different transformations of the original dataset. The performance of the best automated classifier closely matches that of experts (AUC over 0.95), suggesting that it is feasible to automatically detect the presence of nanotechnology products in CT summaries with a high degree of accuracy. This can significantly speed up the process of finding whether reports on ClinicalTrials.gov might be relevant to a particular nanoparticle or nanodevice, which is

  8. Cerebellar involvement in metabolic disorders: a pattern-recognition approach

    International Nuclear Information System (INIS)

    Steinlin, M.; Boltshauser, E.; Blaser, S.

    1998-01-01

    Inborn errors of metabolism can affect the cerebellum during development, maturation and later during life. We have established criteria for pattern recognition of cerebellar abnormalities in metabolic disorders. The abnormalities can be divided into four major groups: cerebellar hypoplasia (CH), hyperplasia, cerebellar atrophy (CA), cerebellar white matter abnormalities (WMA) or swelling, and involvement of the dentate nuclei (DN) or cerebellar cortex. CH can be an isolated typical finding, as in adenylsuccinase deficiency, but is also occasionally seen in many other disorders. Differentiation from CH and CA is often difficult, as in carbohydrate deficient glycoprotein syndrome or 2-l-hydroxyglutaric acidaemia. In cases of atrophy the relationship of cerebellar to cerebral atrophy is important. WMA may be diffuse or patchy, frequently predominantly around the DN. Severe swelling of white matter is present during metabolic crisis in maple syrup urine disease. The DN can be affected by metabolite deposition, necrosis, calcification or demyelination. Involvement of cerebellar cortex is seen in infantile neuroaxonal dystrophy. Changes in DN and cerebellar cortex are rather typical and therefore most helpful; additional features should be sought as they are useful in narrowing down the differential diagnosis. (orig.)

  9. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  10. Alternative approach to the surface-excitation model

    International Nuclear Information System (INIS)

    Krohn, V.E.

    1981-01-01

    Although the development of the surface-excitation model of sputtered-ion emission involved a detailed description of the ionization process, one can arrive at the same result by assuming an equilibrium treatment, e.g. the Saha-Langmuir equation, with the temperature falling as the collision casade develops. This suggests that, even if situations are found where the surface-excitation model is successful, it does not follow that the original detailed description of the ionization process is correct. Nevertheless, the surface-excitation model does contain an interesting new idea which should not be overlooked, i.e. that atoms sputtered during the early stages of a collision cascade will be relatively energetic, and to the extent that the Saha-Langmuir equation has some applicability, will have a probability of positive ionization which will be low for atoms of low ionization potential (I phi), relative to lower-energy atoms emitted during the later stages of the collision cascade. The extended abstract will discuss recent experimental results

  11. Design of laser-generated shockwave experiments. An approach using analytic models

    International Nuclear Information System (INIS)

    Lee, Y.T.; Trainor, R.J.

    1980-01-01

    Two of the target-physics phenomena which must be understood before a clean experiment can be confidently performed are preheating due to suprathermal electrons and shock decay due to a shock-rarefaction interaction. Simple analytic models are described for these two processes and the predictions of these models are compared with those of the LASNEX fluid physics code. We have approached this work not with the view of surpassing or even approaching the reliability of the code calculations, but rather with the aim of providing simple models which may be used for quick parameter-sensitivity evaluations, while providing physical insight into the problems

  12. On Approaches to Analyze the Sensitivity of Simulated Hydrologic Fluxes to Model Parameters in the Community Land Model

    Directory of Open Access Journals (Sweden)

    Jie Bao

    2015-12-01

    Full Text Available Effective sensitivity analysis approaches are needed to identify important parameters or factors and their uncertainties in complex Earth system models composed of multi-phase multi-component phenomena and multiple biogeophysical-biogeochemical processes. In this study, the impacts of 10 hydrologic parameters in the Community Land Model on simulations of runoff and latent heat flux are evaluated using data from a watershed. Different metrics, including residual statistics, the Nash–Sutcliffe coefficient, and log mean square error, are used as alternative measures of the deviations between the simulated and field observed values. Four sensitivity analysis (SA approaches, including analysis of variance based on the generalized linear model, generalized cross validation based on the multivariate adaptive regression splines model, standardized regression coefficients based on a linear regression model, and analysis of variance based on support vector machine, are investigated. Results suggest that these approaches show consistent measurement of the impacts of major hydrologic parameters on response variables, but with differences in the relative contributions, particularly for the secondary parameters. The convergence behaviors of the SA with respect to the number of sampling points are also examined with different combinations of input parameter sets and output response variables and their alternative metrics. This study helps identify the optimal SA approach, provides guidance for the calibration of the Community Land Model parameters to improve the model simulations of land surface fluxes, and approximates the magnitudes to be adjusted in the parameter values during parametric model optimization.

  13. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    NARCIS (Netherlands)

    P.C. Austin (Peter); D. van Klaveren (David); Y. Vergouwe (Yvonne); D. Nieboer (Daan); D.S. Lee (Douglas); E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractObjective: Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting: We

  14. Beyond GLMs: a generative mixture modeling approach to neural system identification.

    Directory of Open Access Journals (Sweden)

    Lucas Theis

    Full Text Available Generalized linear models (GLMs represent a popular choice for the probabilistic characterization of neural spike responses. While GLMs are attractive for their computational tractability, they also impose strong assumptions and thus only allow for a limited range of stimulus-response relationships to be discovered. Alternative approaches exist that make only very weak assumptions but scale poorly to high-dimensional stimulus spaces. Here we seek an approach which can gracefully interpolate between the two extremes. We extend two frequently used special cases of the GLM-a linear and a quadratic model-by assuming that the spike-triggered and non-spike-triggered distributions can be adequately represented using Gaussian mixtures. Because we derive the model from a generative perspective, its components are easy to interpret as they correspond to, for example, the spike-triggered distribution and the interspike interval distribution. The model is able to capture complex dependencies on high-dimensional stimuli with far fewer parameters than other approaches such as histogram-based methods. The added flexibility comes at the cost of a non-concave log-likelihood. We show that in practice this does not have to be an issue and the mixture-based model is able to outperform generalized linear and quadratic models.

  15. A Model-Driven Approach for Telecommunications Network Services Definition

    Science.gov (United States)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  16. Portraiture of constructivist parental involvement: A model to develop a community of practice

    Science.gov (United States)

    Dignam, Christopher Anthony

    This qualitative research study addressed the problem of the lack of parental involvement in secondary school science. Increasing parental involvement is vital in supporting student academic achievement and social growth. The purpose of this emergent phenomenological study was to identify conditions required to successfully construct a supportive learning environment to form partnerships between students, parents, and educators. The overall research question in this study investigated the conditions necessary to successfully enlist parental participation with students during science inquiry investigations at the secondary school level. One hundred thirteen pairs of parents and students engaged in a 6-week scientific inquiry activity and recorded attitudinal data in dialogue journals, questionnaires, open-ended surveys, and during one-one-one interviews conducted by the researcher between individual parents and students. Comparisons and cross-interpretations of inter-rater, codified, triangulated data were utilized for identifying emergent themes. Data analysis revealed the active involvement of parents in researching with their child during inquiry investigations, engaging in journaling, and assessing student performance fostered partnerships among students, parents, and educators and supported students' social skills development. The resulting model, employing constructivist leadership and enlisting parent involvement, provides conditions and strategies required to develop a community of practice that can help effect social change. The active involvement of parents fostered improved efficacy and a holistic mindset to develop in parents, students, and teachers. Based on these findings, the interactive collaboration of parents in science learning activities can proactively facilitate a community of practice that will assist educators in facilitating social change.

  17. A stochastic approach to the derivation of exemption and clearance levels

    International Nuclear Information System (INIS)

    Deckert, A.

    1997-01-01

    Deciding what clearance levels are appropriate for a particular waste stream inherently involves a number of uncertainties. Some of these uncertainties can be quantified using stochastic modeling techniques, which can aid the process of decision making. In this presentation the German approach to dealing with the uncertainties involved in setting clearance levels is addressed. (author)

  18. Effective use of integrated hydrological models in basin-scale water resources management: surrogate modeling approaches

    Science.gov (United States)

    Zheng, Y.; Wu, B.; Wu, X.

    2015-12-01

    Integrated hydrological models (IHMs) consider surface water and subsurface water as a unified system, and have been widely adopted in basin-scale water resources studies. However, due to IHMs' mathematical complexity and high computational cost, it is difficult to implement them in an iterative model evaluation process (e.g., Monte Carlo Simulation, simulation-optimization analysis, etc.), which diminishes their applicability for supporting decision-making in real-world situations. Our studies investigated how to effectively use complex IHMs to address real-world water issues via surrogate modeling. Three surrogate modeling approaches were considered, including 1) DYCORS (DYnamic COordinate search using Response Surface models), a well-established response surface-based optimization algorithm; 2) SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), a response surface-based optimization algorithm that we developed specifically for IHMs; and 3) Probabilistic Collocation Method (PCM), a stochastic response surface approach. Our investigation was based on a modeling case study in the Heihe River Basin (HRB), China's second largest endorheic river basin. The GSFLOW (Coupled Ground-Water and Surface-Water Flow Model) model was employed. Two decision problems were discussed. One is to optimize, both in time and in space, the conjunctive use of surface water and groundwater for agricultural irrigation in the middle HRB region; and the other is to cost-effectively collect hydrological data based on a data-worth evaluation. Overall, our study results highlight the value of incorporating an IHM in making decisions of water resources management and hydrological data collection. An IHM like GSFLOW can provide great flexibility to formulating proper objective functions and constraints for various optimization problems. On the other hand, it has been demonstrated that surrogate modeling approaches can pave the path for such incorporation in real

  19. Modelling diversity in building occupant behaviour: a novel statistical approach

    DEFF Research Database (Denmark)

    Haldi, Frédéric; Calì, Davide; Andersen, Rune Korsholm

    2016-01-01

    We propose an advanced modelling framework to predict the scope and effects of behavioural diversity regarding building occupant actions on window openings, shading devices and lighting. We develop a statistical approach based on generalised linear mixed models to account for the longitudinal nat...

  20. Multiphysics modeling using COMSOL a first principles approach

    CERN Document Server

    Pryor, Roger W

    2011-01-01

    Multiphysics Modeling Using COMSOL rapidly introduces the senior level undergraduate, graduate or professional scientist or engineer to the art and science of computerized modeling for physical systems and devices. It offers a step-by-step modeling methodology through examples that are linked to the Fundamental Laws of Physics through a First Principles Analysis approach. The text explores a breadth of multiphysics models in coordinate systems that range from 1D to 3D and introduces the readers to the numerical analysis modeling techniques employed in the COMSOL Multiphysics software. After readers have built and run the examples, they will have a much firmer understanding of the concepts, skills, and benefits acquired from the use of computerized modeling techniques to solve their current technological problems and to explore new areas of application for their particular technological areas of interest.