WorldWideScience

Sample records for modelling methodology australia

  1. A case study in data audit and modelling methodology. Australia

    Energy Technology Data Exchange (ETDEWEB)

    Apelbaum, John [Apelbaum Consulting Group, 750 Blackburn Road, Melbourne VIC 3170 (Australia)

    2009-10-15

    The purpose of the paper is to outline a rigorous, spatially consistent and cost-effective transport planning tool that projects travel demand, energy and emissions for all modes associated with domestic and international transport. The planning tool (Aus{sub e}Tran) is a multi-modal, multi-fuel and multi-regional macroeconomic and demographic-based computational model of the Australian transport sector that overcomes some of the gaps associated with existing strategic level transport emission models. The paper also identifies a number of key data issues that need to be resolved prior to model development with particular reference to the Australian environment. The strategic model structure endogenously derives transport demand, energy and emissions by jurisdiction, vehicle type, emission type and transport service for both freight and passenger transport. Importantly, the analytical framework delineates the national transport task, energy consumed and emissions according to region, state/territory of origin and jurisdictional protocols, provides an audit mechanism for the evaluation of the methodological framework, integrates a mathematical protocol to derive time series FFC emission factors and allows for the impact of non-registered road vehicles on transport, fuel and emissions. (author)

  2. A case study in data audit and modelling methodology-Australia

    International Nuclear Information System (INIS)

    Apelbaum, John

    2009-01-01

    The purpose of the paper is to outline a rigorous, spatially consistent and cost-effective transport planning tool that projects travel demand, energy and emissions for all modes associated with domestic and international transport. The planning tool (Aus e Tran) is a multi-modal, multi-fuel and multi-regional macroeconomic and demographic-based computational model of the Australian transport sector that overcomes some of the gaps associated with existing strategic level transport emission models. The paper also identifies a number of key data issues that need to be resolved prior to model development with particular reference to the Australian environment. The strategic model structure endogenously derives transport demand, energy and emissions by jurisdiction, vehicle type, emission type and transport service for both freight and passenger transport. Importantly, the analytical framework delineates the national transport task, energy consumed and emissions according to region, state/territory of origin and jurisdictional protocols, provides an audit mechanism for the evaluation of the methodological framework, integrates a mathematical protocol to derive time series FFC emission factors and allows for the impact of non-registered road vehicles on transport, fuel and emissions.

  3. A case study in data audit and modelling methodology-Australia

    Energy Technology Data Exchange (ETDEWEB)

    Apelbaum, John [Apelbaum Consulting Group, 750 Blackburn Road, Melbourne VIC 3170 (Australia)], E-mail: john@apelbaumconsulting.com.au

    2009-10-15

    The purpose of the paper is to outline a rigorous, spatially consistent and cost-effective transport planning tool that projects travel demand, energy and emissions for all modes associated with domestic and international transport. The planning tool (Aus{sub e}Tran) is a multi-modal, multi-fuel and multi-regional macroeconomic and demographic-based computational model of the Australian transport sector that overcomes some of the gaps associated with existing strategic level transport emission models. The paper also identifies a number of key data issues that need to be resolved prior to model development with particular reference to the Australian environment. The strategic model structure endogenously derives transport demand, energy and emissions by jurisdiction, vehicle type, emission type and transport service for both freight and passenger transport. Importantly, the analytical framework delineates the national transport task, energy consumed and emissions according to region, state/territory of origin and jurisdictional protocols, provides an audit mechanism for the evaluation of the methodological framework, integrates a mathematical protocol to derive time series FFC emission factors and allows for the impact of non-registered road vehicles on transport, fuel and emissions.

  4. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Towards a Residential Air-Conditioner Usage Model for Australia

    Directory of Open Access Journals (Sweden)

    Mark Goldsworthy

    2017-08-01

    Full Text Available Realistic models of occupant behaviour in relation to air-conditioner (a/c use are fundamentally important for developing accurate building energy simulation tools. In Australia and elsewhere, such simulation tools are inextricably bound both in legislation and in the design of new technology, electricity infrastructure and regulatory schemes. An increasing number of studies in the literature confirm just how important occupants are in determining overall energy consumption, but obtaining the data on which to build behaviour models is a non-trivial task. Here data is presented on air-conditioner usage derived from three different types of case study analyses. These are: (i use of aggregate energy consumption data coupled with weather, demographic and building statistics across Australia to estimate key predictors of energy use at the aggregate level; (ii use of survey data to determine characteristic a/c switch on/off behaviours and usage frequencies; and (iii use of detailed household level sub-circuit monitoring from 140 households to determine a/c switch on/off probabilities and their dependence on different building and occupant parameters. These case studies are used to assess the difficulties associated with translation of different forms of individual, aggregate and survey based information into a/c behaviour simulation models. Finally a method of linking the data gathering methodologies with the model development is suggested. This method would combine whole-of-house “smart”-meter data measurements with linked targeted occupant surveying.

  6. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  7. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  8. Methodology of simulation of underground working in metal mines. Application to a uranium deposit in Australia

    International Nuclear Information System (INIS)

    Deraisme, J.; de Fouquet, C.; Fraisse, H.

    1983-01-01

    For the Ben Lomond (Northern Queensland Australia) underground uranium mining project, studies were carried out to compare the feasibility of different mining methods according to their cost per ton and selectivity, i.e. cut and fill, sublevel stopping and both mixed. First, a geostatistical orebody model was built. The ore grade variability of this model results from the drillhole structural analysis. Working on two dimensional vertical cross sections, the usual hand drawing stope reserve estimate obtained with computer assisted design for each of the three different mining methods is compared with the results obtained with automatic algorithms allocated to the characteristics of each mining method. These algorithms use mathematical morphology to reproduce the geometrical constraints connected with each mining method and/or dynamic programmation. These techniques lead to fully automatic of optimal economical stope design. Comparison is positive: automatic stopes designs are in agreement with hand made drawings, but they can be defined faster through interactive questionning of the computer, and the total maximum profit obtained is a least as high as the best profit found through hand designed projects [fr

  9. Modelling the Balassa-Samuelson Effect in Australia

    Directory of Open Access Journals (Sweden)

    Khorshed Chowdhury

    2011-03-01

    Full Text Available This paper examines the Balassa-Samuelson hypothesis in Australia using the ARDL cointegrationframework. Evidence was found of a significant long-run relationship between real exchange rate andAustralia-US productivity differential during the period of 1950-2003. The results indicate that a one per centincrease in labour productivity in Australia relative to the US will lead to 5.6 per cent appreciation in the realexchange rate of Australia. The estimated coefficient for the error correction term is -0.1983 and is highlysignificant, indicating that the deviation from the long term real exchange rate equilibrium path is correctedby nearly 20 per cent over the following year. The Author suspects that the elasticity coefficient is “overestimated”due to the exclusion of relevant explanatory variables in the analytical model. The real exchangerate movements are affected by real fundamentals and policy-induced shifts in its real fundamentals. Thefundamentals include the terms of trade, government expenditure, real interest rate differentials, net foreignliabilities among others along with labour productivity differential.

  10. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  11. A model for an inland port in Australia

    Directory of Open Access Journals (Sweden)

    K. T.K. Toh

    2008-11-01

    Full Text Available This paper examines the role of an inland port particular to the outer regions of Melbourne, Australia. In this study, it has been experienced that the broad use of terminology, in the Melbourne context, has been a stumbling block. In its particular context, this has provided the impetus for the development of a model for an inland port that is unambiguous. It is clear from international examples that such a development acts as a significant potential nucleus for regional economic growth, but the lack of a facilitated discussion is an impediment. This model is offered as a facilitator and a useful tool in the construction of a common understanding.

  12. Predictive modelling of Ross River virus notifications in southeastern Australia.

    Science.gov (United States)

    Cutcher, Z; Williamson, E; Lynch, S E; Rowe, S; Clothier, H J; Firestone, S M

    2017-02-01

    Ross River virus (RRV) is a mosquito-borne virus endemic to Australia. The disease, marked by arthritis, myalgia and rash, has a complex epidemiology involving several mosquito species and wildlife reservoirs. Outbreak years coincide with climatic conditions conducive to mosquito population growth. We developed regression models for human RRV notifications in the Mildura Local Government Area, Victoria, Australia with the objective of increasing understanding of the relationships in this complex system, providing trigger points for intervention and developing a forecast model. Surveillance, climatic, environmental and entomological data for the period July 2000-June 2011 were used for model training then forecasts were validated for July 2011-June 2015. Rainfall and vapour pressure were the key factors for forecasting RRV notifications. Validation of models showed they predicted RRV counts with an accuracy of 81%. Two major RRV mosquito vectors (Culex annulirostris and Aedes camptorhynchus) were important in the final estimation model at proximal lags. The findings of this analysis advance understanding of the drivers of RRV in temperate climatic zones and the models will inform public health agencies of periods of increased risk.

  13. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  14. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  15. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  16. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  17. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  18. Methodology, models and algorithms in thermographic diagnostics

    CERN Document Server

    Živčák, Jozef; Madarász, Ladislav; Rudas, Imre J

    2013-01-01

    This book presents  the methodology and techniques of  thermographic applications with focus primarily on medical thermography implemented for parametrizing the diagnostics of the human body. The first part of the book describes the basics of infrared thermography, the possibilities of thermographic diagnostics and the physical nature of thermography. The second half includes tools of intelligent engineering applied for the solving of selected applications and projects. Thermographic diagnostics was applied to problematics of paraplegia and tetraplegia and carpal tunnel syndrome (CTS). The results of the research activities were created with the cooperation of the four projects within the Ministry of Education, Science, Research and Sport of the Slovak Republic entitled Digital control of complex systems with two degrees of freedom, Progressive methods of education in the area of control and modeling of complex object oriented systems on aircraft turbocompressor engines, Center for research of control of te...

  19. Developing a savanna burning emissions abatement methodology for tussock grasslands in high rainfall regions of northern Australia

    Directory of Open Access Journals (Sweden)

    Jeremy Russell-Smith

    2014-06-01

    Full Text Available Fire-prone tropical savanna and grassland systems are a significant source of atmospheric emissions of greenhouse gases.  In recent years, substantial research has been directed towards developing accounting methodologies for savanna burning emissions to be applied in Australia’s National Greenhouse Gas Inventory, as well as for commercial carbon trading purposes.  That work has focused on woody savanna systems.  Here, we extend the methodological approach to include tussock grasslands and associated Melaleuca-dominated open woodlands (<10% foliage cover in higher rainfall (>1,000 mm/annum regions of northern Australia.  Field assessments under dry season conditions focused on deriving fuel accumulation, fire patchiness and combustion relationships for key fuel types: fine fuels − grass and litter; coarse woody fuels − twigs <6 mm diameter; heavy woody fuels − >6 mm diameter; and shrubs.  In contrast with previous savanna burning assessments, fire treatments undertaken under early dry season burning conditions resulted in negligible patchiness and very substantial consumption of fine fuels.  In effect, burning in the early dry season provides no benefits in greenhouse gas emissions and emissions reductions in tussock grasslands can be achieved only through reducing the extent of burning.  The practical implications of reduced burning in higher rainfall northern Australian grassland systems are discussed, indicating that there are significant constraints, including infrastructural, cultural and woody thickening issues.  Similar opportunities and constraints are observed in other international contexts, but especially project implementation challenges associated with legislative, political and governance issues.

  20. Methodological Developments in Geophysical Assimilation Modeling

    Science.gov (United States)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to

  1. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  2. A methodology for spectral wave model evaluation

    Science.gov (United States)

    Siqueira, S. A.; Edwards, K. L.; Rogers, W. E.

    2017-12-01

    climate, omitting the energy in the frequency band between the two lower limits tested can lead to an incomplete characterization of model performance. This methodology was developed to aid in selecting a comparison frequency range that does not needlessly increase computational expense and does not exclude energy to the detriment of model performance analysis.

  3. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  4. Interplay wellbeing framework: a collaborative methodology 'bringing together stories and numbers' to quantify Aboriginal cultural values in remote Australia.

    Science.gov (United States)

    Cairney, Sheree; Abbott, Tammy; Quinn, Stephen; Yamaguchi, Jessica; Wilson, Byron; Wakerman, John

    2017-05-03

    Wellbeing has been difficult to understand, measure and strengthen for Aboriginal people in remote Australia. Part of the challenge has been genuinely involving community members and incorporating their values and priorities into assessment and policy. Taking a 'shared space' collaborative approach between remote Aboriginal communities, governments and scientists, we merged Aboriginal knowledge with western science - by bringing together stories and numbers. This research aims to statistically validate the holistic Interplay Wellbeing Framework and Survey that bring together Aboriginal-identified priorities of culture, empowerment and community with government priorities including education, employment and health. Quantitative survey data were collected from a cohort of 842 Aboriginal people aged 15-34 years, recruited from four different Aboriginal communities in remote Australia. Aboriginal community researchers designed and administered the survey. Structural equation modeling showed good fit statistics (χ/df = 2.69, CFI = 0.95 and RMSEA = 0.045) confirming the holistic nature of the Interplay Wellbeing Framework. The strongest direct impacts on wellbeing were 'social and emotional wellbeing' (r = 0.23; p Interplay Wellbeing Framework and Survey were statistically validated as a collaborative approach to assessing wellbeing that is inclusive of other cultural worldviews, values and practices. New community-derived social and cultural indicators were established, contributing valuable insight to psychometric assessment across cultures. These analyses confirm that culture, empowerment and community play key roles in the interplay with education, employment and health, as part of a holistic and quantifiable system of wellbeing. This research supports the holistic concept of wellbeing confirming that everything is interrelated and needs to be considered at the 'whole of system' level in policy approaches.

  5. Assessing trends in observed and modelled climate extremes over Australia in relation to future projections

    International Nuclear Information System (INIS)

    Alexander, Lisa

    2007-01-01

    Full text: Nine global coupled climate models were assessed for their ability to reproduce observed trends in a set of indices representing temperature and precipitation extremes over Australia. Observed trends for 1957-1999 were compared with individual and multi-modelled trends calculated over the same period. When averaged across Australia the magnitude of trends and interannual variability of temperature extremes were well simulated by most models, particularly for the warm nights index. Except for consecutive dry days, the majority of models also reproduced the correct sign of trend for precipitation extremes. A bootstrapping technique was used to show that most models produce plausible trends when averaged over Australia, although only heavy precipitation days simulated from the multi-model ensemble showed significant skill at reproducing the observed spatial pattern of trends. Two of the models with output from different forcings showed that only with anthropogenic forcing included could the models capture the observed areally averaged trend for some of the temperature indices, but the forcing made little difference to the models' ability to reproduce the spatial pattern of trends over Australia. Future projected changes in extremes using three emissions scenarios were also analysed. Australia shows a shift towards significant warming of temperature extremes with much longer dry spells interspersed with periods of increased extreme precipitation irrespective of the scenario used. More work is required to determine whether regional projected changes over Australia are robust

  6. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  7. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  8. A mass shooting at Port Arthur, Tasmania, Australia: a study of its impact on early pregnancy losses using a conception time-based methodology.

    Science.gov (United States)

    Dean, R G; Dean, J; Heller, G Z; Leader, L R

    2015-11-01

    Does an acute calamity in a community cause early miscarriage and is this association the same for male and female fetuses? Estimated losses of 29.5% of first trimester pregnancies in the affected region could be associated with an acute calamity, with no statistically significant difference in estimated losses by fetal sex. There are very few studies on the impact of a calamity on early pregnancy loss and its differential effects on male and female fetuses. A decline in the human sex ratio at birth associated with the events of 9/11 in New York has been documented. This is a retrospective descriptive study of birth register data in Tasmania, Australia, from 1991 to 1997, covering the period in which the calamity occurred. The register contains data on all pregnancies that proceeded to >20 weeks gestation. The conception date was calculated by subtracting gestational age from birth date. We estimated that 40 318 pregnancies were conceived in the period 1991-1996 inclusive. These were aggregated to 4-weekly blocks classified by region and sex. The acute calamity was at Port Arthur, Tasmania, Australia. On 28 April 1996, a gunman opened fire on visitors and staff in a tourist cafe. A very stressful 20 h period, ended with 35 people dead and 22 injured. A negative binomial regression model was used to assess the association between this calamity and pregnancy loss. This loss is evidenced by a shortfall in the registration of pregnancies that were in their first trimester at the time of the calamity. We estimated a shortfall of 29.5% or 229 registered pregnancies among those in the first trimester at the time of the calamity (P surrogate for geographic area or space assumes that the mother has not moved into the postcode area after the calamity and before the reporting of a birth. The results of this study suggest that calamities bring about significant pregnancy loss affecting both sexes. The methodology presented of inferring conception date from birth date and using

  9. Methodological Approach for Modeling of Multienzyme in-pot Processes

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia; Roman Martinez, Alicia; Sin, Gürkan

    2011-01-01

    This paper presents a methodological approach for modeling multi-enzyme in-pot processes. The methodology is exemplified stepwise through the bi-enzymatic production of N-acetyl-D-neuraminic acid (Neu5Ac) from N-acetyl-D-glucosamine (GlcNAc). In this case study, sensitivity analysis is also used ...

  10. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    Science.gov (United States)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  11. Locally Simple Models Construction: Methodology and Practice

    Directory of Open Access Journals (Sweden)

    I. A. Kazakov

    2017-12-01

    Full Text Available One of the most notable trends associated with the Fourth industrial revolution is a significant strengthening of the role played by semantic methods. They are engaged in artificial intelligence means, knowledge mining in huge flows of big data, robotization, and in the internet of things. Smart contracts also can be mentioned here, although the ’intelligence’ of smart contracts still needs to be seriously elaborated. These trends should inevitably lead to an increased role of logical methods working with semantics, and significantly expand the scope of their application in practice. However, there are a number of problems that hinder this process. We are developing an approach, which makes the application of logical modeling efficient in some important areas. The approach is based on the concept of locally simple models and is primarily focused on solving tasks in the management of enterprises, organizations, governing bodies. The most important feature of locally simple models is their ability to replace software systems. Replacement of programming by modeling gives huge advantages, for instance, it dramatically reduces development and support costs. Modeling, unlike programming, preserves the explicit semantics of models allowing integration with artificial intelligence and robots. In addition, models are much more understandable to general people than programs. In this paper we propose the implementation of the concept of locally simple modeling on the basis of so-called document models, which has been developed by us earlier. It is shown that locally simple modeling is realized through document models with finite submodel coverages. In the second part of the paper an example of using document models for solving a management problem of real complexity is demonstrated.

  12. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  13. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  14. Downscaling an Eddy-Resolving Global Model for the Continental Shelf off South Eastern Australia

    Science.gov (United States)

    Roughan, M.; Baird, M.; MacDonald, H.; Oke, P.

    2008-12-01

    The Australian Bluelink collaboration between CSIRO, the Bureau of Meteorology and the Royal Australian Navy has made available to the research community the output of BODAS (Bluelink ocean data assimilation system), an ensemble optimal interpolation reanalysis system with ~10 km resolution around Australia. Within the Bluelink project, BODAS fields are assimilated into a dynamic ocean model of the same resolution to produce BRAN (BlueLink ReANalysis, a hindcast of water properties around Australia from 1992 to 2004). In this study, BODAS hydrographic fields are assimilated into a ~ 3 km resolution Princeton Ocean Model (POM) configuration of the coastal ocean off SE Australia. Experiments were undertaken to establish the optimal strength and duration of the assimilation of BODAS fields into the 3 km resolution POM configuration for the purpose of producing hindcasts of ocean state. It is shown that the resultant downscaling of Bluelink products is better able to reproduce coastal features, particularly velocities and hydrography over the continental shelf off south eastern Australia. The BODAS-POM modelling system is used to provide a high-resolution simulation of the East Australian Current over the period 1992 to 2004. One of the applications that we will present is an investigation of the seasonal and inter-annual variability in the dispersion of passive particles in the East Australian Current. The practical outcome is an estimate of the connectivity of estuaries along the coast of southeast Australia, which is relevant for the dispersion of marine pests.

  15. Modeling Methodologies for Representing Urban Cultural Geographies in Stability Operations

    National Research Council Canada - National Science Library

    Ferris, Todd P

    2008-01-01

    ... 2.0.0, in an effort to provide modeling methodologies for a single simulation tool capable of exploring the complex world of urban cultural geographies undergoing Stability Operations in an irregular warfare (IW) environment...

  16. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  17. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF). Th...

  18. Which bank? A guardian model for regulation of embryonic stem cell research in Australia.

    Science.gov (United States)

    McLennan, A

    2007-08-01

    In late 2005 the Legislation Review: Prohibition of Human Cloning Act 2002 (Cth) and the Research Involving Human Embryos Act 2002 (Cth) recommended the establishment of an Australian stem cell bank. This article aims to address a lack of discussion of issues surrounding stem cell banking by suggesting possible answers to the questions of whether Australia should establish a stem cell bank and what its underlying philosophy and functions should be. Answers are developed through an analysis of regulatory, scientific and intellectual property issues relating to embryonic stem cell research in the United Kingdom, United States and Australia. This includes a detailed analysis of the United Kingdom Stem Cell Bank. It is argued that a "guardian" model stem cell bank should be established in Australia. This bank would aim to promote the maximum public benefit from human embryonic stem cell research by providing careful regulatory oversight and addressing ethical issues, while also facilitating research by addressing practical scientific concerns and intellectual property issues.

  19. A Stochastic Growth Model with Income Tax Evasion: Implications for Australia

    OpenAIRE

    Ratbek Dzhumashev; Emin Gahramanov

    2009-01-01

    In this paper we develop a stochastic endogenous growth model augmented with income tax evasion. Our model avoids some existing discrepancies between empirical evidence and theoretical predictions of traditional tax evasion models. Further, we show that: i) productive government expenditures play an important role in affecting economy's tax evasion rate; ii) the average marginal income tax rate in Australia come close to the optimal; and iii) the phenomenon of tax evasion is not an excuse for...

  20. Australia is Facing a HousingAffordability Crisis: Is the Solution to this Problem the Singapore Model of Housing?

    Directory of Open Access Journals (Sweden)

    John McLaren

    2016-12-01

    Full Text Available Australia is pricing young buyers out of the housing market. Unfortunately, debt-free home ownership in the retirement years is a key part of the Australian welfare system. This paper provides one possible solution to the current housing predicament of Australia. In doing so, the paper examines the housing strategy in Singapore, where residents are provided with accommodation at a reasonable cost. This strategy is examined and translated for use in Australia. In conclusion the paper proposes that the Singapore model of home ownership is worthy of consideration by the government of Australia.

  1. Geologic modeling in risk assessment methodology for radioactive waste management

    International Nuclear Information System (INIS)

    Logan, S.E.; Berbano, M.C.

    1977-01-01

    Under contract to the U.S. Environmental Protection Agency (EPA), the University of New Mexico is developing a computer based assessment methodology for evaluating public health and environmental impacts from the disposal of radioactive waste in geologic formations. Methodology incorporates a release or fault tree model, an environmental model, and an economic model. The release model and its application to a model repository in bedded salt is described. Fault trees are constructed to provide the relationships between various geologic and man-caused events which are potential mechanisms for release of radioactive material beyond the immediate environs of the repository. The environmental model includes: 1) the transport to and accumulations at various receptors in the biosphere, 2) pathways from these environmental concentrations, and 3) radiation dose to man. Finally, economic results are used to compare and assess various disposal configurations as a basis for formulatin

  2. A satellite and model based flood inundation climatology of Australia

    Science.gov (United States)

    Schumann, G.; Andreadis, K.; Castillo, C. J.

    2013-12-01

    To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.

  3. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  4. A model of adaptation of overseas nurses: exploring the experiences of Japanese nurses working in Australia.

    Science.gov (United States)

    Kishi, Yuka; Inoue, Kumiyo; Crookes, Patrick; Shorten, Allison

    2014-04-01

    The purpose of the study was to investigate the experiences of Japanese nurses and their adaptation to their work environment in Australia. Using a qualitative research method and semistructured interviews, the study aimed to discover, describe, and analyze the experiences of 14 Japanese nurses participating in the study. A qualitative study. Fourteen Japanese registered nurses working in Australian hospitals participated in the study. Individual semistructured interviews were conducted from April to June in 2008. Thematic analysis was used to identify themes within the data. Analysis of qualitative open-ended questions revealed the participants' adaptation process. It consists of three themes or phases: seeking (S), acclimatizing (A), and settling (S), subsequently named the S.A.S. model. The conceptual model of the adaptation processes of 14 Japanese nurses working in Australia includes the seeking, acclimatizing, and settling phases. Although these phases are not mutually exclusive and the process is not necessarily uniformly linear, all participants in this study passed through this S.A.S. model in order to adapt to their new environment. The S.A.S. model of adaptation helps to describe the experiences of Japanese overseas qualified nurses working in Australian hospitals. Future research is needed to examine whether this model can be applied to nurses from other countries and in other settings outside Australia.

  5. K-Means Subject Matter Expert Refined Topic Model Methodology

    Science.gov (United States)

    2017-01-01

    computing environment the Visual Basic for Applications ( VBA ) programming language presents the option as our programming language of choice. We propose...background, or access to other computational programming environments, to build topic models from free text datasets using a familiar Excel based...environment the restricts access to other software based text analytic tools. Opportunities to deploy developmental versions of the methodology and

  6. Effect of model resolution on a regional climate model simulation over southeast Australia

    KAUST Repository

    Evans, J. P.; McCabe, Matthew

    2013-01-01

    Dynamically downscaling climate projections from global climate models (GCMs) for use in impacts and adaptation research has become a common practice in recent years. In this study, the CSIRO Mk3.5 GCM is downscaled using the Weather Research and Forecasting (WRF) regional climate model (RCM) to medium (50 km) and high (10 km) resolution over southeast Australia. The influence of model resolution on the present-day (1985 to 2009) modelled regional climate and projected future (2075 to 2099) changes are examined for both mean climate and extreme precipitation characteristics. Increasing model resolution tended to improve the simulation of present day climate, with larger improvements in areas affected by mountains and coastlines. Examination of circumstances under which increasing the resolution decreased performance revealed an error in the GCM circulation, the effects of which had been masked by the coarse GCM topography. Resolution modifications to projected changes were largest in regions with strong topographic and coastline influences, and can be large enough to change the sign of the climate change projected by the GCM. Known physical mechanisms for these changes included orographic uplift and low-level blocking of air-masses caused by mountains. In terms of precipitation extremes, the GCM projects increases in extremes even when the projected change in the mean was a decrease: but this was not always true for the higher resolution models. Thus, while the higher resolution RCM climate projections often concur with the GCM projections, there are times and places where they differ significantly due to their better representation of physical processes. It should also be noted that the model resolution can modify precipitation characteristics beyond just its mean value.

  7. Effect of model resolution on a regional climate model simulation over southeast Australia

    KAUST Repository

    Evans, J. P.

    2013-03-26

    Dynamically downscaling climate projections from global climate models (GCMs) for use in impacts and adaptation research has become a common practice in recent years. In this study, the CSIRO Mk3.5 GCM is downscaled using the Weather Research and Forecasting (WRF) regional climate model (RCM) to medium (50 km) and high (10 km) resolution over southeast Australia. The influence of model resolution on the present-day (1985 to 2009) modelled regional climate and projected future (2075 to 2099) changes are examined for both mean climate and extreme precipitation characteristics. Increasing model resolution tended to improve the simulation of present day climate, with larger improvements in areas affected by mountains and coastlines. Examination of circumstances under which increasing the resolution decreased performance revealed an error in the GCM circulation, the effects of which had been masked by the coarse GCM topography. Resolution modifications to projected changes were largest in regions with strong topographic and coastline influences, and can be large enough to change the sign of the climate change projected by the GCM. Known physical mechanisms for these changes included orographic uplift and low-level blocking of air-masses caused by mountains. In terms of precipitation extremes, the GCM projects increases in extremes even when the projected change in the mean was a decrease: but this was not always true for the higher resolution models. Thus, while the higher resolution RCM climate projections often concur with the GCM projections, there are times and places where they differ significantly due to their better representation of physical processes. It should also be noted that the model resolution can modify precipitation characteristics beyond just its mean value.

  8. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  9. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  10. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  11. A Methodology to Assess Ionospheric Models for GNSS

    Science.gov (United States)

    Rovira-Garcia, Adria; Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Ibánez, Deimos

    2015-04-01

    Testing the accuracy of the ionospheric models used in the Global Navigation Satellite System (GNSS) is a long-standing issue. It is still a challenging problem due to the lack of accurate enough slant ionospheric determinations to be used as a reference. The present study proposes a methodology to assess any ionospheric model used in satellite-based applications and, in particular, GNSS ionospheric models. The methodology complements other analysis comparing the navigation based on different models to correct the code and carrier-phase observations. Specifically, the following ionospheric models are assessed: the operational models broadcast in the Global Positioning System (GPS), Galileo and the European Geostationary Navigation Overlay System (EGNOS), the post-process Global Ionospheric Maps (GIMs) from different analysis centers belonging to the International GNSS Service (IGS) and, finally, a new GIM computed by the gAGE/UPC research group. The methodology is based in the comparison between the predictions of the ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences shall be separated into the hardware delays (a receiver constant plus a satellite constant) per data interval, e.g., a day. The condition that these Differential Code Biases (DCBs) are commonly shared throughout the world-wide network of receivers and satellites provides a global character to the assessment. This approach generalizes simple tests based on double differenced Slant Total Electron Contents (STECs) between pairs of satellites and receivers on a much local scale. The present study has been conducted during the entire 2014, i.e., the last Solar Maximum. The seasonal and latitudinal structures of the results clearly reflect the different strategies used by the different models. On one hand, ionospheric model corrections based on a grid (IGS-GIMs or EGNOS) are shown to be several times better than the models

  12. An experimental methodology for a fuzzy set preference model

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  13. Modelling seasonal habitat suitability for wide-ranging species: Invasive wild pigs in northern Australia.

    Directory of Open Access Journals (Sweden)

    Jens G Froese

    Full Text Available Invasive wildlife often causes serious damage to the economy and agriculture as well as environmental, human and animal health. Habitat models can fill knowledge gaps about species distributions and assist planning to mitigate impacts. Yet, model accuracy and utility may be compromised by small study areas and limited integration of species ecology or temporal variability. Here we modelled seasonal habitat suitability for wild pigs, a widespread and harmful invader, in northern Australia. We developed a resource-based, spatially-explicit and regional-scale approach using Bayesian networks and spatial pattern suitability analysis. We integrated important ecological factors such as variability in environmental conditions, breeding requirements and home range movements. The habitat model was parameterized during a structured, iterative expert elicitation process and applied to a wet season and a dry season scenario. Model performance and uncertainty was evaluated against independent distributional data sets. Validation results showed that an expert-averaged model accurately predicted empirical wild pig presences in northern Australia for both seasonal scenarios. Model uncertainty was largely associated with different expert assumptions about wild pigs' resource-seeking home range movements. Habitat suitability varied considerably between seasons, retracting to resource-abundant rainforest, wetland and agricultural refuge areas during the dry season and expanding widely into surrounding grassland floodplains, savanna woodlands and coastal shrubs during the wet season. Overall, our model suggested that suitable wild pig habitat is less widely available in northern Australia than previously thought. Mapped results may be used to quantify impacts, assess risks, justify management investments and target control activities. Our methods are applicable to other wide-ranging species, especially in data-poor situations.

  14. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  15. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    Science.gov (United States)

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  16. The Role of Integrated Modelling and Assessment for Decision-Making: Lessons from Water Allocation Issues in Australia

    Science.gov (United States)

    Jakeman, A. J.; Guillaume, J. H. A.; El Sawah, S.; Hamilton, S.

    2014-12-01

    Integrated modelling and assessment (IMA) is best regarded as a process that can support environmental decision-making when issues are strongly contested and uncertainties pervasive. To be most useful, the process must be multi-dimensional and phased. Principally, it must be tailored to the problem context to encompass diverse issues of concern, management settings and stakeholders. This in turn requires the integration of multiple processes and components of natural and human systems and their corresponding spatial and temporal scales. Modellers therefore need to be able to integrate multiple disciplines, methods, models, tools and data, and many sources and types of uncertainty. These dimensions are incorporated into iteration between the various phases of the IMA process, including scoping, problem framing and formulation, assessing options and communicating findings. Two case studies in Australia are employed to share the lessons of how integration can be achieved in these IMA phases using a mix of stakeholder participation processes and modelling tools. One case study aims to improve the relevance of modelling by incorporating stakeholder's views of irrigated viticulture and water management decision making. It used a novel methodology with the acronym ICTAM, consisting of Interviews to elicit mental models, Cognitive maps to represent and analyse individual and group mental models, Time-sequence diagrams to chronologically structure the decision making process, an All-encompassing conceptual model, and computational Models of stakeholder decision making. The second case uses a hydro-economic river network model to examine basin-wide impacts of water allocation cuts and adoption of farm innovations. The knowledge exchange approach used in each case was designed to integrate data and knowledge bearing in mind the contextual dimensions of the problem at hand, and the specific contributions that environmental modelling was thought to be able to make.

  17. Teaching methodology for modeling reference evapotranspiration with artificial neural networks

    OpenAIRE

    Martí, Pau; Pulido Calvo, Inmaculada; Gutiérrez Estrada, Juan Carlos

    2015-01-01

    [EN] Artificial neural networks are a robust alternative to conventional models for estimating different targets in irrigation engineering, among others, reference evapotranspiration, a key variable for estimating crop water requirements. This paper presents a didactic methodology for introducing students in the application of artificial neural networks for reference evapotranspiration estimation using MatLab c . Apart from learning a specific application of this software wi...

  18. Evaluation of alternative model-data fusion approaches in water balance estimation across Australia

    Science.gov (United States)

    van Dijk, A. I. J. M.; Renzullo, L. J.

    2009-04-01

    Australia's national agencies are developing a continental modelling system to provide a range of water information services. It will include rolling water balance estimation to underpin national water accounts, water resources assessments that interpret current water resources availability and trends in a historical context, and water resources predictions coupled to climate and weather forecasting. The nation-wide coverage, currency, accuracy, and consistency required means that remote sensing will need to play an important role along with in-situ observations. Different approaches to blending models and observations can be considered. Integration of on-ground and remote sensing data into land surface models in atmospheric applications often involves state updating through model-data assimilation techniques. By comparison, retrospective water balance estimation and hydrological scenario modelling to date has mostly relied on static parameter fitting against observations and has made little use of earth observation. The model-data fusion approach most appropriate for a continental water balance estimation system will need to consider the trade-off between computational overhead and the accuracy gains achieved when using more sophisticated synthesis techniques and additional observations. This trade-off was investigated using a landscape hydrological model and satellite-based estimates of soil moisture and vegetation properties for aseveral gauged test catchments in southeast Australia.

  19. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian

    2001-01-01

    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  20. Evaluation of radioxenon releases in Australia using atmospheric dispersion modelling tools

    International Nuclear Information System (INIS)

    Tinker, Rick; Orr, Blake; Grzechnik, Marcus; Hoffmann, Emmy; Saey, Paul; Solomon, Stephen

    2010-01-01

    The origin of a series of atmospheric radioxenon events detected at the Comprehensive Test Ban Treaty Organisation (CTBTO) International Monitoring System site in Melbourne, Australia, between November 2008 and February 2009 was investigated. Backward tracking analyses indicated that the events were consistent with releases associated with hot commission testing of the Australian Nuclear Science Technology Organisation (ANSTO) radiopharmaceutical production facility in Sydney, Australia. Forward dispersion analyses were used to estimate release magnitudes and transport times. The estimated 133 Xe release magnitude of the largest event (between 0.2 and 34 TBq over a 2 d window), was in close agreement with the stack emission releases estimated by the facility for this time period (between 0.5 and 2 TBq). Modelling of irradiation conditions and theoretical radioxenon emission rates were undertaken and provided further evidence that the Melbourne detections originated from this radiopharmaceutical production facility. These findings do not have public health implications. This is the first comprehensive study of atmospheric radioxenon measurements and releases in Australia.

  1. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  2. 3D Urban Virtual Models generation methodology for smart cities

    Directory of Open Access Journals (Sweden)

    M. Álvarez

    2018-04-01

    Full Text Available Currently the use of Urban 3D Models goes beyond the mere support of three-dimensional image for the visualization of our urban surroundings. The three-dimensional Urban Models are in themselves fundamental tools to manage the different phenomena that occur in smart cities. It is therefore necessary to generate realistic models, in which BIM building design information can be integrated with GIS and other space technologies. The generation of 3D Urban Models benefit from the amount of data from sensors with the latest technologies such as airborne sensors and of the existence of international standards such as CityGML. This paper presents a methodology for the development of a three - dimensional Urban Model, based on LiDAR data and the CityGML standard, applied to the city of Lorca.

  3. A methodology for overall consequence modeling in chemical industry

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2009-01-01

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  4. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  5. 75 FR 10694 - Airworthiness Directives; AeroSpace Technologies of Australia Pty Ltd Models N22B, N22S, and N24A...

    Science.gov (United States)

    2010-03-09

    ... Airworthiness Directives; AeroSpace Technologies of Australia Pty Ltd Models N22B, N22S, and N24A Airplanes... authority for Australia, has issued AD GAF-N22-52, Amendment 1, dated January 2010 (referred to after this... examining the MCAI in the AD docket. Relevant Service Information AeroSpace Technologies of Australia...

  6. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  7. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  8. Spatially-explicit modelling model for assessing wild dog control strategies in Western Australia

    Science.gov (United States)

    Large predators can significantly impact livestock industries. In Australia, wild dogs (Canis lupus familiaris, Canis lupus dingo, and hybrids) cause economic losses of more than AUD $40M annually. Landscape-scale exclusion fencing coupled with lethal techniques is a widely pract...

  9. Methodology and preliminary models for analyzing nuclear safeguards decisions

    International Nuclear Information System (INIS)

    1978-11-01

    This report describes a general analytical tool designed to assist the NRC in making nuclear safeguards decisions. The approach is based on decision analysis--a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material, demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria), and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  10. Methodology and preliminary models for analyzing nuclear-safeguards decisions

    International Nuclear Information System (INIS)

    Judd, B.R.; Weissenberger, S.

    1978-11-01

    This report describes a general analytical tool designed with Lawrence Livermore Laboratory to assist the Nuclear Regulatory Commission in making nuclear safeguards decisions. The approach is based on decision analysis - a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material; demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria); and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  11. Model identification methodology for fluid-based inerters

    Science.gov (United States)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  12. Control of trachoma in Australia: a model based evaluation of current interventions.

    Directory of Open Access Journals (Sweden)

    Andrew J Shattock

    2015-04-01

    Full Text Available Australia is the only high-income country in which endemic trachoma persists. In response, the Australian Government has recently invested heavily towards the nationwide control of the disease.A novel simulation model was developed to reflect the trachoma epidemic in Australian Aboriginal communities. The model, which incorporates demographic, migration, mixing, and biological heterogeneities, was used to evaluate recent intervention measures against counterfactual past scenarios, and also to assess the potential impact of a series of hypothesized future intervention measures relative to the current national strategy and intensity. The model simulations indicate that, under the current intervention strategy and intensity, the likelihood of controlling trachoma to less than 5% prevalence among 5-9 year-old children in hyperendemic communities by 2020 is 31% (19%-43%. By shifting intervention priorities such that large increases in the facial cleanliness of children are observed, this likelihood of controlling trachoma in hyperendemic communities is increased to 64% (53%-76%. The most effective intervention strategy incorporated large-scale antibiotic distribution programs whilst attaining ambitious yet feasible screening, treatment, facial cleanliness and housing construction targets. Accordingly, the estimated likelihood of controlling trachoma in these communities is increased to 86% (76%-95%.Maintaining the current intervention strategy and intensity is unlikely to be sufficient to control trachoma across Australia by 2020. However, by shifting the intervention strategy and increasing intensity, the likelihood of controlling trachoma nationwide can be significantly increased.

  13. Control of trachoma in Australia: a model based evaluation of current interventions.

    Science.gov (United States)

    Shattock, Andrew J; Gambhir, Manoj; Taylor, Hugh R; Cowling, Carleigh S; Kaldor, John M; Wilson, David P

    2015-04-01

    Australia is the only high-income country in which endemic trachoma persists. In response, the Australian Government has recently invested heavily towards the nationwide control of the disease. A novel simulation model was developed to reflect the trachoma epidemic in Australian Aboriginal communities. The model, which incorporates demographic, migration, mixing, and biological heterogeneities, was used to evaluate recent intervention measures against counterfactual past scenarios, and also to assess the potential impact of a series of hypothesized future intervention measures relative to the current national strategy and intensity. The model simulations indicate that, under the current intervention strategy and intensity, the likelihood of controlling trachoma to less than 5% prevalence among 5-9 year-old children in hyperendemic communities by 2020 is 31% (19%-43%). By shifting intervention priorities such that large increases in the facial cleanliness of children are observed, this likelihood of controlling trachoma in hyperendemic communities is increased to 64% (53%-76%). The most effective intervention strategy incorporated large-scale antibiotic distribution programs whilst attaining ambitious yet feasible screening, treatment, facial cleanliness and housing construction targets. Accordingly, the estimated likelihood of controlling trachoma in these communities is increased to 86% (76%-95%). Maintaining the current intervention strategy and intensity is unlikely to be sufficient to control trachoma across Australia by 2020. However, by shifting the intervention strategy and increasing intensity, the likelihood of controlling trachoma nationwide can be significantly increased.

  14. Integrating FMEA in a Model-Driven Methodology

    Science.gov (United States)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  15. Analysing pseudoephedrine/methamphetamine policy options in Australia using multi-criteria decision modelling.

    Science.gov (United States)

    Manning, Matthew; Wong, Gabriel T W; Ransley, Janet; Smith, Christine

    2016-06-01

    In this paper we capture and synthesize the unique knowledge of experts so that choices regarding policy measures to address methamphetamine consumption and dependency in Australia can be strengthened. We examine perceptions of the: (1) influence of underlying factors that impact on the methamphetamine problem; (2) importance of various models of intervention that have the potential to affect the success of policies; and (3) efficacy of alternative pseudoephedrine policy options. We adopt a multi-criteria decision model to unpack factors that affect decisions made by experts and examine potential variations on weight/preference among groups. Seventy experts from five groups (i.e. academia (18.6%), government and policy (27.1%), health (18.6%), pharmaceutical (17.1%) and police (18.6%)) in Australia participated in the survey. Social characteristics are considered the most important underlying factor, prevention the most effective strategy and Project STOP the most preferred policy option with respect to reducing methamphetamine consumption and dependency in Australia. One-way repeated ANOVAs indicate a statistically significant difference with regards to the influence of underlying factors (F(2.3, 144.5)=11.256, pmethamphetamine consumption and dependency. Most experts support the use of preventative mechanisms to inhibit drug initiation and delayed drug uptake. Compared to other policies, Project STOP (which aims to disrupt the initial diversion of pseudoephedrine) appears to be a more preferable preventative mechanism to control the production and subsequent sale and use of methamphetamine. This regulatory civil law lever engages third parties in controlling drug-related crime. The literature supports third-party partnerships as it engages experts who have knowledge and expertise with respect to prevention and harm minimization. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. The AUSGeoid98 geoid model of Australia: data treatment, computations and comparisons with GPS-levelling data

    DEFF Research Database (Denmark)

    Featherstone, W.E.; Kirby, J.F.; Kearsley, A.H.W.

    2001-01-01

    The AUSGeoid98 gravimetric geoid model of Australia has been computed using data from the EGM96 global geopotential model, the 1996 release of the Australian gravity database, a nationwide digital elevation model, and satellite altimeter-derived marine gravity anomalies. The geoid heights are on ...

  17. Methodological Aspects of Modelling and Simulation of Robotized Workstations

    Directory of Open Access Journals (Sweden)

    Naqib Daneshjo

    2018-05-01

    Full Text Available From the point of view of development of application and program products, key directions that need to be respected in computer support for project activities are quite clearly specified. User interfaces with a high degree of graphical interactive convenience, two-dimensional and three-dimensional computer graphics contribute greatly to streamlining project methodologies and procedures in particular. This is mainly due to the fact that a high number of solved tasks is clearly graphic in the modern design of robotic systems. Automation of graphical character tasks is therefore a significant development direction for the subject area. The authors present results of their research in the area of automation and computer-aided design of robotized systems. A new methodical approach to modelling robotic workstations, consisting of ten steps incorporated into the four phases of the logistics process of creating and implementing a robotic workplace, is presented. The emphasis is placed on the modelling and simulation phase with verification of elaborated methodologies on specific projects or elements of the robotized welding plant in automotive production.

  18. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  19. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  20. The mentoring experiences of new graduate midwives working in midwifery continuity of care models in Australia.

    Science.gov (United States)

    Cummins, Allison M; Denney-Wilson, E; Homer, C S E

    2017-05-01

    The aim of this paper was to explore the mentoring experiences of new graduate midwives working in midwifery continuity of care models in Australia. Most new graduates find employment in hospitals and undertake a new graduate program rotating through different wards. A limited number of new graduate midwives were found to be working in midwifery continuity of care. The new graduate midwives in this study were mentored by more experienced midwives. Mentoring in midwifery has been described as being concerned with confidence building based through a personal relationship. A qualitative descriptive study was undertaken and the data were analysed using continuity of care as a framework. We found having a mentor was important, knowing the mentor made it easier for the new graduate to call their mentor at any time. The new graduate midwives had respect for their mentors and the support helped build their confidence in transitioning from student to midwife. With the expansion of midwifery continuity of care models in Australia mentoring should be provided for transition midwives working in this way. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  1. A ROADMAP FOR GENERATING SEMANTICALLY ENRICHED BUILDING MODELS ACCORDING TO CITYGML MODEL VIA TWO DIFFERENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    G. Floros

    2016-10-01

    Full Text Available The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model’s generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects’ purposes.

  2. MoPCoM Methodology: Focus on Models of Computation

    Science.gov (United States)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  3. Building Modelling Methodologies for Virtual District Heating and Cooling Networks

    Energy Technology Data Exchange (ETDEWEB)

    Saurav, Kumar; Choudhury, Anamitra R.; Chandan, Vikas; Lingman, Peter; Linder, Nicklas

    2017-10-26

    District heating and cooling systems (DHC) are a proven energy solution that has been deployed for many years in a growing number of urban areas worldwide. They comprise a variety of technologies that seek to develop synergies between the production and supply of heat, cooling, domestic hot water and electricity. Although the benefits of DHC systems are significant and have been widely acclaimed, yet the full potential of modern DHC systems remains largely untapped. There are several opportunities for development of energy efficient DHC systems, which will enable the effective exploitation of alternative renewable resources, waste heat recovery, etc., in order to increase the overall efficiency and facilitate the transition towards the next generation of DHC systems. This motivated the need for modelling these complex systems. Large-scale modelling of DHC-networks is challenging, as it has several components interacting with each other. In this paper we present two building methodologies to model the consumer buildings. These models will be further integrated with network model and the control system layer to create a virtual test bed for the entire DHC system. The model is validated using data collected from a real life DHC system located at Lulea, a city on the coast of northern Sweden. The test bed will be then used for simulating various test cases such as peak energy reduction, overall demand reduction etc.

  4. Developing and Testing a 3d Cadastral Data Model a Case Study in Australia

    Science.gov (United States)

    Aien, A.; Kalantari, M.; Rajabifard, A.; Williamson, I. P.; Shojaei, D.

    2012-07-01

    Population growth, urbanization and industrialization place more pressure on land use with the need for increased space. To extend the use and functionality of the land, complex infrastructures are being built, both vertically and horizontally, layered and stacked. These three-dimensional (3D) developments affect the interests (Rights, Restrictions, and Responsibilities (RRRs)) attached to the underlying land. A 3D cadastre will assist in managing the effects of 3D development on a particular extent of land. There are many elements that contribute to developing a 3D cadastre, such as existing of 3D property legislations, 3D DBMS, 3D visualization. However, data modelling is one of the most important elements of a successful 3D cadastre. As architectural models of houses and high rise buildings help their users visualize the final product, 3D cadastre data model supports 3D cadastre users to understand the structure or behavior of the system and has a template that guides them to construct and implement the 3D cadastre. Many jurisdictions, organizations and software developers have built their own cadastral data model. Land Administration Domain Model (DIS-ISO 19152, The Netherlands) and ePlan (Intergovernmental Committee on Surveying and Mapping, Australia) are examples of existing data models. The variation between these data models is the result of different attitudes towards cadastres. However, there is a basic common thread among them all. Current cadastral data models use a 2D land-parcel concept and extend it to support 3D requirements. These data models cannot adequately manage and represent the spatial extent of 3D RRRs. Most of the current cadastral data models have been influenced by a very broad understanding of 3D cadastral concepts because better clarity in what needs to be represented and analysed in the cadastre needs to be established. This paper presents the first version of a 3D Cadastral Data Model (3DCDM_Version 1.0). 3DCDM models both the legal

  5. Distribution models for koalas in South Australia using citizen science-collected data.

    Science.gov (United States)

    Sequeira, Ana M M; Roetman, Philip E J; Daniels, Christopher B; Baker, Andrew K; Bradshaw, Corey J A

    2014-06-01

    The koala (Phascolarctos cinereus) occurs in the eucalypt forests of eastern and southern Australia and is currently threatened by habitat fragmentation, climate change, sexually transmitted diseases, and low genetic variability throughout most of its range. Using data collected during the Great Koala Count (a 1-day citizen science project in the state of South Australia), we developed generalized linear mixed-effects models to predict habitat suitability across South Australia accounting for potential errors associated with the dataset. We derived spatial environmental predictors for vegetation (based on dominant species of Eucalyptus or other vegetation), topographic water features, rain, elevation, and temperature range. We also included predictors accounting for human disturbance based on transport infrastructure (sealed and unsealed roads). We generated random pseudo-absences to account for the high prevalence bias typical of citizen-collected data. We accounted for biased sampling effort along sealed and unsealed roads by including an offset for distance to transport infrastructures. The model with the highest statistical support (wAIC c ∼ 1) included all variables except rain, which was highly correlated with elevation. The same model also explained the highest deviance (61.6%), resulted in high R (2)(m) (76.4) and R (2)(c) (81.0), and had a good performance according to Cohen's κ (0.46). Cross-validation error was low (∼ 0.1). Temperature range, elevation, and rain were the best predictors of koala occurrence. Our models predict high habitat suitability in Kangaroo Island, along the Mount Lofty Ranges, and at the tips of the Eyre, Yorke and Fleurieu Peninsulas. In the highest-density region (5576 km(2)) of the Adelaide-Mount Lofty Ranges, a density-suitability relationship predicts a population of 113,704 (95% confidence interval: 27,685-199,723; average density = 5.0-35.8 km(-2)). We demonstrate the power of citizen science data for predicting species

  6. Distribution models for koalas in South Australia using citizen science-collected data

    Science.gov (United States)

    Sequeira, Ana M M; Roetman, Philip E J; Daniels, Christopher B; Baker, Andrew K; Bradshaw, Corey J A

    2014-01-01

    The koala (Phascolarctos cinereus) occurs in the eucalypt forests of eastern and southern Australia and is currently threatened by habitat fragmentation, climate change, sexually transmitted diseases, and low genetic variability throughout most of its range. Using data collected during the Great Koala Count (a 1-day citizen science project in the state of South Australia), we developed generalized linear mixed-effects models to predict habitat suitability across South Australia accounting for potential errors associated with the dataset. We derived spatial environmental predictors for vegetation (based on dominant species of Eucalyptus or other vegetation), topographic water features, rain, elevation, and temperature range. We also included predictors accounting for human disturbance based on transport infrastructure (sealed and unsealed roads). We generated random pseudo-absences to account for the high prevalence bias typical of citizen-collected data. We accounted for biased sampling effort along sealed and unsealed roads by including an offset for distance to transport infrastructures. The model with the highest statistical support (wAICc ∼ 1) included all variables except rain, which was highly correlated with elevation. The same model also explained the highest deviance (61.6%), resulted in high R2(m) (76.4) and R2(c) (81.0), and had a good performance according to Cohen's κ (0.46). Cross-validation error was low (∼ 0.1). Temperature range, elevation, and rain were the best predictors of koala occurrence. Our models predict high habitat suitability in Kangaroo Island, along the Mount Lofty Ranges, and at the tips of the Eyre, Yorke and Fleurieu Peninsulas. In the highest-density region (5576 km2) of the Adelaide–Mount Lofty Ranges, a density–suitability relationship predicts a population of 113,704 (95% confidence interval: 27,685–199,723; average density = 5.0–35.8 km−2). We demonstrate the power of citizen science data for predicting species

  7. Modeling methodology for a CMOS-MEMS electrostatic comb

    Science.gov (United States)

    Iyer, Sitaraman V.; Lakdawala, Hasnain; Mukherjee, Tamal; Fedder, Gary K.

    2002-04-01

    A methodology for combined modeling of capacitance and force 9in a multi-layer electrostatic comb is demonstrated in this paper. Conformal mapping-based analytical methods are limited to 2D symmetric cross-sections and cannot account for charge concentration effects at corners. Vertex capacitance can be more than 30% of the total capacitance in a single-layer 2 micrometers thick comb with 10 micrometers overlap. Furthermore, analytical equations are strictly valid only for perfectly symmetrical finger positions. Fringing and corner effects are likely to be more significant in a multi- layered CMOS-MEMS comb because of the presence of more edges and vertices. Vertical curling of CMOS-MEMS comb fingers may also lead to reduced capacitance and vertical forces. Gyroscopes are particularly sensitive to such undesirable forces, which therefore, need to be well-quantified. In order to address the above issues, a hybrid approach of superposing linear regression models over a set of core analytical models is implemented. Design of experiments is used to obtain data for capacitance and force using a commercial 3D boundary-element solver. Since accurate force values require significantly higher mesh refinement than accurate capacitance, we use numerical derivatives of capacitance values to compute the forces. The model is formulated such that the capacitance and force models use the same regression coefficients. The comb model thus obtained, fits the numerical capacitance data to within +/- 3% and force to within +/- 10%. The model is experimentally verified by measuring capacitance change in a specially designed test structure. The capacitance model matches measurements to within 10%. The comb model is implemented in an Analog Hardware Description Language (ADHL) for use in behavioral simulation of manufacturing variations in a CMOS-MEMS gyroscope.

  8. BUSHFIRE BEHAVIOUR MODELLING USING FARSITE WITH GIS INTEGRATION FOR THE MITCHAM HILLS, SOUTH AUSTRALIA

    Directory of Open Access Journals (Sweden)

    SAAD ALSHARRAH

    2012-11-01

    Full Text Available Bushfire behaviour modelling using FARSITE with GIS integration for the Mitcham Hills, South Australia. Bushfires are now becoming of serious concern as they can have devastating effects on the natural and human ecosystems. An important element of bushfires is fire behaviour. Fire behaviour describes the mode in which a fire reacts to the influences of fuel, weather, topography and fire fighting. In order to understand and predict fire growth and the behaviour of fires, decision makers use fire models to simulate fire behaviour. Fire behaviour modelling can assist forest managers and environmental decision makers in the understanding of how a fire will behave with the influences of environmental factors such as fuels, weather and topography. This study models (spatially and temporally the behaviour of a hypothetical fire for the Mitcham Hills in South Australia using FARSITE (Fire Area Simulator. FARSITE, a two-dimensional deterministic model, takes into account the factors that influence fire behaviour (fuels, weather and topography and simulates the spread and behaviours of fires based on the parameters inputted. Geographic Information Systems (GIS and Remote Sensing (RS techniques were utilised for data preparation and the mapping of parameters that are needed and welcomed by FARSITE. The results are a simulation of spread of fire, fireline intensity, flame length and time of arrival for the area of interest. The simulation confirmed that it can be used for predicting how a fire will spread and how long it will take which can be very beneficial for fire suppression and control and risk assessment.

  9. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  10. Cost savings from a telemedicine model of care in northern Queensland, Australia.

    Science.gov (United States)

    Thaker, Darshit A; Monypenny, Richard; Olver, Ian; Sabesan, Sabe

    2013-09-16

    To conduct a cost analysis of a telemedicine model for cancer care (teleoncology) in northern Queensland, Australia, compared with the usual model of care from the perspective of the Townsville and other participating hospital and health services. Retrospective cost-savings analysis; and a one-way sensitivity analysis performed to test the robustness of findings in net savings. Records of all patients managed by means of teleoncology at the Townsville Cancer Centre (TCC) and its six rural satellite centres in northern Queensland, Australia between 1 March 2007 and 30 November 2011. Costs for set-up and staffing to manage the service, and savings from avoidance of travel expenses for specialist oncologists, patients and their escorts, and for aeromedical retrievals. There were 605 teleoncology consultations with 147 patients over 56 months, at a total cost of $442 276. The cost for project establishment was $36 000, equipment/maintenance was $143 271, and staff was $261 520. The estimated travel expense avoided was $762 394; this figure included the costs of travel for patients and escorts of $658 760, aeromedical retrievals of $52 400 and travel for specialists of $47 634, as well as an estimate of accommodation costs for a proportion of patients of $3600. This resulted in a net saving of $320 118. Costs would have to increase by 72% to negate the savings. The teleoncology model of care at the TCC resulted in net savings, mainly due to avoidance of travel costs. Such savings could be redirected to enhancing rural resources and service capabilities. This teleoncology model is applicable to geographically distant areas requiring lengthy travel.

  11. Health behaviour modelling for prenatal diagnosis in Australia: a geodemographic framework for health service utilisation and policy development

    Directory of Open Access Journals (Sweden)

    Halliday Jane L

    2006-09-01

    Full Text Available Abstract Background Despite the wide availability of prenatal screening and diagnosis, a number of studies have reported no decrease in the rate of babies born with Down syndrome. The objective of this study was to investigate the geodemographic characteristics of women who have prenatal diagnosis in Victoria, Australia, by applying a novel consumer behaviour modelling technique in the analysis of health data. Methods A descriptive analysis of data on all prenatal diagnostic tests, births (1998 and 2002 and births of babies with Down syndrome (1998 to 2002 was undertaken using a Geographic Information System and socioeconomic lifestyle segmentation classifications. Results Most metropolitan women in Victoria have average or above State average levels of uptake of prenatal diagnosis. Inner city women residing in high socioeconomic lifestyle segments who have high rates of prenatal diagnosis spend 20% more on specialist physician's fees when compared to those whose rates are average. Rates of prenatal diagnosis are generally low amongst women in rural Victoria, with the lowest rates observed in farming districts. Reasons for this are likely to be a combination of lack of access to services (remoteness and individual opportunity (lack of transportation, low levels of support and income. However, there are additional reasons for low uptake rates in farming areas that could not be explained by the behaviour modelling. These may relate to women's attitudes and choices. Conclusion A lack of statewide geodemographic consistency in uptake of prenatal diagnosis implies that there is a need to target health professionals and pregnant women in specific areas to ensure there is increased equity of access to services and that all pregnant women can make informed choices that are best for them. Equally as important is appropriate health service provision for families of children with Down syndrome. Our findings show that these potential interventions are

  12. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  14. A spatial simulation model for the dispersal of the bluetongue vector Culicoides brevitarsis in Australia.

    Directory of Open Access Journals (Sweden)

    Joel K Kelso

    Full Text Available The spread of Bluetongue virus (BTV among ruminants is caused by movement of infected host animals or by movement of infected Culicoides midges, the vector of BTV. Biologically plausible models of Culicoides dispersal are necessary for predicting the spread of BTV and are important for planning control and eradication strategies.A spatially-explicit simulation model which captures the two underlying population mechanisms, population dynamics and movement, was developed using extensive data from a trapping program for C. brevitarsis on the east coast of Australia. A realistic midge flight sub-model was developed and the annual incursion and population establishment of C. brevitarsis was simulated. Data from the literature was used to parameterise the model.The model was shown to reproduce the spread of C. brevitarsis southwards along the east Australian coastline in spring, from an endemic population to the north. Such incursions were shown to be reliant on wind-dispersal; Culicoides midge active flight on its own was not capable of achieving known rates of southern spread, nor was re-emergence of southern populations due to overwintering larvae. Data from midge trapping programmes were used to qualitatively validate the resulting simulation model.The model described in this paper is intended to form the vector component of an extended model that will also include BTV transmission. A model of midge movement and population dynamics has been developed in sufficient detail such that the extended model may be used to evaluate the timing and extent of BTV outbreaks. This extended model could then be used as a platform for addressing the effectiveness of spatially targeted vaccination strategies or animal movement bans as BTV spread mitigation measures, or the impact of climate change on the risk and extent of outbreaks. These questions involving incursive Culicoides spread cannot be simply addressed with non-spatial models.

  15. Development and application of a large scale river system model for National Water Accounting in Australia

    Science.gov (United States)

    Dutta, Dushmanta; Vaze, Jai; Kim, Shaun; Hughes, Justin; Yang, Ang; Teng, Jin; Lerat, Julien

    2017-04-01

    Existing global and continental scale river models, mainly designed for integrating with global climate models, are of very coarse spatial resolutions and lack many important hydrological processes, such as overbank flow, irrigation diversion, groundwater seepage/recharge, which operate at a much finer resolution. Thus, these models are not suitable for producing water accounts, which have become increasingly important for water resources planning and management at regional and national scales. A continental scale river system model called Australian Water Resource Assessment River System model (AWRA-R) has been developed and implemented for national water accounting in Australia using a node-link architecture. The model includes major hydrological processes, anthropogenic water utilisation and storage routing that influence the streamflow in both regulated and unregulated river systems. Two key components of the model are an irrigation model to compute water diversion for irrigation use and associated fluxes and stores and a storage-based floodplain inundation model to compute overbank flow from river to floodplain and associated floodplain fluxes and stores. The results in the Murray-Darling Basin shows highly satisfactory performance of the model with median daily Nash-Sutcliffe Efficiency (NSE) of 0.64 and median annual bias of less than 1% for the period of calibration (1970-1991) and median daily NSE of 0.69 and median annual bias of 12% for validation period (1992-2014). The results have demonstrated that the performance of the model is less satisfactory when the key processes such as overbank flow, groundwater seepage and irrigation diversion are switched off. The AWRA-R model, which has been operationalised by the Australian Bureau of Meteorology for continental scale water accounting, has contributed to improvements in the national water account by substantially reducing accounted different volume (gain/loss).

  16. Asia Pacific retirement: Models for Australia, Fiji, Malaysia, Philippines and Republic of Korea.

    Science.gov (United States)

    McCallum, J

    1992-01-01

    Survey data from Australia, Fiji, Malaysia, Philippines, and the Republic of Korea are used to model older workers' choices. The co-existence of a traditional sector along with a modern sector in much of the Asia Pacific region offers a traditional family lifestyle, as well as paid work and retirement choices. Differences are analyzed between countries, by expanding choices to include traditional family support, and within countries by use of ethnic group dummies along with economic factors. Results demonstrate the importance of cultural and developmental factors within and between countries. There is less dependency on family in more developed countries but inverse effects for wealthy persons. Wealthier households in more developed countries depend upon income from their own work while in developing countries they depend on families. Women in the developing countries work whilst those in developed countries tend to retire with their husbands to share retirement leisure.

  17. Culture Clash? Investigating constructions of sexual and reproductive health from the perspective of 1.5 generation migrants in Australia using Q methodology.

    Science.gov (United States)

    Dune, T; Perz, J; Mengesha, Z; Ayika, D

    2017-04-04

    In Australia, those who migrate as children or adolescents (1.5 generation migrants) may have entered a new cultural environment at a crucial time in their psychosexual development. These migrants may have to contend with constructions of sexual and reproductive health from at least two cultures which may be at conflict on the matter. This study was designed to investigate the role of culture in constructions of sexual and reproductive health and health care seeking behaviour from the perspective of 1.5 generation migrants. Forty-two adults from various ethno-cultural backgrounds took part in this Q methodological study. Online, participants rank-ordered forty-two statements about constructions of sexual and reproductive health and health seeking behaviours based on the level to which they agreed or disagreed with them. Participants then answered a series of questions about the extent to which their ethnic/cultural affiliations influenced their identity. A by-person factor analysis was then conducted, with factors extracted using the centroid technique and a varimax rotation. A seven-factor solution provided the best conceptual fit for constructions of sexual and reproductive health and help-seeking. Factor A compared progressive and traditional sexual and reproductive health values. Factor B highlighted migrants' experiences through two cultural lenses. Factor C explored migrant understandings of sexual and reproductive health in the context of culture. Factor D explained the role of culture in migrants' intimate relationships, beliefs about migrant sexual and reproductive health and engagement of health care services. Factor E described the impact of culture on sexual and reproductive health related behaviour. Factor F presented the messages migrant youth are given about sexual and reproductive health. Lastly, Factor G compared constructions of sexual and reproductive health across cultures. This study has demonstrated that when the cultural norms of migrants

  18. A methodology for ecosystem-scale modeling of selenium

    Science.gov (United States)

    Presser, T.S.; Luoma, S.N.

    2010-01-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  19. Short-term droughts forecast using Markov chain model in Victoria, Australia

    Science.gov (United States)

    Rahmat, Siti Nazahiyah; Jayasuriya, Niranjali; Bhuiyan, Muhammed A.

    2017-07-01

    A comprehensive risk management strategy for dealing with drought should include both short-term and long-term planning. The objective of this paper is to present an early warning method to forecast drought using the Standardised Precipitation Index (SPI) and a non-homogeneous Markov chain model. A model such as this is useful for short-term planning. The developed method has been used to forecast droughts at a number of meteorological monitoring stations that have been regionalised into six (6) homogenous clusters with similar drought characteristics based on SPI. The non-homogeneous Markov chain model was used to estimate drought probabilities and drought predictions up to 3 months ahead. The drought severity classes defined using the SPI were computed at a 12-month time scale. The drought probabilities and the predictions were computed for six clusters that depict similar drought characteristics in Victoria, Australia. Overall, the drought severity class predicted was quite similar for all the clusters, with the non-drought class probabilities ranging from 49 to 57 %. For all clusters, the near normal class had a probability of occurrence varying from 27 to 38 %. For the more moderate and severe classes, the probabilities ranged from 2 to 13 % and 3 to 1 %, respectively. The developed model predicted drought situations 1 month ahead reasonably well. However, 2 and 3 months ahead predictions should be used with caution until the models are developed further.

  20. Models of psychological service provision under Australia's Better Outcomes in Mental Health Care program.

    Science.gov (United States)

    Pirkis, Jane; Burgess, Philip; Kohn, Fay; Morley, Belinda; Blashki, Grant; Naccarella, Lucio

    2006-08-01

    The Access to Allied Psychological Services component of Australia's Better Outcomes in Mental Health Care program enables eligible general practitioners to refer consumers to allied health professionals for affordable, evidence-based mental health care, via 108 projects conducted by Divisions of General Practice. The current study profiled the models of service delivery across these projects, and examined whether particular models were associated with differential levels of access to services. We found: 76% of projects were retaining their allied health professionals under contract, 28% via direct employment, and 7% some other way; Allied health professionals were providing services from GPs' rooms in 63% of projects, from their own rooms in 63%, from a third location in 42%; and The referral mechanism of choice was direct referral in 51% of projects, a voucher system in 27%, a brokerage system in 24%, and a register system in 25%. Many of these models were being used in combination. No model was predictive of differential levels of access, suggesting that the approach of adapting models to the local context is proving successful.

  1. SR-Site groundwater flow modelling methodology, setup and results

    International Nuclear Information System (INIS)

    Selroos, Jan-Olof; Follin, Sven

    2010-12-01

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report

  2. Development of a General Modelling Methodology for Vacuum Residue Hydroconversion

    Directory of Open Access Journals (Sweden)

    Pereira de Oliveira L.

    2013-11-01

    Full Text Available This work concerns the development of a methodology for kinetic modelling of refining processes, and more specifically for vacuum residue conversion. The proposed approach allows to overcome the lack of molecular detail of the petroleum fractions and to simulate the transformation of the feedstock molecules into effluent molecules by means of a two-step procedure. In the first step, a synthetic mixture of molecules representing the feedstock for the process is generated via a molecular reconstruction method, termed SR-REM molecular reconstruction. In the second step, a kinetic Monte-Carlo method (kMC is used to simulate the conversion reactions on this mixture of molecules. The molecular reconstruction was applied to several petroleum residues and is illustrated for an Athabasca (Canada vacuum residue. The kinetic Monte-Carlo method is then described in detail. In order to validate this stochastic approach, a lumped deterministic model for vacuum residue conversion was simulated using Gillespie’s Stochastic Simulation Algorithm. Despite the fact that both approaches are based on very different hypotheses, the stochastic simulation algorithm simulates the conversion reactions with the same accuracy as the deterministic approach. The full-scale stochastic simulation approach using molecular-level reaction pathways provides high amounts of detail on the effluent composition and is briefly illustrated for Athabasca VR hydrocracking.

  3. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))

    2010-12-15

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  4. A methodology for the assessment of rehabilitation success of post mining landscapes-sediment and radionuclide transport at the former Nabarlek uranium mine, Northern Territory, Australia

    International Nuclear Information System (INIS)

    Hancock, G.R.; Grabham, M.K.; Martin, P; Evans, K.G.; Bollhoefer, A.

    2006-01-01

    Protection of the environment post-mining is an important issue, especially where runoff and erosion can lead to undesirable material leaving post-mining landscapes and contaminating surrounding land and watercourses. Methods for assessment of the environmental impact and long-term behaviour of post-mining landforms based on scientific methodology are needed especially where field data are absent or poor. An appraisal of the former Nabarlek uranium mine was conducted to assess the site from a soil erosion perspective as part of an independent evaluation of overall rehabilitation success. Determination of the gross erosion occurring, sediment discharge to Cooper Creek and the resultant sediment associated radionuclide load in Cooper Creek were the primary objectives of the study. These objectives were achieved through the application of several models using parameter values collected from the site. The study found that the area containing the mill tailings repository is extremely stable and meets the guidelines established for long-term storage of uranium mill tailings. Most other areas on the site are stable; however there are some areas with a high sediment loss. Sediment concentration in Cooper Creek, which drains the site, was found to be within the Australian water quality guidelines for fresh water, however sediment concentrations in tributaries were found to exceed recommended levels. Radionuclide determinations on soil samples showed that the highest specific activities (Bq kg -1 ) were present on a small (0.44 ha) area with a relatively high erosion rate. This small area contributed the majority of the estimated flux to Cooper Creek of uranium-series radionuclides sorbed or structurally incorporated to eroded soil particles sourced from the mine site. This study provides a methodology for assessment of the erosional stability of such a landscape and consequent impact on water quality, using extensive field data and readily available and well known models

  5. "Grey nomads" in Australia: are they a good model for successful aging and health?

    Science.gov (United States)

    Higgs, Paul F D; Quirk, Frances

    2007-10-01

    Lifestyle factors have been identified as being very important in determining health in later life. Nutrition, exercise, and social environment all interact to promote, or to limit, opportunities for an active and healthy post-working life. Not only are rates of chronic illness and disability reduced through the promotion of healthy lifestyles, but also quality of life is maintained through the compression of morbidity. Governments in Australia, as in the European Union and North America, have highlighted the importance of behavioral change in health promotion strategies with the aim of having an impact on the health-related lifestyles of their populations. This paper examines the example of a group of older Australians, the "grey nomads," who may present opportunities for examining health-related lifestyle changes. The term grey nomad refers to a portion of the older population in Australia who choose to use their later years and retirement as opportunities for travel and leisure, mainly within the confines of the Australian continent. As such, they are similar to groups in North America, such as the "snow birds," who travel to the southern United States to escape the colder winters of more northerly latitudes. Similar seasonal migrations occur from Northern to Southern Europe. What all share in common is an active culture/lifestyle of attempting to "age successfully." Grey nomads also participate in the creation of what can be termed postmodern communities, where they and other regular travelers may develop a sense of community feeling with others who are also regularly returning to the same spot year after year. Social support is highly predictive of health outcomes and such mobile communities may prove a positive factor in promoting good health. In this paper we examine whether the "grey nomads" represent a good model for improving health-related lifestyles in later life.

  6. Methodology for assessing electric vehicle charging infrastructure business models

    International Nuclear Information System (INIS)

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, which allows them to recover their costs while, at the same time, offer EV users a charging price which makes electro-mobility comparable to internal combustion engine vehicles. For that purpose, three scenarios are defined, which present different EV charging alternatives, in terms of charging power and charging station ownership and accessibility. A case study is presented for each scenario and the required charging station usage to have a profitable business model is calculated. We demonstrate that private home charging is likely to be the preferred option for EV users who can charge at home, as it offers a lower total cost of ownership under certain conditions, even today. On the contrary, finding a profitable business case for fast charging requires more intensive infrastructure usage. - Highlights: • Ecosystem is a network of actors who collaborate to create a positive business case. • Electro-mobility (electricity-powered road vehicles and ICT) is a complex ecosystem. • Methodological analysis to ensure that all actors benefit from electro-mobility. • Economic analysis of charging infrastructure deployment linked to its usage. • Comparison of EV ownership cost vs. ICE for vehicle users.

  7. Optimisation modelling to assess cost of dietary improvement in remote Aboriginal Australia.

    Directory of Open Access Journals (Sweden)

    Julie Brimblecombe

    Full Text Available The cost and dietary choices required to fulfil nutrient recommendations defined nationally, need investigation, particularly for disadvantaged populations.We used optimisation modelling to examine the dietary change required to achieve nutrient requirements at minimum cost for an Aboriginal population in remote Australia, using where possible minimally-processed whole foods.A twelve month cross-section of population-level purchased food, food price and nutrient content data was used as the baseline. Relative amounts from 34 food group categories were varied to achieve specific energy and nutrient density goals at minimum cost while meeting model constraints intended to minimise deviation from the purchased diet.Simultaneous achievement of all nutrient goals was not feasible. The two most successful models (A & B met all nutrient targets except sodium (146.2% and 148.9% of the respective target and saturated fat (12.0% and 11.7% of energy. Model A was achieved with 3.2% lower cost than the baseline diet (which cost approximately AUD$13.01/person/day and Model B at 7.8% lower cost but with a reduction in energy of 4.4%. Both models required very large reductions in sugar sweetened beverages (-90% and refined cereals (-90% and an approximate four-fold increase in vegetables, fruit, dairy foods, eggs, fish and seafood, and wholegrain cereals.This modelling approach suggested population level dietary recommendations at minimal cost based on the baseline purchased diet. Large shifts in diet in remote Aboriginal Australian populations are needed to achieve national nutrient targets. The modeling approach used was not able to meet all nutrient targets at less than current food expenditure.

  8. Optimisation modelling to assess cost of dietary improvement in remote Aboriginal Australia.

    Science.gov (United States)

    Brimblecombe, Julie; Ferguson, Megan; Liberato, Selma C; O'Dea, Kerin; Riley, Malcolm

    2013-01-01

    The cost and dietary choices required to fulfil nutrient recommendations defined nationally, need investigation, particularly for disadvantaged populations. We used optimisation modelling to examine the dietary change required to achieve nutrient requirements at minimum cost for an Aboriginal population in remote Australia, using where possible minimally-processed whole foods. A twelve month cross-section of population-level purchased food, food price and nutrient content data was used as the baseline. Relative amounts from 34 food group categories were varied to achieve specific energy and nutrient density goals at minimum cost while meeting model constraints intended to minimise deviation from the purchased diet. Simultaneous achievement of all nutrient goals was not feasible. The two most successful models (A & B) met all nutrient targets except sodium (146.2% and 148.9% of the respective target) and saturated fat (12.0% and 11.7% of energy). Model A was achieved with 3.2% lower cost than the baseline diet (which cost approximately AUD$13.01/person/day) and Model B at 7.8% lower cost but with a reduction in energy of 4.4%. Both models required very large reductions in sugar sweetened beverages (-90%) and refined cereals (-90%) and an approximate four-fold increase in vegetables, fruit, dairy foods, eggs, fish and seafood, and wholegrain cereals. This modelling approach suggested population level dietary recommendations at minimal cost based on the baseline purchased diet. Large shifts in diet in remote Aboriginal Australian populations are needed to achieve national nutrient targets. The modeling approach used was not able to meet all nutrient targets at less than current food expenditure.

  9. A methodology model for quality management in a general hospital.

    Science.gov (United States)

    Stern, Z; Naveh, E

    1997-01-01

    A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.

  10. Modeling myocardial infarction in mice: methodology, monitoring, pathomorphology.

    Science.gov (United States)

    Ovsepyan, A A; Panchenkov, D N; Prokhortchouk, E B; Telegin, G B; Zhigalova, N A; Golubev, E P; Sviridova, T E; Matskeplishvili, S T; Skryabin, K G; Buziashvili, U I

    2011-01-01

    Myocardial infarction is one of the most serious and widespread diseases in the world. In this work, a minimally invasive method for simulating myocardial infarction in mice is described in the Russian Federation for the very first time; the procedure is carried out by ligation of the coronary heart artery or by controlled electrocoagulation. As a part of the methodology, a series of anesthetic, microsurgical and revival protocols are designed, owing to which a decrease in the postoperational mortality from the initial 94.6 to 13.6% is achieved. ECG confirms the development of large-focal or surface myocardial infarction. Postmortal histological examination confirms the presence of necrosis foci in the heart muscles of 87.5% of animals. Altogether, the medical data allow us to conclude that an adequate mouse model for myocardial infarction was generated. A further study is focused on the standardization of the experimental procedure and the use of genetically modified mouse strains, with the purpose of finding the most efficient therapeutic approaches for this disease.

  11. Temporal modelling of ballast water discharge and ship-mediated invasion risk to Australia.

    Science.gov (United States)

    Cope, Robert C; Prowse, Thomas A A; Ross, Joshua V; Wittmann, Talia A; Cassey, Phillip

    2015-04-01

    Biological invasions have the potential to cause extensive ecological and economic damage. Maritime trade facilitates biological invasions by transferring species in ballast water, and on ships' hulls. With volumes of maritime trade increasing globally, efforts to prevent these biological invasions are of significant importance. Both the International Maritime Organization and the Australian government have developed policy seeking to reduce the risk of these invasions. In this study, we constructed models for the transfer of ballast water into Australian waters, based on historic ballast survey data. We used these models to hindcast ballast water discharge over all vessels that arrived in Australian waters between 1999 and 2012. We used models for propagule survival to compare the risk of ballast-mediated propagule transport between ecoregions. We found that total annual ballast discharge volume into Australia more than doubled over the study period, with the vast majority of ballast water discharge and propagule pressure associated with bulk carrier traffic. As such, the ecoregions suffering the greatest risk are those associated with the export of mining commodities. As global marine trade continues to increase, effective monitoring and biosecurity policy will remain necessary to combat the risk of future marine invasion events.

  12. An equivalent layer magnetization model for Australia based on Magsat data

    Science.gov (United States)

    Mayhew, M. A.; Johnson, B. D.

    1987-01-01

    An equivalent layer magnetization model for Australia and adjacent oceanic areas is presented. The model is obtained by linear inversion of Magsat anomaly data measured in the altitude range 325-550 km. The anomaly data set has been isolated from the raw data set by use of models of the core field and very long wavelength external fields, and is internally consistent. Certain major structural features of the Australian continent are geographically associated with magnetization anomalies. A first-order difference is seen between the Tasman Zone and the Precambrian cratonic areas: magnetization anomalies are much more subdued in the former, possibly reflecting a shallowing of the Curie isotherm within the crust. A profile of the vertical integral of magnetization is presented for a crustal section extending from the Gawler Block to the southeast coast. It is shown that the magnetization variations are probably due partly, but not wholly, to depth to Curie isotherm variations; gross magnetization variations among at least three distinct crustal units must be involved.

  13. The value of model averaging and dynamical climate model predictions for improving statistical seasonal streamflow forecasts over Australia

    Science.gov (United States)

    Pokhrel, Prafulla; Wang, Q. J.; Robertson, David E.

    2013-10-01

    Seasonal streamflow forecasts are valuable for planning and allocation of water resources. In Australia, the Bureau of Meteorology employs a statistical method to forecast seasonal streamflows. The method uses predictors that are related to catchment wetness at the start of a forecast period and to climate during the forecast period. For the latter, a predictor is selected among a number of lagged climate indices as candidates to give the "best" model in terms of model performance in cross validation. This study investigates two strategies for further improvement in seasonal streamflow forecasts. The first is to combine, through Bayesian model averaging, multiple candidate models with different lagged climate indices as predictors, to take advantage of different predictive strengths of the multiple models. The second strategy is to introduce additional candidate models, using rainfall and sea surface temperature predictions from a global climate model as predictors. This is to take advantage of the direct simulations of various dynamic processes. The results show that combining forecasts from multiple statistical models generally yields more skillful forecasts than using only the best model and appears to moderate the worst forecast errors. The use of rainfall predictions from the dynamical climate model marginally improves the streamflow forecasts when viewed over all the study catchments and seasons, but the use of sea surface temperature predictions provide little additional benefit.

  14. A review of the literature to inform a best-practice clinical supervision model for midwifery students in Australia.

    Science.gov (United States)

    McKellar, Lois; Graham, Kristen

    2017-05-01

    Effective clinical supervision in midwifery programs leading to registration is essential to ensure that students can provide safe and competent woman centred care by the completion of their program. A number of different clinical supervision models exist in Australia and internationally, with varying levels of support and facilitation of student learning opportunities. In Australia, midwifery students must achieve specified learning outcomes and midwifery practice requirements to be eligible to register as a midwife. Identifying a best practice clinical supervision model for Australian midwifery students is therefore a priority for all key stakeholders, particularly education and maternity care providers. The aim of this literature review was to explore different types of clinical supervision models in order to develop and implement a best practice model in midwifery education programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  16. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  17. Discussion on the Implementation of the Patient Centred Medical Home model - Experiences from Australia

    Directory of Open Access Journals (Sweden)

    Safa Majidi Rahbar

    2017-07-01

    Full Text Available Introduction: Different practitioners and academics have been working on the application of the Patient Centred Medical Home (PCMH model within the Australian context for many years. In early 2016, the Commonwealth government of Australia announced plans to establish Health Care Homes throughout the country based off the PCMH model, beginning with trial sites focused on the bundling of payments. As a result, the number of Primary Health Networks, policy makers and general practices receptive to establishing Health Care Homes is growing rapidly. The time is ripe to identify how best the elements of the model translate into the Australian context and how to implement its elements with success. As a contribution to the opportunity for a widespread implementation, the North Coast Primary Health Network is engaged in a project to build capacity in general practices to transition into Health Care Homes. The main outcomes of this project include: 1. Preparing “The Australian Handbook for Transitioning to Health Care Homes” A resource which will provide a rationale for transitioning to a HCH, milestones for transitioning along a continuum and tools for practice and practice support for establishing the model in general practice. Thus developing capacity to train ‘change facilitators’ to work to accompany transitioning practices. 2. Establishment of a National Network of Patient Centred HCH Collaborators Made up of PHN representatives, experts and policy makers working in the PCMH development space. Focused on improving advocacy effectiveness, knowledge sharing and keeping stakeholders up to date with unfolding developments. 3. Increasing local preparedness and interest for establishing HCHs Focused on propagation of development of interest locally for transitioning practices into HCHs. A local network of practitioners and collaborators informed of project updates and HCH learning and development opportunities in the region. 4. Local trial and

  18. Risk methodology for geologic disposal of radioactive waste: model description and user manual for Pathways model

    International Nuclear Information System (INIS)

    Helton, J.C.; Kaestner, P.C.

    1981-03-01

    A model for the environmental movement and human uptake of radionuclides is presented. This model is designated the Pathways-to-Man Model and was developed as part of a project funded by the Nuclear Regulatory Commission to design a methodology to assess the risk associated with the geologic disposal of high-level radioactive waste. The Pathways-to-Man Model is divided into two submodels. One of these, the Environmental Transport Model, represents the long-term distribution and accumulation of radionuclides in the environment. This model is based on a mixed-cell approach and describes radionuclide movement with a system of linear differential equations. The other, the Transport-to-Man Model, represents the movement of radionuclides from the environment to man. This model is based on concentration ratios. General descriptions of these models are provided in this report. Further, documentation is provided for the computer program which implements the Pathways Model

  19. A proterozoic tectonic model for northern Australia and its economic implications

    International Nuclear Information System (INIS)

    Rossiter, A.G.; Ferguson, J.

    1980-01-01

    It is argued that at the end of Archaean time the Australian continent was confined to the area now occupied by the Yilgarn, Pilbara, Gawler, and Musgrave Blocks, and the southern part of the Arunta Block. During the Early Proterozoic, sedimentary and volcanic rocks were laid down in an extensive depositional zone trending roughly east-west along the northern margin of the Archaean continent. Copper and gold mineralization, commonly showing stratigraphic control, is widespread in this belt. Following deformation and metamorphism of the Early Proterozoic rocks, felsic and mafic igneous activity, and accumulation of platform sediments on the newly stabilized crust, a predominantly north-south depositional zone developed along the eastern margin of the continent during the Middle Proterozoic. Lead and zinc assume much more importance in the mineral deposits of this belt. It is postulated that the present positions of rocks of the Pine Creek and Georgetown regions are due to horizontal displacements of several hundred kilometres along major fault zones. Apparent rifting of these blocks away from palaeo-continental margins may be related to the occurrence of uraniferous granitic rocks and uranium mineralization within them via a mantle plume mechanism. Although current data are limited, tectonic environments suggested for Proterozoic mafic igneous rocks of northern Australia by their geochemistry are compatible with the geological settings of these rocks and with the tectonic model put forward. (author)

  20. An Assessment of the Effectiveness of Tree-Based Models for Multi-Variate Flood Damage Assessment in Australia

    Directory of Open Access Journals (Sweden)

    Roozbeh Hasanzadeh Nafari

    2016-07-01

    Full Text Available Flood is a frequent natural hazard that has significant financial consequences for Australia. In Australia, physical losses caused by floods are commonly estimated by stage-damage functions. These methods usually consider only the depth of the water and the type of buildings at risk. However, flood damage is a complicated process, and it is dependent on a variety of factors which are rarely taken into account. This study explores the interaction, importance, and influence of water depth, flow velocity, water contamination, precautionary measures, emergency measures, flood experience, floor area, building value, building quality, and socioeconomic status. The study uses tree-based models (regression trees and bagging decision trees and a dataset collected from 2012 to 2013 flood events in Queensland, which includes information on structural damages, impact parameters, and resistance variables. The tree-based approaches show water depth, floor area, precautionary measures, building value, and building quality to be important damage-influencing parameters. Furthermore, the performance of the tree-based models is validated and contrasted with the outcomes of a multi-parameter loss function (FLFArs from Australia. The tree-based models are shown to be more accurate than the stage-damage function. Consequently, considering more parameters and taking advantage of tree-based models is recommended. The outcome is important for improving established Australian flood loss models and assisting decision-makers and insurance companies dealing with flood risk assessment.

  1. Modelling the costs and consequences of treating paediatric faecal impaction in Australia.

    Science.gov (United States)

    Guest, Julian F; Clegg, John P

    2006-01-01

    To compare the costs and consequences of using oral macrogol 3350 plus electrolytes (macrogol 3350; Movicol) compared to enemas/suppositories, manual evacuation and naso-gastric administration of macrogol (NGA-PEG) lavage solution in treating paediatric faecal impaction in Australia. A decision model was constructed using published clinical outcomes, utilities and clinician-derived resource utilisation estimates. The model was used to determine the expected Commonwealth and parent costs associated with each treatment over the period of disimpaction and 12 weeks post-disimpaction, in Australian dollars at 2003/2004 prices. 92% of oral macrogol 3350-treated patients are expected to be disimpacted within 6 days following initial treatment, compared with 79% of patients treated with enemas and suppositories who are expected to be disimpacted within 8 days. All patients are expected to be disimpacted within 5 days following a manual evacuation and within 2 days following NGA-PEG. The level of health gain at 12 weeks post-disimpaction irrespective of treatment for disimpaction and subsequent maintenance is expected to be the same; the expected quality-adjusted life years (QALYs) being 0.20 (95% CI: 0.17; 0.23). Starting treatment with oral macrogol 3350 in an outpatient setting is expected to lead to a Commonwealth cost of $758, compared to $1838 with NGA-PEG, $2125 with enemas and suppositories, $3931 with oral macrogol 3350 in an inpatient setting and $4478 with manual evacuation. Resource use associated with maintenance following initial disimpaction is expected to be broadly similar, irrespective of initial laxative. Hence, the expected Commonwealth cost is primarily affected by the treatment used to initially disimpact a patient. Expected parents' costs are expected to be comparable irrespective of treatment ranging from $89 to $112 per patient. Within the limitations of our model, using oral macrogol 3350 in an outpatient setting for treating faecally impacted

  2. Weather-Driven Variation in Dengue Activity in Australia Examined Using a Process-Based Modeling Approach

    Science.gov (United States)

    Bannister-Tyrrell, Melanie; Williams, Craig; Ritchie, Scott A.; Rau, Gina; Lindesay, Janette; Mercer, Geoff; Harley, David

    2013-01-01

    The impact of weather variation on dengue transmission in Cairns, Australia, was determined by applying a process-based dengue simulation model (DENSiM) that incorporated local meteorologic, entomologic, and demographic data. Analysis showed that inter-annual weather variation is one of the significant determinants of dengue outbreak receptivity. Cross-correlation analyses showed that DENSiM simulated epidemics of similar relative magnitude and timing to those historically recorded in reported dengue cases in Cairns during 1991–2009, (r = 0.372, P < 0.01). The DENSiM model can now be used to study the potential impacts of future climate change on dengue transmission. Understanding the impact of climate variation on the geographic range, seasonality, and magnitude of dengue transmission will enhance development of adaptation strategies to minimize future disease burden in Australia. PMID:23166197

  3. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Simulating endosulfan transport in runoff from cotton fields in Australia with the GLEAMS model.

    Science.gov (United States)

    Connolly, R D; Kennedy, I R; Silburn, D M; Simpson, B W; Freebairn, D M

    2001-01-01

    Endosulfan (6,7,8,9,10,10-hexachloro-1,5,5a,6,9,9a-hexahydro-6,9methano-2,4,3-benzodioxathiepin 3-oxide), a pesticide that is highly toxic to aquatic organisms, is widely used in the cotton (Gossypium hirsutum L.) industry in Australia and is a risk to the downstream riverine environment. We used the GLEAMS model to evaluate the effectiveness of a range of management scenarios aimed at minimizing endosulfan transport in runoff at the field scale. The field management scenarios simulated were (i) Conventional, bare soil at the beginning of the cotton season and seven irrigations per season; (ii) Improved Irrigation, irrigation amounts reduced and frequency increased to reduce runoff from excess irrigation; (iii) Dryland, no irrigation; (iv) Stubble Retained, increased soil cover created by retaining residue from the previous crop or a specially planted winter cover crop; and (v) Reduced Sprays, a fewer number of sprays. Stubble Retained was the most effective scenario for minimizing endosulfan transport because infiltration was increased and erosion reduced, and the stubble intercepted and neutralized a proportion of the applied endosulfan. Reducing excess irrigation reduced annual export rates by 80 to 90%, but transport in larger storm events was still high. Reducing the number of pesticide applications only reduced transport when three or fewer sprays were applied. We conclude that endosulfan transport from cotton farms can be minimized with a combination of field management practices that reduce excess irrigation and concentration of pesticide on the soil at any point in time; however, discharges, probably with endosulfan concentrations exceeding guideline values, will still occur in storm events.

  5. A methodology for the parametric modelling of the flow coefficients and flow rate in hydraulic valves

    International Nuclear Information System (INIS)

    Valdés, José R.; Rodríguez, José M.; Saumell, Javier; Pütz, Thomas

    2014-01-01

    Highlights: • We develop a methodology for the parametric modelling of flow in hydraulic valves. • We characterize the flow coefficients with a generic function with two parameters. • The parameters are derived from CFD simulations of the generic geometry. • We apply the methodology to two cases from the automotive brake industry. • We validate by comparing with CFD results varying the original dimensions. - Abstract: The main objective of this work is to develop a methodology for the parametric modelling of the flow rate in hydraulic valve systems. This methodology is based on the derivation, from CFD simulations, of the flow coefficient of the critical restrictions as a function of the Reynolds number, using a generalized square root function with two parameters. The methodology is then demonstrated by applying it to two completely different hydraulic systems: a brake master cylinder and an ABS valve. This type of parametric valve models facilitates their implementation in dynamic simulation models of complex hydraulic systems

  6. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development.

    Science.gov (United States)

    Tøndel, Kristin; Niederer, Steven A; Land, Sander; Smith, Nicolas P

    2014-05-20

    Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input-output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on

  7. Methodology for estimating soil carbon for the forest carbon budget model of the United States, 2001

    Science.gov (United States)

    L. S. Heath; R. A. Birdsey; D. W. Williams

    2002-01-01

    The largest carbon (C) pool in United States forests is the soil C pool. We present methodology and soil C pool estimates used in the FORCARB model, which estimates and projects forest carbon budgets for the United States. The methodology balances knowledge, uncertainties, and ease of use. The estimates are calculated using the USDA Natural Resources Conservation...

  8. Methodology for Designing Models Predicting Success of Infertility Treatment

    OpenAIRE

    Alireza Zarinara; Mohammad Mahdi Akhondi; Hojjat Zeraati; Koorsh Kamali; Kazem Mohammad

    2016-01-01

    Abstract Background: The prediction models for infertility treatment success have presented since 25 years ago. There are scientific principles for designing and applying the prediction models that is also used to predict the success rate of infertility treatment. The purpose of this study is to provide basic principles for designing the model to predic infertility treatment success. Materials and Methods: In this paper, the principles for developing predictive models are explained and...

  9. Evolving electrical SCLM models of the Australian continent - results of the South Australia AusLAMP deployment

    Science.gov (United States)

    Robertson, K. E.; Thiel, S.; Heinson, G. S.

    2017-12-01

    The Australian Lithospheric Architecture Magnetotelluric Project (AusLAMP) is an Australian initiative to map the Australian continental lithosphere using magnetotelluric (MT) stations to obtain a resistivity model of the subsurface. It is a joint project between Geoscience Australia, state surveys, and Universities. We present new MT 3D inversion results of the largest coherent array of the AusLAMP MT deployments to date covering two-thirds of South Australia, funded largely by the Geological Survey of South Australia with additional funding by Geoscience Australia and The University of Adelaide. The model extends across the South Australian Gawler Craton, including the Eucla Basin to the west of the craton and the Flinders Ranges and Curnamona Province to the east. The MT array covers parts of the Australian lithosphere, which has been largely unexplored with seismic tomography methods and provide a unique insight into the tectonic evolution of the continent. We incorporate 284 long-period (10s-10,000s) MT stations separated roughly every half degree latitude and longitude across an area spanning 1200 km x 800 km, south of latitude -28.5 degrees and from longitude 129 degrees to 141 degrees. We invert 24 discrete periods of the impedance tenor between 7 s and 13,000 s, and 22 different periods of the tipper data between 7s-8000 s period. The results show a heterogeneous lower crust and mantle lithosphere with a primarily resistive mantle (>1000 Ωm) lithosphere in the central and western part of the Gawler Craton and Eucla Domain. The model shows a generally NS oriented electric LAB offset from deeper cratonic lithosphere in the west to a shallow lithosphere along the eastern margin of the Gawler Craton extending further east towards the Proterozoic and Phanerozoic eastern part of Australia. The lower crust is generally resistive with elongated lower crustal conductivity anomalies, which are associated with major translithospheric shear zones likely existent

  10. A generic methodology for developing fuzzy decision models

    NARCIS (Netherlands)

    Bosma, R.; Berg, van den J.; Kaymak, U.; Udo, H.; Verreth, J.

    2012-01-01

    An important paradigm in decision-making models is utility-maximization where most models do not include actors’ motives. Fuzzy set theory on the other hand offers a method to simulate human decisionmaking. However, the literature describing expert-driven fuzzy logic models, rarely gives precise

  11. A generic methodology for developing fuzzy decision models

    NARCIS (Netherlands)

    Bosma, R.H.; Berg, van den J.; Kaymak, Uzay; Udo, H.M.J.; Verreth, J.A.J.

    2012-01-01

    An important paradigm in decision-making models is utility-maximization where most models do not include actors’ motives. Fuzzy set theory on the other hand offers a method to simulate human decision-making. However, the literature describing expert-driven fuzzy logic models, rarely gives precise

  12. A methodology for constructing the calculation model of scientific spreadsheets

    NARCIS (Netherlands)

    Vos, de M.; Wielemaker, J.; Schreiber, G.; Wielinga, B.; Top, J.L.

    2015-01-01

    Spreadsheets models are frequently used by scientists to analyze research data. These models are typically described in a paper or a report, which serves as single source of information on the underlying research project. As the calculation workflow in these models is not made explicit, readers are

  13. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  14. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  15. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  16. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  17. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    Science.gov (United States)

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Climate Modelling Shows Increased Risk to Eucalyptus sideroxylon on the Eastern Coast of Australia Compared to Eucalyptus albens

    Directory of Open Access Journals (Sweden)

    Farzin Shabani

    2017-11-01

    Full Text Available Aim: To identify the extent and direction of range shift of Eucalyptus sideroxylon and E. albens in Australia by 2050 through an ensemble forecast of four species distribution models (SDMs. Each was generated using four global climate models (GCMs, under two representative concentration pathways (RCPs. Location: Australia. Methods: We used four SDMs of (i generalized linear model, (ii MaxEnt, (iii random forest, and (iv boosted regression tree to construct SDMs for species E. sideroxylon and E. albens under four GCMs including (a MRI-CGCM3, (b MIROC5, (c HadGEM2-AO and (d CCSM4, under two RCPs of 4.5 and 6.0. Here, the true skill statistic (TSS index was used to assess the accuracy of each SDM. Results: Results showed that E. albens and E. sideroxylon will lose large areas of their current suitable range by 2050 and E. sideroxylon is projected to gain in eastern and southeastern Australia. Some areas were also projected to remain suitable for each species between now and 2050. Our modelling showed that E. sideroxylon will lose suitable habitat on the western side and will not gain any on the eastern side because this region is one the most heavily populated areas in the country, and the populated areas are moving westward. The predicted decrease in E. sideroxylon’s distribution suggests that land managers should monitor its population closely, and evaluate whether it meets criteria for a protected legal status. Main conclusions: Both Eucalyptus sideroxylon and E. albens will be negatively affected by climate change and it is projected that E. sideroxylon will be at greater risk of losing habitat than E. albens.

  19. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  20. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  1. A Model for a State Income Tax in Australia: Historical Considerations, Key Design Issues and Recommendations

    OpenAIRE

    Mellor, Peter Warren

    2017-01-01

    This thesis addresses the question, would the reintroduction of income taxation at the State level in Australia be feasible at the present time? The States levied income taxes from the late nineteenth century until 1942, when the Commonwealth unilaterally enacted legislation for its ‘uniform tax’ scheme for centralised income taxation which made it effectively impossible for State income taxation to continue. As the States also face a significant constitutional restrictions ...

  2. A changing climate: impacts on human exposures to O3 using an integrated modeling methodology

    Science.gov (United States)

    Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposu...

  3. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  4. An improved methodology for dynamic modelling and simulation of ...

    Indian Academy of Sciences (India)

    The diversity of the processes and the complexity of the drive system .... modelling the specific event, general simulation tools such as Matlab R provide the user with tools for creating ..... using the pulse width modulation (PWM) techniques.

  5. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  6. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  7. Methodology for assessing electric vehicle charging infrastructure business models

    OpenAIRE

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, w...

  8. The methodology of energy policy-making in economical models

    Energy Technology Data Exchange (ETDEWEB)

    Poursina, B.

    1998-08-01

    Scrutiny and careful study in energy is a subject that in human science has been investigated from different point of view. The expansion of this research, because of its importance and effect in different dimensions of human life, has also arrived in the field of political and economic sciences. Economics evaluates the energy phenomenon at the side of elements such as labor, capital and technology in the production functions of firms. The nature of these discussions is mainly from the viewpoint of micro analyses. Nevertheless, the variation and challenges concerning energy and environment during the recent decades and the economists` detailed investigations in its analysis and evaluation have led to the arrival of energy discussions in a special shape in macro planning and large economic models. The paper compares various energy models - EFDM, MEDEE, MIDAS and HERMES. This extent of planning and consequently modelling which lacks a background in the processes of economic researches, deals with analysis of energy and economics reacting effects. Modelling of energy-economy interaction and energy policy in modeling macroeconomics large models are new ideas in energy studies and economics. 7 refs., 6 figs., 1 tab.

  9. A Comparative Study of Three Methodologies for Modeling Dynamic Stall

    Science.gov (United States)

    Sankar, L.; Rhee, M.; Tung, C.; ZibiBailly, J.; LeBalleur, J. C.; Blaise, D.; Rouzaud, O.

    2002-01-01

    During the past two decades, there has been an increased reliance on the use of computational fluid dynamics methods for modeling rotors in high speed forward flight. Computational methods are being developed for modeling the shock induced loads on the advancing side, first-principles based modeling of the trailing wake evolution, and for retreating blade stall. The retreating blade dynamic stall problem has received particular attention, because the large variations in lift and pitching moments encountered in dynamic stall can lead to blade vibrations and pitch link fatigue. Restricting to aerodynamics, the numerical prediction of dynamic stall is still a complex and challenging CFD problem, that, even in two dimensions at low speed, gathers the major difficulties of aerodynamics, such as the grid resolution requirements for the viscous phenomena at leading-edge bubbles or in mixing-layers, the bias of the numerical viscosity, and the major difficulties of the physical modeling, such as the turbulence models, the transition models, whose both determinant influences, already present in static maximal-lift or stall computations, are emphasized by the dynamic aspect of the phenomena.

  10. Exploration of the beliefs and experiences of Aboriginal people with cancer in Western Australia: a methodology to acknowledge cultural difference and build understanding

    Directory of Open Access Journals (Sweden)

    Howat Peter

    2009-08-01

    Full Text Available Abstract Background Aboriginal Australians experience poorer outcomes, and are 2.5 times more likely to die from cancer than non-Aboriginal people, even after adjustment for stage of diagnosis, cancer treatment and comorbidities. They are also less likely to present early as a result of symptoms and to access treatment. Psycho-social factors affect Aboriginal people's willingness and ability to participate in cancer-related screening and treatment services, but little exploration of this has occurred within Australia to date. The current research adopted a phenomenological qualitative approach to understand and explore the lived experiences of Aboriginal Australians with cancer and their beliefs and understanding around this disease in Western Australia (WA. This paper details considerations in the design and process of conducting the research. Methods/Design The National Health and Medical Research Council (NHMRC guidelines for ethical conduct of Aboriginal research were followed. Researchers acknowledged the past negative experiences of Aboriginal people with research and were keen to build trust and relationships prior to conducting research with them. Thirty in-depth interviews with Aboriginal people affected by cancer and twenty with health service providers were carried out in urban, rural and remote areas of WA. Interviews were audio-recorded, transcribed verbatim and coded independently by two researchers. NVivo7 software was used to assist data management and analysis. Participants' narratives were divided into broad categories to allow identification of key themes and discussed by the research team. Discussion and conclusion Key issues specific to Aboriginal research include the need for the research process to be relationship-based, respectful, culturally appropriate and inclusive of Aboriginal people. Researchers are accountable to both participants and the wider community for reporting their findings and for research translation so

  11. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  12. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.

    1997-01-01

    There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...... of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions...... of the terms, but by stressing the use of a stringent terminology. Therefore, the goal of the paper is to advocate the use of such a well defined and clear terminology. (C) 1997 IAWQ. Published by Elsevier Science Ltd....

  13. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  14. A Methodology for Validation of High Resolution Combat Models

    Science.gov (United States)

    1988-06-01

    TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the

  15. Experimental animal models for COPD: a methodological review

    Directory of Open Access Journals (Sweden)

    Vahideh Ghorani

    2017-05-01

    The present review provides various methods used for induction of animal models of COPD, different animals used (mainly mice, guinea pigs and rats and measured parameters. The information provided in this review is valuable for choosing appropriate animal, method of induction and selecting parameters to be measured in studies concerning COPD.

  16. Controlling disease outbreaks in wildlife using limited culling: modelling classical swine fever incursions in wild pigs in Australia.

    Science.gov (United States)

    Cowled, Brendan D; Garner, M Graeme; Negus, Katherine; Ward, Michael P

    2012-01-16

    Disease modelling is one approach for providing new insights into wildlife disease epidemiology. This paper describes a spatio-temporal, stochastic, susceptible- exposed-infected-recovered process model that simulates the potential spread of classical swine fever through a documented, large and free living wild pig population following a simulated incursion. The study area (300 000 km2) was in northern Australia. Published data on wild pig ecology from Australia, and international Classical Swine Fever data was used to parameterise the model. Sensitivity analyses revealed that herd density (best estimate 1-3 pigs km-2), daily herd movement distances (best estimate approximately 1 km), probability of infection transmission between herds (best estimate 0.75) and disease related herd mortality (best estimate 42%) were highly influential on epidemic size but that extraordinary movements of pigs and the yearly home range size of a pig herd were not. CSF generally established (98% of simulations) following a single point introduction. CSF spread at approximately 9 km2 per day with low incidence rates (management in wildlife. An important finding was that it may only be necessary to cull or vaccinate relatively small proportions of a population to successfully contain and eradicate some wildlife disease epidemics.

  17. Long-term relationships of major macro-variables in a resource-related economic model of Australia

    International Nuclear Information System (INIS)

    Harvie, Charles; Hoa, T. van

    1993-01-01

    The paper reports the results of a simple cointegration analysis applied to bivariate causality models using data on resource output, oil prices, terms of trade, current account and output growth to investigate the long-term relationships among these major macroeconomic aggregates in a resource-related economic model of Australia. For the period 1960-1990, the empirical evidence indicates that these five macro-variables, as formulated in our model, are not random walks. In addition, resource production and oil prices are significantly cointegrated, and they are also significantly cointegrated with the current account, terms of trade and economic growth. These findings provide support to the long-term adjustments foundation of our resource-related model. (author)

  18. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional

  19. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  20. Modelling the effects of climate and land cover change on groundwater recharge in south-west Western Australia

    Directory of Open Access Journals (Sweden)

    W. Dawes

    2012-08-01

    Full Text Available The groundwater resource contained within the sandy aquifers of the Swan Coastal Plain, south-west Western Australia, provides approximately 60 percent of the drinking water for the metropolitan population of Perth. Rainfall decline over the past three decades coupled with increasing water demand from a growing population has resulted in falling dam storage and groundwater levels. Projected future changes in climate across south-west Western Australia consistently show a decline in annual rainfall of between 5 and 15 percent. There is expected to be a reduction of diffuse recharge across the Swan Coastal Plain. This study aims to quantify the change in groundwater recharge in response to a range of future climate and land cover patterns across south-west Western Australia.

    Modelling the impact on the groundwater resource of potential climate change was achieved with a dynamically linked unsaturated/saturated groundwater model. A vertical flux manager was used in the unsaturated zone to estimate groundwater recharge using a variety of simple and complex models based on climate, land cover type (e.g. native trees, plantation, cropping, urban, wetland, soil type, and taking into account the groundwater depth.

    In the area centred on the city of Perth, Western Australia, the patterns of recharge change and groundwater level change are not consistent spatially, or consistently downward. In areas with land-use change, recharge rates have increased. Where rainfall has declined sufficiently, recharge rates are decreasing, and where compensating factors combine, there is little change to recharge. In the southwestern part of the study area, the patterns of groundwater recharge are dictated primarily by soil, geology and land cover. In the sand-dominated areas, there is little response to future climate change, because groundwater levels are shallow and much rainfall is rejected recharge. Where the combination of native vegetation and

  1. A Methodology for Modeling Confined, Temperature Sensitive Cushioning Systems

    Science.gov (United States)

    1980-06-01

    thickness of cushion T, and®- s temperature 0, and as a dependent variable, G, the peak acceleration. The initial model, Equation (IV-11), proved deficient ...k9) = TR * TCTH ALV(60) = Tk * TCTH AL2 V6)= Tk2 * FCTH V2 =TRk * TCrFH *AL V(6~3) =THZ * TC.TH AU! V(,34) =TRa * TCTH 141 Yj)=Tks * T(-Th * AL V(.4b

  2. Modeling postpartum depression in rats: theoretic and methodological issues

    Science.gov (United States)

    Ming, LI; Shinn-Yi, CHOU

    2016-01-01

    The postpartum period is when a host of changes occur at molecular, cellular, physiological and behavioral levels to prepare female humans for the challenge of maternity. Alteration or prevention of these normal adaptions is thought to contribute to disruptions of emotion regulation, motivation and cognitive abilities that underlie postpartum mental disorders, such as postpartum depression. Despite the high incidence of this disorder, and the detrimental consequences for both mother and child, its etiology and related neurobiological mechanisms remain poorly understood, partially due to the lack of appropriate animal models. In recent decades, there have been a number of attempts to model postpartum depression disorder in rats. In the present review, we first describe clinical symptoms of postpartum depression and discuss known risk factors, including both genetic and environmental factors. Thereafter, we discuss various rat models that have been developed to capture various aspects of this disorder and knowledge gained from such attempts. In doing so, we focus on the theories behind each attempt and the methods used to achieve their goals. Finally, we point out several understudied areas in this field and make suggestions for future directions. PMID:27469254

  3. Modeling postpartum depression in rats: theoretic and methodological issues

    Directory of Open Access Journals (Sweden)

    Ming LI

    2018-06-01

    Full Text Available The postpartum period is when a host of changes occur at molecular, cellular, physiological and behavioral levels to prepare female humans for the challenge of maternity. Alteration or prevention of these normal adaptions is thought to contribute to disruptions of emotion regulation, motivation and cognitive abilities that underlie postpartum mental disorders, such as postpartum depression. Despite the high incidence of this disorder, and the detrimental consequences for both mother and child, its etiology and related neurobiological mechanisms remain poorly understood, partially due to the lack of appropriate animal models. In recent decades, there have been a number of attempts to model postpartum depression disorder in rats. In the present review, we first describe clinical symptoms of postpartum depression and discuss known risk factors, including both genetic and environmental factors. Thereafter, we discuss various rat models that have been developed to capture various aspects of this disorder and knowledge gained from such attempts. In doing so, we focus on the theories behind each attempt and the methods used to achieve their goals. Finally, we point out several understudied areas in this field and make suggestions for future directions.

  4. Myxomatosis in Australia and Europe: a model for emerging infectious diseases.

    Science.gov (United States)

    Kerr, Peter J

    2012-03-01

    Myxoma virus is a poxvirus naturally found in two American leporid (rabbit) species (Sylvilagus brasiliensis and Sylvilagus bachmani) in which it causes an innocuous localised cutaneous fibroma. However, in European rabbits (Oryctolagus cuniculus) the same virus causes the lethal disseminated disease myxomatosis. The introduction of myxoma virus into the European rabbit population in Australia in 1950 initiated the best known example of what happens when a novel pathogen jumps into a completely naïve new mammalian host species. The short generation time of the rabbit and their vast numbers in Australia meant evolution could be studied in real time. The carefully documented emergence of attenuated strains of virus that were more effectively transmitted by the mosquito vector and the subsequent selection of rabbits with genetic resistance to myxomatosis is the paradigm for pathogen virulence and host-pathogen coevolution. This natural experiment was repeated with the release of a separate strain of myxoma virus in France in 1952. The subsequent spread of the virus throughout Europe and its coevolution with the rabbit essentially paralleled what occurred in Australia. Detailed molecular studies on myxoma virus have dissected the role of virulence genes in the pathogenesis of myxomatosis and when combined with genomic data and reverse genetics should in future enable the understanding of the molecular evolution of the virus as it adapted to its new host. This review describes the natural history and evolution of myxoma virus together with the molecular biology and experimental pathogenesis studies that are informing our understanding of evolution of emerging diseases. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  5. The persistence of subsistence: qualitative social-ecological modeling of indigenous aquatic hunting and gathering in tropical Australia

    Directory of Open Access Journals (Sweden)

    Marcus Barber

    2015-03-01

    Full Text Available Subsistence remains critical to indigenous people in settler-colonial states such as Australia, providing key foundations for indigenous identities and for wider state recognition. However, the drivers of contemporary subsistence are rarely fully articulated and analyzed in terms of likely changing conditions. Our interdisciplinary team combined past research experience gained from multiple sites with published literature to create two generalized qualitative models of the socio-cultural and environmental influences on indigenous aquatic subsistence in northern Australia. One model focused on the longer term (inter-year to generational persistence of subsistence at the community scale, the other model on shorter term (day to season drivers of effort by active individuals. The specification of driver definitions and relationships demonstrates the complexities of even generalized and materialist models of contemporary subsistence practices. The qualitative models were analyzed for emergent properties and for responses to plausible changes in key variables: access, habitat degradation, social security availability, and community dysfunction. Positive human community condition is shown to be critical to the long-term persistence of subsistence, but complex interactions of negative and positive drivers shape subsistence effort expended at the individual scale and within shorter time frames. Such models enable motivations, complexities, and the potential management and policy levers of significance to be identified, defined, causally related, and debated. The models can be used to augment future models of human-natural systems, be tested against case-specific field conditions and/or indigenous perspectives, and aid preliminary assessments of the effects on subsistence of changes in social and environmental conditions, including policy settings.

  6. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  7. Methodological aspects of journaling a dynamic adjusting entry model

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2011-01-01

    Full Text Available This paper expands the discussion of the importance and function of adjusting entries for loan receivables. Discussion of the cyclical development of adjusting entries, their negative impact on the business cycle and potential solutions has intensified during the financial crisis. These discussions are still ongoing and continue to be relevant to members of the professional public, banking regulators and representatives of international accounting institutions. The objective of this paper is to evaluate a method of journaling dynamic adjusting entries under current accounting law. It also expresses the authors’ opinions on the potential for consistently implementing basic accounting principles in journaling adjusting entries for loan receivables under a dynamic model.

  8. Methodology for modeling the microbial contamination of air filters.

    Science.gov (United States)

    Joe, Yun Haeng; Yoon, Ki Young; Hwang, Jungho

    2014-01-01

    In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  9. Methodology for modeling the microbial contamination of air filters.

    Directory of Open Access Journals (Sweden)

    Yun Haeng Joe

    Full Text Available In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  10. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  11. A model for a drug distribution system in remote Australia as a social determinant of health using event structure analysis.

    Science.gov (United States)

    Rovers, John P; Mages, Michelle D

    2017-09-25

    The social determinants of health include the health systems under which people live and utilize health services. One social determinant, for which pharmacists are responsible, is designing drug distribution systems that ensure patients have safe and convenient access to medications. This is critical for settings with poor access to health care. Rural and remote Australia is one example of a setting where the pharmacy profession, schools of pharmacy, and regulatory agencies require pharmacists to assure medication access. Studies of drug distribution systems in such settings are uncommon. This study describes a model for a drug distribution system in an Aboriginal Health Service in remote Australia. The results may be useful for policy setting, pharmacy system design, health professions education, benchmarking, or quality assurance efforts for health system managers in similarly remote locations. The results also suggest that pharmacists can promote access to medications as a social determinant of health. The primary objective of this study was to propose a model for a drug procurement, storage, and distribution system in a remote region of Australia. The secondary objective was to learn the opinions and experiences of healthcare workers under the model. Qualitative research methods were used. Semi-structured interviews were performed with a convenience sample of 11 individuals employed by an Aboriginal health service. Transcripts were analyzed using Event Structure Analysis (ESA) to develop the model. Transcripts were also analyzed to determine the opinions and experiences of health care workers. The model was comprised of 24 unique steps with seven distinct components: choosing a supplier; creating a list of preferred medications; budgeting and ordering; supply and shipping; receipt and storage in the clinic; prescribing process; dispensing and patient counseling. Interviewees described opportunities for quality improvement in choosing suppliers, legal issues and

  12. Adaptability and stability of maize varieties using mixed model methodology

    Directory of Open Access Journals (Sweden)

    Walter Fernandes Meirelles

    2012-01-01

    Full Text Available The objective of this study was to evaluate the performance, adaptability and stability of corn cultivars simultaneously in unbalanced experiments, using the method of harmonic means of the relative performance of genetic values. The grain yield of 45 cultivars, including hybrids and varieties, was evaluated in 49 environments in two growing seasons. In the 2007/2008 growing season, 36 cultivars were evaluated and in 2008/2009 25 cultivars, of which 16 were used in both seasons. Statistical analyses were performed based on mixed models, considering genotypes as random and replications within environments as fixed factors. The experimental precision in the combined analyses was high (accuracy estimates > 92 %. Despite the existence of genotype x environment interaction, hybrids and varieties with high adaptability and stability were identified. Results showed that the method of harmonic means of the relative performance of genetic values is a suitable method for maize breeding programs.

  13. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    International Nuclear Information System (INIS)

    Knezevic, J.; Odoom, E.R.

    2001-01-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets

  14. Selection of low-level radioactive waste disposal sites using screening models versus more complex methodologies

    International Nuclear Information System (INIS)

    Uslu, I.; Fields, D.E.

    1993-01-01

    The task of choosing a waste-disposal site from a set of candidate sites requires an approach capable of objectively handling many environmental variables for each site. Several computer methodologies have been developed to assist in the process of choosing a site for the disposal of low-level radioactive waste; however, most of these models are costly to apply, in terms of computer resources and the time and effort required by professional modelers, geologists, and waste-disposal experts. The authors describe how the relatively simple DRASTIC methodology (a standardized system for evaluating groundwater pollution potential using hydrogeologic settings) may be used for open-quotes pre-screeningclose quotes of sites to determine which subset of candidate sites is worthy of more detailed screening. Results of site comparisons made with DRASTIC are compared with results obtained using PRESTO-II methodology, which is representative of the more complex release-transport-human exposure methodologies. 6 refs., 1 fig., 1 tab

  15. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  16. A comparison of multivariate and univariate time series approaches to modelling and forecasting emergency department demand in Western Australia.

    Science.gov (United States)

    Aboagye-Sarfo, Patrick; Mai, Qun; Sanfilippo, Frank M; Preen, David B; Stewart, Louise M; Fatovich, Daniel M

    2015-10-01

    To develop multivariate vector-ARMA (VARMA) forecast models for predicting emergency department (ED) demand in Western Australia (WA) and compare them to the benchmark univariate autoregressive moving average (ARMA) and Winters' models. Seven-year monthly WA state-wide public hospital ED presentation data from 2006/07 to 2012/13 were modelled. Graphical and VARMA modelling methods were used for descriptive analysis and model fitting. The VARMA models were compared to the benchmark univariate ARMA and Winters' models to determine their accuracy to predict ED demand. The best models were evaluated by using error correction methods for accuracy. Descriptive analysis of all the dependent variables showed an increasing pattern of ED use with seasonal trends over time. The VARMA models provided a more precise and accurate forecast with smaller confidence intervals and better measures of accuracy in predicting ED demand in WA than the ARMA and Winters' method. VARMA models are a reliable forecasting method to predict ED demand for strategic planning and resource allocation. While the ARMA models are a closely competing alternative, they under-estimated future ED demand. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Characterisation of current and future GNSS performance in urban canyons using a high quality 3-D urban model of Melbourne, Australia

    Science.gov (United States)

    Gang-jun, Liu; Kefei, Zhang; Falin, Wu; Liam, Densley; Retscher, Günther

    2009-03-01

    Global Navigation Satellite System (GNSS) is a critical space-borne geospatial infrastructure providing essential positioning supports to a range of location-sensitive applications. GNSS is currently dominated by the US Global Positioning System (GPS) constellation. The next generation GNSS is expected to offer more satellites, better positioning provision, and improved availability and continuity of navigation support. However, GNSS performance in 3-D urban environments is problematic because GNSS signals are either completely blocked or severely degraded by high-rising geographic features like buildings. The aim of this study is to gain an in-depth understanding of the changing spatial patterns of GNSS performance, measured by the number of visible satellites (NVS) and position dilution-of-precision (PDOP), in the urban canyons of Melbourne, Australia. The methodology used includes the following steps: (1) determination of the dynamic orbital positions of current and future GNSS satellites; (2) development of a 3-D urban model of high geometric quality for Melbourne Central Business District (CBD); (3) evaluation of GNSS performance for every specified location in the urban canyons; and (4) visualisation and characterisation of the dynamic spatial patterns of GNSS performances in the urban canyons. As expected, the study shows that the integration of the GPS and Galileo constellations results in higher availability and stronger geometry, leading to significant improvement of GNSS performance in urban canyons of Melbourne CBD. Some conclusions are drawn and further research currently undertaken is also outlined.

  18. Competence development organizations in project management on the basis of genomic model methodologies

    OpenAIRE

    Бушуев, Сергей Дмитриевич; Рогозина, Виктория Борисовна; Ярошенко, Юрий Федерович

    2013-01-01

    The matrix technology for identification of organisational competencies in project management is presented in the article. Matrix elements are the components of organizational competence in the field of project management and project management methodology represented in the structure of the genome. The matrix model of competence in the framework of the adopted methodologies and scanning method for identifying organizational competences formalised. Proposed methods for building effective proj...

  19. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    OpenAIRE

    Pedro Mello Paiva; Alexandre Nunes Barreto; Jader Lugon Junior; Leticia Ferraço de Campos

    2016-01-01

    This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers sim...

  20. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  1. Methodology if inspections to carry out the nuclear outages model

    International Nuclear Information System (INIS)

    Aycart, J.; Mortenson, S.; Fourquet, J. M.

    2005-01-01

    Before the nuclear generation industry was deregulated in the United States, refueling and maintenance outages in nuclear power plants usually lasted orotund 100 days. After deregulation took effect, improved capability factors and performances became more important. As a result, it became essential to reduce the critical path time during the outage, which meant that activities that had typically been done in series had to be executed in parallel. The new outage model required the development of new tools and new processes, The 360-degree platform developed by GE Energy has made it possible to execute multiple activities in parallel. Various in-vessel visual inspection (IVVI) equipments can now simultaneously perform inspections on the pressurized reactor vessel (RPV) components. The larger number of inspection equipments in turn results in a larger volume of data, with the risk of increasing the time needed for examining them and postponing the end of the analysis phase, which is critical for the outage. To decrease data analysis times, the IVVI Digitalisation process has been development. With this process, the IVVI data are sent via a high-speed transmission line to a site outside the Plant called Center of Excellence (COE), where a team of Level III experts is in charge of analyzing them. The tools for the different product lines are being developed to interfere with each other as little as possible, thus minimizing the impact of the critical path on plant refueling activities. Methods are also being developed to increase the intervals between inspection. In accordance with the guidelines of the Boiling Water Reactor Vessel and Internals project (BWRVIP), the intervals between inspections are typically longer if ultrasound volumetric inspections are performed than if the scope is limited to IVVI. (Author)

  2. A methodology and supply chain management inspired reference ontology for modeling healthcare teams.

    Science.gov (United States)

    Kuziemsky, Craig E; Yazdi, Sara

    2011-01-01

    Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.

  3. Methodological challenges in collecting social and behavioural data regarding the HIV epidemic among gay and other men who have sex with men in Australia.

    Directory of Open Access Journals (Sweden)

    Iryna B Zablotska

    Full Text Available BACKGROUND: Behavioural surveillance and research among gay and other men who have sex with men (GMSM commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS and discussed their utility for behavioural surveillance. METHODS: Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. RESULTS: Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR or casual partners (UAIC. The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. CONCLUSION: Respondent-driven sampling produced the sample that was most consistent to population estimates

  4. Methodological challenges in collecting social and behavioural data regarding the HIV epidemic among gay and other men who have sex with men in Australia.

    Science.gov (United States)

    Zablotska, Iryna B; Frankland, Andrew; Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett

    2014-01-01

    Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding

  5. Unsettling Australia

    DEFF Research Database (Denmark)

    Jensen, Lars

    This book is a critical intervention into debates on Australia's cultural history. The book demonstrates the interconnectedness of themes commonly seen as separate discursive formations, and shows the fruitfulness of bringing a combined cultural studies and postcolonial approach to bear on a number...

  6. Towards A Model-Based Prognostics Methodology For Electrolytic Capacitors: A Case Study Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter...

  7. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  8. Novel Methodology for Functional Modeling and Simulation of Wireless Embedded Systems

    Directory of Open Access Journals (Sweden)

    Sosa Morales Emma

    2008-01-01

    Full Text Available Abstract A novel methodology is presented for the modeling and the simulation of wireless embedded systems. Tight interaction between the analog and the digital functionality makes the design and verification of such systems a real challenge. The applied methodology brings together the functional models of the baseband algorithms written in C language with the circuit descriptions at behavioral level in Verilog or Verilog-AMS for the system simulations in a single kernel environment. The physical layer of an ultrawideband system has been successfully modeled and simulated. The results confirm that this methodology provides a standardized framework in order to efficiently and accurately simulate complex mixed signal applications for embedded systems.

  9. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  10. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  11. Cervical cancer screening in Australia: modelled evaluation of the impact of changing the recommended interval from two to three years

    Directory of Open Access Journals (Sweden)

    Howard Kirsten

    2010-11-01

    Full Text Available Abstract Background The National Cervical Screening Program in Australia currently recommends that sexually active women between the ages of 18-70 years attend routine screening every 2 years. The publically funded National HPV Vaccination Program commenced in 2007, with catch-up in females aged 12-26 years conducted until 2009; and this may prompt consideration of whether the screening interval and other aspects of the organized screening program could be reviewed. The aim of the current evaluation was to assess the epidemiologic outcomes and cost implications of changing the recommended screening interval in Australia to 3 years. Methods We used a modelling approach to evaluate the effects of moving to a 3-yearly recommended screening interval. We used data from the Victorian Cervical Cytology Registry over the period 1997-2007 to model compliance with routine screening under current practice, and registry data from other countries with 3-yearly recommendations to inform assumptions about future screening behaviour under two alternative systems for screening organisation - retention of a reminder-based system (as in New Zealand, or a move to a call-and-recall system (as in England. Results A 3-yearly recommendation is predicted to be of similar effectiveness to the current 2-yearly recommendation, resulting in no substantial change to the total number of incident cervical cancer cases or cancer deaths, or to the estimated 0.68% average cumulative lifetime risk of cervical cancer in unvaccinated Australian women. However, a 3-yearly screening policy would be associated with decreases in the annual number of colposcopy and biopsy procedures performed (by 4-10% and decreases in the number of treatments for pre-invasive lesions (by 2-4%. The magnitude of the decrease in the number of diagnostic procedures and treatments would depend on the method of screening organization, with call-and-recall screening associated with the highest reductions. The

  12. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  13. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  14. An innovative approach to address homelessness in regional Australia: Participant evaluation of a co-payment model.

    Science.gov (United States)

    Jacups, S; Rogerson, B; Kinchin, I

    2018-03-01

    Homelessness is not only about lack of secure housing, it is sometimes caused by simple reasons such as lack of money to travel home. The purpose of this study was to investigate whether the participant co-funded assistance program ('Return to Country' [R2C]), when offered to low socio-economic individuals experiencing homelessness, represented an effective use of scarce resources. In northern Australia, a remote and sparsely populated area, Indigenous persons who travel to regional centres cannot always afford airfares home; they therefore become stranded away from their 'country' leading to rapidly deteriorating health, isolation and separation from family and kin. The R2C program was designed to facilitate travel for persons who were temporarily stranded and were voluntarily seeking to return home. The program provided operational support and funding (participants co-funded AU$99) to participants to return home. Using a descriptive, case series research design, university researchers independently evaluated the R2C program using semi-structured interviews with 37 participants. An investment of AU$970 per participant in the program with partial co-payment was associated with high participant acceptability and satisfaction in-line with harms reduction around substance and criminal abuse, which is suggestive of long-term success for the model. Findings from this study can contribute to the development of best practice guidelines and policies that specifically address the needs of this unique population of stranded persons, who are seeking to return home. The acceptance of the co-payment model can be adopted by policy makers involved in homelessness prevention in other locations in Australia or internationally as an add-on service provision to mainstream housing support. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  15. Methodologies for Wind Turbine and STATCOM Integration in Wind Power Plant Models for Harmonic Resonances Assessment

    DEFF Research Database (Denmark)

    Freijedo Fernandez, Francisco Daniel; Chaudhary, Sanjay Kumar; Guerrero, Josep M.

    2015-01-01

    -domain. As an alternative, a power based averaged modelling is also proposed. Type IV wind turbine harmonic signature and STATCOM active harmonic mitigation are considered for the simulation case studies. Simulation results provide a good insight of the features and limitations of the proposed methodologies.......This paper approaches modelling methodologies for integration of wind turbines and STATCOM in harmonic resonance studies. Firstly, an admittance equivalent model representing the harmonic signature of grid connected voltage source converters is provided. A simplified type IV wind turbine modelling...... is then straightforward. This linear modelling is suitable to represent the wind turbine in the range of frequencies at which harmonic interactions are likely. Even the admittance method is suitable both for frequency and time domain studies, some limitations arise in practice when implementing it in the time...

  16. Australia: Population.

    Science.gov (United States)

    The Australian Bureau of Census and Statistics reported on 27 August 1979 that Australia's total population was 14,376,400 at the end of the first quarter of 1979. Net immigration gain during the same period was 12,700. Natural increase was 32,100--births were 57,100 and deaths were 25,000. In January 1979, Australia introduced a new immigration scheme to improve methods of selecting immigrants. Points are awarded on the basis of personal qualities and employability; an applicant must score 60 out of 100. This scheme supersedes the earlier system under which immigrants were selected on the family reunion criterion and employability. Migrants from Britain and Ireland made up the bulk of the new comers, but their proportion has dropped from 50% in the mid-1960s to 30% in early 1979. In contrast, Asian immigrants have risen from 2% to 22% over the same period. Asian immigration began in the mid-1960s with the relaxation of the "White Australia" policy which barred non-European migrants, and increased when the ban was abolished by Prime Minister Gough Whitlam in 1973.

  17. From LCAs to simplified models: a generic methodology applied to wind power electricity.

    Science.gov (United States)

    Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle

    2013-02-05

    This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.

  18. Through the Looking Glass: No Wonderland Yet! (The Reciprocal Relationship between Methodology and Models of Reality).

    Science.gov (United States)

    Unger, Rhoda Kesler

    1983-01-01

    Discusses the relationship between conceptual frameworks and methodology in psychology. Argues that models of reality influence research in terms of question selection, causal factors hypothesized, and interpretation of data. Considers the position and role of women as objects and agents of research using a sociology of knowledge perspective.…

  19. Nirex methodology for scenario and conceptual model development. An international peer review

    International Nuclear Information System (INIS)

    1999-06-01

    Nirex has responsibilities for nuclear waste management in the UK. The company's top level objectives are to maintain technical credibility on deep disposal, to gain public acceptance for a deep geologic repository, and to provide relevant advice to customers on the safety implications of their waste packaging proposals. Nirex utilizes peer reviews as appropriate to keep its scientific tools up-to-date and to periodically verify the quality of its products. The NEA formed an International Review Team (IRT) consisting of four internationally recognised experts plus a member of the NEA Secretariat. The IRT performed an in-depth analysis of five Nirex scientific reports identified in the terms of reference of the review. The review was to primarily judge whether the Nirex methodology provides an adequate framework to support the building of a future licensing safety case. Another objective was to judge whether the methodology could aid in establishing a better understanding, and, ideally, enhance acceptance of a repository among stakeholders. Methodologies for conducting safety assessments include at a very basic level the identification of features, events, and processes (FEPs) relevant to the system at hand, their convolution in scenarios for analysis, and the formulation of conceptual models to be addressed through numerical modelling. The main conclusion of the IRT is that Nirex has developed a potentially sound methodology for the identification and analysis of FEPs and for the identification of conceptual model needs and model requirements. The work is still in progress and is not yet complete. (R.P.)

  20. A Methodological Review of Structural Equation Modelling in Higher Education Research

    Science.gov (United States)

    Green, Teegan

    2016-01-01

    Despite increases in the number of articles published in higher education journals using structural equation modelling (SEM), research addressing their statistical sufficiency, methodological appropriateness and quantitative rigour is sparse. In response, this article provides a census of all covariance-based SEM articles published up until 2013…

  1. Projecting future expansion of invasive species: comparing and improving methodologies for species distribution modeling.

    Science.gov (United States)

    Mainali, Kumar P; Warren, Dan L; Dhileepan, Kunjithapatham; McConnachie, Andrew; Strathie, Lorraine; Hassan, Gul; Karki, Debendra; Shrestha, Bharat B; Parmesan, Camille

    2015-12-01

    . However, discrepancies between model predictions and parthenium invasion in Australia indicate successful management for this globally significant weed. © 2015 John Wiley & Sons Ltd.

  2. Modelling and Monitoring in Preparedness for Nuclear Powered Warship Visits in Australia

    Energy Technology Data Exchange (ETDEWEB)

    Grzechnik, Marcus P.; Orr, Blake W.; Bokor, Ilonka; Solomon, Stephen B. [Australian Radiation Protection and Nuclear Safety Agency, 619 Lower Plenty Road, Yallambie, Victoria, 3084 (Australia)

    2014-07-01

    As a part of reciprocal inter-Governmental arrangements, Australia hosts regular Governmental-approved peacetime visits of naval vessels. These vessels can be conventionally or nuclear-powered. Because of the nature of Nuclear Powered Warships (NPWs), special procedures have been adopted to ensure that the safety of the general public is maintained during visits by such vessels (Nimitz class carriers or submarines). These procedures include Conditions of Entry and the arrangements for visits, as well as contingency arrangements in the unlikely event of an accident resulting in the hazardous release of radioactivity to the environment. The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) is involved in an inter-departmental committee, the Visiting Ships Panel (Nuclear) (VSP(N)), which oversees the arrangements for visits to Australian Ports by NPWs. This panel includes representatives from Navy and Jurisdictions which host NPW visits. As well as existing arrangements (including reference accident scenarios and sampling of sediment and local seafood), ARPANSA has initiated programmes to; - Present automated atmospheric dispersion products based on current and predicted weather - these are expected to be housed on a secure web site with outputs tailored to the needs of first responders using the ARGOS decision support tool. - Monitor water concentrations within NPW ports prior to visits, in order to establish baseline values. The monitoring (for Cs-137) involves a high volume filtering and extraction technique which has been developed at ARPANSA and will be discussed. An update on progress will be discussed. This includes an overview of relevant systems, procedures in place and work to be completed. Issues to be resolved and lessons learned will also be considered. (authors)

  3. On the fit of models to covariances and methodology to the Bulletin.

    Science.gov (United States)

    Bentler, P M

    1992-11-01

    It is noted that 7 of the 10 top-cited articles in the Psychological Bulletin deal with methodological topics. One of these is the Bentler-Bonett (1980) article on the assessment of fit in covariance structure models. Some context is provided on the popularity of this article. In addition, a citation study of methodology articles appearing in the Bulletin since 1978 was carried out. It verified that publications in design, evaluation, measurement, and statistics continue to be important to psychological research. Some thoughts are offered on the role of the journal in making developments in these areas more accessible to psychologists.

  4. Validation of a model with climatic and flow scenario analysis: case of Lake Burrumbeet in southeastern Australia.

    Science.gov (United States)

    Yihdego, Yohannes; Webb, John

    2016-05-01

    Forecast evaluation is an important topic that addresses the development of reliable hydrological probabilistic forecasts, mainly through the use of climate uncertainties. Often, validation has no place in hydrology for most of the times, despite the parameters of a model are uncertain. Similarly, the structure of the model can be incorrectly chosen. A calibrated and verified dynamic hydrologic water balance spreadsheet model has been used to assess the effect of climate variability on Lake Burrumbeet, southeastern Australia. The lake level has been verified to lake level, lake volume, lake surface area, surface outflow and lake salinity. The current study aims to increase lake level confidence model prediction through historical validation for the year 2008-2013, under different climatic scenario. Based on the observed climatic condition (2008-2013), it fairly matches with a hybridization of scenarios, being the period interval (2008-2013), corresponds to both dry and wet climatic condition. Besides to the hydrologic stresses uncertainty, uncertainty in the calibrated model is among the major drawbacks involved in making scenario simulations. In line with this, the uncertainty in the calibrated model was tested using sensitivity analysis and showed that errors in the model can largely be attributed to erroneous estimates of evaporation and rainfall, and surface inflow to a lesser. The study demonstrates that several climatic scenarios should be analysed, with a combination of extreme climate, stream flow and climate change instead of one assumed climatic sequence, to improve climate variability prediction in the future. Performing such scenario analysis is a valid exercise to comprehend the uncertainty with the model structure and hydrology, in a meaningful way, without missing those, even considered as less probable, ultimately turned to be crucial for decision making and will definitely increase the confidence of model prediction for management of the water

  5. Australia`s uranium opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Alder, K.

    1996-12-31

    The book is a personal account by an insider who was deeply involved in the rise and fall of the Australian Atomic Energy Commission (AAEC), and in particular in its efforts to bring Australia into the nuclear age. It reveals the thinking behind the Commission`s research programmes and major projects, such as the centrifuge enrichment program and Jervis Bay Nuclear Power project. It shows how politics, politicians and sensational journalism had disastrous effects on the AAEC, its programmes and aspirations. ills.

  6. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  7. Long-term evaluation of benefits, harms, and cost-effectiveness of the National Bowel Cancer Screening Program in Australia: A modelling study

    NARCIS (Netherlands)

    Lew, J.-B. (Jie-Bin); St John, D.J.B. (D James B); X.-M. Xu (Xiang-Ming); M.J.W. Greuter (Marcel); Caruana, M. (Michael); D.R. Cenin (Dayna R.); He, E. (Emily); Saville, M. (Marion); Grogan, P. (Paul); V.M.H. Coupé (Veerle); K. Canfell (Karen)

    2017-01-01

    textabstractBackground: No assessment of the National Bowel Screening Program (NBCSP) in Australia, which considers all downstream benefits, costs, and harms, has been done. We aimed to use a comprehensive natural history model and the most recent information about cancer treatment costs to estimate

  8. Simplified life cycle assessment models: methodological framework and applications to energy pathways

    International Nuclear Information System (INIS)

    Padey, Pierryves

    2013-01-01

    The energy transition debate is a key issue for today and the coming years. One of the challenges is to limit the environmental impacts of electricity production. Decision support tools, sufficiently accurate, simple to use, accounting for environmental aspects and favoring future energetic choices, must be implemented. However, the environmental assessment of the energy pathways is complex, and it means considering a two levels characterization. The 'energy pathway' is the first level and corresponds to its environmental distribution, to compare overall pathways. The 'system pathway' is the 2. level and compares environmental impacts of systems within each pathway. We have devised a generic methodology covering both necessary characterization levels by estimating the energy pathways environmental profiles while allowing a simple comparison of its systems environmental impacts. This methodology is based on the definition of a parameterized Life Cycle Assessment model and considers, through a Global Sensitivity Analysis, the environmental impacts of a large sample of systems representative of an energy pathway. As a second step, this methodology defines simplified models based on few key parameters identified as inducing the largest variability in the energy pathway environmental impacts. These models assess in a simple way the systems environmental impacts, avoiding any complex LCAs. This reduction methodology has been applied to the onshore wind power energy pathway in Europe and the photovoltaic energy pathway in France. (author)

  9. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  10. Methodology Development for SiC Sensor Signal Modelling in the Nuclear Reactor Radiation Environments

    International Nuclear Information System (INIS)

    Cetnar, J.; Krolikowski, I.P.

    2013-06-01

    This paper deals with SiC detector simulation methodology for signal formation by neutrons and induced secondary radiation as well as its inverse interpretation. The primary goal is to achieve the SiC capability of simultaneous spectroscopic measurements of neutrons and gamma-rays for which an appropriate methodology of the detector signal modelling and its interpretation must be adopted. The process of detector simulation is divided into two basically separate but actually interconnected sections. The first one is the forward simulation of detector signal formation in the field of the primary neutron and secondary radiations, whereas the second one is the inverse problem of finding a representation of the primary radiation, based on the measured detector signals. The applied methodology under development is based on the Monte Carlo description of radiation transport and analysis of the reactor physics. The methodology of SiC detector signal interpretation will be based on the existing experience in neutron metrology developed in the past for various neutron and gamma-ray detection systems. Since the novel sensors based on SiC are characterised by a new structure, yet to be finally designed, the methodology for particle spectroscopic fluence measurement must be developed while giving a productive feed back to the designing process of SiC sensor, in order to arrive at the best possible design. (authors)

  11. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  12. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-08-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  13. A methodology to annotate systems biology markup language models with the synthetic biology open language.

    Science.gov (United States)

    Roehner, Nicholas; Myers, Chris J

    2014-02-21

    Recently, we have begun to witness the potential of synthetic biology, noted here in the form of bacteria and yeast that have been genetically engineered to produce biofuels, manufacture drug precursors, and even invade tumor cells. The success of these projects, however, has often failed in translation and application to new projects, a problem exacerbated by a lack of engineering standards that combine descriptions of the structure and function of DNA. To address this need, this paper describes a methodology to connect the systems biology markup language (SBML) to the synthetic biology open language (SBOL), existing standards that describe biochemical models and DNA components, respectively. Our methodology involves first annotating SBML model elements such as species and reactions with SBOL DNA components. A graph is then constructed from the model, with vertices corresponding to elements within the model and edges corresponding to the cause-and-effect relationships between these elements. Lastly, the graph is traversed to assemble the annotating DNA components into a composite DNA component, which is used to annotate the model itself and can be referenced by other composite models and DNA components. In this way, our methodology can be used to build up a hierarchical library of models annotated with DNA components. Such a library is a useful input to any future genetic technology mapping algorithm that would automate the process of composing DNA components to satisfy a behavioral specification. Our methodology for SBML-to-SBOL annotation is implemented in the latest version of our genetic design automation (GDA) software tool, iBioSim.

  14. Estimates of potential childhood lead exposure from contaminated soil using the US EPA IEUBK Model in Sydney, Australia.

    Science.gov (United States)

    Laidlaw, Mark A S; Mohmmad, Shaike M; Gulson, Brian L; Taylor, Mark P; Kristensen, Louise J; Birch, Gavin

    2017-07-01

    Surface soils in portions of the Sydney (New South Wales, Australia) urban area are contaminated with lead (Pb) primarily from past use of Pb in gasoline, the deterioration of exterior lead-based paints, and industrial activities. Surface soil samples (n=341) were collected from a depth of 0-2.5cm at a density of approximately one sample per square kilometre within the Sydney estuary catchment and analysed for lead. The bioaccessibility of soil Pb was analysed in 18 samples. The blood lead level (BLL) of a hypothetical 24 month old child was predicted at soil sampling sites in residential and open land use using the United States Environmental Protection Agency (US EPA) Integrated Exposure Uptake and Biokinetic (IEUBK) model. Other environmental exposures used the Australian National Environmental Protection Measure (NEPM) default values. The IEUBK model predicted a geometric mean BLL of 2.0±2.1µg/dL using measured soil lead bioavailability measurements (bioavailability =34%) and 2.4±2.8µg/dL using the Australian NEPM default assumption (bioavailability =50%). Assuming children were present and residing at the sampling locations, the IEUBK model incorporating soil Pb bioavailability predicted that 5.6% of the children at the sampling locations could potentially have BLLs exceeding 5µg/dL and 2.1% potentially could have BLLs exceeding 10µg/dL. These estimations are consistent with BLLs previously measured in children in Sydney. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Population Screening for Hereditary Haemochromatosis in Australia: Construction and Validation of a State-Transition Cost-Effectiveness Model.

    Science.gov (United States)

    de Graaff, Barbara; Si, Lei; Neil, Amanda L; Yee, Kwang Chien; Sanderson, Kristy; Gurrin, Lyle C; Palmer, Andrew J

    2017-03-01

    HFE-associated haemochromatosis, the most common monogenic disorder amongst populations of northern European ancestry, is characterised by iron overload. Excess iron is stored in parenchymal tissues, leading to morbidity and mortality. Population screening programmes are likely to improve early diagnosis, thereby decreasing associated disease. Our aim was to develop and validate a health economics model of screening using utilities and costs from a haemochromatosis cohort. A state-transition model was developed with Markov states based on disease severity. Australian males (aged 30 years) and females (aged 45 years) of northern European ancestry were the target populations. The screening strategy was the status quo approach in Australia; the model was run over a lifetime horizon. Costs were estimated from the government perspective and reported in 2015 Australian dollars ($A); costs and quality-adjusted life-years (QALYs) were discounted at 5% annually. Model validity was assessed using goodness-of-fit analyses. Second-order Monte-Carlo simulation was used to account for uncertainty in multiple parameters. For validity, the model reproduced mortality, life expectancy (LE) and prevalence rates in line with published data. LE for C282Y homozygote males and females were 49.9 and 40.2 years, respectively, slightly lower than population rates. Mean (95% confidence interval) QALYS were 15.7 (7.7-23.7) for males and 14.4 (6.7-22.1) for females. Mean discounted lifetime costs for C282Y homozygotes were $A22,737 (3670-85,793) for males and $A13,840 (1335-67,377) for females. Sensitivity analyses revealed discount rates and prevalence had the greatest impacts on outcomes. We have developed a transparent, validated health economics model of C282Y homozygote haemochromatosis. The model will be useful to decision makers to identify cost-effective screening strategies.

  16. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  17. Increase in Total Joint Arthroplasty Projected from 2014 to 2046 in Australia: A Conservative Local Model With International Implications.

    Science.gov (United States)

    Inacio, Maria C S; Graves, Stephen E; Pratt, Nicole L; Roughead, Elizabeth E; Nemes, Szilard

    2017-08-01

    The incidence of joint arthroplasty is increasing worldwide. International estimates of future demand for joint arthroplasty have used models that propose either an exponential future increase, despite obvious system constraints, or static increases, which do not account for past trends. Country-specific projection estimates that address limitations of past projections are necessary. In Australia, a high-income country with the 7th highest incidence of TKA and 15th highest incidence of THA of the Organization for Economic Cooperation and Development (OECD) countries, the volume of TKAs and THAs increased 198% between 1994 and 2014. To determine the projected incidence and volume of primary TKAs and THAs from 2014 to 2046 in the Australian population older than 40 years. Australian State and Territory Health Department data were used to identify TKAs and THAs performed between 1994 and 1995 and 2013 and 2014. The Australian Bureau of Statistics was the source of the population estimates for the same periods and population-projected estimates until 2046. The incidence rate (IR), 95% CI, and prediction interval (PI) of TKAs and THAs per 100,000 Australian citizens older than 40 years were calculated. Future IRs were estimated using a logistic model, and volume was calculated from projected IR and population. The logistic growth model assumes the existence of an upper limit of the TKA and THA incidences and a growth rate directly related to this incidence. At the beginning, when the observed incidence is much lower than the asymptote, the increase is exponential, but it decreases as it approaches the upper limit. A 66% increase in the IR of primary THAs between 2013 and 2046 is projected for Australia (2013: IR = 307 per 100,000, [95% CI, 262-329 per 100,000] compared with 2046: IR= 510 per 100,000, [95% PI, 98-567 per 100,000]), which translates to a 219% increase in the volume during this period. For TKAs the IR is expected to increase by 26% by 2046 (IR = 575 per

  18. PERAMALAN JUMLAH KUNJUNGAN WISATAWAN AUSTRALIA YANG BERKUNJUNG KE BALI MENGGUNAKAN MODEL TIME VARYING PARAMETER (TVP

    Directory of Open Access Journals (Sweden)

    I PUTU GEDE DIAN GERRY SUWEDAYANA

    2016-08-01

    Full Text Available The purpose of this research is to forecast the number of Australian tourists arrival to Bali using Time Varying Parameter (TVP model based on inflation of Indonesia and exchange rate AUD to IDR from January 2010 – December 2015 as explanatory variables. TVP model is specified in a state space model and estimated by Kalman filter algorithm. The result shows that the TVP model can be used to forecast the number of Australian tourists arrival to Bali because it satisfied the assumption that the residuals are distributed normally and the residuals in the measurement and transition equations are not correlated. The estimated TVP model is . This model has a value of mean absolute percentage error (MAPE is equal to dan root mean square percentage error (RMSPE is equal to . The number of Australian tourists arrival to Bali for the next five periods is predicted: ; ; ; ; and (January - May 2016.

  19. Development of an economical model to determine an appropriate feed-in tariff for grid-connected solar PV electricity in all states of Australia

    International Nuclear Information System (INIS)

    Zahedi, A.

    2009-01-01

    Australia is a country with a vast amount of natural resources including sun and wind. Australia lies between latitude of 10-45 S and longitude of 112-152 E, with a daily solar exposure of between less than 3 MJ/(m 2 day) in winter and more than 30 MJ/(m 2 day) in summer. Global solar radiation in Australia varies between minimum of 3285 MJ/(m 2 year) in Hobart to 8760 MJ/(m 2 year) in Northern Territory. As a result of this wide range of radiation level there will be a big difference between costs of solar PV electricity in different locations. A study we have recently conducted on the solar PV electricity price in all states of Australia. For this purpose we have developed an economical model and a computer simulation to determine the accurate unit price of grid-connected roof-top solar photovoltaic (PV) electricity in A$/kWh for all state of Australia. The benefit of this computer simulation is that we can accurately determine the most appropriate feed-in tariff of grid-connected solar PV energy system. The main objective of this paper is to present the results of this study. A further objective of this paper is to present the details of the unit price of solar PV electricity in the state of Victoria in each month and then to compare with electricity price from conventional power systems, which is currently applied to this state. The state Victoria is located south of Australia and in terms of sun radiation is second lowest compared with the other Australian states. The computer simulation developed for this study makes it possible to determine the cost of grid-connected solar PV electricity at any location in any country based on availability of average daily solar exposure of each month as well as economical factors of the country. (author)

  20. A methodology for including wall roughness effects in k-ε low-Reynolds turbulence models

    International Nuclear Information System (INIS)

    Ambrosini, W.; Pucciarelli, A.; Borroni, I.

    2015-01-01

    Highlights: • A model for taking into account wall roughness in low-Reynolds k-ε models is presented. • The model is subjected to a first validation to show its potential in general applications. • The application of the model in predicting heat transfer to supercritical fluids is also discussed. - Abstract: A model accounting for wall roughness effects in k-ε low-Reynolds turbulence models is described in the present paper. In particular, the introduction in the transport equations of k and ε of additional source terms related to roughness, based on simple assumptions and dimensional relationships, is proposed. An objective of the present paper, in addition to obtaining more realistic predictions of wall friction, is the application of the proposed model to the study of heat transfer to supercritical fluids. A first validation of the model is reported. The model shows the capability of predicting, at least qualitatively, some of the most important trends observed when dealing with rough pipes in very different flow conditions. Qualitative comparisons with some DNS data available in literature are also performed. Further analyses provided promising results concerning the ability of the model in reproducing the trend of friction factor when varying the flow conditions, though improvements are necessary for achieving better quantitative accuracy. First applications of the model in simulating heat transfer to supercritical fluids are also described, showing the capability of the model to affect the predictions of these heat transfer phenomena, in particular in the vicinity of the pseudo-critical conditions. A more extended application of the model to relevant deteriorated heat transfer conditions will clarify the usefulness of this modelling methodology in improving predictions of these difficult phenomena. Whatever the possible success in this particular application that motivated its development, this approach suggests a general methodology for accounting

  1. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  2. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  3. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  4. Modelling the distribution of hard seabed using calibrated multibeam acoustic backscatter data in a tropical, macrotidal embayment: Darwin Harbour, Australia

    Science.gov (United States)

    Siwabessy, P. Justy W.; Tran, Maggie; Picard, Kim; Brooke, Brendan P.; Huang, Zhi; Smit, Neil; Williams, David K.; Nicholas, William A.; Nichol, Scott L.; Atkinson, Ian

    2018-06-01

    Spatial information on the distribution of seabed substrate types in high use coastal areas is essential to support their effective management and environmental monitoring. For Darwin Harbour, a rapidly developing port in northern Australia, the distribution of hard substrate is poorly documented but known to influence the location and composition of important benthic biological communities (corals, sponges). In this study, we use angular backscatter response curves to model the distribution of hard seabed in the subtidal areas of Darwin Harbour. The angular backscatter response curve data were extracted from multibeam sonar data and analysed against backscatter intensity for sites observed from seabed video to be representative of "hard" seabed. Data from these sites were consolidated into an "average curve", which became a reference curve that was in turn compared to all other angular backscatter response curves using the Kolmogorov-Smirnov goodness-of-fit. The output was used to generate interpolated spatial predictions of the probability of hard seabed ( p-hard) and derived hard seabed parameters for the mapped area of Darwin Harbour. The results agree well with the ground truth data with an overall classification accuracy of 75% and an area under curve measure of 0.79, and with modelled bed shear stress for the Harbour. Limitations of this technique are discussed with attention to discrepancies between the video and acoustic results, such as in areas where sediment forms a veneer over hard substrate.

  5. A new model of collaborative research: experiences from one of Australia's NHMRC Partnership Centres for Better Health.

    Science.gov (United States)

    Wutzke, Sonia; Redman, Sally; Bauman, Adrian; Hawe, Penelope; Shiell, Alan; Thackway, Sarah; Wilson, Andrew

    2017-02-15

    There is often a disconnection between the creation of evidence and its use in policy and practice. Cross-sectoral, multidisciplinary partnership research, founded on shared governance and coproduction, is considered to be one of the most effective means of overcoming this research-policy-practice disconnect. Similar to a number of funding bodies internationally, Australia's National Health and Medical Research Council has introduced Partnership Centres for Better Health: a scheme explicitly designed to encourage coproduced partnership research. In this paper, we describe our experiences of The Australian Prevention Partnership Centre, established in June 2013 to explore the systems, strategies and structures that inform decisions about how to prevent lifestyle-related chronic disease. We present our view on how the Partnership Centre model is working in practice. We comment on the unique features of the Partnership Centre funding model, how these features enable ways of working that are different from both investigator-initiated and commissioned research, and how these ways of working can result in unique outcomes that would otherwise not have been possible. Although not without challenges, the Partnership Centre approach addresses a major gap in the Australian research environment, whereby large-scale, research-policy-practice partnerships are established with sufficient time, resources and flexibility to deliver highly innovative, timely and accessible research that is of use to policy and practice.

  6. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  7. Investigating compound flooding in an estuary using hydrodynamic modelling: a case study from the Shoalhaven River, Australia

    Science.gov (United States)

    Kumbier, Kristian; Carvalho, Rafael C.; Vafeidis, Athanasios T.; Woodroffe, Colin D.

    2018-02-01

    Many previous modelling studies have considered storm-tide and riverine flooding independently, even though joint-probability analysis highlighted significant dependence between extreme rainfall and extreme storm surges in estuarine environments. This study investigates compound flooding by quantifying horizontal and vertical differences in coastal flood risk estimates resulting from a separation of storm-tide and riverine flooding processes. We used an open-source version of the Delft3D model to simulate flood extent and inundation depth due to a storm event that occurred in June 2016 in the Shoalhaven Estuary, south-eastern Australia. Time series of observed water levels and discharge measurements are used to force model boundaries, whereas observational data such as satellite imagery, aerial photographs, tidal gauges and water level logger measurements are used to validate modelling results. The comparison of simulation results including and excluding riverine discharge demonstrated large differences in modelled flood extents and inundation depths. A flood risk assessment accounting only for storm-tide flooding would have underestimated the flood extent of the June 2016 storm event by 30 % (20.5 km2). Furthermore, inundation depths would have been underestimated on average by 0.34 m and by up to 1.5 m locally. We recommend considering storm-tide and riverine flooding processes jointly in estuaries with large catchment areas, which are known to have a quick response time to extreme rainfall. In addition, comparison of different boundary set-ups at the intermittent entrance in Shoalhaven Heads indicated that a permanent opening, in order to reduce exposure to riverine flooding, would increase tidal range and exposure to both storm-tide flooding and wave action.

  8. A methodology for modeling photocatalytic reactors for indoor pollution control using previously estimated kinetic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Passalia, Claudio; Alfano, Orlando M. [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina); Brandi, Rodolfo J., E-mail: rbrandi@santafe-conicet.gov.ar [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina)

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer Indoor pollution control via photocatalytic reactors. Black-Right-Pointing-Pointer Scaling-up methodology based on previously determined mechanistic kinetics. Black-Right-Pointing-Pointer Radiation interchange model between catalytic walls using configuration factors. Black-Right-Pointing-Pointer Modeling and experimental validation of a complex geometry photocatalytic reactor. - Abstract: A methodology for modeling photocatalytic reactors for their application in indoor air pollution control is carried out. The methodology implies, firstly, the determination of intrinsic reaction kinetics for the removal of formaldehyde. This is achieved by means of a simple geometry, continuous reactor operating under kinetic control regime and steady state. The kinetic parameters were estimated from experimental data by means of a nonlinear optimization algorithm. The second step was the application of the obtained kinetic parameters to a very different photoreactor configuration. In this case, the reactor is a corrugated wall type using nanosize TiO{sub 2} as catalyst irradiated by UV lamps that provided a spatially uniform radiation field. The radiative transfer within the reactor was modeled through a superficial emission model for the lamps, the ray tracing method and the computation of view factors. The velocity and concentration fields were evaluated by means of a commercial CFD tool (Fluent 12) where the radiation model was introduced externally. The results of the model were compared experimentally in a corrugated wall, bench scale reactor constructed in the laboratory. The overall pollutant conversion showed good agreement between model predictions and experiments, with a root mean square error less than 4%.

  9. 2014 International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Ören, Tuncer; Kacprzyk, Janusz; Filipe, Joaquim

    2015-01-01

    The present book includes a set of selected extended papers from the 4th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2014), held in Vienna, Austria, from 28 to 30 August 2014. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2014 received 167 submissions, from 45 countries, in all continents. After a double blind paper review performed by the Program Committee, 23% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2014. Commitment to high quality standards is a major concern of SIMULTEC...

  10. 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Kacprzyk, Janusz; Ören, Tuncer; Filipe, Joaquim

    2016-01-01

    The present book includes a set of selected extended papers from the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2015), held in Colmar, France, from 21 to 23 July 2015. The conference brought together researchers, engineers and practitioners interested in methodologies and applications of modeling and simulation. New and innovative solutions are reported in this book. SIMULTECH 2015 received 102 submissions, from 36 countries, in all continents. After a double blind paper review performed by the Program Committee, 19% were accepted as full papers and thus selected for oral presentation. Additional papers were accepted as short papers and posters. A further selection was made after the Conference, based also on the assessment of presentation quality and audience interest, so that this book includes the extended and revised versions of the very best papers of SIMULTECH 2015. Commitment to high quality standards is a major concern of SIMULTECH t...

  11. Two-dimensional chronostratigraphic modelling of OSL ages from recent beach-ridge deposits, SE Australia

    DEFF Research Database (Denmark)

    Tamura, Toru; Cunningham, Alastair C.; Oliver, Thomas S.N.

    2018-01-01

    Optically-stimulated luminesecne (OSL) dating, in concert with two-dimensional ground-penetrating radar (GPR) profiling, has contributed to significant advances in our understanding of beach-ridge systems and other sedimentary landforms in various settings. For recent beach-ridges, the good OSL...... samples may be larger than the difference in sample ages. Age inversions can be avoided, however, if the stratigraphic constraints are included in the age estimation process. Here, we create a custom Bayesian chronological model for a recent (..., for direct comparison with a GPR profile. The model includes a full ‘burial-dose model’ for each sample and a dose rate term with the modelled ages constrained by the vertical and shore-normal sample order. The modelled ages are visualized by plotting isochrones on the beach-ridge cross section...

  12. A Model-Driven Methodology for Big Data Analytics-as-a-Service

    OpenAIRE

    Damiani, Ernesto; Ardagna, Claudio Agostino; Ceravolo, Paolo; Bellandi, Valerio; Bezzi, Michele; Hebert, Cedric

    2017-01-01

    The Big Data revolution has promised to build a data-driven ecosystem where better decisions are supported by enhanced analytics and data management. However, critical issues still need to be solved in the road that leads to commodization of Big Data Analytics, such as the management of Big Data complexity and the protection of data security and privacy. In this paper, we focus on the first issue and propose a methodology based on Model Driven Engineering (MDE) that aims to substantially lowe...

  13. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  14. Methodological Aspects of Modeling Development and Viability of Systems and Counterparties in the Digital Economy

    Directory of Open Access Journals (Sweden)

    Vitlinskyy Valdemar V.

    2018-03-01

    Full Text Available The aim of the article is to study and generalize methodological approaches to modeling economic development and viability of economic systems with consideration for risk, changing their goals, status, and behavior in the digital economy. The definition of categories of economic development and viability is offered, the directions of their research by means of mathematical modeling are grounded. The system of characteristics and markers of the external economic environment under conditions of digitalization of economic activity is analyzed. The theoretical foundations and methodology for mathematical modeling of development of economic systems as well as ensuring their viability and security under conditions of introducing infrastructure of information society and digital economy on the principles of the information and knowledge approach are considered. It is proved that in an information society, predictive model technologies are a growing safety resource. There studied prerequisites for replacing the traditional integration concept of evaluation, analysis, modeling, management, and administration of economic development based on a threat-oriented approach to the definition of security protectors, information, and knowledge. There proposed a concept of creating a database of models for examining trends and patterns of economic development, which, unlike traditional trend models of dynamics, identifies and iteratively conceptualizes processes based on a set of knowledgeable predictors based on the use of data mining and machine learning tools, including in-depth training.

  15. Model methodology for estimating pesticide concentration extremes based on sparse monitoring data

    Science.gov (United States)

    Vecchia, Aldo V.

    2018-03-22

    This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.

  16. Harvesting Australia's mineral wealth

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Anderson Strathclyde plc is becoming increasingly involved in supplying equipment for the coal industry in Australia. It now has 2 subsidiary companies based in Australia: Anderson Strathclyde Australia and A B Rea.

  17. MODEL - INTEGRAL METHODOLOGY FOR SUCCESSFUL DESIGNING AND IMPLEMENTING OF TQM SYSTEM IN MACEDONIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Elizabeta Mitreva

    2011-12-01

    Full Text Available The subject of this paper is linked with the valorization of the meaning and the perspectives of Total Quality Management (TQM system design and implementation within the domestic companies and creating a model-methodology for improved performance, efficiency and effectiveness. The research is designed as an attempt to depict the existing condition in the Macedonian companies regarding quality system design and implementation, analysed through 4 polls in the "house of quality" whose top is the ultimate management, and as its bases measurement, evaluation, analyzing and comparison of the quality are used. This "house" is being held by 4 subsystems e.g. internal standardization, methods and techniques for flawless work performance, education and motivation and analyses of the quality costs. The data received from the research and the proposal of the integral methodology for designing and implementing of TQM system are designed in turn to help and present useful directions to all Macedonian companies tending to become "world class" organizations. The basis in the creation of this model is the redesign of the business processes which afterword begins as a new phase of the business performance - continued improvement, rolling of Deming's Quality Circle (Plan-Do-Check-Act. The model-methodology proposed in this paper is integral and universal which means that it is applicable to all companies regardless of the business area.

  18. Energy Demand Modeling Methodology of Key State Transitions of Turning Processes

    Directory of Open Access Journals (Sweden)

    Shun Jia

    2017-04-01

    Full Text Available Energy demand modeling of machining processes is the foundation of energy optimization. Energy demand of machining state transition is integral to the energy requirements of the machining process. However, research focus on energy modeling of state transition is scarce. To fill this gap, an energy demand modeling methodology of key state transitions of the turning process is proposed. The establishment of an energy demand model of state transition could improve the accuracy of the energy model of the machining process, which also provides an accurate model and reliable data for energy optimization of the machining process. Finally, case studies were conducted on a CK6153i CNC lathe, the results demonstrating that predictive accuracy with the proposed method is generally above 90% for the state transition cases.

  19. Development in methodologies for modelling of human and ecotoxic impacts in LCA

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Huijbregts, Mark; Jolliet, Olivier

    2009-01-01

    Under the UNEP-SETAC Life Cycle Initiative there is an aim to develop an internationally backed recommended practice of life cycle impact assessment addressing methodological issues like choice of characterization model and characterization factors. In this context, an international comparison...... was performed of characterization models for toxic impacts from chemicals in life cycle assessment. Six commonly used characterization models were compared and in a sequence of workshops. Crucial fate, exposure and effect aspects were identified for which the models differed in their treatment. The models were....... The USEtox™ model has been used to calculate characterization factors for several thousand substances and is currently under review with the intention that it shall form the basis of the recommendations from the UNEP-SETAC Life Cycle Initiative regarding characterization of toxic impacts in Life Cycle...

  20. Water level management of lakes connected to regulated rivers: An integrated modeling and analytical methodology

    Science.gov (United States)

    Hu, Tengfei; Mao, Jingqiao; Pan, Shunqi; Dai, Lingquan; Zhang, Peipei; Xu, Diandian; Dai, Huichao

    2018-07-01

    Reservoir operations significantly alter the hydrological regime of the downstream river and river-connected lake, which has far-reaching impacts on the lake ecosystem. To facilitate the management of lakes connected to regulated rivers, the following information must be provided: (1) the response of lake water levels to reservoir operation schedules in the near future and (2) the importance of different rivers in terms of affecting the water levels in different lake regions of interest. We develop an integrated modeling and analytical methodology for the water level management of such lakes. The data-driven method is used to model the lake level as it has the potential of producing quick and accurate predictions. A new genetic algorithm-based synchronized search is proposed to optimize input variable time lags and data-driven model parameters simultaneously. The methodology also involves the orthogonal design and range analysis for extracting the influence of an individual river from that of all the rivers. The integrated methodology is applied to the second largest freshwater lake in China, the Dongting Lake. The results show that: (1) the antecedent lake levels are of crucial importance for the current lake level prediction; (2) the selected river discharge time lags reflect the spatial heterogeneity of the rivers' impacts on lake level changes; (3) the predicted lake levels are in very good agreement with the observed data (RMSE ≤ 0.091 m; R2 ≥ 0.9986). This study demonstrates the practical potential of the integrated methodology, which can provide both the lake level responses to future dam releases and the relative contributions of different rivers to lake level changes.

  1. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    Science.gov (United States)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  2. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  3. Analysing the economy-wide effects of the energy tax: results for Australia from the ORANI-E model

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, R.A.; Dixon, P.B. [Monash Univ., Clayton, VIC (Australia); Australian Bureau of Agricultural and Resource Economics (ABARE), Canberra, ACT (Australia)

    1996-12-31

    Since the mid 1980s, economists have devoted considerable effort to greenhouse issues. Among the questions to which they have sought answers were what would be the effects on economic growth and employment of adopting different approaches to restricting greenhouse gas emissions, and what are the distributional effects of restricting greenhouse gas emissions, i.e. how would income and economic activity be re-allocated between countries, between industries, and between income classes. One approach to reduce greenhouse gas emissions is to impose taxes on the use of fossil fuels. Such a policy might, however, cause short-run economic disruption. This issues is investigated for Australia using a general equilibrium model, ORANI-E. The short-run effects of an energy tax are shown to depend on what is done with the tax revenue, how the labour market reacts, and on substitution possibilities between energy, capital and labour. Overall, the results indicate that energy taxes need not be damaging to the macro-economy. (author). 5 tabs., 2 figs., refs.

  4. Concepts and methodologies for modeling and simulation a tribute to Tuncer Oren

    CERN Document Server

    Yilmaz, Levent

    2015-01-01

    This comprehensive text/reference presents cutting-edge advances in the theory and methodology of modeling and simulation (M&S), and reveals how this work has been influenced by the fundamental contributions of Professor Tuncer Ören to this field. Exploring the synergies among the domains of M&S and systems engineering (SE), the book describes how M&S and SE can help to address the complex problems identified as "Grand Challenges" more effectively under a model-driven and simulation-directed systems engineering framework. Topics and features: examines frameworks for the development of advan

  5. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  6. Decision modelling of non-pharmacological interventions for individuals with dementia: a systematic review of methodologies

    DEFF Research Database (Denmark)

    Sopina, Liza; Sørensen, Jan

    2018-01-01

    alongside an RCT without additional modelling. Results: Two primary, five secondary and three tertiary prevention intervention studies were identified and reviewed. Five studies utilised Markov models, with others using discrete event, regression-based simulation, and decision tree approaches. A number...... of challenging methodological issues were identified, including the use of MMSE-score as the main outcome measure, limited number of strategies compared, restricted time horizons, and limited or dated data on dementia onset, progression and mortality. Only one of the three tertiary prevention studies explicitly...

  7. Methodology and Applications in Non-linear Model-based Geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    that are approximately Gaussian. Parameter estimation and prediction for the transformed Gaussian model is studied. In some cases a transformation cannot possibly render the data Gaussian. A methodology for analysing such data was introduced by Diggle, Tawn and Moyeed (1998): The generalised linear spatial model...... priors for Bayesian inference is discussed. Procedures for parameter estimation and prediction are studied. Theoretical properties of Markov chain Monte Carlo algorithms are investigated, and different algorithms are compared. In addition, the thesis contains a manual for an R-package, geoRglmm, which...

  8. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  9. Taxes and Subsidies for Improving Diet and Population Health in Australia: A Cost-Effectiveness Modelling Study.

    Science.gov (United States)

    Cobiac, Linda J; Tam, King; Veerman, Lennert; Blakely, Tony

    2017-02-01

    An increasing number of countries are implementing taxes on unhealthy foods and drinks to address the growing burden of dietary-related disease, but the cost-effectiveness of combining taxes on unhealthy foods and subsidies on healthy foods is not well understood. Using a population model of dietary-related diseases and health care costs and food price elasticities, we simulated the effect of taxes on saturated fat, salt, sugar, and sugar-sweetened beverages and a subsidy on fruits and vegetables, over the lifetime of the Australian population. The sizes of the taxes and subsidy were set such that, when combined as a package, there would be a negligible effect on average weekly expenditure on food (beverage tax (12,000 [95% UI: 2,100 to 21,000] DALYs). The fruit and vegetable subsidy (-13,000 [95% UI: -44,000 to 18,000] DALYs) was a cost-effective addition to the package of taxes. However, it did not necessarily lead to a net health benefit for the population when modelled as an intervention on its own, because of the possible adverse cross-price elasticity effects on consumption of other foods (e.g., foods high in saturated fat and salt). The study suggests that taxes and subsidies on foods and beverages can potentially be combined to achieve substantial improvements in population health and cost-savings to the health sector. However, the magnitude of health benefits is sensitive to measures of price elasticity, and further work is needed to incorporate potential benefits or harms associated with changes in other foods and nutrients that are not currently modelled, such as red and processed meats and fibre. With potentially large health benefits for the Australian population and large benefits in reducing health sector spending on the treatment of non-communicable diseases, the formulation of a tax and subsidy package should be given a more prominent role in Australia's public health nutrition strategy.

  10. Modelling drivers and distribution of lead and zinc concentrations in soils of an urban catchment (Sydney estuary, Australia).

    Science.gov (United States)

    Johnson, L E; Bishop, T F A; Birch, G F

    2017-11-15

    The human population is increasing globally and land use is changing to accommodate for this growth. Soils within urban areas require closer attention as the higher population density increases the chance of human exposure to urban contaminants. One such example of an urban area undergoing an increase in population density is Sydney, Australia. The city also possesses a notable history of intense industrial activity. By integrating multiple soil surveys and covariates into a linear mixed model, it was possible to determine the main drivers and map the distribution of lead and zinc concentrations within the Sydney estuary catchment. The main drivers as derived from the model included elevation, distance to main roads, main road type, soil landscape, population density (lead only) and land use (zinc only). Lead concentrations predicted using the model exceeded the established guideline value of 300mgkg -1 over a large portion of the study area with concentrations exceeding 1000mgkg -1 in the south of the catchment. Predicted zinc did not exceed the established guideline value of 7400mgkg -1 ; however concentrations were higher to the south and west of the study area. Unlike many other studies we considered the prediction uncertainty when assessing the contamination risk. Although the predictions indicate contamination over a large area, the broadness of the prediction intervals suggests that in many of these areas we cannot be sure that the site is contaminated. More samples are required to determine the contaminant distribution with greater precision, especially in residential areas where contamination was highest. Managing sources and addressing areas of elevated lead and zinc concentrations in urban areas has the potential to reduce the impact of past human activities and improve the urban environment of the future. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A methodology to model flow-thermals inside a domestic gas oven

    International Nuclear Information System (INIS)

    Mistry, Hiteshkumar; Ganapathisubbu, S.; Dey, Subhrajit; Bishnoi, Peeush; Castillo, Jose Luis

    2011-01-01

    In this paper, the authors describe development of a CFD based methodology to evaluate performance of a domestic gas oven. This involves modeling three-dimensional, unsteady, forced convective flow field coupled with radiative participating media. Various strategies for capturing transient heat transfer coupled with mixed convection flow field are evaluated considering the trade-off between computational time and accuracy of predictions. A new technique of modeling gas oven that does not require detailed modeling of flow-thermals through the burner is highlighted. Experiments carried out to support this modeling development shows that heat transfer from burners can be represented as non-dimensional false bottom temperature profiles. Transient validation of this model with experiments show less than 6% discrepancy in thermal field during preheating of bake cycle of gas oven.

  12. Risk-adjusted capitation funding models for chronic disease in Australia: alternatives to casemix funding.

    Science.gov (United States)

    Antioch, K M; Walsh, M K

    2002-01-01

    Under Australian casemix funding arrangements that use Diagnosis-Related Groups (DRGs) the average price is policy based, not benchmarked. Cost weights are too low for State-wide chronic disease services. Risk-adjusted Capitation Funding Models (RACFM) are feasible alternatives. A RACFM was developed for public patients with cystic fibrosis treated by an Australian Health Maintenance Organization (AHMO). Adverse selection is of limited concern since patients pay solidarity contributions via Medicare levy with no premium contributions to the AHMO. Sponsors paying premium subsidies are the State of Victoria and the Federal Government. Cost per patient is the dependent variable in the multiple regression. Data on DRG 173 (cystic fibrosis) patients were assessed for heteroskedasticity, multicollinearity, structural stability and functional form. Stepwise linear regression excluded non-significant variables. Significant variables were 'emergency' (1276.9), 'outlier' (6377.1), 'complexity' (3043.5), 'procedures' (317.4) and the constant (4492.7) (R(2)=0.21, SE=3598.3, F=14.39, Probpayment (constant). The model explained 21% of the variance in cost per patient. The payment rate is adjusted by a best practice annual admission rate per patient. The model is a blended RACFM for in-patient, out-patient, Hospital In The Home, Fee-For-Service Federal payments for drugs and medical services; lump sum lung transplant payments and risk sharing through cost (loss) outlier payments. State and Federally funded home and palliative services are 'carved out'. The model, which has national application via Coordinated Care Trials and by Australian States for RACFMs may be instructive for Germany, which plans to use Australian DRGs for casemix funding. The capitation alternative for chronic disease can improve equity, allocative efficiency and distributional justice. The use of Diagnostic Cost Groups (DCGs) is a promising alternative classification system for capitation arrangements.

  13. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  14. Methodological considerations for economic modelling of latent tuberculous infection screening in migrants.

    Science.gov (United States)

    Shedrawy, J; Siroka, A; Oxlade, O; Matteelli, A; Lönnroth, K

    2017-09-01

    Tuberculosis (TB) in migrants from endemic to low-incidence countries results mainly from the reactivation of latent tuberculous infection (LTBI). LTBI screening policies for migrants vary greatly between countries, and the evidence on the cost-effectiveness of the different approaches is weak and heterogeneous. The aim of this review was to assess the methodology used in published economic evaluations of LTBI screening among migrants to identify critical methodological options that must be considered when using modelling to determine value for money from different economic perspectives. Three electronic databases were searched and 10 articles were included. There was considerable variation across this small number of studies with regard to economic perspective, main outcomes, modelling technique, screening options and target populations considered, as well as in parameterisation of the epidemiological situation, test accuracy, efficacy, safety and programme performance. Only one study adopted a societal perspective; others adopted a health care or wider government perspective. Parameters representing the cascade of screening and treating LTBI varied widely, with some studies using highly aspirational scenarios. This review emphasises the need for a more harmonised approach for economic analysis, and better transparency in how policy options and economic perspectives influence methodological choices. Variability is justifiable for some parameters. However, sufficient data are available to standardise others. A societal perspective is ideal, but can be challenging due to limited data. Assumptions about programme performance should be based on empirical data or at least realistic assumptions. Results should be interpreted within specific contexts and policy options, with cautious generalisations.

  15. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  16. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  17. A modelling methodology for assessing the impact of climate variability and climatic change on hydroelectric generation

    International Nuclear Information System (INIS)

    Munoz, J.R.; Sailor, D.J.

    1998-01-01

    A new methodology relating basic climatic variables to hydroelectric generation was developed. The methodology can be implemented in large or small basins with any number of hydro plants. The method was applied to the Sacramento, Eel and Russian river basins in northern California where more than 100 hydroelectric plants are located. The final model predicts the availability of hydroelectric generation for the entire basin provided present and near past climate conditions, with about 90% accuracy. The results can be used for water management purposes or for analyzing the effect of climate variability on hydrogeneration availability in the basin. A wide range of results can be obtained depending on the climate change scenario used. (Author)

  18. Mathematical Methodology for New Modeling of Water Hammer in Emergency Core Cooling System

    International Nuclear Information System (INIS)

    Lee, Seungchan; Yoon, Dukjoo; Ha, Sangjun

    2013-01-01

    In engineering insight, the water hammer study has carried out through the experimental work and the fluid mechanics. In this study, a new access methodology is introduced by Newton mechanics and a mathematical method. Also, NRC Generic Letter 2008-01 requires nuclear power plant operators to evaluate the effect of water-hammer for the protection of pipes of the Emergency Core Cooling System, which is related to the Residual Heat Removal System and the Containment Spray System. This paper includes modeling, the processes of derivation of the mathematical equations and the comparison with other experimental work. To analyze the effect of water-hammer, this mathematical methodology is carried out. This study is in good agreement with other experiment results as above. This method is very efficient to explain the water-hammer phenomena

  19. Mathematical Methodology for New Modeling of Water Hammer in Emergency Core Cooling System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungchan; Yoon, Dukjoo; Ha, Sangjun [Korea Hydro Nuclear Power Co. Ltd, Daejeon (Korea, Republic of)

    2013-05-15

    In engineering insight, the water hammer study has carried out through the experimental work and the fluid mechanics. In this study, a new access methodology is introduced by Newton mechanics and a mathematical method. Also, NRC Generic Letter 2008-01 requires nuclear power plant operators to evaluate the effect of water-hammer for the protection of pipes of the Emergency Core Cooling System, which is related to the Residual Heat Removal System and the Containment Spray System. This paper includes modeling, the processes of derivation of the mathematical equations and the comparison with other experimental work. To analyze the effect of water-hammer, this mathematical methodology is carried out. This study is in good agreement with other experiment results as above. This method is very efficient to explain the water-hammer phenomena.

  20. Modeling of Throughput in Production Lines Using Response Surface Methodology and Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Federico Nuñez-Piña

    2018-01-01

    Full Text Available The problem of assigning buffers in a production line to obtain an optimum production rate is a combinatorial problem of type NP-Hard and it is known as Buffer Allocation Problem. It is of great importance for designers of production systems due to the costs involved in terms of space requirements. In this work, the relationship among the number of buffer slots, the number of work stations, and the production rate is studied. Response surface methodology and artificial neural network were used to develop predictive models to find optimal throughput values. 360 production rate values for different number of buffer slots and workstations were used to obtain a fourth-order mathematical model and four hidden layers’ artificial neural network. Both models have a good performance in predicting the throughput, although the artificial neural network model shows a better fit (R=1.0000 against the response surface methodology (R=0.9996. Moreover, the artificial neural network produces better predictions for data not utilized in the models construction. Finally, this study can be used as a guide to forecast the maximum or near maximum throughput of production lines taking into account the buffer size and the number of machines in the line.

  1. A new methodology for modeling of direct landslide costs for transportation infrastructures

    Science.gov (United States)

    Klose, Martin; Terhorst, Birgit

    2014-05-01

    The world's transportation infrastructure is at risk of landslides in many areas across the globe. A safe and affordable operation of traffic routes are the two main criteria for transportation planning in landslide-prone areas. The right balancing of these often conflicting priorities requires, amongst others, profound knowledge of the direct costs of landslide damage. These costs include capital investments for landslide repair and mitigation as well as operational expenditures for first response and maintenance works. This contribution presents a new methodology for ex post assessment of direct landslide costs for transportation infrastructures. The methodology includes tools to compile, model, and extrapolate landslide losses on different spatial scales over time. A landslide susceptibility model enables regional cost extrapolation by means of a cost figure obtained from local cost compilation for representative case study areas. On local level, cost survey is closely linked with cost modeling, a toolset for cost estimation based on landslide databases. Cost modeling uses Landslide Disaster Management Process Models (LDMMs) and cost modules to simulate and monetize cost factors for certain types of landslide damage. The landslide susceptibility model provides a regional exposure index and updates the cost figure to a cost index which describes the costs per km of traffic route at risk of landslides. Both indexes enable the regionalization of local landslide losses. The methodology is applied and tested in a cost assessment for highways in the Lower Saxon Uplands, NW Germany, in the period 1980 to 2010. The basis of this research is a regional subset of a landslide database for the Federal Republic of Germany. In the 7,000 km² large Lower Saxon Uplands, 77 km of highway are located in potential landslide hazard area. Annual average costs of 52k per km of highway at risk of landslides are identified as cost index for a local case study area in this region. The

  2. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one-b...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  3. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  4. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    International Nuclear Information System (INIS)

    Lahtinen, J.; Launiainen, T.; Heljanko, K.; Ropponen, J.

    2012-01-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  5. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  6. On interception modelling of a lowland coastal rainforest in northern Queensland, Australia

    Science.gov (United States)

    Wallace, Jim; McJannet, Dave

    2006-10-01

    SummaryRecent studies of the water balance of tropical rainforests in northern Queensland have revealed that large fractions of rainfall, up to 30%, are intercepted by the canopy and lost as evaporation. These loss rates are much higher than those reported for continental rainforests, for example, in the Amazon basin, where interception is around 9% of rainfall. Higher interception losses have been found in coastal and mountain rainforests and substantial advection of energy during rainfall is proposed to account for these results. This paper uses a process based model of interception to analyse the interception losses at Oliver Creek, a lowland coastal rainforest site in northern Queensland with a mean annual rainfall of 3952 mm. The observed interception loss of 25% of rainfall for the period August 2001 to January 2004 can be reproduced by the model with a suitable choice of the three key controlling variables, the canopy storage capacity, mean rainfall rate and mean wet canopy evaporation rate. Our analyses suggest that the canopy storage capacity of the Oliver Creek rainforest is between 3.0 and 3.5 mm, higher than reported for most other rainforests. Despite the high canopy capacity at our site, the interception losses can only be accounted for with energy advection during rainfall in the range 40-70% of the incident energy.

  7. An isotopic and modelling study of flow paths and storage in Quaternary calcarenite, SW Australia: implications for speleothem paleoclimate records

    Science.gov (United States)

    Treble, Pauline C.; Bradley, Chris; Wood, Anne; Baker, Andy; Jex, Catherine N.; Fairchild, Ian J.; Gagan, Michael K.; Cowley, Joan; Azcurra, Cecilia

    2013-03-01

    We investigated the distinctive shallow sub-surface hydrology of the southwest Western Australia (SWWA) dune calcarenite using observed rainfall and rainfall δ18O; soil moisture, cave drip rate and dripwater δ18O over a six-year period: August 2005-March 2012. A lumped parameter hydrological model is developed to describe water fluxes and drip δ18O. Comparison of observed data and model output allow us to assess the critical non-climatic karst hydrological processes that modify the precipitation δ18O signal and discuss the implications for speleothem paleoclimate records from this cave and those with a similar karst setting. Our findings include evidence of multiple reservoirs, characterised by distinct δ18O values and recharge responses ('low' and 'high' flow sites). Dripwaters exhibit δ18O variations in wet versus dry years at low-flow sites receiving diffuse seepage from the epikarst with an attenuated isotopic composition that approximates mean rainfall. Recharge from high-magnitude rain events is stored in a secondary reservoir which is associated with high-flow dripwater that is 1‰ lower than our monitored low-flow sites (δ18O). One drip site is characterised by mixed-flow behaviour and exhibits a non-linear threshold response after the cessation of drainage from a secondary reservoir following a record dry year (2006). Additionally, our results yield a better understanding of the vadose zone hydrology and dripwater characteristics in Quaternary age dune limestones. We show that flow to our monitored sites is dominated by diffuse flow with inferred transit times of less than one year. Diffuse flow appears to follow vertical preferential paths through the limestone reflecting differences in permeability and deep recharge into the host rock.

  8. 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Koziel, Slawomir; Kacprzyk, Janusz; Leifsson, Leifur; Ören, Tuncer

    2015-01-01

    This book includes extended and revised versions of a set of selected papers from the 3rd International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2013) which was co-organized by the Reykjavik University (RU) and sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC). SIMULTECH 2013 was held in cooperation with the ACM SIGSIM - Special Interest Group (SIG) on SImulation and Modeling (SIM), Movimento Italiano Modellazione e Simulazione (MIMOS) and AIS Special Interest Group on Modeling and Simulation (AIS SIGMAS) and technically co-sponsored by the Society for Modeling & Simulation International (SCS), Liophant Simulation, Simulation Team and International Federation for Information Processing (IFIP). This proceedings brings together researchers, engineers, applied mathematicians and practitioners working in the advances and applications in the field of system simulation.

  9. Power Prediction Model for Turning EN-31 Steel Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    M. Hameedullah

    2010-01-01

    Full Text Available Power consumption in turning EN-31 steel (a material that is most extensively used in automotive industry with tungstencarbide tool under different cutting conditions was experimentally investigated. The experimental runs were planned accordingto 24+8 added centre point factorial design of experiments, replicated thrice. The data collected was statisticallyanalyzed using Analysis of Variance technique and first order and second order power consumption prediction models weredeveloped by using response surface methodology (RSM. It is concluded that second-order model is more accurate than thefirst-order model and fit well with the experimental data. The model can be used in the automotive industries for decidingthe cutting parameters for minimum power consumption and hence maximum productivity

  10. Topobathymetric elevation model development using a new methodology: Coastal National Elevation Database

    Science.gov (United States)

    Danielson, Jeffrey J.; Poppenga, Sandra K.; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy A.; Barras, John

    2016-01-01

    During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models

  11. Efficient methodologies for system matrix modelling in iterative image reconstruction for rotating high-resolution PET

    Energy Technology Data Exchange (ETDEWEB)

    Ortuno, J E; Kontaxakis, G; Rubio, J L; Santos, A [Departamento de Ingenieria Electronica (DIE), Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Guerra, P [Networking Research Center on Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN), Madrid (Spain)], E-mail: juanen@die.upm.es

    2010-04-07

    A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

  12. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  13. Tissue banking in australia.

    Science.gov (United States)

    Ireland, Lynette; McKelvie, Helen

    2003-01-01

    The legal structure for the regulation of tissue banking has existed for many years. In Australia, the donation of human tissue is regulated by legislation in each of the eight States and Territories. These substantially uniform Acts were passed in the late 1970's and early 1980's, based on model legislation and underpinned by the concept of consensual giving. However, it was not until the early 1990's that tissue banking came under the notice of regulatory authorities. Since then the Australian Government has moved quickly to oversee the tissue banking sector in Australia. Banked human tissue has been deemed to be a therapeutic good under the Therapeutic Goods Act 1989, and tissue banks are required to be licensed by the Therapeutic Goods Administration and are audited for compliance with the Code of Good Manufacturing Practice- Human Blood and Tissues. In addition, tissue banks must comply with a myriad of other standards, guidelines and recommendations.

  14. A geostatistical methodology to assess the accuracy of unsaturated flow models

    International Nuclear Information System (INIS)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error

  15. A geostatistical methodology to assess the accuracy of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  16. DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Damir Kolich

    2016-03-01

    Full Text Available In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended.

  17. Methodology of synchronization among strategy and operation. A standards-based modeling approach

    Directory of Open Access Journals (Sweden)

    VICTOR EDWIN COLLAZOS

    2017-05-01

    Full Text Available Enterprise Architecture (EA has gained importance in recent years, mainly for its concept of “alignment” between the strategic and operational levels of organizations. Such alignment occurs when Information Technology (IT is applied correctly and timely, working in synergy and harmony with strategy and the operation to achieve mutually their own goals and satisfy the organizational needs.Both the strategic and operational levels have standards that help model elements necessary to obtain desired results. In this sense, BMM and BPMN were selected because both have the support of OMG and they are fairly well known for modelling the strategic level and operational level, respectively. In addition, i* modeling goal can be used for reducing the gap between these two standards. This proposal may help both the high-level design of the information system and to the appropriate identification of the business processes that will support it.This paper presents a methodology for aligning strategy and the operation based on standards and heuristics. We have made a classification for elements of the models and, for some specific cases, an extension of the heuristics associated between them. This allows us to propose methodology, which uses above-mentioned standards and combines mappings, transformations and actions to be considered in the alignment process.

  18. Methodology for identifying parameters for the TRNSYS model Type 210 - wood pellet stoves and boilers

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Tomas; Fiedler, Frank; Nordlander, Svante

    2006-05-15

    This report describes a method how to perform measurements on boilers and stoves and how to identify parameters from the measurements for the boiler/stove-model TRNSYS Type 210. The model can be used for detailed annual system simulations using TRNSYS. Experience from measurements on three different pellet stoves and four boilers were used to develop this methodology. Recommendations for the set up of measurements are given and the required combustion theory for the data evaluation and data preparation are given. The data evaluation showed that the uncertainties are quite large for the measured flue gas flow rate and for boilers and stoves with high fraction of energy going to the water jacket also the calculated heat rate to the room may have large uncertainties. A methodology for the parameter identification process and identified parameters for two different stoves and three boilers are given. Finally the identified models are compared with measured data showing that the model generally agreed well with measured data during both stationary and dynamic conditions.

  19. Oil Well Blowout 3D computational modeling: review of methodology and environmental requirements

    Directory of Open Access Journals (Sweden)

    Pedro Mello Paiva

    2016-12-01

    Full Text Available This literature review aims to present the different methodologies used in the three-dimensional modeling of the hydrocarbons dispersion originated from an oil well blowout. It presents the concepts of coastal environmental sensitivity and vulnerability, their importance for prioritizing the most vulnerable areas in case of contingency, and the relevant legislation. We also discuss some limitations about the methodology currently used in environmental studies of oil drift, which considers simplification of the spill on the surface, even in the well blowout scenario. Efforts to better understand the oil and gas behavior in the water column and three-dimensional modeling of the trajectory gained strength after the Deepwater Horizon spill in 2010 in the Gulf of Mexico. The data collected and the observations made during the accident were widely used for adjustment of the models, incorporating various factors related to hydrodynamic forcing and weathering processes to which the hydrocarbons are subjected during subsurface leaks. The difficulties show to be even more challenging in the case of blowouts in deep waters, where the uncertainties are still larger. The studies addressed different variables to make adjustments of oil and gas dispersion models along the upward trajectory. Factors that exert strong influences include: speed of the subsurface currents;  gas separation from the main plume; hydrate formation, dissolution of oil and gas droplets; variations in droplet diameter; intrusion of the droplets at intermediate depths; biodegradation; and appropriate parametrization of the density, salinity and temperature profiles of water through the column.

  20. Probabilistic risk assessment modeling of digital instrumentation and control systems using two dynamic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, T., E-mail: aldemir.1@osu.ed [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Guarro, S. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Mandelli, D. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Kirschenbaum, J. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Mangan, L.A. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Bucci, P. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Yau, M. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Ekici, E. [Ohio State University, Department of Electrical and Computer Engineering, Columbus, OH 43210 (United States); Miller, D.W.; Sun, X. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Arndt, S.A. [U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001 (United States)

    2010-10-15

    The Markov/cell-to-cell mapping technique (CCMT) and the dynamic flowgraph methodology (DFM) are two system logic modeling methodologies that have been proposed to address the dynamic characteristics of digital instrumentation and control (I and C) systems and provide risk-analytical capabilities that supplement those provided by traditional probabilistic risk assessment (PRA) techniques for nuclear power plants. Both methodologies utilize a discrete state, multi-valued logic representation of the digital I and C system. For probabilistic quantification purposes, both techniques require the estimation of the probabilities of basic system failure modes, including digital I and C software failure modes, that appear in the prime implicants identified as contributors to a given system event of interest. As in any other system modeling process, the accuracy and predictive value of the models produced by the two techniques, depend not only on the intrinsic features of the modeling paradigm, but also and to a considerable extent on information and knowledge available to the analyst, concerning the system behavior and operation rules under normal and off-nominal conditions, and the associated controlled/monitored process dynamics. The application of the two methodologies is illustrated using a digital feedwater control system (DFWCS) similar to that of an operating pressurized water reactor. This application was carried out to demonstrate how the use of either technique, or both, can facilitate the updating of an existing nuclear power plant PRA model following an upgrade of the instrumentation and control system from analog to digital. Because of scope limitations, the focus of the demonstration of the methodologies was intentionally limited to aspects of digital I and C system behavior for which probabilistic data was on hand or could be generated within the existing project bounds of time and resources. The data used in the probabilistic quantification portion of the

  1. Capturing complexity in work disability research: application of system dynamics modeling methodology.

    Science.gov (United States)

    Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J

    2016-01-01

    Work disability (WD) is characterized by variable and occasionally undesirable outcomes. The underlying determinants of WD outcomes include patterns of dynamic relationships among health, personal, organizational and regulatory factors that have been challenging to characterize, and inadequately represented by contemporary WD models. System dynamics modeling (SDM) methodology applies a sociotechnical systems thinking lens to view WD systems as comprising a range of influential factors linked by feedback relationships. SDM can potentially overcome limitations in contemporary WD models by uncovering causal feedback relationships, and conceptualizing dynamic system behaviors. It employs a collaborative and stakeholder-based model building methodology to create a visual depiction of the system as a whole. SDM can also enable researchers to run dynamic simulations to provide evidence of anticipated or unanticipated outcomes that could result from policy and programmatic intervention. SDM may advance rehabilitation research by providing greater insights into the structure and dynamics of WD systems while helping to understand inherent complexity. Challenges related to data availability, determining validity, and the extensive time and technical skill requirements for model building may limit SDM's use in the field and should be considered. Contemporary work disability (WD) models provide limited insight into complexity associated with WD processes. System dynamics modeling (SDM) has the potential to capture complexity through a stakeholder-based approach that generates a simulation model consisting of multiple feedback loops. SDM may enable WD researchers and practitioners to understand the structure and behavior of the WD system as a whole, and inform development of improved strategies to manage straightforward and complex WD cases.

  2. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Science.gov (United States)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  3. New methodologies for calculation of flight parameters on reduced scale wings models in wind tunnel =

    Science.gov (United States)

    Ben Mosbah, Abdallah

    In order to improve the qualities of wind tunnel tests, and the tools used to perform aerodynamic tests on aircraft wings in the wind tunnel, new methodologies were developed and tested on rigid and flexible wings models. A flexible wing concept is consists in replacing a portion (lower and/or upper) of the skin with another flexible portion whose shape can be changed using an actuation system installed inside of the wing. The main purpose of this concept is to improve the aerodynamic performance of the aircraft, and especially to reduce the fuel consumption of the airplane. Numerical and experimental analyses were conducted to develop and test the methodologies proposed in this thesis. To control the flow inside the test sections of the Price-Paidoussis wind tunnel of LARCASE, numerical and experimental analyses were performed. Computational fluid dynamics calculations have been made in order to obtain a database used to develop a new hybrid methodology for wind tunnel calibration. This approach allows controlling the flow in the test section of the Price-Paidoussis wind tunnel. For the fast determination of aerodynamic parameters, new hybrid methodologies were proposed. These methodologies were used to control flight parameters by the calculation of the drag, lift and pitching moment coefficients and by the calculation of the pressure distribution around an airfoil. These aerodynamic coefficients were calculated from the known airflow conditions such as angles of attack, the mach and the Reynolds numbers. In order to modify the shape of the wing skin, electric actuators were installed inside the wing to get the desired shape. These deformations provide optimal profiles according to different flight conditions in order to reduce the fuel consumption. A controller based on neural networks was implemented to obtain desired displacement actuators. A metaheuristic algorithm was used in hybridization with neural networks, and support vector machine approaches and their

  4. Agile Methodologies and Software Process Improvement Maturity Models, Current State of Practice in Small and Medium Enterprises

    OpenAIRE

    Koutsoumpos, Vasileios; Marinelarena, Iker

    2013-01-01

    Abstract—Background: Software Process Improvement (SPI) maturity models have been developed to assist organizations to enhance software quality. Agile methodologies are used to ensure productivity and quality of a software product. Amongst others they are applied in Small and Medium – sized Enterprises (SMEs). However, little is known about the combination of Agile methodologies and SPI maturity models regarding SMEs and the results that could emerge, as all the current SPI models are address...

  5. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  6. Boolean modeling in systems biology: an overview of methodology and applications

    International Nuclear Information System (INIS)

    Wang, Rui-Sheng; Albert, Réka; Saadatpour, Assieh

    2012-01-01

    Mathematical modeling of biological processes provides deep insights into complex cellular systems. While quantitative and continuous models such as differential equations have been widely used, their use is obstructed in systems wherein the knowledge of mechanistic details and kinetic parameters is scarce. On the other hand, a wealth of molecular level qualitative data on individual components and interactions can be obtained from the experimental literature and high-throughput technologies, making qualitative approaches such as Boolean network modeling extremely useful. In this paper, we build on our research to provide a methodology overview of Boolean modeling in systems biology, including Boolean dynamic modeling of cellular networks, attractor analysis of Boolean dynamic models, as well as inferring biological regulatory mechanisms from high-throughput data using Boolean models. We finally demonstrate how Boolean models can be applied to perform the structural analysis of cellular networks. This overview aims to acquaint life science researchers with the basic steps of Boolean modeling and its applications in several areas of systems biology. (paper)

  7. A novel methodology improves reservoir characterization models using geologic fuzzy variables

    Energy Technology Data Exchange (ETDEWEB)

    Soto B, Rodolfo [DIGITOIL, Maracaibo (Venezuela); Soto O, David A. [Texas A and M University, College Station, TX (United States)

    2004-07-01

    One of the research projects carried out in Cusiana field to explain its rapid decline during the last years was to get better permeability models. The reservoir of this field has a complex layered system that it is not easy to model using conventional methods. The new technique included the development of porosity and permeability maps from cored wells following the same trend of the sand depositions for each facie or layer according to the sedimentary facie and the depositional system models. Then, we used fuzzy logic to reproduce those maps in three dimensions as geologic fuzzy variables. After multivariate statistical and factor analyses, we found independence and a good correlation coefficient between the geologic fuzzy variables and core permeability and porosity. This means, the geologic fuzzy variable could explain the fabric, the grain size and the pore geometry of the reservoir rock trough the field. Finally, we developed a neural network permeability model using porosity, gamma ray and the geologic fuzzy variable as input variables. This model has a cross-correlation coefficient of 0.873 and average absolute error of 33% compared with the actual model with a correlation coefficient of 0.511 and absolute error greater than 250%. We tested different methodologies, but this new one showed dramatically be a promiser way to get better permeability models. The use of the models have had a high impact in the explanation of well performance and workovers, and reservoir simulation models. (author)

  8. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Blakeman, Edward D [ORNL; Peplow, Douglas E. [ORNL; Wagner, John C [ORNL; Murphy, Brian D [ORNL; Mueller, Don [ORNL

    2007-09-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally files and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts.

  9. Modeling companion diagnostics in economic evaluations of targeted oncology therapies: systematic review and methodological checklist.

    Science.gov (United States)

    Doble, Brett; Tan, Marcus; Harris, Anthony; Lorgelly, Paula

    2015-02-01

    The successful use of a targeted therapy is intrinsically linked to the ability of a companion diagnostic to correctly identify patients most likely to benefit from treatment. The aim of this study was to review the characteristics of companion diagnostics that are of importance for inclusion in an economic evaluation. Approaches for including these characteristics in model-based economic evaluations are compared with the intent to describe best practice methods. Five databases and government agency websites were searched to identify model-based economic evaluations comparing a companion diagnostic and subsequent treatment strategy to another alternative treatment strategy with model parameters for the sensitivity and specificity of the companion diagnostic (primary synthesis). Economic evaluations that limited model parameters for the companion diagnostic to only its cost were also identified (secondary synthesis). Quality was assessed using the Quality of Health Economic Studies instrument. 30 studies were included in the review (primary synthesis n = 12; secondary synthesis n = 18). Incremental cost-effectiveness ratios may be lower when the only parameter for the companion diagnostic included in a model is the cost of testing. Incorporating the test's accuracy in addition to its cost may be a more appropriate methodological approach. Altering the prevalence of the genetic biomarker, specific population tested, type of test, test accuracy and timing/sequence of multiple tests can all impact overall model results. The impact of altering a test's threshold for positivity is unknown as it was not addressed in any of the included studies. Additional quality criteria as outlined in our methodological checklist should be considered due to the shortcomings of standard quality assessment tools in differentiating studies that incorporate important test-related characteristics and those that do not. There is a need to refine methods for incorporating the characteristics

  10. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    International Nuclear Information System (INIS)

    Blakeman, Edward D.; Peplow, Douglas E.; Wagner, John C.; Murphy, Brian D.; Mueller, Don

    2007-01-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally files and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts

  11. Modeling collective animal behavior with a cognitive perspective: a methodological framework.

    Directory of Open Access Journals (Sweden)

    Sebastian Weitz

    Full Text Available The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the

  12. Methodology for geometric modelling. Presentation and administration of site descriptive models; Metodik foer geometrisk modellering. Presentation och administration av platsbeskrivande modeller

    Energy Technology Data Exchange (ETDEWEB)

    Munier, Raymond [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hermanson, Jan [Golder Associates (Sweden)

    2001-03-01

    This report presents a methodology to construct, visualise and present geoscientific descriptive models based on data from the site investigations, which the SKB currently performs, to build an underground nuclear waste disposal facility in Sweden. It is designed for interaction with SICADA (SKB:s site characterisation database) and RVS (SKB:s Rock Visualisation System). However, the concepts of the methodology are general and can be used with other tools capable of handling 3D geometries and parameters. The descriptive model is intended to be an instrument where site investigation data from all disciplines are put together to form a comprehensive visual interpretation of the studied rock mass. The methodology has four main components: 1. Construction of a geometrical model of the interpreted main structures at the site. 2. Description of the geoscientific characteristics of the structures. 3. Description and geometrical implementation of the geometric uncertainties in the interpreted model structures. 4. Quality system for the handling of the geometrical model, its associated database and some aspects of the technical auditing. The geometrical model forms a basis for understanding the main elements and structures of the investigated site. Once the interpreted geometries are in place in the model, the system allows for adding descriptive and quantitative data to each modelled object through a system of intuitive menus. The associated database allows each geometrical object a complete quantitative description of all geoscientific disciplines, variabilities, uncertainties in interpretation and full version history. The complete geometrical model and its associated database of object descriptions are to be recorded in a central quality system. Official, new and old versions of the model are administered centrally in order to have complete quality assurance of each step in the interpretation process. The descriptive model is a cornerstone in the understanding of the

  13. Complex methodology of the model elaboration of the quantified transnationalization process assessment

    Directory of Open Access Journals (Sweden)

    Larysa Rudenko-Sudarieva

    2009-03-01

    Full Text Available In the article there are studied the theoretical fundamentals of transnationalization, the peculiarities of its development based on the studying of the world theory and practices; suggested a systematic approach of the methodical background as for determination of the economic category of «transnationalization» and its author’s definition; developed a complex methodology of the model building of the quantified transnationalization process assessment based on the seven-milestone algorithm of the formation of key indicators; systematized and carried out synthesis of the empiric investigations concerning the state, development of the available tendencies, comparative analysis of the transnationalization level within the separate TNC’s groups.

  14. 3CE Methodology for Conducting a Modeling, Simulation, and Instrumentation Tool Capability Analysis

    Science.gov (United States)

    2010-05-01

    flRmurn I F )T:Ir,tir)l! MCr)lto.-lng DHin nttbli..’"Ollc:~ E,;m:a..liut .!,)’l’lt’Mn:l’lll.ll~ t Managemen t F unction a l Arem 1 .5 Toola na...a modeling, simulation, and instrumentation (MS&I) environment. This methodology uses the DoDAF product set to document operational and systems...engineering process were identified and resolved, such as duplication of data elements derived from DoDAF operational and system views used to

  15. Animal Models of Virus-Induced Neurobehavioral Sequelae: Recent Advances, Methodological Issues, and Future Prospects

    Directory of Open Access Journals (Sweden)

    Marco Bortolato

    2010-01-01

    Full Text Available Converging lines of clinical and epidemiological evidence suggest that viral infections in early developmental stages may be a causal factor in neuropsychiatric disorders such as schizophrenia, bipolar disorder, and autism-spectrum disorders. This etiological link, however, remains controversial in view of the lack of consistent and reproducible associations between viruses and mental illness. Animal models of virus-induced neurobehavioral disturbances afford powerful tools to test etiological hypotheses and explore pathophysiological mechanisms. Prenatal or neonatal inoculations of neurotropic agents (such as herpes-, influenza-, and retroviruses in rodents result in a broad spectrum of long-term alterations reminiscent of psychiatric abnormalities. Nevertheless, the complexity of these sequelae often poses methodological and interpretational challenges and thwarts their characterization. The recent conceptual advancements in psychiatric nosology and behavioral science may help determine new heuristic criteria to enhance the translational value of these models. A particularly critical issue is the identification of intermediate phenotypes, defined as quantifiable factors representing single neurochemical, neuropsychological, or neuroanatomical aspects of a diagnostic category. In this paper, we examine how the employment of these novel concepts may lead to new methodological refinements in the study of virus-induced neurobehavioral sequelae through animal models.

  16. Do Methodological Choices in Environmental Modeling Bias Rebound Effects? A Case Study on Electric Cars.

    Science.gov (United States)

    Font Vivanco, David; Tukker, Arnold; Kemp, René

    2016-10-18

    Improvements in resource efficiency often underperform because of rebound effects. Calculations of the size of rebound effects are subject to various types of bias, among which methodological choices have received particular attention. Modellers have primarily focused on choices related to changes in demand, however, choices related to modeling the environmental burdens from such changes have received less attention. In this study, we analyze choices in the environmental assessment methods (life cycle assessment (LCA) and hybrid LCA) and environmental input-output databases (E3IOT, Exiobase and WIOD) used as a source of bias. The analysis is done for a case study on battery electric and hydrogen cars in Europe. The results describe moderate rebound effects for both technologies in the short term. Additionally, long-run scenarios are calculated by simulating the total cost of ownership, which describe notable rebound effect sizes-from 26 to 59% and from 18 to 28%, respectively, depending on the methodological choices-with favorable economic conditions. Relevant sources of bias are found to be related to incomplete background systems, technology assumptions and sectorial aggregation. These findings highlight the importance of the method setup and of sensitivity analyses of choices related to environmental modeling in rebound effect assessments.

  17. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  18. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    Energy Technology Data Exchange (ETDEWEB)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  19. Methodological aspects of modeling household solid waste generation in Japan: Evidence from Okayama and Otsu cities.

    Science.gov (United States)

    Gu, Binxian; Fujiwara, Takeshi; Jia, Renfu; Duan, Ruiyang; Gu, Aijun

    2017-12-01

    This paper presents a quantitative methodology and two empirical case studies in Japan on modeling household solid waste (HSW) generation based on individual consumption expenditure (ICE) and local waste policy effects by using the coupled estimation model systems. Results indicate that ICE on food, miscellaneous commodities and services, as well as education, cultural, and recreation services are mainly associated with the changes of HSW generation and its components in Okayama and Otsu from 1980 to 2014. The effects of waste policy measures were also identified. HSW generation in Okayama will increase from 11.60 million tons (mt) in 1980 to 25.02 mt in 2025, and the corresponding figures are 6.82 mt (in 1980) and 14.00 mt (in 2025) in Otsu. To better manage local HSW, several possible and appropriate implications such as promoting a green lifestyle, extending producer responsibility, intensifying recycling and source separation, generalizing composting, and establishing flexible measures and sustainable policies should be adopted. Results of this study would facilitate consumer management of low waste generation and support an effective HSW policy design in the two case cities. Success could lead to emulation by other Japanese cities seeking to build and maintain a sustainable, eco-friendly society. Moreover, the methodologies of establishing coupled estimation model systems could be extended to China and other global cities.

  20. Spatio-temporal modelling of heat stress and climate change implications for the Murray dairy region, Australia

    Science.gov (United States)

    Nidumolu, Uday; Crimp, Steven; Gobbett, David; Laing, Alison; Howden, Mark; Little, Stephen

    2014-08-01

    The Murray dairy region produces approximately 1.85 billion litres of milk each year, representing about 20 % of Australia's total annual milk production. An ongoing production challenge in this region is the management of the impacts of heat stress during spring and summer. An increase in the frequency and severity of extreme temperature events due to climate change may result in additional heat stress and production losses. This paper assesses the changing nature of heat stress now, and into the future, using historical data and climate change projections for the region using the temperature humidity index (THI). Projected temperature and relative humidity changes from two global climate models (GCMs), CSIRO MK3.5 and CCR-MIROC-H, have been used to calculate THI values for 2025 and 2050, and summarized as mean occurrence of, and mean length of consecutive high heat stress periods. The future climate scenarios explored show that by 2025 an additional 12-15 days (compared to 1971 to 2000 baseline data) of moderate to severe heat stress are likely across much of the study region. By 2050, larger increases in severity and occurrence of heat stress are likely (i.e. an additional 31-42 moderate to severe heat stress days compared with baseline data). This increasing trend will have a negative impact on milk production among dairy cattle in the region. The results from this study provide useful insights on the trends in THI in the region. Dairy farmers and the dairy industry could use these results to devise and prioritise adaptation options to deal with projected increases in heat stress frequency and severity.

  1. A new methodology for modelling of health risk from urban flooding exemplified by cholera

    DEFF Research Database (Denmark)

    Mark, Ole; Jørgensen, Claus; Hammond, Michael

    2016-01-01

    outlines a novel methodology for linking dynamic urban flood modelling with quantitative microbial risk assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and health risk caused by direct human contact with the flood water and hence gives...... and mortality, especially during floods. At present, there are no software tools capable of combining hydrodynamic modelling and health risk analyses, and the links between urban flooding and the health risk for the population due to direct contact with the flood water are poorly understood. The present paper...... an option for reducing the burden of disease in the population by use of intelligent urban flood risk management. The model linking urban flooding and health risk is applied to Dhaka City in Bangladesh, where waterborne diseases including cholera are endemic. The application to Dhaka City is supported...

  2. METHODOLOGY FOR THE ESTIMATION OF PARAMETERS, OF THE MODIFIED BOUC-WEN MODEL

    Directory of Open Access Journals (Sweden)

    Tomasz HANISZEWSKI

    2015-03-01

    Full Text Available Bouc-Wen model is theoretical formulation that allows to reflect real hysteresis loop of modeled object. Such object is for example a wire rope, which is present on equipment of crane lifting mechanism. Where adopted modified version of the model has nine parameters. Determination of such a number of parameters is complex and problematic issue. In this article are shown the methodology to identify and sample results of numerical simulations. The results were compared with data obtained on the basis of laboratory tests of ropes [3] and on their basis it was found that there is compliance between results and there is possibility to apply in dynamic systems containing in their structures wire ropes [4].

  3. A Consistent Methodology Based Parameter Estimation for a Lactic Acid Bacteria Fermentation Model

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2017-01-01

    Lactic acid bacteria are used in many industrial applications, e.g. as starter cultures in the dairy industry or as probiotics, and research on their cell production is highly required. A first principles kinetic model was developed to describe and understand the biological, physical, and chemical...... mechanisms in a lactic acid bacteria fermentation. We present here a consistent approach for a methodology based parameter estimation for a lactic acid fermentation. In the beginning, just an initial knowledge based guess of parameters was available and an initial parameter estimation of the complete set...... of parameters was performed in order to get a good model fit to the data. However, not all parameters are identifiable with the given data set and model structure. Sensitivity, identifiability, and uncertainty analysis were completed and a relevant identifiable subset of parameters was determined for a new...

  4. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Science.gov (United States)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  5. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects......This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... will be developed upon, will be discussed. Also, the parameters for evaluating the PSM will be considered. In establishing the theoretical body of knowledge with respect to CALS, an identification of schools and paradigms within the research area of applying information technology in a manufacturing environment...

  6. The epistemology of mathematical and statistical modeling: a quiet methodological revolution.

    Science.gov (United States)

    Rodgers, Joseph Lee

    2010-01-01

    A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.

  7. Socio-hydrologic Modeling to Understand and Mediate the Competition for Water between Humans and Ecosystems: Murrumbidgee River Basin, Australia

    Science.gov (United States)

    van Emmerik, Tim; Sivapalan, Murugesu; Li, Zheng; Pande, Saket; Savenije, Hubert

    2014-05-01

    Around the world the demand for water resources is growing in order to satisfy rapidly increasing human populations, leading to competition for water between humans and ecosystems. An entirely new and comprehensive quantitative framework is needed to establish a holistic understanding of that competition, thereby enabling development and evaluation of effective mediation strategies. We present a case study centered on the Murrumbidgee river basin in eastern Australia that illustrates the dynamics of the balance between water extraction and use for food production and efforts to mitigate and reverse consequent degradation of the riparian environment. Interactions between patterns of water resources management and climate driven hydrological variability within the prevailing socio-economic environment have contributed to the emergence of new whole system dynamics over the last 100 years. In particular, data analysis reveals a pendulum swing between an exclusive focus on agricultural development and food production in the initial stages of water resources development and its attendant socio-economic benefits, followed by the gradual realization of the adverse environmental impacts, efforts to mitigate these with the use of remedial measures, and ultimately concerted efforts and externally imposed solutions to restore environmental health and ecosystem services. A quasi-distributed coupled socio-hydrologic system model that explicitly includes the two-way coupling between human and hydrological systems, including evolution of human values/norms relating to water and the environment, is able to mimic broad features of this pendulum swing. The model consists of coupled nonlinear differential equations that include four state variables describing the co-evolution of storage capacity, irrigated area, human population, and ecosystem health, which are all connected by feedback mechanisms. The model is used to generate insights into the dominant controls of the trajectory of

  8. Economic and quality of care evaluation of dialysis service models in remote Australia: protocol for a mixed methods study.

    Science.gov (United States)

    Gorham, Gillian; Howard, Kirsten; Togni, Samantha; Lawton, Paul; Hughes, Jaquelyne; Majoni, Sandawana William; Brown, Sarah; Barnes, Sue; Cass, Alan

    2017-05-03

    Australia's Northern Territory (NT) has the country's highest incidence and prevalence of kidney disease. Indigenous people from remote areas suffer the heaviest disease burden. Concerns regarding cost and sustainability limit the provision of dialysis treatments in remote areas and most Indigenous people requiring dialysis relocate to urban areas. However, this dislocation of people from their family, community and support networks may prove more costly when the broader health, societal and economic consequences for the individual, family and whole of government are considered. The Dialysis Models of Care Study is a large cross organisation mixed methods study. It includes a retrospective (2000-2014) longitudinal data linkage study of two NT cohorts: Renal Cohort 1- comprising approximately 2000 adults who received dialysis and Renal Cohort 2- comprising approximately 400 children of those adults. Linkage of administrative data sets from the Australian and New Zealand Dialysis and Transplant Registry, NT Departments of Health, Housing and Education by a specialist third party (SA/NT Datalink) will enable extraction of activity, financial and outcome data. Interviews with patients, clinicians and service providers, using a snowball technique, will canvass relevant issues and assist in determining the full costs and impacts of the five most used dialysis Models of Care. The study uses a mixed methods approach to investigate the quantitative and qualitative dimensions of the full costs and outcomes associated with the choice of particular dialysis models of care for any given patient. The study includes a large data linkage component that for the first time links health, housing and education data to fully analyse and evaluate the impact on patients, their families and the broader community, resulting from the relocation of people for treatment. The study will generate a large amount of activity, financial and qualitative data that will investigate health costs less

  9. A MAINTENANCE STRATEGY MODEL FOR STATIC EQUIPMENT USING INSPECTION METHODOLOGIES AND RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.K. Visser

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Mechanical equipment used on process plants can be categorised into two main types, namely static and rotating equipment. A brief survey at a number of chemical process plants indicated that a number of maintenance strategies exist and are used for rotating equipment. However, some of these strategies are not directly applicable to static equipment, although the risk-based inspection (RBI methodology has been developed for pressure vessels. A generalised risk-based maintenance strategy for all types of static equipment does not currently exist. This paper describes the development of an optimised model of inspection methodologies, maintenance strategies, and risk management principles that are generically applicable for static equipment. It enables maintenance managers and engineers to select an applicable maintenance strategy and inspection methodology, based on the operational and business risks posed by the individual pieces of equipment.

    AFRIKAANSE OPSOMMING: Meganiese toerusting wat op prosesaanlegte gebruik word kan in twee kategorieë verdeel word, naamlik statiese en roterende toerusting. 'n Bondige ondersoek by 'n aantal chemiese prosesaanlegte het aangedui dat 'n aantal strategieë vir instandhouding van roterende toerusting gebruik word, terwyl die risikogebaseerde inspeksiemetodologie wel vir drukvate gebruik word. 'n Algemene risikogebaseerde instandhoudingstrategie vir alle tipes statiese toerusting is egter nie tans beskikbaar nie. Hierdie artikel beskryf die ontwikkeling van 'n geoptimeerde model van inspeksiemetodologieë, instandhoudingstrategieë, en risikobestuursbeginsels wat algemeen gebruik kan word vir statiese toerusting. Dit stel die instandhouding-bestuurders en -ingenieurs in staat om 'n instandhoudingstrategie en inspeksie-metodologie te kies, gebaseer op die operasionele en besigheidsrisiko's van die individuele toerusting.

  10. Prototype methodology for obtaining cloud seeding guidance from HRRR model data

    Science.gov (United States)

    Dawson, N.; Blestrud, D.; Kunkel, M. L.; Waller, B.; Ceratto, J.

    2017-12-01

    Weather model data, along with real time observations, are critical to determine whether atmospheric conditions are prime for super-cooled liquid water during cloud seeding operations. Cloud seeding groups can either use operational forecast models, or run their own model on a computer cluster. A custom weather model provides the most flexibility, but is also expensive. For programs with smaller budgets, openly-available operational forecasting models are the de facto method for obtaining forecast data. The new High-Resolution Rapid Refresh (HRRR) model (3 x 3 km grid size), developed by the Earth System Research Laboratory (ESRL), provides hourly model runs with 18 forecast hours per run. While the model cannot be fine-tuned for a specific area or edited to provide cloud-seeding-specific output, model output is openly available on a near-real-time basis. This presentation focuses on a prototype methodology for using HRRR model data to create maps which aid in near-real-time cloud seeding decision making. The R programming language is utilized to run a script on a Windows® desktop/laptop computer either on a schedule (such as every half hour) or manually. The latest HRRR model run is downloaded from NOAA's Operational Model Archive and Distribution System (NOMADS). A GRIB-filter service, provided by NOMADS, is used to obtain surface and mandatory pressure level data for a subset domain which greatly cuts down on the amount of data transfer. Then, a set of criteria, identified by the Idaho Power Atmospheric Science Group, is used to create guidance maps. These criteria include atmospheric stability (lapse rates), dew point depression, air temperature, and wet bulb temperature. The maps highlight potential areas where super-cooled liquid water may exist, reasons as to why cloud seeding should not be attempted, and wind speed at flight level.

  11. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding

    Science.gov (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte

    2015-04-01

    Indroduction Urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and it has significant economic and social consequences. While the cost of the direct flood damages of urban flooding is well understood, the indirect damages, like the water borne diseases is in general still poorly understood. Climate changes are expected to increase the frequency of urban flooding in many countries which is likely to increase water borne diseases. Diarrheal diseases are most prevalent in developing countries, where poor sanitation, poor drinking water and poor surface water quality causes a high disease burden and mortality, especially during floods. The level of water borne diarrhea in countries with well-developed water and waste water infrastructure has been reduced to an acceptable level, and the population in general do not consider waste water as being a health risk. Hence, exposure to wastewater influenced urban flood water still has the potential to cause transmission of diarrheal diseases. When managing urban flooding and planning urban climate change adaptations, health risks are rarely taken into consideration. This paper outlines a novel methodology for linking dynamic urban flood modelling with Quantitative Microbial Risk Assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and the health risks caused by direct human contact with flood water and provides an option for reducing the burden of disease in the population through the use of intelligent urban flood risk management. Methodology We have linked hydrodynamic urban flood modelling with quantitative microbial risk assessment (QMRA) to determine the risk of infection caused by exposure to wastewater influenced urban flood water. The deterministic model MIKE Flood, which integrates the sewer network model in MIKE Urban and the 2D surface model MIKE21, was used to calculate the concentration of pathogens in the

  12. A system-of-systems modeling methodology for strategic general aviation design decision-making

    Science.gov (United States)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  13. UPCaD: A Methodology of Integration Between Ontology-Based Context-Awareness Modeling and Relational Domain Data

    Directory of Open Access Journals (Sweden)

    Vinícius Maran

    2018-01-01

    Full Text Available Context-awareness is a key feature for ubiquitous computing scenarios applications. Currently, technologies and methodologies have been proposed for the integration of context-awareness concepts in intelligent information systems to adapt them to the execution of services, user interfaces and data retrieval. Recent research proposed conceptual modeling alternatives to the integration of the domain modeling in RDBMS and context-awareness modeling. The research described using highly expressiveness ontologies. The present work describes the UPCaD (Unified Process for Integration between Context-Awareness and Domain methodology, which is composed of formalisms and processes to guide the data integration considering RDBMS and context modeling. The methodology was evaluated in a virtual learning environment application. The evaluation shows the possibility to use a highly expressive context ontology to filter the relational data query and discusses the main contributions of the methodology compared with recent approaches.

  14. [Systemic inflammation: theoretical and methodological approaches to description of general pathological process model. Part 3. Backgroung for nonsyndromic approach].

    Science.gov (United States)

    Gusev, E Yu; Chereshnev, V A

    2013-01-01

    Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.

  15. Methodological Bases for Describing Risks of the Enterprise Business Model in Integrated Reporting

    Directory of Open Access Journals (Sweden)

    Nesterenko Oksana O.

    2017-12-01

    Full Text Available The aim of the article is to substantiate the methodological bases for describing the business and accounting risks of an enterprise business model in integrated reporting for their timely detection and assessment, and develop methods for their leveling or minimizing and possible prevention. It is proposed to consider risks in the process of forming integrated reporting from two sides: first, risks that arise in the business model of an organization and should be disclosed in its integrated report; second, accounting risks of integrated reporting, which should be taken into account by members of the cross-sectoral working group and management personnel in the process of forming and promulgating integrated reporting. To develop an adequate accounting and analytical tool for disclosure of information about the risks of the business model and integrated reporting, their leveling or minimization, in the article a terminological analysis of the essence of entrepreneurial and accounting risks is carried out. The entrepreneurial risk is defined as an objective-subjective economic category that characterizes the probability of negative or positive consequences of economic-social-ecological activity within the framework of the business model of an enterprise under uncertainty. The accounting risk is suggested to be understood as the probability of unfavorable consequences as a result of organizational, methodological errors in the integrated accounting system, which present threat to the quality, accuracy and reliability of the reporting information on economic, social and environmental activities in integrated reporting as well as threat of inappropriate decision-making by stakeholders based on the integrated report. For the timely identification of business risks and maximum leveling of the influence of accounting risks on the process of formation and publication of integrated reporting, in the study the place of entrepreneurial and accounting risks in

  16. Modeling and Analysis of The Pressure Die Casting Using Response Surface Methodology

    International Nuclear Information System (INIS)

    Kittur, Jayant K.; Herwadkar, T. V.; Parappagoudar, M. B.

    2010-01-01

    Pressure die casting is successfully used in the manufacture of Aluminum alloys components for automobile and many other industries. Die casting is a process involving many process parameters having complex relationship with the quality of the cast product. Though various process parameters have influence on the quality of die cast component, major influence is seen by the die casting machine parameters and their proper settings. In the present work, non-linear regression models have been developed for making predictions and analyzing the effect of die casting machine parameters on the performance characteristics of die casting process. Design of Experiments (DOE) with Response Surface Methodology (RSM) has been used to analyze the effect of effect of input parameters and their interaction on the response and further used to develop nonlinear input-output relationships. Die casting machine parameters, namely, fast shot velocity, slow shot to fast shot change over point, intensification pressure and holding time have been considered as the input variables. The quality characteristics of the cast product were determined by porosity, hardness and surface rough roughness (output/responses). Design of experiments has been used to plan the experiments and analyze the impact of variables on the quality of casting. On the other-hand Response Surface Methodology (Central Composite Design) is utilized to develop non-linear input-output relationships (regression models). The developed regression models have been tested for their statistical adequacy through ANOVA test. The practical usefulness of these models has been tested with some test cases. These models can be used to make the predictions about different quality characteristics, for the known set of die casting machine parameters, without conducting the experiments.

  17. Australia's marine virtual laboratory

    Science.gov (United States)

    Proctor, Roger; Gillibrand, Philip; Oke, Peter; Rosebrock, Uwe

    2014-05-01

    In all modelling studies of realistic scenarios, a researcher has to go through a number of steps to set up a model in order to produce a model simulation of value. The steps are generally the same, independent of the modelling system chosen. These steps include determining the time and space scales and processes of the required simulation; obtaining data for the initial set up and for input during the simulation time; obtaining observation data for validation or data assimilation; implementing scripts to run the simulation(s); and running utilities or custom-built software to extract results. These steps are time consuming and resource hungry, and have to be done every time irrespective of the simulation - the more complex the processes, the more effort is required to set up the simulation. The Australian Marine Virtual Laboratory (MARVL) is a new development in modelling frameworks for researchers in Australia. MARVL uses the TRIKE framework, a java-based control system developed by CSIRO that allows a non-specialist user configure and run a model, to automate many of the modelling preparation steps needed to bring the researcher faster to the stage of simulation and analysis. The tool is seen as enhancing the efficiency of researchers and marine managers, and is being considered as an educational aid in teaching. In MARVL we are developing a web-based open source application which provides a number of model choices and provides search and recovery of relevant observations, allowing researchers to: a) efficiently configure a range of different community ocean and wave models for any region, for any historical time period, with model specifications of their choice, through a user-friendly web application, b) access data sets to force a model and nest a model into, c) discover and assemble ocean observations from the Australian Ocean Data Network (AODN, http://portal.aodn.org.au/webportal/) in a format that is suitable for model evaluation or data assimilation, and

  18. Testing a Moderated Model of Satisfaction with Urban Living Using Data for Brisbane-South East Queensland, Australia

    Science.gov (United States)

    Mccrea, Rod; Stimson, Robert; Western, John

    2005-01-01

    Using survey data collected from households living in the Brisbane-South East Queensland region, a rapidly growing metropolis in Australia, path analysis is used to test links between urban residents' assessment of various urban attributes and their level of satisfaction in three urban domains--housing, neighbourhood or local area, and the wider…

  19. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  20. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  1. PAGIS summary report of phase 1: a common methodological approach based on European data and models

    International Nuclear Information System (INIS)

    Cadelli, N.; Cottone, G.; Bertozzi, G.; Girardi, F.

    1984-01-01

    Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.

  2. WAVFH delegates' reports: Australia

    International Nuclear Information System (INIS)

    Scanlan, W.A.

    1986-01-01

    Radiation measuring and control before Chernobyl: Continuous measurements of fallout in different parts of Australia, including the food producing areas, have been made since the mid 1950s. Levels have decreased rapidly since the cessation of atmospheric nuclear tests in the Southern Hemisphere in 1974 and in the Northern Hemisphere in 1980. Measurements of concentrations of radionuclides arising from fallout were made for the major groups of foods affected by the radioactive contaminants, starting in the 1950s and continuing until concentrations were so low that further effort in measurement was not warranted, i.e., less than 0.1 Bq/kg or 0.1 Bq/l. Changes in the concentrations of radionuclides in foods follow the same trends as the fallout levels. Based on the low levels of fallout measured in Australia since the 1950s, and taking into account the extremely low levels during the past decade, the concentrations of radionuclides arising from fallout in foods grown and processed in Australia are extremely small. Results from the fall-out from Chernobyl. Since the Chernobyl accident, measurements of the concentrations of 137 Cs in a variety of foodstuffs grown in Australia have been made, mainly for export purposes. A summary of the results of these measurements is given in Table 111 of Attachment 2. No 134 Cs has been detected, nor is it likely to be. By taking into account these measurements, the earlier measurements of foodstuffs, predictive modelling values and the very low levels of fall-out in deposit and in air, it is concluded that the concentrations of 137 Cs in all foodstuffs grown in Australia are extremely small. Accordingly, their consumption would result in no significant risk to the health of a population. With world atmospheric conditions being as they are, it will probably be 12 to 18 months before any fallout reaches Australia. Even if some fall-out does occur, it will be minimal and should not significantly increase our very low natural levels

  3. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  4. Assessment of historical leak model methodology as applied to the REDOX high-level waste tank SX-108

    International Nuclear Information System (INIS)

    JONES, T.E.

    1999-01-01

    Using the Historical Leak Model approach, the estimated leak rate (and therefore, projected leak volume) for Tank 241-SX-108 could not be reproduced using the data included in the initial document describing the leak methodology. An analysis of parameters impacting tank heat load calculations strongly suggest that the historical tank operating data lack the precision and accuracy required to estimate tank leak volumes using the Historical Leak Model methodology

  5. Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Yuting [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dunham, Camilla [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Hannah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) million hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled

  6. Methodology for predicting oily mixture properties in the mathematical modeling of molecular distillation

    Directory of Open Access Journals (Sweden)

    M. F. Gayol

    2017-06-01

    Full Text Available A methodology for predicting the thermodynamic and transport properties of a multi-component oily mixture, in which the different mixture components are grouped into a small number of pseudo components is shown. This prediction of properties is used in the mathematical modeling of molecular distillation, which consists of a system of differential equations in partial derivatives, according to the principles of the Transport Phenomena and is solved by an implicit finite difference method using a computer code. The mathematical model was validated with experimental data, specifically the molecular distillation of a deodorizer distillate (DD of sunflower oil. The results obtained were satisfactory, with errors less than 10% with respect to the experimental data in a temperature range in which it is possible to apply the proposed method.

  7. Methodology for predicting oily mixture properties in the mathematical modeling of molecular distillation

    International Nuclear Information System (INIS)

    Gayol, M.F.; Pramparo, M.C.; Miró Erdmann, S.M.

    2017-01-01

    A methodology for predicting the thermodynamic and transport properties of a multi-component oily mixture, in which the different mixture components are grouped into a small number of pseudo components is shown. This prediction of properties is used in the mathematical modeling of molecular distillation, which consists of a system of differential equations in partial derivatives, according to the principles of the Transport Phenomena and is solved by an implicit finite difference method using a computer code. The mathematical model was validated with experimental data, specifically the molecular distillation of a deodorizer distillate (DD) of sunflower oil. The results obtained were satisfactory, with errors less than 10% with respect to the experimental data in a temperature range in which it is possible to apply the proposed method. [es

  8. A methodology for modeling surface effects on stiff and soft solids

    Science.gov (United States)

    He, Jin; Park, Harold S.

    2018-06-01

    We present a computational method that can be applied to capture surface stress and surface tension-driven effects in both stiff, crystalline nanostructures, like size-dependent mechanical properties, and soft solids, like elastocapillary effects. We show that the method is equivalent to the classical Young-Laplace model. The method is based on converting surface tension and surface elasticity on a zero-thickness surface to an initial stress and corresponding elastic properties on a finite thickness shell, where the consideration of geometric nonlinearity enables capturing the out-of-plane component of the surface tension that results for curved surfaces through evaluation of the surface stress in the deformed configuration. In doing so, we are able to use commercially available finite element technology, and thus do not require consideration and implementation of the classical Young-Laplace equation. Several examples are presented to demonstrate the capability of the methodology for modeling surface stress in both soft solids and crystalline nanostructures.

  9. Reliability Modeling of Electromechanical System with Meta-Action Chain Methodology

    Directory of Open Access Journals (Sweden)

    Genbao Zhang

    2018-01-01

    Full Text Available To establish a more flexible and accurate reliability model, the reliability modeling and solving algorithm based on the meta-action chain thought are used in this thesis. Instead of estimating the reliability of the whole system only in the standard operating mode, this dissertation adopts the structure chain and the operating action chain for the system reliability modeling. The failure information and structure information for each component are integrated into the model to overcome the given factors applied in the traditional modeling. In the industrial application, there may be different operating modes for a multicomponent system. The meta-action chain methodology can estimate the system reliability under different operating modes by modeling the components with varieties of failure sensitivities. This approach has been identified by computing some electromechanical system cases. The results indicate that the process could improve the system reliability estimation. It is an effective tool to solve the reliability estimation problem in the system under various operating modes.

  10. Introduction of a methodology for visualization and graphical interpretation of Bayesian classification models.

    Science.gov (United States)

    Balfer, Jenny; Bajorath, Jürgen

    2014-09-22

    Supervised machine learning models are widely used in chemoinformatics, especially for the prediction of new active compounds or targets of known actives. Bayesian classification methods are among the most popular machine learning approaches for the prediction of activity from chemical structure. Much work has focused on predicting structure-activity relationships (SARs) on the basis of experimental training data. By contrast, only a few efforts have thus far been made to rationalize the performance of Bayesian or other supervised machine learning models and better understand why they might succeed or fail. In this study, we introduce an intuitive approach for the visualization and graphical interpretation of naïve Bayesian classification models. Parameters derived during supervised learning are visualized and interactively analyzed to gain insights into model performance and identify features that determine predictions. The methodology is introduced in detail and applied to assess Bayesian modeling efforts and predictions on compound data sets of varying structural complexity. Different classification models and features determining their performance are characterized in detail. A prototypic implementation of the approach is provided.

  11. Development of a practical methodology for integrating shoreline oil-holding capacity into modeling

    International Nuclear Information System (INIS)

    Schmidt Etkin, D.; French-McCay, D.; Rowe, J.; Michel, J.; Boufadel, M.; Li, H.

    2008-01-01

    The factors that influence the behaviour of oil in the aftermath of an oil spill on water include oil type and characteristics; oil thickness on the shoreline; time until shoreline impact; timing with regards to tides; weathering during and after the spill; and nearshore wave energy. The oil behaviour also depends on the shoreline characteristics, particularly porosity and permeability. The interactions of spilled oil with sediments on beaches must be well understood in order to model the oil spill trajectory, fate and risk. The movement of oil can be most accurately simulated if the algorithm incorporates an estimate of shoreline oil retention. This paper presented a literature review of relevant shoreline oiling studies and considered the relevance of study findings for inclusion in modelling. Survey data from a detailed shoreline cleanup assessment team (SCAT) were analyzed for patterns in oil penetration and oil-holding capacity by shoreline sediment type and oil type for potential use in modelling algorithms. A theoretical beach hydraulics model was then developed for use in a stochastic spill model. Gaps in information were identified, including the manner in which wave action and other environmental variables have an impact on the dynamic processes involved in shoreline oiling. The methodology presented in this paper can be used to estimate the amount of oil held by a shoreline upon impact to allow a trajectory model to more accurately project the total spread of oil. 27 refs., 13 tabs., 3 figs

  12. A generalized methodology for identification of threshold for HRU delineation in SWAT model

    Science.gov (United States)

    M, J.; Sudheer, K.; Chaubey, I.; Raj, C.

    2016-12-01

    The distributed hydrological model, Soil and Water Assessment Tool (SWAT) is a comprehensive hydrologic model widely used for making various decisions. The simulation accuracy of the distributed hydrological model differs due to the mechanism involved in the subdivision of the watershed. Soil and Water Assessment Tool (SWAT) considers sub-dividing the watershed and the sub-basins into small computing units known as 'hydrologic response units (HRU). The delineation of HRU is done based on unique combinations of land use, soil types, and slope within the sub-watersheds, which are not spatially defined. The computations in SWAT are done at HRU level and are then aggregated up to the sub-basin outlet, which is routed through the stream system. Generally, the HRUs are delineated by considering a threshold percentage of land use, soil and slope are to be given by the modeler to decrease the computation time of the model. The thresholds constrain the minimum area for constructing an HRU. In the current HRU delineation practice in SWAT, the land use, soil and slope of the watershed within a sub-basin, which is less than the predefined threshold, will be surpassed by the dominating land use, soil and slope, and introduce some level of ambiguity in the process simulations in terms of inappropriate representation of the area. But the loss of information due to variation in the threshold values depends highly on the purpose of the study. Therefore this research studies the effects of threshold values of HRU delineation on the hydrological modeling of SWAT on sediment simulations and suggests guidelines for selecting the appropriate threshold values considering the sediment simulation accuracy. The preliminary study was done on Illinois watershed by assigning different thresholds for land use and soil. A general methodology was proposed for identifying an appropriate threshold for HRU delineation in SWAT model that considered computational time and accuracy of the simulation

  13. A consistent modelling methodology for secondary settling tanks in wastewater treatment.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Nopens, Ingmar

    2011-03-01

    The aim of this contribution is partly to build consensus on a consistent modelling methodology (CMM) of complex real processes in wastewater treatment by combining classical concepts with results from applied mathematics, and partly to apply it to the clarification-thickening process in the secondary settling tank. In the CMM, the real process should be approximated by a mathematical model (process model; ordinary or partial differential equation (ODE or PDE)), which in turn is approximated by a simulation model (numerical method) implemented on a computer. These steps have often not been carried out in a correct way. The secondary settling tank was chosen as a case since this is one of the most complex processes in a wastewater treatment plant and simulation models developed decades ago have no guarantee of satisfying fundamental mathematical and physical properties. Nevertheless, such methods are still used in commercial tools to date. This particularly becomes of interest as the state-of-the-art practice is moving towards plant-wide modelling. Then all submodels interact and errors propagate through the model and severely hamper any calibration effort and, hence, the predictive purpose of the model. The CMM is described by applying it first to a simple conversion process in the biological reactor yielding an ODE solver, and then to the solid-liquid separation in the secondary settling tank, yielding a PDE solver. Time has come to incorporate established mathematical techniques into environmental engineering, and wastewater treatment modelling in particular, and to use proven reliable and consistent simulation models. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Application of fault tree methodology to modeling of the AP1000 plant digital reactor protection system

    International Nuclear Information System (INIS)

    Teolis, D.S.; Zarewczynski, S.A.; Detar, H.L.

    2012-01-01

    The reactor trip system (RTS) and engineered safety features actuation system (ESFAS) in nuclear power plants utilizes instrumentation and control (IC) to provide automatic protection against unsafe and improper reactor operation during steady-state and transient power operations. During normal operating conditions, various plant parameters are continuously monitored to assure that the plant is operating in a safe state. In response to deviations of these parameters from pre-determined set points, the protection system will initiate actions required to maintain the reactor in a safe state. These actions may include shutting down the reactor by opening the reactor trip breakers and actuation of safety equipment based on the situation. The RTS and ESFAS are represented in probabilistic risk assessments (PRAs) to reflect the impact of their contribution to core damage frequency (CDF). The reactor protection systems (RPS) in existing nuclear power plants are generally analog based and there is general consensus within the PRA community on fault tree modeling of these systems. In new plants, such as AP1000 plant, the RPS is based on digital technology. Digital systems are more complex combinations of hardware components and software. This combination of complex hardware and software can result in the presence of faults and failure modes unique to a digital RPS. The United States Nuclear Regulatory Commission (NRC) is currently performing research on the development of probabilistic models for digital systems for inclusion in PRAs; however, no consensus methodology exists at this time. Westinghouse is currently updating the AP1000 plant PRA to support initial operation of plants currently under construction in the United States. The digital RPS is modeled using fault tree methodology similar to that used for analog based systems. This paper presents high level descriptions of a typical analog based RPS and of the AP1000 plant digital RPS. Application of current fault

  15. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    Science.gov (United States)

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  16. Methodological challenges to bridge the gap between regional climate and hydrology models

    Science.gov (United States)

    Bozhinova, Denica; José Gómez-Navarro, Juan; Raible, Christoph; Felder, Guido

    2017-04-01

    The frequency and severity of floods worldwide, together with their impacts, are expected to increase under climate change scenarios. It is therefore very important to gain insight into the physical mechanisms responsible for such events in order to constrain the associated uncertainties. Model simulations of the climate and hydrological processes are important tools that can provide insight in the underlying physical processes and thus enable an accurate assessment of the risks. Coupled together, they can provide a physically consistent picture that allows to assess the phenomenon in a comprehensive way. However, climate and hydrological models work at different temporal and spatial scales, so there are a number of methodological challenges that need to be carefully addressed. An important issue pertains the presence of biases in the simulation of precipitation. Climate models in general, and Regional Climate models (RCMs) in particular, are affected by a number of systematic biases that limit their reliability. In many studies, prominently the assessment of changes due to climate change, such biases are minimised by applying the so-called delta approach, which focuses on changes disregarding absolute values that are more affected by biases. However, this approach is not suitable in this scenario, as the absolute value of precipitation, rather than the change, is fed into the hydrological model. Therefore, bias has to be previously removed, being this a complex matter where various methodologies have been proposed. In this study, we apply and discuss the advantages and caveats of two different methodologies that correct the simulated precipitation to minimise differences with respect an observational dataset: a linear fit (FIT) of the accumulated distributions and Quantile Mapping (QM). The target region is Switzerland, and therefore the observational dataset is provided by MeteoSwiss. The RCM is the Weather Research and Forecasting model (WRF), driven at the

  17. A Hybrid Methodology for Modeling Risk of Adverse Events in Complex Health-Care Settings.

    Science.gov (United States)

    Kazemi, Reza; Mosleh, Ali; Dierks, Meghan

    2017-03-01

    In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment-caused injuries while in the hospital. While risk cannot be entirely eliminated from health-care activities, an important goal is to develop effective and durable mitigation strategies to render the system "safer." In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health-care domain, this can be extremely challenging due to the wide variability in the way that health-care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational-level and policy-level contributions to risk evolve over time, and how policies and decisions may affect the general system-level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation-based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient-level factors and also physician-level decisions and factors in the management of an individual patient, which contribute to the risk of hospital-acquired AE. BBNs are networks of probabilities that can capture probabilistic relations

  18. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  19. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  20. Greenhouse gas network design using backward Lagrangian particle dispersion modelling - Part 1: Methodology and Australian test case

    Science.gov (United States)

    Ziehn, T.; Nickless, A.; Rayner, P. J.; Law, R. M.; Roff, G.; Fraser, P.

    2014-09-01

    This paper describes the generation of optimal atmospheric measurement networks for determining carbon dioxide fluxes over Australia using inverse methods. A Lagrangian particle dispersion model is used in reverse mode together with a Bayesian inverse modelling framework to calculate the relationship between weekly surface fluxes, comprising contributions from the biosphere and fossil fuel combustion, and hourly concentration observations for the Australian continent. Meteorological driving fields are provided by the regional version of the Australian Community Climate and Earth System Simulator (ACCESS) at 12 km resolution at an hourly timescale. Prior uncertainties are derived on a weekly timescale for biosphere fluxes and fossil fuel emissions from high-resolution model runs using the Community Atmosphere Biosphere Land Exchange (CABLE) model and the Fossil Fuel Data Assimilation System (FFDAS) respectively. The influence from outside the modelled domain is investigated, but proves to be negligible for the network design. Existing ground-based measurement stations in Australia are assessed in terms of their ability to constrain local flux estimates from the land. We find that the six stations that are currently operational are already able to reduce the uncertainties on surface flux estimates by about 30%. A candidate list of 59 stations is generated based on logistic constraints and an incremental optimisation scheme is used to extend the network of existing stations. In order to achieve an uncertainty reduction of about 50%, we need to double the number of measurement stations in Australia. Assuming equal data uncertainties for all sites, new stations would be mainly located in the northern and eastern part of the continent.

  1. Greenhouse gas network design using backward Lagrangian particle dispersion modelling − Part 1: Methodology and Australian test case

    Directory of Open Access Journals (Sweden)

    T. Ziehn

    2014-09-01

    Full Text Available This paper describes the generation of optimal atmospheric measurement networks for determining carbon dioxide fluxes over Australia using inverse methods. A Lagrangian particle dispersion model is used in reverse mode together with a Bayesian inverse modelling framework to calculate the relationship between weekly surface fluxes, comprising contributions from the biosphere and fossil fuel combustion, and hourly concentration observations for the Australian continent. Meteorological driving fields are provided by the regional version of the Australian Community Climate and Earth System Simulator (ACCESS at 12 km resolution at an hourly timescale. Prior uncertainties are derived on a weekly timescale for biosphere fluxes and fossil fuel emissions from high-resolution model runs using the Community Atmosphere Biosphere Land Exchange (CABLE model and the Fossil Fuel Data Assimilation System (FFDAS respectively. The influence from outside the modelled domain is investigated, but proves to be negligible for the network design. Existing ground-based measurement stations in Australia are assessed in terms of their ability to constrain local flux estimates from the land. We find that the six stations that are currently operational are already able to reduce the uncertainties on surface flux estimates by about 30%. A candidate list of 59 stations is generated based on logistic constraints and an incremental optimisation scheme is used to extend the network of existing stations. In order to achieve an uncertainty reduction of about 50%, we need to double the number of measurement stations in Australia. Assuming equal data uncertainties for all sites, new stations would be mainly located in the northern and eastern part of the continent.

  2. Development of a new damage function model for power plants: Methodology and applications

    International Nuclear Information System (INIS)

    Levy, J.I.; Hammitt, J.K.; Yanagisawa, Y.; Spengler, J.D.

    1999-01-01

    Recent models have estimated the environmental impacts of power plants, but differences in assumptions and analytical methodologies have led to diverging findings. In this paper, the authors present a new damage function model that synthesizes previous efforts and refines components that have been associated with variations in impact estimates. Their model focuses on end-use emissions and quantified the direct human health impacts of criteria air pollutants. To compare their model to previous efforts and to evaluate potential policy applications, the authors assess the impacts of an oil and natural gas-fueled cogeneration power plant in Boston, MA. Impacts under baseline assumptions are estimated to be $0.007/kWh of electricity, $0.23/klb of steam, and $0.004/ton-h of chilled water (representing 2--9% of the market value of outputs). Impacts are largely related to ozone (48%) and particulate matter (42%). Addition of upstream emissions and nonpublic health impacts increases externalities by as much as 50%. Sensitivity analyses demonstrate the importance of plant siting, meteorological conditions, epidemiological assumptions, and the monetary value placed on premature mortality as well as the potential influence of global warming. Comparative analyses demonstrate that their model provides reasonable impact estimates and would therefore be applicable in a broad range of policy settings

  3. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    Science.gov (United States)

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  4. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  5. On the Inclusion of Energy-Shifting Demand Response in Production Cost Models: Methodology and a Case Study

    DEFF Research Database (Denmark)

    O'Connell, Niamh; Hale, Elaine; Doebber, Ian

    and communications, power system characteristics, regulatory environments, market structures, and business models. The work described in this report focuses on the enablement of such analysis from the production cost modeling perspective. In particular, we contribute a bottom-up methodology for modeling load...

  6. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    Science.gov (United States)

    2015-03-01

    domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model

  7. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    International Nuclear Information System (INIS)

    Andersson, Johan; Berglund, Johan; Follin, Sven; Hakami, Eva; Halvarson, Jan; Hermanson, Jan; Laaksoharju, Marcus; Rhen, Ingvar; Wahlgren, C.H.

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline and after this

  8. Inverse modeling of emissions for local photooxidant pollution: Testing a new methodology with kriging constraints

    Directory of Open Access Journals (Sweden)

    I. Pison

    2006-07-01

    Full Text Available A new methodology for the inversion of anthropogenic emissions at a local scale is tested. The inversion constraints are provided by a kriging technique used in air quality forecast in the Paris area, which computes an analyzed concentration field from network measurements and the first-guess simulation of a CTM. The inverse developed here is based on the CHIMERE model and its adjoint to perform 4-D integration. The methodology is validated on synthetic cases inverting emission fluxes. It is shown that the information provided by the analyzed concentrations is sufficient to reach a mathematically acceptable solution to the optimization, even when little information is available in the measurements. As compared to the use of measurements alone or of measurements and a background matrix, the use of kriging leads to a more homogeneous distribution of the corrections, both in space and time. Moreover, it is then possible to double the accuracy of the inversion by performing two kriging-optimization cycles. Nevertheless, kriging analysis cannot compensate for a very important lack of information in the measurements.

  9. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  10. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  11. Model for diffusion and porewater chemistry in compacted bentonite. Theoretical basis and the solution methodology for the transport model

    International Nuclear Information System (INIS)

    Lehikoinen, J.

    1997-01-01

    This report describes the progress of the computer model for ionic transport in bentonite. The research is part of the project Microstructural and chemical parameters of bentonite as determinants of waste isolation efficiency within the Nuclear fission safety program organized by The Commission of the European Communities. The study was started by collecting a comprehensive body of available data on space-charge transport modelling and creating a conceptualization of the problem at hand. The numerical discretization of the governing equations by finite differences was also initiated. This report introduces the theoretical basis for the model, somewhat more elaborated than presented in Progress Report 1/1996, and rectifies a few mistakes appearing in that report. It also gives a brief introduction to the solution methodology of the disc retized governing equations. (orig.) (12 refs.)

  12. Incorporation of ice sheet models into an Earth system model: Focus on methodology of coupling

    Science.gov (United States)

    Rybak, Oleg; Volodin, Evgeny; Morozova, Polina; Nevecherja, Artiom

    2018-03-01

    Elaboration of a modern Earth system model (ESM) requires incorporation of ice sheet dynamics. Coupling of an ice sheet model (ICM) to an AOGCM is complicated by essential differences in spatial and temporal scales of cryospheric, atmospheric and oceanic components. To overcome this difficulty, we apply two different approaches for the incorporation of ice sheets into an ESM. Coupling of the Antarctic ice sheet model (AISM) to the AOGCM is accomplished via using procedures of resampling, interpolation and assigning to the AISM grid points annually averaged meanings of air surface temperature and precipitation fields generated by the AOGCM. Surface melting, which takes place mainly on the margins of the Antarctic peninsula and on ice shelves fringing the continent, is currently ignored. AISM returns anomalies of surface topography back to the AOGCM. To couple the Greenland ice sheet model (GrISM) to the AOGCM, we use a simple buffer energy- and water-balance model (EWBM-G) to account for orographically-driven precipitation and other sub-grid AOGCM-generated quantities. The output of the EWBM-G consists of surface mass balance and air surface temperature to force the GrISM, and freshwater run-off to force thermohaline circulation in the oceanic block of the AOGCM. Because of a rather complex coupling procedure of GrIS compared to AIS, the paper mostly focuses on Greenland.

  13. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Directory of Open Access Journals (Sweden)

    Faycal Mimouni

    2016-04-01

    Full Text Available Purpose: Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Design/methodology/approach: Network modeling by combining Petri and Bayesian network. Findings: Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Research limitations/implications: Demands are independent from returns. Practical implications: Model can only be used on nonperishable products. Social implications: Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Originality/value: Bayesian network with a cycle combined with the Petri Network.

  14. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    International Nuclear Information System (INIS)

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  15. Review of Project SAFE: Comments on biosphere conceptual model description and risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard; Wilmot, Roger [Galson Sciences Ltd (United Kingdom)

    2002-09-01

    The Swedish Nuclear Fuel and Waste Management Company's (SKB's) most recent assessment of the safety of the Forsmark repository for low-level and intermediate-level waste (Project SAFE) is currently undergoing review by the Swedish regulators. As part of its review, the Swedish Radiation Protection Institute (SSI) identified that two components of SAFE require more detailed review: (i) the conceptual model description of the biosphere system, and (ii) SKB's risk assessment methodology. We have reviewed the biosphere system interaction matrix and how this has been used in the identification, justification and description of biosphere models for radiological assessment purposes. The risk assessment methodology has been reviewed considering in particular issues associated with scenario selection, assessment timescale, and the probability and risk associated with the well scenario. There is an extensive range of supporting information on which biosphere modelling in Project SAFE is based. However, the link between this material and the biosphere models themselves is not clearly set out. This leads to some contradictions and mis-matches between description and implementation. One example concerns the representation of the geosphere-biosphere interface. The supporting description of lakes indicates that interaction between groundwaters entering the biosphere through lake bed sediments could lead to accumulations of radionuclides in sediments. These sediments may become agricultural areas at some time in the future. In the numerical modelling of the biosphere carried out in Project SAFE, the direct accumulation of contaminants in bed sediments is not represented. Application of a more rigorous procedure to ensure numerical models are fit for purpose is recommended, paying more attention to issues associated with the geosphere-biosphere interface. A more structured approach to risk assessment would be beneficial, with a better explanation of the difference

  16. Digital System Categorization Methodology to Support Integration of Digital Instrumentation and Control Models into PRAs

    International Nuclear Information System (INIS)

    Arndt, Steven A.

    2011-01-01

    It has been suggested that by categorizing the various digital systems used in safety critical applications in nuclear power plants, it would be possible to determine which systems should be modeled in the analysis of the larger plant wide PRA, at what level of detail the digital system should be modeled and using which methods. The research reported in this paper develops a categorization method using system attributes to permit a modeler to more effectively model the systems that will likely have the most critical contributions to the overall plant safety and to more effectively model system interactions for those digital systems where the interactions are most important to the overall accuracy and completeness of the plant PRA. The proposed methodology will categorize digital systems based on certain attributes of the systems themselves and how they will be used in the specific application. This will help determine what digital systems need to be modeled and at what level of detail, and can be used to guide PRA analysis and regulatory reviews. The three-attribute categorization strategy that was proposed by Arndt is used as the basis for the categorization methodology developed here. The first attribute, digital system complexity, is based on Type Il interactions defined by Aldemir and an overall digital system size and complexity index. The size and complexity index used are previously defined software complexity metrics. Potential sub-attributes of digital system complexity include, design complexity, software complexity, hardware complexity, system function complexity and testability. The second attribute, digital system interactions/inter-conductivity, is a combination of Rushby's coupling and Ademir's Type I interactions. Digital systems that are loosely coupled and/or have very few Type I interaction would not interact dynamically with the overall system and would have a low interactions/inter-conductivity score. Potential sub-attributes of digital system

  17. Digital System Categorization Methodology to Support Integration of Digital Instrumentation and Control Models into PRAs

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, Steven A. [U.S. Nuclear Regulatory Commission, Washington D.C. (United States)

    2011-08-15

    It has been suggested that by categorizing the various digital systems used in safety critical applications in nuclear power plants, it would be possible to determine which systems should be modeled in the analysis of the larger plant wide PRA, at what level of detail the digital system should be modeled and using which methods. The research reported in this paper develops a categorization method using system attributes to permit a modeler to more effectively model the systems that will likely have the most critical contributions to the overall plant safety and to more effectively model system interactions for those digital systems where the interactions are most important to the overall accuracy and completeness of the plant PRA. The proposed methodology will categorize digital systems based on certain attributes of the systems themselves and how they will be used in the specific application. This will help determine what digital systems need to be modeled and at what level of detail, and can be used to guide PRA analysis and regulatory reviews. The three-attribute categorization strategy that was proposed by Arndt is used as the basis for the categorization methodology developed here. The first attribute, digital system complexity, is based on Type Il interactions defined by Aldemir and an overall digital system size and complexity index. The size and complexity index used are previously defined software complexity metrics. Potential sub-attributes of digital system complexity include, design complexity, software complexity, hardware complexity, system function complexity and testability. The second attribute, digital system interactions/inter-conductivity, is a combination of Rushby's coupling and Ademir's Type I interactions. Digital systems that are loosely coupled and/or have very few Type I interaction would not interact dynamically with the overall system and would have a low interactions/inter-conductivity score. Potential sub-attributes of

  18. Review of Project SAFE: Comments on biosphere conceptual model description and risk assessment methodology

    International Nuclear Information System (INIS)

    Klos, Richard; Wilmot, Roger

    2002-09-01

    The Swedish Nuclear Fuel and Waste Management Company's (SKB's) most recent assessment of the safety of the Forsmark repository for low-level and intermediate-level waste (Project SAFE) is currently undergoing review by the Swedish regulators. As part of its review, the Swedish Radiation Protection Institute (SSI) identified that two components of SAFE require more detailed review: (i) the conceptual model description of the biosphere system, and (ii) SKB's risk assessment methodology. We have reviewed the biosphere system interaction matrix and how this has been used in the identification, justification and description of biosphere models for radiological assessment purposes. The risk assessment methodology has been reviewed considering in particular issues associated with scenario selection, assessment timescale, and the probability and risk associated with the well scenario. There is an extensive range of supporting information on which biosphere modelling in Project SAFE is based. However, the link between this material and the biosphere models themselves is not clearly set out. This leads to some contradictions and mis-matches between description and implementation. One example concerns the representation of the geosphere-biosphere interface. The supporting description of lakes indicates that interaction between groundwaters entering the biosphere through lake bed sediments could lead to accumulations of radionuclides in sediments. These sediments may become agricultural areas at some time in the future. In the numerical modelling of the biosphere carried out in Project SAFE, the direct accumulation of contaminants in bed sediments is not represented. Application of a more rigorous procedure to ensure numerical models are fit for purpose is recommended, paying more attention to issues associated with the geosphere-biosphere interface. A more structured approach to risk assessment would be beneficial, with a better explanation of the difference between

  19. ABOUT THE RELEVANCE AND METHODOLOGY ASPECTS OF TEACHING THE MATHEMATICAL MODELING TO PEDAGOGICAL STUDENTS

    Directory of Open Access Journals (Sweden)

    Y. A. Perminov

    2014-01-01

    Full Text Available The paper substantiates the need for profile training in mathematical modeling for pedagogical students, caused by the total penetration of mathematics into different sciences, including the humanities; fast development of the information communications technologies; and growing importance of mathematical modeling, combining the informal scientific and formal mathematical languages with the unique opportunities of computer programming. The author singles out the reasons for mastering and using the mathematical apparatus by teaches in every discipline. Indeed, among all the modern mathematical methods and ideas, mathematical modeling retains its priority in all professional spheres. Therefore, the discipline of “Mathematical Modeling” can play an important role in integrating different components of specialists training in various profiles. By mastering the basics of mathematical modeling, students acquire skills of methodological thinking; learn the principles of analysis, synthesis, generalization of ideas and methods in different disciplines and scientific spheres; and achieve general culture competences. In conclusion, the author recommends incorporating the “Methods of Profile Training in Mathematical Modeling” into the pedagogical magistracy curricula. 

  20. Methodological issues in cardiovascular epidemiology: the risk of determining absolute risk through statistical models

    Directory of Open Access Journals (Sweden)

    Demosthenes B Panagiotakos

    2006-09-01

    Full Text Available Demosthenes B Panagiotakos, Vassilis StavrinosOffice of Biostatistics, Epidemiology, Department of Dietetics, Nutrition, Harokopio University, Athens, GreeceAbstract: During the past years there has been increasing interest in the development of cardiovascular disease functions that predict future events at individual level. However, this effort has not been so far very successful, since several investigators have reported large differences in the estimation of the absolute risk among different populations. For example, it seems that predictive models that have been derived from US or north European populations  overestimate the incidence of cardiovascular events in south European and Japanese populations. A potential explanation could be attributed to several factors such as geographical, cultural, social, behavioral, as well as genetic variations between the investigated populations in addition to various methodological, statistical, issues relating to the estimation of these predictive models. Based on current literature it can be concluded that, while risk prediction of future cardiovascular events is a useful tool and might be valuable in controlling the burden of the disease in a population, further work is required to improve the accuracy of the present predictive models.Keywords: cardiovascular disease, risk, models

  1. A Radiative Transfer Modeling Methodology in Gas-Liquid Multiphase Flow Simulations

    Directory of Open Access Journals (Sweden)

    Gautham Krishnamoorthy

    2014-01-01

    Full Text Available A methodology for performing radiative transfer calculations in computational fluid dynamic simulations of gas-liquid multiphase flows is presented. By considering an externally irradiated bubble column photoreactor as our model system, the bubble scattering coefficients were determined through add-on functions by employing as inputs the bubble volume fractions, number densities, and the fractional contribution of each bubble size to the bubble volume from four different multiphase modeling options. The scattering coefficient profiles resulting from the models were significantly different from one another and aligned closely with their predicted gas-phase volume fraction distributions. The impacts of the multiphase modeling option, initial bubble diameter, and gas flow rates on the radiation distribution patterns within the reactor were also examined. An increase in air inlet velocities resulted in an increase in the fraction of larger sized bubbles and their contribution to the scattering coefficient. However, the initial bubble sizes were found to have the strongest impact on the radiation field.

  2. Partial least squares path modeling basic concepts, methodological issues and applications

    CERN Document Server

    Noonan, Richard

    2017-01-01

    This edited book presents the recent developments in partial least squares-path modeling (PLS-PM) and provides a comprehensive overview of the current state of the most advanced research related to PLS-PM. The first section of this book emphasizes the basic concepts and extensions of the PLS-PM method. The second section discusses the methodological issues that are the focus of the recent development of the PLS-PM method. The third part discusses the real world application of the PLS-PM method in various disciplines. The contributions from expert authors in the field of PLS focus on topics such as the factor-based PLS-PM, the perfect match between a model and a mode, quantile composite-based path modeling (QC-PM), ordinal consistent partial least squares (OrdPLSc), non-symmetrical composite-based path modeling (NSCPM), modern view for mediation analysis in PLS-PM, a multi-method approach for identifying and treating unobserved heterogeneity, multigroup analysis (PLS-MGA), the assessment of the common method b...

  3. Risk methodology for geologic disposal of radioactive waste: asymptotic properties of the environmental transport model

    International Nuclear Information System (INIS)

    Helton, J.C.; Brown, J.B.; Iman, R.L.

    1981-02-01

    The Environmental Transport Model is a compartmental model developed to represent the surface movement of radionuclides. The purpose of the present study is to investigate the asymptotic behavior of the model and to acquire insight with respect to such behavior and the variables which influence it. For four variations of a hypothetical river receiving a radionuclide discharge, the following properties are considered: predicted asymptotic values for environmental radionuclide concentrations and time required for environmental radionuclide concentrations to reach 90% of their predicted asymptotic values. Independent variables of two types are used to define each variation of the river: variables which define physical properties of the river system (e.g., soil depth, river discharge and sediment resuspension) and variables which summarize radionuclide properties (i.e., distribution coefficients). Sensitivity analysis techniques based on stepwise regression are used to determine the dominant variables influencing the behavior of the model. This work constitutes part of a project at Sandia National Laboratories funded by the Nuclear Regulatory Commission to develop a methodology to assess the risk associated with geologic disposal of radioactive waste

  4. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    Energy Technology Data Exchange (ETDEWEB)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes [Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Skudai, Johor (Malaysia)

    2015-05-15

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.

  5. Development of a system dynamics model based on Six Sigma methodology

    Directory of Open Access Journals (Sweden)

    José Jovani Cardiel Ortega

    2017-01-01

    Full Text Available A dynamic model to analyze the complexity associated with the manufacturing systems and to improve the performance of the process through the Six Sigma philosophy is proposed. The research focuses on the implementation of the system dynamics tool to comply with each of the phases of the DMAIC methodology. In the first phase, define, the problem is articulated, collecting data, selecting the variables, and representing them in a mental map that helps build the dynamic hypothesis. In the second phase, measure, model is formulated, equations are developed, and Forrester diagram is developed to carry out the simulation. In the third phase, analyze, the simulation results are studied. For the fourth phase, improving, the model is validated through a sensitivity analysis. Finally, in control phase, operation policies are proposed. This paper presents the development of a dynamic model of the system of knitted textile production knitted developed; the implementation was done in a textile company in southern Guanajuato. The results show an improvement in the process performance by increasing the level of sigma allowing the validation of the proposed approach.

  6. Implementation of a documentation model comprising nursing terminologies--theoretical and methodological issues.

    Science.gov (United States)

    von Krogh, Gunn; Nåden, Dagfinn

    2008-04-01

    To describe and discuss theoretical and methodological issues of implementation of a nursing services documentation model comprising NANDA nursing diagnoses, Nursing Intervention Classification and Nursing Outcome Classification terminologies. The model is developed for electronic patient record and was implemented in a psychiatric hospital on an organizational level and on five test wards in 2001-2005. The theory of Rogers guided the process of innovation, whereas the implementation procedure of McCloskey and Bulecheck combined with adult learning principals guided the test site implementation. The test wards managed in different degrees to adopt the model. Two wards succeeded fully, including a ward with high percentage of staff with interdisciplinary background. Better planning regarding the impact of the organization's innovative aptitude, the innovation strategies and the use of differentiated methods regarding the clinician's individual premises for learning nursing terminologies might have enhanced the adoption to the model. To better understand the nature of barriers and the importance of careful planning regarding the implementation of electronic patient record elements in nursing care services, focusing on nursing terminologies. Further to indicate how a theory and specific procedure can be used to guide the process of implementation throughout the different levels of management.

  7. MODELS AND METHODS OF SAFETY-ORIENTED PROJECT MANAGEMENT OF DEVELOPMENT OF COMPLEX SYSTEMS: METHODOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Олег Богданович ЗАЧКО

    2016-03-01

    Full Text Available The methods and models of safety-oriented project management of the development of complex systems are proposed resulting from the convergence of existing approaches in project management in contrast to the mechanism of value-oriented management. A cognitive model of safety oriented project management of the development of complex systems is developed, which provides a synergistic effect that is to move the system from the original (pre condition in an optimal one from the viewpoint of life safety - post-project state. The approach of assessment the project complexity is proposed, which consists in taking into account the seasonal component of a time characteristic of life cycles of complex organizational and technical systems with occupancy. This enabled to take into account the seasonal component in simulation models of life cycle of the product operation in complex organizational and technical system, modeling the critical points of operation of systems with occupancy, which forms a new methodology for safety-oriented management of projects, programs and portfolios of projects with the formalization of the elements of complexity.

  8. Modeling and process optimization of electrospinning of chitosan-collagen nanofiber by response surface methodology

    Science.gov (United States)

    Amiri, Nafise; Moradi, Ali; Abolghasem Sajjadi Tabasi, Sayyed; Movaffagh, Jebrail

    2018-04-01

    Chitosan-collagen composite nanofiber is of a great interest to researchers in biomedical fields. Since the electrospinning is the most popular method for nanofiber production, having a comprehensive knowledge of the electrospinning process is beneficial. Modeling techniques are precious tools for managing variables in the electrospinning process, prior to the more time- consuming and expensive experimental techniques. In this study, a central composite design of response surface methodology (RSM) was employed to develop a statistical model as well as to define the optimum condition for fabrication of chitosan-collagen nanofiber with minimum diameter. The individual and the interaction effects of applied voltage (10–25 kV), flow rate (0.5–1.5 mL h‑1), and needle to collector distance (15–25 cm) on the fiber diameter were investigated. ATR- FTIR and cell study were done to evaluate the optimized nanofibers. According to the RSM, a two-factor interaction (2FI) model was the most suitable model. The high regression coefficient value (R 2 ≥ 0.9666) of the fitted regression model and insignificant lack of fit (P = 0.0715) indicated that the model was highly adequate in predicting chitosan-collagen nanofiber diameter. The optimization process showed that the chitosan-collagen nanofiber diameter of 156.05 nm could be obtained in 9 kV, 0.2 ml h‑1, and 25 cm which was confirmed by experiment (155.92 ± 18.95 nm). The ATR-FTIR and cell study confirmed the structure and biocompatibility of the optimized membrane. The represented model could assist researchers in fabricating chitosan-collagen electrospun scaffolds with a predictable fiber diameter, and optimized chitosan-collagen nanofibrous mat could be a potential candidate for wound healing and tissue engineering.

  9. Cognitive models of executive functions development: methodological limitations and theoretical challenges

    Directory of Open Access Journals (Sweden)

    Florencia Stelzer

    2014-01-01

    Full Text Available Executive functions (EF have been defined as a series of higher-order cognitive processes which allow the control of thought, behavior and affection according to the achievement of a goal. Such processes present a lengthy postnatal development which matures completely by the end of adolescence. In this article we make a review of some of the main models of EF development during childhood. The aim of this work is to describe the state of the art related to the topic, identifying the main theoretical difficulties and methodological limitations associated with the different proposed paradigms. Finally, some suggestions are given to cope with such difficulties, emphasizing that the development of an ontology of EF could be a viable alternative to counter them. We believe that futture researches should guide their efforts toward the development of that ontology.

  10. A Diagnostic Model for Dementia in Clinical Practice-Case Methodology Assisting Dementia Diagnosis.

    Science.gov (United States)

    Londos, Elisabet

    2015-04-02

    Dementia diagnosis is important for many different reasons. Firstly, to separate dementia, or major neurocognitive disorder, from MCI (mild cognitive impairment), mild neurocognitive disorder. Secondly, to define the specific underlying brain disorder to aid treatment, prognosis and decisions regarding care needs and assistance. The diagnostic method of dementias is a puzzle of different data pieces to be fitted together in the best possible way to reach a clinical diagnosis. Using a modified case methodology concept, risk factors affecting cognitive reserve and symptoms constituting the basis of the brain damage hypothesis, can be visualized, balanced and reflected against test results as well as structural and biochemical markers. The model's origin is the case method initially described in Harvard business school, here modified to serve dementia diagnostics.

  11. An Innovative Structural Mode Selection Methodology: Application for the X-33 Launch Vehicle Finite Element Model

    Science.gov (United States)

    Hidalgo, Homero, Jr.

    2000-01-01

    An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.

  12. Generalized Characterization Methodology for Performance Modelling of Lithium-Ion Batteries

    DEFF Research Database (Denmark)

    Stroe, Daniel Loan; Swierczynski, Maciej Jozef; Stroe, Ana-Irina

    2016-01-01

    Lithium-ion (Li-ion) batteries are complex energy storage devices with their performance behavior highly dependent on the operating conditions (i.e., temperature, load current, and state-of-charge (SOC)). Thus, in order to evaluate their techno-economic viability for a certain application, detailed...... information about Li-ion battery performance behavior becomes necessary. This paper proposes a comprehensive seven-step methodology for laboratory characterization of Li-ion batteries, in which the battery’s performance parameters (i.e., capacity, open-circuit voltage (OCV), and impedance) are determined...... and their dependence on the operating conditions are obtained. Furthermore, this paper proposes a novel hybrid procedure for parameterizing the batteries’ equivalent electrical circuit (EEC), which is used to emulate the batteries’ dynamic behavior. Based on this novel parameterization procedure, the performance model...

  13. The use of mental models in chemical risk protection: developing a generic workplace methodology.

    Science.gov (United States)

    Cox, Patrick; Niewöhmer, Jörg; Pidgeon, Nick; Gerrard, Simon; Fischhoff, Baruch; Riley, Donna

    2003-04-01

    We adopted a comparative approach to evaluate and extend a generic methodology to analyze the different sets of beliefs held about chemical hazards in the workplace. Our study mapped existing knowledge structures about the risks associated with the use of perchloroethylene and rosin-based solder flux in differing workplaces. "Influence diagrams" were used to represent beliefs held by chemical experts; "user models" were developed from data elicited from open-ended interviews with the workplace users of the chemicals. The juxtaposition of expert and user understandings of chemical risks enabled us to identify knowledge gaps and misunderstandings and to reinforce appropriate sets of safety beliefs and behavior relevant to chemical risk communications. By designing safety information to be more relevant to the workplace context of users, we believe that employers and employees may gain improved knowledge about chemical hazards in the workplace, such that better chemical risk management, self-protection, and informed decision making develop over time.

  14. Failure detection by adaptive lattice modelling using Kalman filtering methodology : application to NPP

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1991-03-01

    Detection of failure in the operational status of a NPP is described. The method uses lattice form of the signal modelling established by means of Kalman filtering methodology. In this approach each lattice parameter is considered to be a state and the minimum variance estimate of the states is performed adaptively by optimal parameter estimation together with fast convergence and favourable statistical properties. In particular, the state covariance is also the covariance of the error committed by that estimate of the state value and the Mahalanobis distance formed for pattern comparison takes x 2 distribution for normally distributed signals. The failure detection is performed after a decision making process by probabilistic assessments based on the statistical information provided. The failure detection system is implemented in multi-channel signal environment of Borssele NPP and its favourable features are demonstrated. (author). 29 refs.; 7 figs

  15. SCIENTIFIC METHODOLOGICAL APPROACHES TO CREATION OF COMPLEX CONTROL SYSTEM MODEL FOR THE STREAMS OF BUILDING WASTE

    Directory of Open Access Journals (Sweden)

    Tskhovrebov Eduard Stanislavovich

    2015-09-01

    Full Text Available In 2011 in Russia a Strategy of Production Development of Construction Materials and Industrial Housing Construction for the period up to 2020 was approved as one of strategic documents in the sphere of construction. In the process of this strategy development all the needs of construction complex were taken into account in all the spheres of economy, including transport system. The strategy also underlined, that the construction industry is a great basis for use and application in secondary economic turnover of dangerous waste from different production branches. This gives possibility to produce construction products of recycled materials and at the same time to solve the problem of environmental protection. The article considers and analyzes scientific methodological approaches to creation of a model of a complex control system for the streams of building waste in frames of organizing uniform ecologically safe and economically effective complex system of waste treatment in country regions.

  16. Pathways for scale and discipline reconciliation: current socio-ecological modelling methodologies to explore and reconstitute human prehistoric dynamics

    OpenAIRE

    Saqalli , Mehdi; Baum , Tilman

    2016-01-01

    International audience; This communication elaborates a plea for the necessity of a specific modelling methodology which does not sacrifice two modelling principles: explanation Micro and correlation Macro. Three goals are assigned to modelling strategies: describe, understand and predict. One tendency in historical and spatial modelling is to develop models at a micro level in order to describe and by that way, understand the connection between local ecological contexts, acquired through loc...

  17. Uranium mining in Australia

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    The mining of uranium in Australia is criticised in relation to it's environmental impact, economics and effects on mine workers and Aborigines. A brief report is given on each of the operating and proposed uranium mines in Australia

  18. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  19. Research methodology workshops evaluation using the Kirkpatrick's model: translating theory into practice.

    Science.gov (United States)

    Abdulghani, Hamza Mohammad; Shaik, Shaffi Ahamed; Khamis, Nehal; Al-Drees, Abdulmajeed Abdulrahman; Irshad, Mohammad; Khalil, Mahmoud Salah; Alhaqwi, Ali Ibrahim; Isnani, Arthur

    2014-04-01

    Qualitative and quantitative evaluation of academic programs can enhance the development, effectiveness, and dissemination of comparative quality reports as well as quality improvement efforts. To evaluate the five research methodology workshops through assessing participants' satisfaction, knowledge and skills gain and impact on practices by the Kirkpatrick's evaluation model. The four level Kirkpatrick's model was applied for the evaluation. Training feedback questionnaires, pre and post tests, learner development plan reports and behavioral surveys were used to evaluate the effectiveness of the workshop programs. Of the 116 participants, 28 (24.1%) liked with appreciation, 62 (53.4%) liked with suggestions and 26 (22.4%) disliked the programs. Pre and post MCQs tests mean scores showed significant improvement of relevant basic knowledge and cognitive skills by 17.67% (p ≤ 0.005). Pre-and-post tests scores on workshops sub-topics also significantly improved for the manuscripts (p ≤ 0.031) and proposal writing (p ≤ 0.834). As for the impact, 56.9% of participants started research, and 6.9% published their studies. The results from participants' performance revealed an overall positive feedback and 79% of participant reported transfer of training skills at their workplace. The course outcomes achievement and suggestions given for improvements offer insight into the program which were encouraging and very useful. Encouraging "research culture" and work-based learning are probably the most powerful determinants for research promotion. These findings therefore encourage faculty development unit to continue its training and development in the research methodology aspects.

  20. Site-conditions map for Portugal based on VS measurements: methodology and final model

    Science.gov (United States)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  1. A Methodological Review of US Budget-Impact Models for New Drugs.

    Science.gov (United States)

    Mauskopf, Josephine; Earnshaw, Stephanie

    2016-11-01

    A budget-impact analysis is required by many jurisdictions when adding a new drug to the formulary. However, previous reviews have indicated that adherence to methodological guidelines is variable. In this methodological review, we assess the extent to which US budget-impact analyses for new drugs use recommended practices. We describe recommended practice for seven key elements in the design of a budget-impact analysis. Targeted literature searches for US studies reporting estimates of the budget impact of a new drug were performed and we prepared a summary of how each study addressed the seven key elements. The primary finding from this review is that recommended practice is not followed in many budget-impact analyses. For example, we found that growth in the treated population size and/or changes in disease-related costs expected during the model time horizon for more effective treatments was not included in several analyses for chronic conditions. In addition, all drug-related costs were not captured in the majority of the models. Finally, for most studies, one-way sensitivity and scenario analyses were very limited, and the ranges used in one-way sensitivity analyses were frequently arbitrary percentages rather than being data driven. The conclusions from our review are that changes in population size, disease severity mix, and/or disease-related costs should be properly accounted for to avoid over- or underestimating the budget impact. Since each budget holder might have different perspectives and different values for many of the input parameters, it is also critical for published budget-impact analyses to include extensive sensitivity and scenario analyses based on realistic input values.

  2. Application of Binomial Model and Market Asset Declaimer Methodology for Valuation of Abandon and Expand Options. The Case Study

    Directory of Open Access Journals (Sweden)

    Paweł Mielcarz

    2007-06-01

    Full Text Available The article presents a case study of valuation of real options included in a investment project. The main goal of the article is to present the calculation and methodological issues of application the methodology for real option valuation. In order to do it there are used the binomial model and Market Asset Declaimer methodology. The project presented in the article concerns the introduction of radio station to a new market. It includes two valuable real options: to abandon the project and to expand.

  3. A combined reaction class approach with integrated molecular orbital+molecular orbital (IMOMO) methodology: A practical tool for kinetic modeling

    International Nuclear Information System (INIS)

    Truong, Thanh N.; Maity, Dilip K.; Truong, Thanh-Thai T.

    2000-01-01

    We present a new practical computational methodology for predicting thermal rate constants of reactions involving large molecules or a large number of elementary reactions in the same class. This methodology combines the integrated molecular orbital+molecular orbital (IMOMO) approach with our recently proposed reaction class models for tunneling. With the new methodology, we show that it is possible to significantly reduce the computational cost by several orders of magnitude while compromising the accuracy in the predicted rate constants by less than 40% over a wide range of temperatures. Another important result is that the computational cost increases only slightly as the system size increases. (c) 2000 American Institute of Physics

  4. Taxes and Subsidies for Improving Diet and Population Health in Australia: A Cost-Effectiveness Modelling Study.

    Directory of Open Access Journals (Sweden)

    Linda J Cobiac

    2017-02-01

    saturated fat and salt. The study suggests that taxes and subsidies on foods and beverages can potentially be combined to achieve substantial improvements in population health and cost-savings to the health sector. However, the magnitude of health benefits is sensitive to measures of price elasticity, and further work is needed to incorporate potential benefits or harms associated with changes in other foods and nutrients that are not currently modelled, such as red and processed meats and fibre.With potentially large health benefits for the Australian population and large benefits in reducing health sector spending on the treatment of non-communicable diseases, the formulation of a tax and subsidy package should be given a more prominent role in Australia's public health nutrition strategy.

  5. A novel methodology to model the cooling processes of packed horticultural produce using 3D shape models

    Science.gov (United States)

    Gruyters, Willem; Verboven, Pieter; Rogge, Seppe; Vanmaercke, Simon; Ramon, Herman; Nicolai, Bart

    2017-10-01

    Freshly harvested horticultural produce require a proper temperature management to maintain their high economic value. Towards this end, low temperature storage is of crucial importance to maintain a high product quality. Optimizing both the package design of packed produce and the different steps in the postharvest cold chain can be achieved by numerical modelling of the relevant transport phenomena. This work presents a novel methodology to accurately model both the random filling of produce in a package and the subsequent cooling process. First, a cultivar-specific database of more than 100 realistic CAD models of apple and pear fruit is built with a validated geometrical 3D shape model generator. To have an accurate representation of a realistic picking season, the model generator also takes into account the biological variability of the produce shape. Next, a discrete element model (DEM) randomly chooses surface meshed bodies from the database to simulate the gravitational filling process of produce in a box or bin, using actual mechanical properties of the fruit. A computational fluid dynamics (CFD) model is then developed with the final stacking arrangement of the produce to study the cooling efficiency of packages under several conditions and configurations. Here, a typical precooling operation is simulated to demonstrate the large differences between using actual 3D shapes of the fruit and an equivalent spheres approach that simplifies the problem drastically. From this study, it is concluded that using a simplified representation of the actual fruit shape may lead to a severe overestimation of the cooling behaviour.

  6. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    Science.gov (United States)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-06-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low

  7. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    International Nuclear Information System (INIS)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-01-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R n . An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R d (d<< n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology

  8. The dynamics of structures - Necessity and methodology for amendment by comparing the calculated model with experimental model

    International Nuclear Information System (INIS)

    Caneparo, B.; Zirilli, S.

    1987-01-01

    In this work relating to support structures for seismic tests, the authors present a mixed procedure necessitating the experimental measurement of natural frequencies, dampings, and the response to impulse stresses (in the case of a seismic stress, the subject of this study, a single impulse is sufficient) in the zone in question. Experimental measurements are used to adjust the finite elements model; it may then be used for later studies. In the presence of interaction with structures not included in the model, such as, for example, the means used for the actual test, it is impossible to adjust it according to the methods proposed and it is up to the experienced author to introduce the modifications judged opportune to take into account everything which is not a part of the model. The authors have, however, carried out a programme based on the local modification of Young's module, which uses only natural frequencies, useful in the adjustment process. Once the zone of poor modelling has been found, this programme enables optimizing the value of E as a function of the experimental data, whilst also furnishing an estimate of residual differences. Dynamic tests have shown that the model thus obtained can be refined by the forced impulse to an impulse stress. In addition to setting out the theories and formulae used, we then give account of verification of the methodology using a plate, and of its application to a support structure in the form of a frame for seismic tests. The appendices include both experimental measurements and tests. The authors carried out the modal analysis with even greater care than necessary in view of the methodology verification phase

  9. MODELING AND STRUCTURING OF ENTERPRISE MANAGEMENT SYSTEM RESORT SPHERE BASED ON ELEMENTS OF NEURAL NETWORK THEORY: THE METHODOLOGICAL BASIS

    Directory of Open Access Journals (Sweden)

    Rena R. Timirualeeva

    2015-01-01

    Full Text Available The article describes the methodology of modeling andstructuring of business networks theory. Accounting ofenvironmental factors mega-, macro- and mesolevels, theinternal state of the managed system and the error management command execution by control system implemented inthis. The proposed methodology can improve the quality of enterprise management of resort complex through a moreflexible response to changes in the parameters of the internaland external environments.

  10. Modelled Cost-Effectiveness of a Package Size Cap and a Kilojoule Reduction Intervention to Reduce Energy Intake from Sugar-Sweetened Beverages in Australia

    Science.gov (United States)

    Mantilla Herrera, Ana Maria; Neal, Bruce; Zheng, Miaobing; Lal, Anita; Sacks, Gary

    2017-01-01

    Interventions targeting portion size and energy density of food and beverage products have been identified as a promising approach for obesity prevention. This study modelled the potential cost-effectiveness of: a package size cap on single-serve sugar sweetened beverages (SSBs) >375 mL (package size cap), and product reformulation to reduce energy content of packaged SSBs (energy reduction). The cost-effectiveness of each intervention was modelled for the 2010 Australia population using a multi-state life table Markov model with a lifetime time horizon. Long-term health outcomes were modelled from calculated changes in body mass index to their impact on Health-Adjusted Life Years (HALYs). Intervention costs were estimated from a limited societal perspective. Cost and health outcomes were discounted at 3%. Total intervention costs estimated in AUD 2010 were AUD 210 million. Both interventions resulted in reduced mean body weight (package size cap: 0.12 kg; energy reduction: 0.23 kg); and HALYs gained (package size cap: 73,883; energy reduction: 144,621). Cost offsets were estimated at AUD 750.8 million (package size cap) and AUD 1.4 billion (energy reduction). Cost-effectiveness analyses showed that both interventions were “dominant”, and likely to result in long term cost savings and health benefits. A package size cap and kJ reduction of SSBs are likely to offer excellent “value for money” as obesity prevention measures in Australia. PMID:28878175

  11. Adolescent depressive symptoms in India, Australia and USA: Exploratory Structural Equation Modelling of cross-national invariance and predictions by gender and age.

    Science.gov (United States)

    Lewis, Andrew J; Rowland, Bosco; Tran, Aiden; Solomon, Renatti F; Patton, George C; Catalano, Richard F; Toumbourou, John W

    2017-04-01

    The present study compares depressive symptoms in adolescents from three countries: Mumbai, India; Seattle, United States; and Melbourne, Australia measured using the Short Moods and Feelings Questionnaire (SMFQ). The study cross nationally compares SMFQ depressive symptom responses by age and gender. Data from a cross-nationally matched survey were used to compare factorial and measurement characteristics from samples of students from Grade 7 and 9 in Mumbai, India (n=3268) with the equivalent cohorts in the Washington State, USA (n=1907) and Victoria, Australia (n=1900). Exploratory Structural Equation Modelling (ESEM) was used to cross-nationally examine factor structure and measurement invariance. A number of reports suggesting that SMFQ is uni-dimensional were not supported in findings from any country. A model with two factors was a better fit and suggested a first factor clustering symptoms that were affective and physiologically based symptoms and a second factor of self-critical, cognitive symptoms. The two-factor model showed convincing cross national configural invariance and acceptable measurement invariance. The present findings revealed that adolescents in Mumbai, India, reported substantially higher depressive symptoms in both factors, but particularly for the self-critical dimension, as compared to their peers in Australia and the USA and that males in Mumbai report high levels of depressive symptoms than females in Mumbai. the cross sectional study collected data for adolescents in Melbourne and Seattle in 2002 and the data for adolescents in Mumbai was obtained in 2010-2011 CONCLUSIONS: These findings suggest that previous findings in developed nations of higher depressive symptoms amongst females compared to males may have an important cultural component and cannot be generalised as a universal feature of adolescent development. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Methodology study for documentation and 3D modelling of blast induced fractures

    Energy Technology Data Exchange (ETDEWEB)

    Olsson, Mats (Swebrec - Swedish Blasting Research Centre, Luleaa (Sweden)); Markstroem, Ingemar; Pettersson, Anders (Golder Associates (Sweden))

    2008-05-15

    The purpose of this activity as part of the Zuse project was to test whether it is possible to produce a 3D model of blast induced fractures around a tunnel and also to find a methodology suitable for large scale studies. The purpose of the studies is to increase the understanding of the excavation damage zone (EDZ) and the possibility of an existing continuous EDZ along the tunnel. For the investigation, an old test area in the Q tunnel at the Aespoe Hard Rock Laboratory was selected, where slabs were excavated in 2003 to investigate the fracture pattern around the contour holes of a blasted tunnel. The rock walls of the excavated niche were studied and documented in the tunnel, while the excavated rock slabs were documented above ground. The work flow included photo documentation of both sides. The photos taken in the tunnel had to be rectified and then the fractures were vectorized automatically in a vectorization program, generating AutoCad DWG-files as output. The vectorized fractures were then moved to MicroStation/RVS where they were interpreted and connected into continuous line strings. The digitized slab and rock sides were then moved to the correct position in 3D space. Finally, a 3D model was made in RVS where the fracture traces were connected into undulating fracture planes in 3D. The conclusion is that it is possible to build a 3D model; the model is presented in Chapter 3.5. However, the age and condition of the slabs may have influenced the quality of the model in this study. The quality of a model that can be built in a future investigation, should be much better if the surveys are adapted to the investigation at hand and the slabs and rock sides are fresh and in better condition. The validity of a model depends on the density of the investigation data. There is also always a risk of over interpretation; the wish to identify a fracture from one section to the next can lead to an interpretation of the fractures as more persistent than they actually

  13. Methodology study for documentation and 3D modelling of blast induced fractures

    International Nuclear Information System (INIS)

    Olsson, Mats; Markstroem, Ingemar; Pettersson, Anders

    2008-05-01

    The purpose of this activity as part of the Zuse project was to test whether it is possible to produce a 3D model of blast induced fractures around a tunnel and also to find a methodology suitable for large scale studies. The purpose of the studies is to increase the understanding of the excavation damage zone (EDZ) and the possibility of an existing continuous EDZ along the tunnel. For the investigation, an old test area in the Q tunnel at the Aespoe Hard Rock Laboratory was selected, where slabs were excavated in 2003 to investigate the fracture pattern around the contour holes of a blasted tunnel. The rock walls of the excavated niche were studied and documented in the tunnel, while the excavated rock slabs were documented above ground. The work flow included photo documentation of both sides. The photos taken in the tunnel had to be rectified and then the fractures were vectorized automatically in a vectorization program, generating AutoCad DWG-files as output. The vectorized fractures were then moved to MicroStation/RVS where they were interpreted and connected into continuous line strings. The digitized slab and rock sides were then moved to the correct position in 3D space. Finally, a 3D model was made in RVS where the fracture traces were connected into undulating fracture planes in 3D. The conclusion is that it is possible to build a 3D model; the model is presented in Chapter 3.5. However, the age and condition of the slabs may have influenced the quality of the model in this study. The quality of a model that can be built in a future investigation, should be much better if the surveys are adapted to the investigation at hand and the slabs and rock sides are fresh and in better condition. The validity of a model depends on the density of the investigation data. There is also always a risk of over interpretation; the wish to identify a fracture from one section to the next can lead to an interpretation of the fractures as more persistent than they actually

  14. Climate Change Modeling Methodology Selected Entries from the Encyclopedia of Sustainability Science and Technology

    CERN Document Server

    2012-01-01

    The Earth's average temperature has risen by 1.4°F over the past century, and computer models project that it will rise much more over the next hundred years, with significant impacts on weather, climate, and human society. Many climate scientists attribute these increases to the buildup of greenhouse gases produced by the burning of fossil fuels and to the anthropogenic production of short-lived climate pollutants. Climate Change Modeling Methodologies: Selected Entries from the Encyclopedia of Sustainability Science and Technology provides readers with an introduction to the tools and analysis techniques used by climate change scientists to interpret the role of these forcing agents on climate.  Readers will also gain a deeper understanding of the strengths and weaknesses of these models and how to test and assess them.  The contributions include a glossary of key terms and a concise definition of the subject for each topic, as well as recommendations for sources of more detailed information. Features au...

  15. Study on an ISO 15926 based data modeling methodology for nuclear power industry

    Energy Technology Data Exchange (ETDEWEB)

    Cheon, Yang Ho; Park, Byeong Ho; Park, Seong Chan; Kim, Eun Kee [KEPCO E-C, Yongin (Korea, Republic of)

    2014-10-15

    The scope is therefore data integration and data to support the whole life of a plant. This representation is specified by a generic, conceptual Data Model (DM) that is independent of any particular application, but that is able to record data from the applications used in plant design, fabrication and operation. The data model is designed to be used in conjunction with Reference Data (RD): standard instances of the DM that represent information common to a number of users, plants, or both. This paper introduces a high level description of the structure of ISO 15926 and how this can be adapted to the nuclear power plant industry in particular. This paper introduces ISO 15926 methodology and how to extend the existing RDL for nuclear power industry. As the ISO 15926 representation is independent of applications, interfaces to existing or future applications have to be developed. Such interfaces are provided by Templates that takes input from external sources and 'lifts' it into an ISO 15926 repository, and/or 'lowers' the data into other applications. This is a similar process to the process defined by W3C. Data exchange can be done using e.g. XML messages, but the modelling is independent of technology used for the exchange.

  16. The Cultural Analysis of Soft Systems Methodology and the Configuration Model of Organizational Culture

    Directory of Open Access Journals (Sweden)

    Jürgen Staadt

    2015-06-01

    Full Text Available Organizations that find themselves within a problematic situation connected with cultural issues such as politics and power require adaptable research and corresponding modeling approaches so as to grasp the arrangements of that situation and their impact on the organizational development. This article originates from an insider-ethnographic intervention into the problematic situation of the leading public housing provider in Luxembourg. Its aim is to describe how the more action-oriented cultural analysis of soft systems methodology and the theory-driven configuration model of organizational culture are mutually beneficial rather than contradictory. The data collected between 2007 and 2013 were analyzed manually as well as by means of ATLAS.ti. Results demonstrate that the cultural analysis enables an in-depth understanding of the power-laden environment within the organization bringing about the so-called “socio-political system” and that the configuration model makes it possible to depict the influence of that system on the whole organization. The overall research approach thus contributes toward a better understanding of the influence and the impact of oppressive social environments and evolving power relations on the development of an organization.

  17. Alcohol, psychomotor-stimulants and behaviour: methodological considerations in preclinical models of early-life stress.

    Science.gov (United States)

    McDonnell-Dowling, Kate; Miczek, Klaus A

    2018-04-01

    In order to assess the risk associated with early-life stress, there has been an increase in the amount of preclinical studies investigating early-life stress. There are many challenges associated with investigating early-life stress in animal models and ensuring that such models are appropriate and clinically relevant. The purpose of this review is to highlight the methodological considerations in the design of preclinical studies investigating the effects of early-life stress on alcohol and psychomotor-stimulant intake and behaviour. The protocols employed for exploring early-life stress were investigated and summarised. Experimental variables include animals, stress models, and endpoints employed. The findings in this paper suggest that there is little consistency among these studies and so the interpretation of these results may not be as clinically relevant as previously thought. The standardisation of these simple stress procedures means that results will be more comparable between studies and that results generated will give us a more robust understanding of what can and may be happening in the human and veterinary clinic.

  18. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  19. Methodology for Applying Cyber Security Risk Evaluation from BN Model to PSA Model

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jin Soo; Heo, Gyun Young [Kyung Hee University, Youngin (Korea, Republic of); Kang, Hyun Gook [KAIST, Dajeon (Korea, Republic of); Son, Han Seong [Joongbu University, Chubu (Korea, Republic of)

    2014-08-15

    There are several advantages to use digital equipment such as cost, convenience, and availability. It is inevitable to use the digital I and C equipment replaced analog. Nuclear facilities have already started applying the digital system to I and C system. However, the nuclear facilities also have to change I and C system even though it is difficult to use digital equipment due to high level of safety, irradiation embrittlement, and cyber security. A cyber security which is one of important concerns to use digital equipment can affect the whole integrity of nuclear facilities. For instance, cyber-attack occurred to nuclear facilities such as the SQL slammer worm, stuxnet, DUQU, and flame. The regulatory authorities have published many regulatory requirement documents such as U.S. NRC Regulatory Guide 5.71, 1.152, IAEA guide NSS-17, IEEE Standard, and KINS Regulatory Guide. One of the important problem of cyber security research for nuclear facilities is difficulty to obtain the data through the penetration experiments. Therefore, we make cyber security risk evaluation model with Bayesian network (BN) for nuclear reactor protection system (RPS), which is one of the safety-critical systems to trip the reactor when the accident is happened to the facilities. BN can be used for overcoming these problems. We propose a method to apply BN cyber security model to probabilistic safety assessment (PSA) model, which had been used for safety assessment of system, structure and components of facility. The proposed method will be able to provide the insight of safety as well as cyber risk to the facility.

  20. Methodology for Applying Cyber Security Risk Evaluation from BN Model to PSA Model

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Heo, Gyun Young; Kang, Hyun Gook; Son, Han Seong

    2014-01-01

    There are several advantages to use digital equipment such as cost, convenience, and availability. It is inevitable to use the digital I and C equipment replaced analog. Nuclear facilities have already started applying the digital system to I and C system. However, the nuclear facilities also have to change I and C system even though it is difficult to use digital equipment due to high level of safety, irradiation embrittlement, and cyber security. A cyber security which is one of important concerns to use digital equipment can affect the whole integrity of nuclear facilities. For instance, cyber-attack occurred to nuclear facilities such as the SQL slammer worm, stuxnet, DUQU, and flame. The regulatory authorities have published many regulatory requirement documents such as U.S. NRC Regulatory Guide 5.71, 1.152, IAEA guide NSS-17, IEEE Standard, and KINS Regulatory Guide. One of the important problem of cyber security research for nuclear facilities is difficulty to obtain the data through the penetration experiments. Therefore, we make cyber security risk evaluation model with Bayesian network (BN) for nuclear reactor protection system (RPS), which is one of the safety-critical systems to trip the reactor when the accident is happened to the facilities. BN can be used for overcoming these problems. We propose a method to apply BN cyber security model to probabilistic safety assessment (PSA) model, which had been used for safety assessment of system, structure and components of facility. The proposed method will be able to provide the insight of safety as well as cyber risk to the facility

  1. The Development Strategies for the Management Models of the Electronic Documents and Records in the United States, United Kingdom and Australia

    Directory of Open Access Journals (Sweden)

    Chiu-Yen Lin

    2015-06-01

    Full Text Available The trend toward electronic government has espoused a large quantity of electronic records, which challenge the existing records management models in the modern countries. This paper describes and compares the development and transition toward electronic records management in the United States, United Kingdom, and Australia to show how the three advanced countries evolved the government records management practices. The analysis emphasized on the holistic policy initiative perspective and compared the directives and regulations, research and development programs and plans, the emerging structures of governance, staffing and professional training, and risk management provisions. The comparison may shed lights on the government electronic management in the other countries. [Article content in Chinese

  2. Harmonising and Matching IPR Holders at IP Australia

    OpenAIRE

    T’Mir D. Julius; Gaétan de Rassenfosse

    2014-01-01

    This document describes the methodology developed by the Melbourne Institute to: (i) harmonise holders of intellectual property rights (IPRs) at IP Australia (applications for patent, designs, trademarks and plant breeder’s rights); (ii) match Australian IPRs holders to the Australian business register; (iii) identify the ultimate owners within Australia; and (iv) identify which holders are small and medium size enterprises.

  3. A systematic methodology to extend the applicability of a bioconversion model for the simulation of various co-digestion scenarios

    DEFF Research Database (Denmark)

    Kovalovszki, Adam; Alvarado-Morales, Merlin; Fotidis, Ioannis

    2017-01-01

    Detailed simulation of anaerobic digestion (AD) requires complex mathematical models and the optimization of numerous model parameters. By performing a systematic methodology and identifying parameters with the highest impact on process variables in a well-established AD model, its applicability...... was extended to various co-digestion scenarios. More specifically, the application of the step-by-step methodology led to the estimation of a general and reduced set of parameters, for the simulation of scenarios where either manure or wastewater were co-digested with different organic substrates. Validation...... experimental data quite well, indicating that it offers a reliable reference point for future simulations of anaerobic co-digestion scenarios....

  4. Methodology for experimental validation of a CFD model for predicting noise generation in centrifugal compressors

    International Nuclear Information System (INIS)

    Broatch, A.; Galindo, J.; Navarro, R.; García-Tíscar, J.

    2014-01-01

    Highlights: • A DES of a turbocharger compressor working at peak pressure point is performed. • In-duct pressure signals are measured in a steady flow rig with 3-sensor arrays. • Pressure spectra comparison is performed as a validation for the numerical model. • A suitable comparison methodology is developed, relying on pressure decomposition. • Whoosh noise at outlet duct is detected in experimental and numerical spectra. - Abstract: Centrifugal compressors working in the surge side of the map generate a broadband noise in the range of 1–3 kHz, named as whoosh noise. This noise is perceived at strongly downsized engines operating at particular conditions (full load, tip-in and tip-out maneuvers). A 3-dimensional CFD model of a centrifugal compressor is built to analyze fluid phenomena related to whoosh noise. A detached eddy simulation is performed with the compressor operating at the peak pressure point of 160 krpm. A steady flow rig mounted on an anechoic chamber is used to obtain experimental measurements as a means of validation for the numerical model. In-duct pressure signals are obtained in addition to standard averaged global variables. The numerical simulation provides global variables showing excellent agreement with experimental measurements. Pressure spectra comparison is performed to assess noise prediction capability of numerical model. The influence of the type and position of the virtual pressure probes is evaluated. Pressure decomposition is required by the simulations to obtain meaningful spectra. Different techniques for obtaining pressure components are analyzed. At the simulated conditions, a broadband noise in 1–3 kHz frequency band is detected in the experimental measurements. This whoosh noise is also captured by the numerical model

  5. Miedema model based methodology to predict amorphous-forming-composition range in binary and ternary systems

    Energy Technology Data Exchange (ETDEWEB)

    Das, N., E-mail: nirupamd@barc.gov.in [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Mittra, J. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India); Murty, B.S. [Department of Metallurgical and Materials Engineering, IIT Madras, Chennai 600 036 (India); Pabi, S.K. [Department of Metallurgical and Materials Engineering, IIT Kharagpur, Kharagpur 721 302 (India); Kulkarni, U.D.; Dey, G.K. [Materials Science Division, Bhabha Atomic Research Centre, Trombay, Mumbai 400 085 (India)

    2013-02-15

    Highlights: Black-Right-Pointing-Pointer A methodology was proposed to predict amorphous forming compositions (AFCs). Black-Right-Pointing-Pointer Chemical contribution to enthalpy of mixing {proportional_to} enthalpy of amorphous for AFCs. Black-Right-Pointing-Pointer Accuracy in the prediction of AFC-range was noticed in Al-Ni-Ti system. Black-Right-Pointing-Pointer Mechanical alloying (MA) results of Al-Ni-Ti followed the predicted AFC-range. Black-Right-Pointing-Pointer Earlier MA results of Al-Ni-Ti also conformed to the predicted AFC-range. - Abstract: From the earlier works on the prediction of amorphous forming composition range (AFCR) using Miedema based model and also, on mechanical alloying experiments it has been observed that all amorphous forming compositions of a given alloy system falls within a linear band when the chemical contribution to enthalpy of the solid solution ({Delta}H{sup ss}) is plotted against the enthalpy of mixing in the amorphous phase ({Delta}H{sup amor}). On the basis of this observation, a methodology has been proposed in this article to identify the AFCR of a ternary system that is likely to be more precise than what can be obtained using {Delta}H{sup amor} - {Delta}H{sup ss} < 0 criterion. MA experiments on various compositions of Al-Ni-Ti system, producing amorphous, crystalline, and mixture of amorphous plus crystalline phases have been carried out and the phases have been characterized using X-ray diffraction and transmission electron microscopy techniques. Data from the present MA experiments and, also, from the literature have been used to validate the proposed approach. Also, the proximity of compositions, producing a mixture of amorphous and crystalline phases to the boundary of AFCR in the Al-Ni-Ti ternary has been found useful to validate the effectiveness of the prediction.

  6. A conceptual model for groundwater - surface water interactions in the Darling River Floodplain, N.S.W., Australia

    Science.gov (United States)

    Brodie, R. S.; Lawrie, K.; Somerville, P.; Hostetler, S.; Magee, J.; Tan, K. P.; Clarke, J.

    2013-12-01

    Multiple lines of evidence were used to develop a conceptual model for interaction between the Darling River and associated floodplain aquifers in western New South Wales, Australia. Hydrostratigraphy and groundwater salinities were mapped using airborne electromagnetics (AEM), validated by sonic-core drilling. The AEM was highly effective in mapping groundwater freshening due to river leakage in discrete zones along the river corridor. These fresh resources occurred in both the unconfined Quaternary aquifers and the underlying, largely semi-confined Pliocene aquifers. The AEM was also fundamental to mapping the Blanchetown Clay aquitard which separates these two aquifer systems. Major-ion chemistry highlighted a mixing signature between river waters and groundwaters in both the Quaternary and Pliocene aquifers. Stable isotope data indicates that recharge to the key Pliocene aquifers is episodic and linked to high-flow flood events rather than river leakage being continuous. This was also evident when groundwater chemistry was compared with river chemistry under different flow conditions. Mapping of borehole levels showed groundwater mounding near the river, emphasising the regional significance of losing river conditions for both aquifer systems. Critically, rapid and significant groundwater level responses were measured during large flood events. In the Pliocene aquifers, continuation of rising trends after the flood peak receded confirms that this is an actual recharge response rather than hydraulic loading. The flow dependency of river leakage can be explained by the presence of mud veneers and mineral precipitates along the Darling River channel bank when river flows are low. During low flow conditions these act as impediments to river leakage. During floods, high flow velocities scour these deposits, revealing lateral-accretion surfaces in the shallow scroll plain sediments. This scouring allows lateral bank recharge to the shallow aquifer. During flood

  7. A methodology for the design and testing of atmospheric boundary layer models for wind energy applications

    Directory of Open Access Journals (Sweden)

    J. Sanz Rodrigo

    2017-02-01

    Full Text Available The GEWEX Atmospheric Boundary Layer Studies (GABLS 1, 2 and 3 are used to develop a methodology for the design and testing of Reynolds-averaged Navier–Stokes (RANS atmospheric boundary layer (ABL models for wind energy applications. The first two GABLS cases are based on idealized boundary conditions and are suitable for verification purposes by comparing with results from higher-fidelity models based on large-eddy simulation. Results from three single-column RANS models, of 1st, 1.5th and 2nd turbulence closure order, show high consistency in predicting the mean flow. The third GABLS case is suitable for the study of these ABL models under realistic forcing such that validation versus observations from the Cabauw meteorological tower are possible. The case consists on a diurnal cycle that leads to a nocturnal low-level jet and addresses fundamental questions related to the definition of the large-scale forcing, the interaction of the ABL with the surface and the evaluation of model results with observations. The simulations are evaluated in terms of surface-layer fluxes and wind energy quantities of interest: rotor equivalent wind speed, hub-height wind direction, wind speed shear and wind direction veer. The characterization of mesoscale forcing is based on spatially and temporally averaged momentum budget terms from Weather Research and Forecasting (WRF simulations. These mesoscale tendencies are used to drive single-column models, which were verified previously in the first two GABLS cases, to first demonstrate that they can produce similar wind profile characteristics to the WRF simulations even though the physics are more simplified. The added value of incorporating different forcing mechanisms into microscale models is quantified by systematically removing forcing terms in the momentum and heat equations. This mesoscale-to-microscale modeling approach is affected, to a large extent, by the input uncertainties of the mesoscale

  8. A methodology for thermo-economic modeling and optimization of solid oxide fuel cell systems

    International Nuclear Information System (INIS)

    Palazzi, Francesca; Autissier, Nordahl; Marechal, Francois M.A.; Favrat, Daniel

    2007-01-01

    In the context of stationary power generation, fuel cell-based systems are being foreseen as a valuable alternative to thermodynamic cycle-based power plants, especially in small scale applications. As the technology is not yet established, many aspects of fuel cell development are currently investigated worldwide. Part of the research focuses on integrating the fuel cell in a system that is both efficient and economically attractive. To address this problem, we present in this paper a thermo-economic optimization method that systematically generates the most attractive configurations of an integrated system. In the developed methodology, the energy flows are computed using conventional process simulation software. The system is integrated using the pinch based methods that rely on optimization techniques. This defines the minimum of energy required and sets the basis to design the ideal heat exchanger network. A thermo-economic method is then used to compute the integrated system performances, sizes and costs. This allows performing the optimization of the system with regard to two objectives: minimize the specific cost and maximize the efficiency. A solid oxide fuel cell (SOFC) system of 50 kW integrating a planar SOFC is modeled and optimized leading to designs with efficiencies ranging from 34% to 44%. The multi-objective optimization strategy identifies interesting system configurations and their performance for the developed SOFC system model. The methods proves to be an attractive tool to be used both as an advanced analysis tool and as support to decision makers when designing new systems

  9. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    Science.gov (United States)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  10. What Can Be Learned From a Laboratory Model of Conceptual Change? Descriptive Findings and Methodological Issues

    Science.gov (United States)

    Ohlsson, Stellan; Cosejo, David G.

    2014-07-01

    The problem of how people process novel and unexpected information— deep learning (Ohlsson in Deep learning: how the mind overrides experience. Cambridge University Press, New York, 2011)—is central to several fields of research, including creativity, belief revision, and conceptual change. Researchers have not converged on a single theory for conceptual change, nor has any one theory been decisively falsified. One contributing reason is the difficulty of collecting informative data in this field. We propose that the commonly used methodologies of historical analysis, classroom interventions, and developmental studies, although indispensible, can be supplemented with studies of laboratory models of conceptual change. We introduce re- categorization, an experimental paradigm in which learners transition from one definition of a categorical concept to another, incompatible definition of the same concept, a simple form of conceptual change. We describe a re-categorization experiment, report some descriptive findings pertaining to the effects of category complexity, the temporal unfolding of learning, and the nature of the learner's final knowledge state. We end with a brief discussion of ways in which the re-categorization model can be improved.

  11. Flying personal planes: modeling the airport choices of general aviation pilots using stated preference methodology.

    Science.gov (United States)

    Camasso, M J; Jagannathan, R

    2001-01-01

    This study employed stated preference (SP) models to determine why general aviation pilots choose to base and operate their aircraft at some airports and not others. Thirteen decision variables identified in pilot focus groups and in the general aviation literature were incorporated into a series of hypothetical choice tasks or scenarios. The scenarios were offered within a fractional factorial design to establish orthogonality and to preclude dominance in any combination of variables. Data from 113 pilots were analyzed for individual differences across pilots using conditional logit regression with and without controls. The results demonstrate that some airport attributes (e.g., full-range hospitality services, paved parallel taxiway, and specific types of runway lighting and landing aids) increase pilot utility. Heavy airport congestion and airport landing fees, on the other hand, decrease pilot utility. The importance of SP methodology as a vehicle for modeling choice behavior and as an input into the planning and prioritization process is discussed. Actual or potential applications include the development of structured decision-making instruments in the behavioral sciences and in human service programs.

  12. Modeling and optimization of ammonia treatment by acidic biochar using response surface methodology

    Directory of Open Access Journals (Sweden)

    Narong Chaisongkroh

    2012-09-01

    Full Text Available Emission of ammonia (NH3 contaminated waste air to the atmosphere without treatment has affected humans andenvironment. Eliminating NH3 in waste air emitted from industries is considered an environmental requisite. In this study,optimization of NH3 adsorption time using acidic rubber wood biochar (RWBs impregnated with sulfuric acid (H2SO4 wasinvestigated. The central composite design (CCD in response surface methodology (RSM by the Design Expert softwarewas used for designing the experiments as well as the full response surface estimation. The RSM was used to evaluate theeffect of adsorption parameters in continuous mode of fixed bed column including waste air flow rate, inlet NH3 concentration in waste air stream, and H2SO4 concentration for adsorbent surface modification. Based on statistical analysis, the NH3symmetric adsorption time (at 50% NH3 removal efficiency model proved to be very highly significant (p<0.0001. The optimum conditions obtained were 300 ppmv inlet NH3 concentration, 72% H2SO4, and 2.1 l/min waste air flow rate. This resultedin 219 minutes of NH3 adsorption time as obtained from the predicted model, which fitted well with the laboratory verification result. This was supported by the high value of coefficient of determination (R2=0.9137. (NH42SO4, a nitrogen fertilizerfor planting, was the by-product from chemical adsorption between NH3 and H2SO4.

  13. Development of A Methodology for Assessing Various Accident Management Strategies Using Decision Tree Models

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Nam Yeong; Kim, Jin Tae; Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of); Jerng, Dong Wook [Chung-Ang University, Seoul (Korea, Republic of)

    2016-05-15

    The purpose of ASP (Accident Sequence Precursor) analysis is to evaluate operational accidents in full power and low power operation by using PRA (Probabilistic Risk Assessment) technologies. The awareness of the importance of ASP analysis has been on rise. The methodology for ASP analysis has been developed in Korea, KINS (Korea Institute of Nuclear Safety) has managed KINS-ASP program since it was developed. In this study, we applied ASP analysis into operational accidents in full power and low power operation to quantify CCDP (Conditional Core Damage Probability). To reflect these 2 cases into PRA model, we modified fault trees and event trees of the existing PRA model. Also, we suggest the ASP regulatory system in the conclusion. In this study, we reviewed previous studies for ASP analysis. Based on it, we applied it into operational accidents in full power and low power operation. CCDP of these 2 cases are 1.195E-06 and 2.261E-03. Unlike other countries, there is no regulatory basis of ASP analysis in Korea. ASP analysis could detect the risk by assessing the existing operational accidents. ASP analysis can improve the safety of nuclear power plant by detecting, reviewing the operational accidents, and finally removing potential risk. Operator have to notify regulatory institute of operational accident before operator takes recovery work for the accident. After follow-up accident, they have to check precursors in data base to find similar accident.

  14. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  15. Model development and optimization of operating conditions to maximize PEMFC performance by response surface methodology

    International Nuclear Information System (INIS)

    Kanani, Homayoon; Shams, Mehrzad; Hasheminasab, Mohammadreza; Bozorgnezhad, Ali

    2015-01-01

    Highlights: • The optimization of the operating parameters in a serpentine PEMFC is done using RSM. • The RSM model can predict the cell power over the wide range of operating conditions. • St-An, St-Ca and RH-Ca have an optimum value to obtain the best performance. • The interactions of the operating conditions affect the output power significantly. • The cathode and anode stoichiometry are the most effective parameters on the power. - Abstract: Optimization of operating conditions to obtain maximum power in PEMFCs could have a significant role to reduce the costs of this emerging technology. In the present experimental study, a single serpentine PEMFC is used to investigate the effects of operating conditions on the electrical power production of the cell. Four significant parameters including cathode stoichiometry, anode stoichiometry, gases inlet temperature, and cathode relative humidity are studied using Design of Experiment (DOE) to obtain an optimal power. Central composite second order Response Surface Methodology (RSM) is used to model the relationship between goal function (power) and considered input parameters (operating conditions). Using this statistical–mathematical method leads to obtain a second-order equation for the cell power. This model considers interactions and quadratic effects of different operating conditions and predicts the maximum or minimum power production over the entire working range of the parameters. In this range, high stoichiometry of cathode and low stoichiometry of anode results in the minimum cell power and contrary the medium range of fuel and oxidant stoichiometry leads to the maximum power. Results show that there is an optimum value for the anode stoichiometry, cathode stoichiometry and relative humidity to reach the best performance. The predictions of the model are evaluated by experimental tests and they are in a good agreement for different ranges of the parameters

  16. Application of a Bayesian model for the quantification of the European methodology for qualification of non-destructive testing

    International Nuclear Information System (INIS)

    Gandossi, Luca; Simola, Kaisa; Shepherd, Barrie

    2010-01-01

    The European methodology for qualification of non-destructive testing is a well-established approach adopted by nuclear utilities in many European countries. According to this methodology, qualification is based on a combination of technical justification and practical trials. The methodology is qualitative in nature, and it does not give explicit guidance on how the evidence from the technical justification and results from trials should be weighted. A Bayesian model for the quantification process was presented in a previous paper, proposing a way to combine the 'soft' evidence contained in a technical justification with the 'hard' evidence obtained from practical trials. This paper describes the results of a pilot study in which such a Bayesian model was applied to two realistic Qualification Dossiers by experienced NDT qualification specialists. At the end of the study, recommendations were made and a set of guidelines was developed for the application of the Bayesian model.

  17. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    Science.gov (United States)

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  18. Scenario Methodology for Modelling of Future Landscape Developments as Basis for Assessing Ecosystem Services

    Directory of Open Access Journals (Sweden)

    Matthias Rosenberg

    2014-04-01

    Full Text Available The ecosystems of our intensively used European landscapes produce a variety of natural goods and services for the benefit of humankind, and secure the basics and quality of life. Because these ecosystems are still undergoing fundamental changes, the interest of the society is to know more about future developments and their ecological impacts. To describe and analyze these changes, scenarios can be developed and an assessment of the ecological changes can be carried out subsequently. In the project „Landscape Saxony 2050“; a methodology for the construction of exploratory scenarios was worked out. The presented methodology provides a possibility to identify the driving forces (socio-cultural, economic and ecological conditions of the landscape development. It allows to indicate possible future paths which lead to a change of structures and processes in the landscape and can influence the capability to provide ecosystem services. One essential component of the applied technique is that an approach for the assessment of the effects of the landscape changes on ecosystem services is integrated into the developed scenario methodology. Another is, that the methodology is strong designed as participatory, i.e. stakeholders are integrated actively. The method is a seven phase model which provides the option for the integration of the stakeholders‘ participation at all levels of scenario development. The scenario framework was applied to the district of Görlitz, an area of 2100 sq km located at the eastern border of Germany. The region is affected by strong demographic as well as economic changes. The core issue focused on the examination of landscape change in terms of biodiversity. Together with stakeholders, a trend scenario and two alternative scenarios were developed. The changes of the landscape structure are represented in story lines, maps and tables. On basis of the driving forces of the issue areas „cultural / social values“ and

  19. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    International Nuclear Information System (INIS)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias; Zhang, Jie

    2017-01-01

    Highlights: • An ensemble model is developed to produce both deterministic and probabilistic wind forecasts. • A deep feature selection framework is developed to optimally determine the inputs to the forecasting methodology. • The developed ensemble methodology has improved the forecasting accuracy by up to 30%. - Abstract: With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by first layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.

  20. A Comparison Study of a Generic Coupling Methodology for Modeling Wake Effects of Wave Energy Converter Arrays

    Directory of Open Access Journals (Sweden)

    Tim Verbrugghe

    2017-10-01

    Full Text Available Wave Energy Converters (WECs need to be deployed in large numbers in an array layout in order to have a significant power production. Each WEC has an impact on the incoming wave field, by diffracting, reflecting and radiating waves. Simulating the wave transformations within and around a WEC array is complex; it is difficult, or in some cases impossible, to simulate both these near-field and far-field wake effects using a single numerical model, in a time- and cost-efficient way in terms of computational time and effort. Within this research, a generic coupling methodology is developed to model both near-field and far-field wake effects caused by floating (e.g., WECs, platforms or fixed offshore structures. The methodology is based on the coupling of a wave-structure interaction solver (Nemoh and a wave propagation model. In this paper, this methodology is applied to two wave propagation models (OceanWave3D and MILDwave, which are compared to each other in a wide spectrum of tests. Additionally, the Nemoh-OceanWave3D model is validated by comparing it to experimental wave basin data. The methodology proves to be a reliable instrument to model wake effects of WEC arrays; results demonstrate a high degree of agreement between the numerical simulations with relative errors lower than 5 % and to a lesser extent for the experimental data, where errors range from 4 % to 17 % .

  1. A methodology for constraining power in finite element modeling of radiofrequency ablation.

    Science.gov (United States)

    Jiang, Yansheng; Possebon, Ricardo; Mulier, Stefaan; Wang, Chong; Chen, Feng; Feng, Yuanbo; Xia, Qian; Liu, Yewei; Yin, Ting; Oyen, Raymond; Ni, Yicheng

    2017-07-01

    Radiofrequency ablation (RFA) is a minimally invasive thermal therapy for the treatment of cancer, hyperopia, and cardiac tachyarrhythmia. In RFA, the power delivered to the tissue is a key parameter. The objective of this study was to establish a methodology for the finite element modeling of RFA with constant power. Because of changes in the electric conductivity of tissue with temperature, a nonconventional boundary value problem arises in the mathematic modeling of RFA: neither the voltage (Dirichlet condition) nor the current (Neumann condition), but the power, that is, the product of voltage and current was prescribed on part of boundary. We solved the problem using Lagrange multiplier: the product of the voltage and current on the electrode surface is constrained to be equal to the Joule heating. We theoretically proved the equality between the product of the voltage and current on the surface of the electrode and the Joule heating in the domain. We also proved the well-posedness of the problem of solving the Laplace equation for the electric potential under a constant power constraint prescribed on the electrode surface. The Pennes bioheat transfer equation and the Laplace equation for electric potential augmented with the constraint of constant power were solved simultaneously using the Newton-Raphson algorithm. Three problems for validation were solved. Numerical results were compared either with an analytical solution deduced in this study or with results obtained by ANSYS or experiments. This work provides the finite element modeling of constant power RFA with a firm mathematical basis and opens pathway for achieving the optimal RFA power. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Modeling Electric Double-Layer Capacitors Using Charge Variation Methodology in Gibbs Ensemble

    Directory of Open Access Journals (Sweden)

    Ganeshprasad Pavaskar

    2018-01-01

    Full Text Available Supercapacitors deliver higher power than batteries and find applications in grid integration and electric vehicles. Recent work by Chmiola et al. (2006 has revealed unexpected increase in the capacitance of porous carbon electrodes using ionic liquids as electrolytes. The work has generated curiosity among both experimentalists and theoreticians. Here, we have performed molecular simulations using a recently developed technique (Punnathanam, 2014 for simulating supercapacitor system. In this technique, the two electrodes (containing electrolyte in slit pore are simulated in two different boxes using the Gibbs ensemble methodology. This reduces the number of particles required and interfacial interactions, which helps in reducing computational load. The method simulates an electric double-layer capacitor (EDLC with macroscopic electrodes with much smaller system sizes. In addition, the charges on individual electrode atoms are allowed to vary in response to movement of electrolyte ions (i.e., electrode is polarizable while ensuring these atoms are at the same electric potential. We also present the application of our technique on EDLCs with the electrodes modeled as slit pores and as complex three-dimensional pore networks for different electrolyte geometries. The smallest pore geometry showed an increase in capacitance toward the potential of 0 charge. This is in agreement with the new understanding of the electrical double layer in regions of dense ionic packing, as noted by Kornyshev’s theoretical model (Kornyshev, 2007, which also showed a similar trend. This is not addressed by the classical Gouy–Chapman theory for the electric double layer. Furthermore, the electrode polarizability simulated in the model improved the accuracy of the calculated capacitance. However, its addition did not significantly alter the capacitance values in the voltage range considered.

  3. Adsorption of cellulase on cereal brans: a simple functional model from response surface methodology

    Directory of Open Access Journals (Sweden)

    Rui Sergio F. da Silva

    1980-11-01

    Full Text Available A functional model based on Langmuirian adsorption as a limiting mechanism was proposed to explain the effect of cellulase during the enzymatic pretreatment of bran, conducted prior to extraction of proteins, by wet alkaline process from wheat and buckwheat bran materials. The proposed model provides a good fit (r = 0.99 for the data generated thru predictive model taken from the response surface methodology, permitting calculation of a affinity constant (b and capacity constant (k, for wheat bran (b = 0.255 g/IU and k = 17.42% and buckwheat bran (b = 0.066g/IUand k = 78.74%.Modelo funcional baseado na adsorção de Langmuir como mecanismo limitante proposto para explicar o efeito da celulase durante o pré-tratamento enzimático de farelos, visando à extração de proteínas, através do método alcalino-úmido. O referido modelo ajusta se muito bem (r = 0,99 aos dados gerados com base em modelo preditivo obtido da metodologia da superfície de resposta. Pode-se calcular a constante de afinidade (b e a constante de capacidade (k para o farelo de trigo e farelo de trigo mourisco (sarraceno, usando uma equação análoga à isoterma de adsorção de Langmuir. Os resultados indicaram que o farelo de trigo mourisco apresenta uma capacidade mais alta para adsorver celulase e, conseqüentemente,'pode-se esperar uma resposta maior ao pré-tratamento com esta enzima.

  4. Modeling of cryogenic frictional behaviour of titanium alloys using Response Surface Methodology approach

    International Nuclear Information System (INIS)

    El-Tayeb, N.S.M.; Yap, T.C.; Venkatesh, V.C.; Brevern, P.V.

    2009-01-01

    The potential of cryogenic effect on frictional behaviour of newly developed titanium alloy Ti-5Al-4V-0.6Mo-0.4Fe (Ti54) sliding against tungsten carbide was investigated and compared with conventional titanium alloy Ti6Al4V (Ti64). In this study, four models were developed to describe the interrelationship between the friction coefficient (response) and independent variables such as speed, load, and sliding distance (time). These variables were investigated using the design of experiments and utilization of the response surface methodology (RSM). By using this method, it was possible to study the effect of main and mixed (interaction) independent variables on the friction coefficient (COF) of both titanium alloys. Under cryogenic condition, the friction coefficient of both Ti64 and Ti54 behaved differently, i.e. an increase in the case of Ti64 and decrease in the case of Ti54. For Ti64, at higher levels of load and speed, sliding in cryogenic conditions produces relatively higher friction coefficients compared to those obtained in dry air conditions. On contrary, introduction of cryogenic fluid reduces the friction coefficients of Ti54 at all tested conditions of load, speed, and time. The established models demonstrated that the mixed effect of load/speed, time/speed, and load/time consistently decrease the COF of Ti54. However this was not the case for Ti64 whereas the COF increased up to 20% when the Ti64 was tested at higher levels of load and sliding time. Furthermore, the models indicated that interaction of loads and speeds was more effective for both Ti-alloy and have the most substantial influence on the friction. In addition, COF for both alloys behaved linearly with the speed but nonlinearly with the load.

  5. Modelling of aflatoxin G1 reduction by kefir grain using response surface methodology.

    Science.gov (United States)

    Ansari, Farzaneh; Khodaiyan, Faramarz; Rezaei, Karamatollah; Rahmani, Anosheh

    2015-01-01

    Aflatoxin G1 (AFG1) is one of the main toxic contaminants in pistachio nuts and causes potential health hazards. Hence, AFG1 reduction is one of the main concerns in food safety. Kefir-grains contain symbiotic association of microorganisms well known for their aflatoxin decontamination effects. In this study, a central composite design (CCD) using response surface methodology (RSM) was applied to develop a model in order to predict AFG1 reduction in pistachio nuts by kefir-grain (already heated at 70 and 110°C). The independent variables were: toxin concentration (X1: 5, 10, 15, 20 and 25 ng/g), kefir-grain level (X2: 5, 10, 20, 10 and 25%), contact time (X3: 0, 2, 4, 6 and 8 h), and incubation temperature (X4: 20, 30, 40, 50 and 60°C). There was a significant reduction in AFG1 (p kefir-grain used. The variables including X1, X3 and the interactions between X2-X4 as well as X3-X4 have significant effects on AFG1 reduction. The model provided a good prediction of AFG1 reduction under the assay conditions. Optimization was used to enhance the efficiency of kefir-grain on AFG1 reduction. The optimum conditions for the highest AFG1 reduction (96.8%) were predicted by the model as follows: toxin concentration = 20 ng/g, kefir-grain level = 10%, contact time = 6 h, and incubation temperature = 30°C which validated practically in six replications.

  6. Computer modelling of the UK wind energy resource. Phase 2. Application of the methodology

    Energy Technology Data Exchange (ETDEWEB)

    Burch, S F; Makari, M; Newton, K; Ravenscroft, F; Whittaker, J

    1993-12-31

    This report presents the results of the second phase of a programme to estimate the UK wind energy resource. The overall objective of the programme is to provide quantitative resource estimates using a mesoscale (resolution about 1km) numerical model for the prediction of wind flow over complex terrain, in conjunction with digitised terrain data and wind data from surface meteorological stations. A network of suitable meteorological stations has been established and long term wind data obtained. Digitised terrain data for the whole UK were obtained, and wind flow modelling using the NOABL computer program has been performed. Maps of extractable wind power have been derived for various assumptions about wind turbine characteristics. Validation of the methodology indicates that the results are internally consistent, and in good agreement with available comparison data. Existing isovent maps, based on standard meteorological data which take no account of terrain effects, indicate that 10m annual mean wind speeds vary between about 4.5 and 7 m/s over the UK with only a few coastal areas over 6 m/s. The present study indicates that 28% of the UK land area had speeds over 6 m/s, with many hill sites having 10m speeds over 10 m/s. It is concluded that these `first order` resource estimates represent a substantial improvement over the presently available `zero order` estimates. The results will be useful for broad resource studies and initial site screening. Detailed resource evaluation for local sites will require more detailed local modelling or ideally long term field measurements. (12 figures, 14 tables, 21 references). (Author)

  7. Optimization of Maillard Reaction in Model System of Glucosamine and Cysteine Using Response Surface Methodology.

    Science.gov (United States)

    Arachchi, Shanika Jeewantha Thewarapperuma; Kim, Ye-Joo; Kim, Dae-Wook; Oh, Sang-Chul; Lee, Yang-Bong

    2017-03-01

    Sulfur-containing amino acids play important roles in good flavor generation in Maillard reaction of non-enzymatic browning, so aqueous model systems of glucosamine and cysteine were studied to investigate the effects of reaction temperature, initial pH, reaction time, and concentration ratio of glucosamine and cysteine. Response surface methodology was applied to optimize the independent reaction parameters of cysteine and glucosamine in Maillard reaction. Box-Behnken factorial design was used with 30 runs of 16 factorial levels, 8 axial levels and 6 central levels. The degree of Maillard reaction was determined by reading absorption at 425 nm in a spectrophotometer and Hunter's L, a, and b values. ΔE was consequently set as the fifth response factor. In the statistical analyses, determination coefficients (R 2 ) for their absorbance, Hunter's L, a, b values, and ΔE were 0.94, 0.79, 0.73, 0.96, and 0.79, respectively, showing that the absorbance and Hunter's b value were good dependent variables for this model system. The optimum processing parameters were determined to yield glucosamine-cysteine Maillard reaction product with higher absorbance and higher colour change. The optimum estimated absorbance was achieved at the condition of initial pH 8.0, 111°C reaction temperature, 2.47 h reaction time, and 1.30 concentration ratio. The optimum condition for colour change measured by Hunter's b value was 2.41 h reaction time, 114°C reaction temperature, initial pH 8.3, and 1.26 concentration ratio. These results can provide the basic information for Maillard reaction of aqueous model system between glucosamine and cysteine.

  8. A Model-Based Methodology for Simultaneous Design and Control of a Bioethanol Production Process

    DEFF Research Database (Denmark)

    Alvarado-Morales, Merlin; Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan

    2010-01-01

    . The PGC methodology is used to generate more efficient separation designs in terms of energy consumption by targeting the separation task at the largest DF. Both methodologies are highlighted through the application of two case studies, a bioethanol production process and a succinic acid production...

  9. Building a Continental Scale Land Cover Monitoring Framework for Australia

    Science.gov (United States)

    Thankappan, Medhavy; Lymburner, Leo; Tan, Peter; McIntyre, Alexis; Curnow, Steven; Lewis, Adam

    2012-04-01

    Land cover information is critical for national reporting and decision making in Australia. A review of information requirements for reporting on national environmental indicators identified the need for consistent land cover information to be compared against a baseline. A Dynamic Land Cover Dataset (DLCD) for Australia has been developed by Geoscience Australia and the Australian Bureau of Agriculture and Resource Economics and Sciences (ABARES) recently, to provide a comprehensive and consistent land cover information baseline to enable monitoring and reporting for sustainable farming practices, water resource management, soil erosion, and forests at national and regional scales. The DLCD was produced from the analysis of Enhanced Vegetation Index (EVI) data at 250-metre resolution derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) for the period from 2000 to 2008. The EVI time series data for each pixel was modelled as 12 coefficients based on the statistical, phenological and seasonal characteristics. The time series were then clustered in coefficients spaces and labelled using ancillary information on vegetation and land use at the catchment scale. The accuracy of the DLCD was assessed using field survey data over 25,000 locations provided by vegetation and land management agencies in State and Territory jurisdictions, and by ABARES. The DLCD is seen as the first in a series of steps to build a framework for national land cover monitoring in Australia. A robust methodology to provide annual updates to the DLCD is currently being developed at Geoscience Australia. There is also a growing demand from the user community for land cover information at better spatial resolution than currently available through the DLCD. Global land cover mapping initiatives that rely on Earth observation data offer many opportunities for national and international programs to work in concert and deliver better outcomes by streamlining efforts on development and

  10. A three-compartment model for micropollutants sorption in sludge: methodological approach and insights.

    Science.gov (United States)

    Barret, Maialen; Patureau, Dominique; Latrille, Eric; Carrère, Hélène

    2010-01-01

    In sludge resulting from wastewater treatment, organic micropollutants sorb to particles and to dissolved/colloidal matter (DCM). Both interactions may influence their physical and biological fate throughout the wastewater treatment processes. To our knowledge, sludge has never been considered as a three-compartment matrix, in which micropollutants coexist in three states: freely dissolved, sorbed-to-particles and sorbed-to-DCM. A methodology is proposed to concomitantly determine equilibrium constants of sorption to particles (K(part)) and to DCM (K(DCM)). Polycyclic Aromatic Hydrocarbons (PAHs) were chosen as model compounds for the experiments. The logarithm of estimated equilibrium constants ranged from 3.1 to 4.3 and their usual correlation to PAH hydrophobicity was verified. Moreover, PAH affinities for particles and for DCM could be compared. Affinity for particles was found to be stronger, probably due to their physical and chemical characteristics. This work provided a useful tool to assess the freely dissolved, sorbed-to-particles and sorbed-to-DCM concentrations of contaminants, which are necessary to accurately predict their fate. Besides, guidelines to investigate the link between sorption and the fundamental concept of bioavailability were proposed. (c) 2009 Elsevier Ltd. All rights reserved.

  11. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    Science.gov (United States)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  12. MAP: an iterative experimental design methodology for the optimization of catalytic search space structure modeling.

    Science.gov (United States)

    Baumes, Laurent A

    2006-01-01

    One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.

  13. [Proposed difficult airway teaching methodology. Presentation of an interactive fresh frozen cadaver model].

    Science.gov (United States)

    Catalá Bauset, J C; de Andres Ibañez, J A; Valverde Navarro, A; Martinez Soriano, F

    2014-04-01

    The aim of this paper is to present a methodology based on the use of fresh-frozen cadavers for training in the management of the airway, and to evaluate the degree of satisfaction among learning physicians. About 6 fresh-frozen cadavers and 14 workstations were prepared where participants were trained in the different skills needed for airway management. The details of preparation of the cadavers are described. The level of satisfaction of the participant was determined using a Likert rating scale of 5 points, at each of the 14 stations, as well as the overall assessment and clinical usefulness of the course. The mean overall evaluation of the course and its usefulness was 4.75 and 4.9, out of 5, respectively. All parts of the course were rated above 4 out of 5. The high level of satisfaction of the course remained homogeneous in the 2 editions analysed. The overall satisfaction of the course was not finally and uniquely determined by any of its particular parts. The fresh cadaver model for training physicians in techniques of airway management is a proposal satisfactory to the participant, and with a realism that approaches the live patient. Copyright © 2013 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Published by Elsevier España. All rights reserved.

  14. Optimization and Modeling of Process Variables of Biodiesel Production from Marula Oil using Response Surface Methodology

    International Nuclear Information System (INIS)

    Enweremadu, C. C.; Rutto, H. L.

    2015-01-01

    This paper presents an optimization study in the production of biodiesel production from Marula oil. The study was carried out using a central composite design of experiments under response surface methodology. A mathematical model was developed to correlate the transesterification process variables to biodiesel yield. The transesterification reaction variables were methanol to oil ratio, x /sub 1/ (10-50 wt percentage), reaction time, x /sub 2/ (30-90 min), reaction temperature, x /sub 3/ (30-90 Degree C) stirring speed, x /sub 4/ (100-400 rpm) and amount of catalyst, x /sub 5/ (0.5-1.5 g). The optimum conditions for the production of the biodiesel were found to be methanol to oil ratio (29.43 wt percentage), reaction time (59.17 minutes), reaction temperature (58.80 Degree C), stirring speed (325 rpm) and amount of catalyst (1.02 g). The optimum yield of biodiesel that can be produced was 95 percentage. The results revealed that the crucial fuel properties of the biodiesel produced at the optimum conditions met the ASTM biodiesel specifications. (author)

  15. Methodological application so as to obtain digital elevation models DEM in wetland areas

    International Nuclear Information System (INIS)

    Quintero, Deiby A; Montoya V, Diana M; Betancur, Teresita

    2009-01-01

    In order to understand hydrological systems and the description of flow processes that occur among its components it is essential to have a physiographic description that morphometric and relief characteristics. When local studies are performed, the basic cartography available, in the best case 1:25,000 scale, tends not to obey the needs required to represent the water dynamics that characterize the interactions between streams, aquifers and lenticular water bodies in flat zones particularly in those where there are wetlands localized in ancient F100D plains of rivers. A lack of financial resources is the principal obstacle to acquiring; information that is current and sufficient for the scale of the project. Geomorphologic conditions of flat relief zones are a good alternative for the construction of the new data. Using the basic cartography available and the new data, it is possible to obtain DEMs that are improved and consistent with the dynamics of surface and groundwater flows in the hydrological system. To accomplish this one must use spatial modeling tools coupled with Geographic Information System - GIS. This article present a methodological application for the region surrounding the catchment of wetland Cienaga Colombia in the Bajo Cauca region of Antioquia.

  16. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation

    Science.gov (United States)

    Pawar, Sumedh; Sharma, Atul

    2018-01-01

    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  17. Isoprene and monoterpene emissions in south-east Australia: comparison of a multi-layer canopy model with MEGAN and with atmospheric observations

    Directory of Open Access Journals (Sweden)

    K. M. Emmerson

    2018-05-01

    Full Text Available One of the key challenges in atmospheric chemistry is to reduce the uncertainty of biogenic volatile organic compound (BVOC emission estimates from vegetation to the atmosphere. In Australia, eucalypt trees are a primary source of biogenic emissions, but their contribution to Australian air sheds is poorly quantified. The Model of Emissions of Gases and Aerosols from Nature (MEGAN has performed poorly against Australian isoprene and monoterpene observations. Finding reasons for the MEGAN discrepancies and strengthening our understanding of biogenic emissions in this region is our focus. We compare MEGAN to the locally produced Australian Biogenic Canopy and Grass Emissions Model (ABCGEM, to identify the uncertainties associated with the emission estimates and the data requirements necessary to improve isoprene and monoterpene emissions estimates for the application of MEGAN in Australia. Previously unpublished, ABCGEM is applied as an online biogenic emissions inventory to model BVOCs in the air shed overlaying Sydney, Australia. The two models use the same meteorological inputs and chemical mechanism, but independent inputs of leaf area index (LAI, plant functional type (PFT and emission factors. We find that LAI, a proxy for leaf biomass, has a small role in spatial, temporal and inter-model biogenic emission variability, particularly in urban areas for ABCGEM. After removing LAI as the source of the differences, we found large differences in the emission activity function for monoterpenes. In MEGAN monoterpenes are partially light dependent, reducing their dependence on temperature. In ABCGEM monoterpenes are not light dependent, meaning they continue to be emitted at high rates during hot summer days, and at night. When the light dependence of monoterpenes is switched off in MEGAN, night-time emissions increase by 90–100 % improving the comparison with observations, suggesting the possibility that monoterpenes emitted from Australian

  18. A comparative review of multi-risk modelling methodologies for climate change adaptation in mountain regions

    Science.gov (United States)

    Terzi, Stefano; Torresan, Silvia; Schneiderbauer, Stefan

    2017-04-01

    Keywords: Climate change, mountain regions, multi-risk assessment, climate change adaptation. Climate change has already led to a wide range of impacts on the environment, the economy and society. Adaptation actions are needed to cope with the impacts that have already occurred (e.g. storms, glaciers melting, floods, droughts) and to prepare for future scenarios of climate change. Mountain environment is particularly vulnerable to the climate changes due to its exposure to recent climate warming (e.g. water regime changes, thawing of permafrost) and due to the high degree of specialization of both natural and human systems (e.g. alpine species, valley population density, tourism-based economy). As a consequence, the mountain local governments are encouraged to undertake territorial governance policies to climate change, considering multi-risks and opportunities for the mountain economy and identifying the best portfolio of adaptation strategies. This study aims to provide a literature review of available qualitative and quantitative tools, methodological guidelines and best practices to conduct multi-risk assessments in the mountain environment within the context of climate change. We analyzed multi-risk modelling and assessment methods applied in alpine regions (e.g. event trees, Bayesian Networks, Agent Based Models) in order to identify key concepts (exposure, resilience, vulnerability, risk, adaptive capacity), climatic drivers, cause-effect relationships and socio-ecological systems to be integrated in a comprehensive framework. The main outcomes of the review, including a comparison of existing techniques based on different criteria (e.g. scale of analysis, targeted questions, level of complexity) and a snapshot of the developed multi-risk framework for climate change adaptation will be here presented and discussed.

  19. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    Science.gov (United States)

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  20. A new general methodology for incorporating physico-chemical transformations into multi-phase wastewater treatment process models.

    Science.gov (United States)

    Lizarralde, I; Fernández-Arévalo, T; Brouckaert, C; Vanrolleghem, P; Ikumi, D S; Ekama, G A; Ayesa, E; Grau, P

    2015-05-01

    This paper introduces a new general methodology for incorporating physico-chemical and chemical transformations into multi-phase wastewater treatment process models in a systematic and rigorous way under a Plant-Wide modelling (PWM) framework. The methodology presented in this paper requires the selection of the relevant biochemical, chemical and physico-chemical transformations taking place and the definition of the mass transport for the co-existing phases. As an example a mathematical model has been constructed to describe a system for biological COD, nitrogen and phosphorus removal, liquid-gas transfer, precipitation processes, and chemical reactions. The capability of the model has been tested by comparing simulated and experimental results for a nutrient removal system with sludge digestion. Finally, a scenario analysis has been undertaken to show the potential of the obtained mathematical model to study phosphorus recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A Modeling methodology for NoSQL Key-Value databases

    Directory of Open Access Journals (Sweden)

    Gerardo ROSSEL

    2017-08-01

    Full Text Available In recent years, there has been an increasing interest in the field of non-relational databases. However, far too little attention has been paid to design methodology. Key-value data stores are an important component of a class of non-relational technologies that are grouped under the name of NoSQL databases. The aim of this paper is to propose a design methodology for this type of database that allows overcoming the limitations of the traditional techniques. The proposed methodology leads to a clean design that also allows for better data management and consistency.

  2. A robust methodology for kinetic model parameter estimation for biocatalytic reactions

    DEFF Research Database (Denmark)

    Al-Haque, Naweed; Andrade Santacoloma, Paloma de Gracia; Lima Afonso Neto, Watson

    2012-01-01

    lead to globally optimized parameter values. In this article, a robust methodology to estimate parameters for biocatalytic reaction kinetic expressions is proposed. The methodology determines the parameters in a systematic manner by exploiting the best features of several of the current approaches...... parameters, which are strongly correlated with each other. State-of-the-art methodologies such as nonlinear regression (using progress curves) or graphical analysis (using initial rate data, for example, the Lineweaver-Burke plot, Hanes plot or Dixon plot) often incorporate errors in the estimates and rarely...

  3. Methodologies Related to Computational models in View of Developing Anti-Alzheimer Drugs: An Overview.

    Science.gov (United States)

    Baheti, Kirtee; Kale, Mayura Ajay

    2018-04-17

    carried out on various heterocyclic scaffolds that can serve as lead compounds to design Anti-Alzheimer's drugs in future. The molecular modeling methods can thus become a better alternative for discovery of newer Anti-Alzheimer agents. This methodology is extremely useful to design drugs in minimum time, with enhanced activity keeping balanced ethical considerations. Thus, the researchers are opting for this improved process over the conventional methods hoping to achieve a sure shot way out for the sufferings of people affected by Alzheimer besides other diseases. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  4. Soft Systems Methodology and Problem Framing: Development of an Environmental Problem Solving Model Respecting a New Emergent Reflexive Paradigm.

    Science.gov (United States)

    Gauthier, Benoit; And Others

    1997-01-01

    Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)

  5. Application of the methodology of safety probabilistic analysis to the modelling the emergency feedwater system of Juragua nuclear power plant

    International Nuclear Information System (INIS)

    Troncoso, M.; Oliva, G.

    1993-01-01

    The application of the methodology developed in the framework of the national plan of safety probabilistic analysis (APS) to the emergency feed water system for the failures of small LOCAS and external electrical supply loss in the nuclear power plant is illustrated in this work. The facilities created by the ARCON code to model the systems and its documentation are also expounded

  6. The Impact of e-Skills on the Settlement of Iranian Refugees in Australia

    Directory of Open Access Journals (Sweden)

    Saeed Shariati

    2017-04-01

    Full Text Available Aim/Purpose: The research investigates the impact of Information and Communication Technologies (ICT on Iranian refugees’ settlement in Australia. Background: The study identifies the issues of settlement, such as language, cultural and social differences. Methodology: The Multi-Sited Ethnography (MSE, which is a qualitative methodology, has been used with a thematic analysis drawing on a series of semi-structured interviews with two groups of participants (51 Iranian refugees and 55 people with a role in assisting refugees. Contribution: The research findings may enable the creation of a model for use by the Aus-tralian Government with Iranian refugees. Findings: The findings show the vital role ICT play in refugees’ ongoing day-to-day life towards settlement. Recommendations for Practitioners: The results from this paper could be generalised to other groups of refugees in Australia and also could be used for Iranian refugees in other countries. Recommendation for Researchers: Researchers may use a similar study for refugees of different backgrounds in Australia and around the world. Impact on Society: ICT may assist refugees to become less isolated, less marginalized and part of mainstream society. Future Research: Future research could look into the digital divide between refugees in Australia and main stream Australians.

  7. Model-based Organization Manning, Strategy, and Structure Design via Team Optimal Design (TOD) Methodology

    National Research Council Canada - National Science Library

    Levchuk, Georgiy; Chopra, Kari; Paley, Michael; Levchuk, Yuri; Clark, David

    2005-01-01

    This paper describes a quantitative Team Optimal Design (TOD) methodology and its application to the design of optimized manning for E-10 Multi-sensor Command and Control Aircraft. The E-10 (USAF, 2002...

  8. Operation room tool handling and miscommunication scenarios: an object-process methodology conceptual model.

    Science.gov (United States)

    Wachs, Juan P; Frenkel, Boaz; Dori, Dov

    2014-11-01

    Errors in the delivery of medical care are the principal cause of inpatient mortality and morbidity, accounting for around 98,000 deaths in the United States of America (USA) annually. Ineffective team communication, especially in the operation room (OR), is a major root of these errors. This miscommunication can be reduced by analyzing and constructing a conceptual model of communication and miscommunication in the OR. We introduce the principles underlying Object-Process Methodology (OPM)-based modeling of the intricate interactions between the surgeon and the surgical technician while handling surgical instruments in the OR. This model is a software- and hardware-independent description of the agents engaged in communication events, their physical activities, and their interactions. The model enables assessing whether the task-related objectives of the surgical procedure were achieved and completed successfully and what errors can occur during the communication. The facts used to construct the model were gathered from observations of various types of operations miscommunications in the operating room and its outcomes. The model takes advantage of the compact ontology of OPM, which is comprised of stateful objects - things that exist physically or informatically, and processes - things that transform objects by creating them, consuming them or changing their state. The modeled communication modalities are verbal and non-verbal, and errors are modeled as processes that deviate from the "sunny day" scenario. Using OPM refinement mechanism of in-zooming, key processes are drilled into and elaborated, along with the objects that are required as agents or instruments, or objects that these processes transform. The model was developed through an iterative process of observation, modeling, group discussions, and simplification. The model faithfully represents the processes related to tool handling that take place in an OR during an operation. The specification is at

  9. Energy in Australia 2011

    International Nuclear Information System (INIS)

    Cuevas-Cubria, C.; Schultz, A.; Petchey, R.; Beaini, F.; New, R.

    2011-04-01

    Securing access to affordable, reliable and clean energy is one of the great challenges facing governments around the world. The Australian Government is committed to ensuring the security of Australia's domestic energy systems as a fundamental part of Australia's social and economic prosperity. Energy in Australia 2011 is a key reference for anyone with an interest in Australian energy issues. It provides a detailed overview of energy in Australia from production to consumption, and serves as a useful resource to inform industry, government and the community.

  10. How evidence-based workforce planning in Australia is informing policy development in the retention and distribution of the health workforce.

    Science.gov (United States)

    Crettenden, Ian F; McCarty, Maureen V; Fenech, Bethany J; Heywood, Troy; Taitz, Michelle C; Tudman, Sam

    2014-02-03

    Australia's health workforce is facing significant challenges now and into the future. Health Workforce Australia (HWA) was established by the Council of Australian Governments as the national agency to progress health workforce reform to address the challenges of providing a skilled, innovative and flexible health workforce in Australia. HWA developed Australia's first major, long-term national workforce projections for doctors, nurses and midwives over a planning horizon to 2025 (called Health Workforce 2025; HW 2025), which provided a national platform for developing policies to help ensure Australia's health workforce meets the community's needs. A review of existing workforce planning methodologies, in concert with the project brief and an examination of data availability, identified that the best fit-for-purpose workforce planning methodology was the stock and flow model for estimating workforce supply and the utilisation method for estimating workforce demand. Scenario modelling was conducted to explore the implications of possible alternative futures, and to demonstrate the sensitivity of the model to various input parameters. Extensive consultation was conducted to test the methodology, data and assumptions used, and also influenced the scenarios selected for modelling. Additionally, a number of other key principles were adopted in developing HW 2025 to ensure the workforce projections were robust and able to be applied nationally. The findings from HW 2025 highlighted that a 'business as usual' approach to Australia's health workforce is not sustainable over the next 10 years, with a need for co-ordinated, long-term reforms by government, professions and the higher education and training sector for a sustainable and affordable health workforce. The main policy levers identified to achieve change were innovation and reform, immigration, training capacity and efficiency and workforce distribution. While HW 2025 has provided a national platform for health

  11. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  12. Bioregional Assessments: Determining the Impacts of Coal Resource Development on Water Resources in Australia through Groundwater, Surface Water and Ecological Modelling

    Science.gov (United States)

    Peeters, L. J.; Post, D. A.; Crosbie, R.; Holland, K.

    2017-12-01

    While extraction of methane from shale gas deposits has been the principal source of the recent expansion of the industry in the United States, in Australia extraction of methane from coal bed methane deposits (termed `coal seam gas' in Australia) has been the focus to date. The two sources of methane share many of the same characteristics including the potential requirement for hydraulic fracturing. However, as coal seam gas deposits generally occur at shallower depths than shale gas, the potential impacts of extraction on surface and groundwater resources may be of even greater concern. The Australian Federal Government commissioned a multi-disciplinary programme of bioregional assessments to improve understanding of the potential impacts of coal seam gas and large coal mining activities on water resources and water-dependent assets across six bioregions Australia. A bioregional assessment is a transparent scientific analysis of the ecology, hydrology, geology and hydrogeology of a bioregion with explicit assessment of the potential direct, indirect and cumulative impacts of coal seam gas and large coal mining development on water resources. The first step in the analysis is to establish the most likely scenario for coal development in each region and establish a causal pathway linking coal development to impacts to the social, economic and ecological functioning of water resources. This forms the basis for a sequence of probabilistic geological, hydrogeological, hydrological and ecological models to quantify the probability of potential impacts. This suite of models is developed independent of the proponents and regulators of coal resource developments and so can provide unbiased information to all stakeholders. To demonstrate transparency of the modelling, all inputs, outputs and executables will be available from http://www.bioregionalassessments.gov.au. The analysis delineated a zone of potential hydrological change for each region, outside of which impacts

  13. Modeling and optimization of ethanol fermentation using Saccharomyces cerevisiae: Response surface methodology and artificial neural network

    Directory of Open Access Journals (Sweden)

    Esfahanian Mehri

    2013-01-01

    Full Text Available In this study, the capabilities of response surface methodology (RSM and artificial neural networks (ANN for modeling and optimization of ethanol production from glucoseusing Saccharomyces cerevisiae in batch fermentation process were investigated. Effect of three independent variables in a defined range of pH (4.2-5.8, temperature (20-40ºC and glucose concentration (20-60 g/l on the cell growth and ethanol production was evaluated. Results showed that prediction accuracy of ANN was apparently similar to RSM. At optimum condition of temperature (32°C, pH (5.2 and glucose concentration (50 g/l suggested by the statistical methods, the maximum cell dry weight and ethanol concentration obtained from RSM were 12.06 and 16.2 g/l whereas experimental values were 12.09 and 16.53 g/l, respectively. The present study showed that using ANN as fitness function, the maximum cell dry weight and ethanol concentration were 12.05 and 16.16 g/l, respectively. Also, the coefficients of determination for biomass and ethanol concentration obtained from RSM were 0.9965 and 0.9853 and from ANN were 0.9975 and 0.9936, respectively. The process parameters optimization was successfully conducted using RSM and ANN; however prediction by ANN was slightly more precise than RSM. Based on experimental data maximum yield of ethanol production of 0.5 g ethanol/g substrate (97 % of theoretical yield was obtained.

  14. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    Energy Technology Data Exchange (ETDEWEB)

    Anooshehpoor, Rasool; Purvance, Matthew D.; Brune, James N.; Preston, Leiph A.; Anderson, John G.; Smith, Kenneth D.

    2006-09-29

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that he PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested.

  15. Phases and Actions of the Evolution of the Concept of Quality in Canada and Australia – A Theoretical Modelling of the Development of Knowledge in Business Performance in the XXI Century - The Approach to Excellence

    Directory of Open Access Journals (Sweden)

    Cristina Raluca Popescu

    2015-05-01

    Full Text Available In the paper “Phases and Actions of the Evolution of the Concept of Quality in Canada and Australia – A Theoretical Modelling of the Development of Knowledge in Business Performance in the XXI Century - The Approach to Excellence” the authors present the basic features of the phases and actions of the evolution of the concept of quality in Canada and Australia, as a theoretical modelling of the development of knowledge in business performance in the XXI century in order to improve the organizational processes so that excellence can be achieved.

  16. A METHODOLOGICAL MODEL FOR INTEGRATING CHARACTER WITHIN CONTENT AND LANGUAGE INTEGRATED LEARNING IN SOCIOLOGY OF RELIGION

    Directory of Open Access Journals (Sweden)

    Moh Yasir Alimi

    2014-02-01

    Full Text Available AbstractIn this article, I describe a methodological model I used in a experimental study on how to integrate character within the practice of Content and Language Integrated Learning (CLIL at the higher education Indonesia.This research can be added to research about character education and CLIL in tertiary education, giving nuances to the practice of CLIL so far predominantly a practice in primary and secondary schools.The research was conducted in Semarang State University, in the Department of Sociology and Anthropology, in Sociology of Religion bilingual class. The research indicates that the integration of character within CLIL enrich the perspective of CLIL by strengthening the use of CLIL for intellectual growth and moral development. On the other side, the use of CLIL with character education gives methods and perspectives to the practice of character education which so far only emphasise contents reforms without learning methods reforms. The research also reveals that the weakness of CLIL in using text for classroom learning can be overcome by the use of specific reading and writing strategies. I develop a practical text strategy which can be effectively used in highly conceptual subject such as sociology of religion. AbstrakArtikel ini bertujuan untuk mendeskripsikan model metodologis yang saya pakai untuk mengintegrasikannya karakter dalam Content and Language Integrated Learning (CLIL pada pendidikan tinggi di Indonesia. Penelitian ini memperkaya penelitian mengenai pendidikan karakter dan penerapan CLIL di perguruan tinggi, selama ini penelitian semacam itu hanya biasa di level lebih rendah. Penelitian dilakukan di Universitas Negeri Semarang, pada kelas bilingual yang diikuti 25 mahasiswa, dan diujikan pada mata kuliah Sosiologi Agama. Pelajaran dari penelitian ini adalah integrasi karakter dalam CLIL dapat memperkaya CLIL. Sebaliknya penggunaan CLIL untuk mendidikkan karakter di kelas bilingual mampu menjawab berbagai tantangan

  17. A Methodological Demonstration of Set-theoretical Approach to Social Media Maturity Models Using Necessary Condition Analysis

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    Despite being widely accepted and applied across research domains, maturity models have been criticized for lacking academic rigor, especially methodologically rigorous and empirically grounded or tested maturity models are quite rare. Attempting to close this gap, we adopt a set-theoretic approach...... and evaluate some of arguments presented by previous conceptual focused social media maturity models....... by applying the Necessary Condition Analysis (NCA) technique to derive maturity stages and stage boundaries conditions. The ontology is to view stages (boundaries) in maturity models as a collection of necessary condition. Using social media maturity data, we demonstrate the strength of our approach...

  18. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Energy Technology Data Exchange (ETDEWEB)

    Mimouni, F.; Abouabdellah, A.

    2016-07-01

    Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Network modeling by combining Petri and Bayesian network. Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Demands are independent from returns. Model can only be used on nonperishable products. Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Bayesian network with a cycle combined with the Petri Network. (Author)

  19. Modelled health benefits of a sugar-sweetened beverage tax across different socioeconomic groups in Australia: A cost-effectiveness and equity analysis.

    Science.gov (United States)

    Lal, Anita; Mantilla-Herrera, Ana Maria; Veerman, Lennert; Backholer, Kathryn; Sacks, Gary; Moodie, Marjory; Siahpush, Mohammad; Carter, Rob; Peeters, Anna

    2017-06-01

    A sugar-sweetened beverage (SSB) tax in Mexico has been effective in reducing consumption of SSBs, with larger decreases for low-income households. The health and financial effects across socioeconomic groups are important considerations for policy-makers. From a societal perspective, we assessed the potential cost-effectiveness, health gains, and financial impacts by socioeconomic position (SEP) of a 20% SSB tax for Australia. Australia-specific price elasticities were used to predict decreases in SSB consumption for each Socio-Economic Indexes for Areas (SEIFA) quintile. Changes in body mass index (BMI) were based on SSB consumption, BMI from the Australian Health Survey 2011-12, and energy balance equations. Markov cohort models were used to estimate the health impact for the Australian population, taking into account obesity-related diseases. Health-adjusted life years (HALYs) gained, healthcare costs saved, and out-of-pocket costs were estimated for each SEIFA quintile. Loss of economic welfare was calculated as the amount of deadweight loss in excess of taxation revenue. A 20% SSB tax would lead to HALY gains of 175,300 (95% CI: 68,700; 277,800) and healthcare cost savings of AU$1,733 million (m) (95% CI: $650m; $2,744m) over the lifetime of the population, with 49.5% of the total health gains accruing to the 2 lowest quintiles. We estimated the increase in annual expenditure on SSBs to be AU$35.40/capita (0.54% of expenditure on food and non-alcoholic drinks) in the lowest SEIFA quintile, a difference of AU$3.80/capita (0.32%) compared to the highest quintile. Annual tax revenue was estimated at AU$642.9m (95% CI: $348.2m; $1,117.2m). The main limitations of this study, as with all simulation models, is that the results represent only the best estimate of a potential effect in the absence of stronger direct evidence. This study demonstrates that from a 20% tax on SSBs, the most HALYs gained and healthcare costs saved would accrue to the most disadvantaged

  20. Modelled health benefits of a sugar-sweetened beverage tax across different socioeconomic groups in Australia: A cost-effectiveness and equity analysis

    Science.gov (United States)

    Mantilla-Herrera, Ana Maria; Veerman, Lennert; Backholer, Kathryn; Moodie, Marjory; Siahpush, Mohammad; Carter, Rob; Peeters, Anna

    2017-01-01

    Background A sugar-sweetened beverage (SSB) tax in Mexico has been effective in reducing consumption of SSBs, with larger decreases for low-income households. The health and financial effects across socioeconomic groups are important considerations for policy-makers. From a societal perspective, we assessed the potential cost-effectiveness, health gains, and financial impacts by socioeconomic position (SEP) of a 20% SSB tax for Australia. Methods and findings Australia-specific price elasticities were used to predict decreases in SSB consumption for each Socio-Economic Indexes for Areas (SEIFA) quintile. Changes in body mass index (BMI) were based on SSB consumption, BMI from the Australian Health Survey 2011–12, and energy balance equations. Markov cohort models were used to estimate the health impact for the Australian population, taking into account obesity-related diseases. Health-adjusted life years (HALYs) gained, healthcare costs saved, and out-of-pocket costs were estimated for each SEIFA quintile. Loss of economic welfare was calculated as the amount of deadweight loss in excess of taxation revenue. A 20% SSB tax would lead to HALY gains of 175,300 (95% CI: 68,700; 277,800) and healthcare cost savings of AU$1,733 million (m) (95% CI: $650m; $2,744m) over the lifetime of the population, with 49.5% of the total health gains accruing to the 2 lowest quintiles. We estimated the increase in annual expenditure on SSBs to be AU$35.40/capita (0.54% of expenditure on food and non-alcoholic drinks) in the lowest SEIFA quintile, a difference of AU$3.80/capita (0.32%) compared to the highest quintile. Annual tax revenue was estimated at AU$642.9m (95% CI: $348.2m; $1,117.2m). The main limitations of this study, as with all simulation models, is that the results represent only the best estimate of a potential effect in the absence of stronger direct evidence. Conclusions This study demonstrates that from a 20% tax on SSBs, the most HALYs gained and healthcare costs

  1. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  2. A health-promoting community dental service in Melbourne, Victoria, Australia: protocol for the North Richmond model of oral health care.

    Science.gov (United States)

    Hall, Martin; Christian, Bradley

    2017-10-01

    Despite the best efforts and commitment of oral health programs, there is no evidence that the current surgical output-based model of oral health care is delivering better oral health outcomes to the community. In fact, Australian evidence indicates the oral health of the community could be getting worse. It is now well-understood that this traditional surgical model of oral health care will never successfully manage the disease itself. It is proposed that a health-promoting, minimally invasive oral disease management model of care may lead to a sustainable benefit to the oral health status of the individual and community groups. The aim of this paper is to describe such a model of oral health care (MoC) currently being implemented by the North Richmond Community Health Oral Health (NRCH-OH) program in Melbourne, Victoria, Australia; this model may serve as a template for other services to re-orient their healthcare delivery towards health promotion and prevention. The paper describes the guiding principles and theories for the model and also its operational components, which are: pre-engagement while on the waitlist; client engagement at the reception area; the assessment phase; oral health education (high-risk clients only); disease management; and reviews and recall.

  3. Community Music in Australia

    Science.gov (United States)

    Harrison, Gillian

    2010-01-01

    This paper presents a historical perspective to the development of community music in Australia. Finding political support in Australia's progressive arts policies of the late 1970s, community music is discussed as embracing the principles of access and equity and supporting the development of musical skills in the context of social change and…

  4. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  5. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  6. Artificial neural network and response surface methodology modeling in mass transfer parameters predictions during osmotic dehydration of Carica papaya L.

    Directory of Open Access Journals (Sweden)

    J. Prakash Maran

    2013-09-01

    Full Text Available In this study, a comparative approach was made between artificial neural network (ANN and response surface methodology (RSM to predict the mass transfer parameters of osmotic dehydration of papaya. The effects of process variables such as temperature, osmotic solution concentration and agitation speed on water loss, weight reduction, and solid gain during osmotic dehydration were investigated using a three-level three-factor Box-Behnken experimental design. Same design was utilized to train a feed-forward multilayered perceptron (MLP ANN with back-propagation algorithm. The predictive capabilities of the two methodologies were compared in terms of root mean square error (RMSE, mean absolute error (MAE, standard error of prediction (SEP, model predictive error (MPE, chi square statistic (χ2, and coefficient of determination (R2 based on the validation data set. The results showed that properly trained ANN model is found to be more accurate in prediction as compared to RSM model.

  7. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  8. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    International Nuclear Information System (INIS)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-01

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  9. Current costing models: are they suitable for allocating health resources? The example of fall injury prevention in Australia.

    Science.gov (United States)

    Moller, Jerry

    2005-01-01

    The example of fall injury among older people is used to define and illustrate how current Australian systems for allocation of health resources perform for funding emerging public health issues. While the examples are Australian, the allocation and priority setting methods are common in the health sector in all developed western nations. With an ageing population the number of falls injuries in Australia and the cost of treatment will rise dramatically over the next 20-50 years. Current methods of allocating funds within the health system are not well suited to meeting this coming epidemic. The information requirements for cost-benefit and cost-effectiveness measures cannot be met. Marginal approaches to health funding are likely to continue to fund already well-funded treatment or politically dr