WorldWideScience

Sample records for modeling based study

  1. Image based 3D city modeling : Comparative study

    Directory of Open Access Journals (Sweden)

    S. P. Singh

    2014-06-01

    Full Text Available 3D city model is a digital representation of the Earth’s surface and it’s related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing rapidly for various engineering and non-engineering applications. Generally four main image based approaches were used for virtual 3D city models generation. In first approach, researchers were used Sketch based modeling, second method is Procedural grammar based modeling, third approach is Close range photogrammetry based modeling and fourth approach is mainly based on Computer Vision techniques. SketchUp, CityEngine, Photomodeler and Agisoft Photoscan are the main softwares to represent these approaches respectively. These softwares have different approaches & methods suitable for image based 3D city modeling. Literature study shows that till date, there is no complete such type of comparative study available to create complete 3D city model by using images. This paper gives a comparative assessment of these four image based 3D modeling approaches. This comparative study is mainly based on data acquisition methods, data processing techniques and output 3D model products. For this research work, study area is the campus of civil engineering department, Indian Institute of Technology, Roorkee (India. This 3D campus acts as a prototype for city. This study also explains various governing parameters, factors and work experiences. This research work also gives a brief introduction, strengths and weakness of these four image based techniques. Some personal comment is also given as what can do or what can’t do from these softwares. At the last, this study shows; it concluded that, each and every software has some advantages and limitations. Choice of software depends on user requirements of 3D project. For normal visualization project, SketchUp software is a good option. For 3D documentation record, Photomodeler gives good

  2. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    Science.gov (United States)

    Xiang, Lin

    2011-01-01

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…

  3. Model-based estimation for dynamic cardiac studies using ECT.

    Science.gov (United States)

    Chiao, P C; Rogers, W L; Clinthorne, N H; Fessler, J A; Hero, A O

    1994-01-01

    The authors develop a strategy for joint estimation of physiological parameters and myocardial boundaries using ECT (emission computed tomography). They construct an observation model to relate parameters of interest to the projection data and to account for limited ECT system resolution and measurement noise. The authors then use a maximum likelihood (ML) estimator to jointly estimate all the parameters directly from the projection data without reconstruction of intermediate images. They also simulate myocardial perfusion studies based on a simplified heart model to evaluate the performance of the model-based joint ML estimator and compare this performance to the Cramer-Rao lower bound. Finally, the authors discuss model assumptions and potential uses of the joint estimation strategy.

  4. An agent-based simulation model to study accountable care organizations.

    Science.gov (United States)

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions.

  5. Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study

    NARCIS (Netherlands)

    Tervahauta, T.H.; Trang Hoang,; Hernández, L.; Zeeman, G.; Buisman, C.J.N.

    2013-01-01

    Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data

  6. Model-based estimation for dynamic cardiac studies using ECT

    International Nuclear Information System (INIS)

    Chiao, P.C.; Rogers, W.L.; Clinthorne, N.H.; Fessler, J.A.; Hero, A.O.

    1994-01-01

    In this paper, the authors develop a strategy for joint estimation of physiological parameters and myocardial boundaries using ECT (Emission Computed Tomography). The authors construct an observation model to relate parameters of interest to the projection data and to account for limited ECT system resolution and measurement noise. The authors then use a maximum likelihood (ML) estimator to jointly estimate all the parameters directly from the projection data without reconstruction of intermediate images. The authors also simulate myocardial perfusion studies based on a simplified heart model to evaluate the performance of the model-based joint ML estimator and compare this performance to the Cramer-Rao lower bound. Finally, model assumptions and potential uses of the joint estimation strategy are discussed

  7. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    Science.gov (United States)

    Xiang, Lin

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8 th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on natural selection implemented in a charter school of a major California city during spring semester of 2009. Eight 8th grade students, two boys and six girls, participated in this study. All of them were low socioeconomic status (SES). English was a second language for all of them, but they had been identified as fluent English speakers at least a year before the study. None of them had learned either natural selection or programming before the study. The study spanned over 7 weeks and was comprised of two study phases. In phase one the subject students learned natural selection in science classroom and how to do programming in NetLogo, an ABPM tool, in a computer lab; in phase two, the subject students were asked to program a simulation of adaptation based on the natural selection model in NetLogo. Both qualitative and quantitative data were collected in this study. The data resources included (1) pre and post test questionnaire, (2) student in-class worksheet, (3) programming planning sheet, (4) code-conception matching sheet, (5) student NetLogo projects, (6) videotaped programming processes, (7) final interview, and (8) investigator's field notes. Both qualitative and quantitative approaches were applied to analyze the gathered data. The findings suggested that students made progress on understanding adaptation phenomena and natural selection at the end of ABPM-supported MBI learning but the progress was limited. These students still held some misconceptions in their conceptual models, such as the idea that animals need to "learn" to adapt into the environment. Besides, their models of natural selection appeared to be

  8. Mobile Agent-Based Software Systems Modeling Approaches: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Aissam Belghiat

    2016-06-01

    Full Text Available Mobile agent-based applications are special type of software systems which take the advantages of mobile agents in order to provide a new beneficial paradigm to solve multiple complex problems in several fields and areas such as network management, e-commerce, e-learning, etc. Likewise, we notice lack of real applications based on this paradigm and lack of serious evaluations of their modeling approaches. Hence, this paper provides a comparative study of modeling approaches of mobile agent-based software systems. The objective is to give the reader an overview and a thorough understanding of the work that has been done and where the gaps in the research are.

  9. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  10. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  11. Study on evaluation method for heterogeneous sedimentary rocks based on forward model

    International Nuclear Information System (INIS)

    Masui, Yasuhiro; Kawada, Koji; Katoh, Arata; Tsuji, Takashi; Suwabe, Mizue

    2004-02-01

    It is very important to estimate the facies distribution of heterogeneous sedimentary rocks for geological disposal of high level radioactive waste. The heterogeneousness of sedimentary rocks is due to variable distribution of grain size and mineral composition. The objective of this study is to establish the evaluation method for heterogeneous sedimentary rocks based on forward model. This study consisted of geological study for Horonobe area and the development of soft wear for sedimentary model. Geological study was composed of following items. 1. The sedimentary system for Koetoi and Wakkanai formations in Horonobe area was compiled based on papers. 2. The cores of HDB-1 were observed mainly from sedimentological view. 3. The facies and compaction property of argillaceous rocks were studied based on physical logs and core analysis data of wells. 4. The structure maps, isochrone maps, isopach maps and restored geological sections were made. The soft wear for sedimentary model to show sedimentary system on a basin scale was developed. This soft wear estimates the facies distribution and hydraulic conductivity of sedimentary rocks on three dimensions scale by numerical simulation. (author)

  12. Topic model-based mass spectrometric data analysis in cancer biomarker discovery studies.

    Science.gov (United States)

    Wang, Minkun; Tsai, Tsung-Heng; Di Poto, Cristina; Ferrarini, Alessia; Yu, Guoqiang; Ressom, Habtom W

    2016-08-18

    A fundamental challenge in quantitation of biomolecules for cancer biomarker discovery is owing to the heterogeneous nature of human biospecimens. Although this issue has been a subject of discussion in cancer genomic studies, it has not yet been rigorously investigated in mass spectrometry based proteomic and metabolomic studies. Purification of mass spectometric data is highly desired prior to subsequent analysis, e.g., quantitative comparison of the abundance of biomolecules in biological samples. We investigated topic models to computationally analyze mass spectrometric data considering both integrated peak intensities and scan-level features, i.e., extracted ion chromatograms (EICs). Probabilistic generative models enable flexible representation in data structure and infer sample-specific pure resources. Scan-level modeling helps alleviate information loss during data preprocessing. We evaluated the capability of the proposed models in capturing mixture proportions of contaminants and cancer profiles on LC-MS based serum proteomic and GC-MS based tissue metabolomic datasets acquired from patients with hepatocellular carcinoma (HCC) and liver cirrhosis as well as synthetic data we generated based on the serum proteomic data. The results we obtained by analysis of the synthetic data demonstrated that both intensity-level and scan-level purification models can accurately infer the mixture proportions and the underlying true cancerous sources with small average error ratios (data, we found more proteins and metabolites with significant changes between HCC cases and cirrhotic controls. Candidate biomarkers selected after purification yielded biologically meaningful pathway analysis results and improved disease discrimination power in terms of the area under ROC curve compared to the results found prior to purification. We investigated topic model-based inference methods to computationally address the heterogeneity issue in samples analyzed by LC/GC-MS. We observed

  13. Model-based design languages: A case study

    OpenAIRE

    Cibrario Bertolotti, Ivan; Hu, Tingting; Navet, Nicolas

    2017-01-01

    Fast-paced innovation in the embedded systems domain puts an ever increasing pressure on effective software development methods, leading to the growing popularity of Model-Based Design (MBD). In this context, a proper choice of modeling languages and related tools - depending on design goals and problem qualities - is crucial to make the most of MBD benefits. In this paper, a comparison between two dissimilar approaches to modeling is carried out, with the goal of highlighting their relative ...

  14. Performance-Based Service Quality Model: An Empirical Study on Japanese Universities

    Science.gov (United States)

    Sultan, Parves; Wong, Ho

    2010-01-01

    Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…

  15. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  16. Experimental Study of Dowel Bar Alternatives Based on Similarity Model Test

    Directory of Open Access Journals (Sweden)

    Chichun Hu

    2017-01-01

    Full Text Available In this study, a small-scaled accelerated loading test based on similarity theory and Accelerated Pavement Analyzer was developed to evaluate dowel bars with different materials and cross-sections. Jointed concrete specimen consisting of one dowel was designed as scaled model for the test, and each specimen was subjected to 864 thousand loading cycles. Deflections between jointed slabs were measured with dial indicators, and strains of the dowel bars were monitored with strain gauges. The load transfer efficiency, differential deflection, and dowel-concrete bearing stress for each case were calculated from these measurements. The test results indicated that the effect of the dowel modulus on load transfer efficiency can be characterized based on the similarity model test developed in the study. Moreover, round steel dowel was found to have similar performance to larger FRP dowel, and elliptical dowel can be preferentially considered in practice.

  17. Model of Values-Based Management Process in Schools: A Mixed Design Study

    Science.gov (United States)

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  18. A case study to estimate costs using Neural Networks and regression based models

    Directory of Open Access Journals (Sweden)

    Nadia Bhuiyan

    2012-07-01

    Full Text Available Bombardier Aerospace’s high performance aircrafts and services set the utmost standard for the Aerospace industry. A case study in collaboration with Bombardier Aerospace is conducted in order to estimate the target cost of a landing gear. More precisely, the study uses both parametric model and neural network models to estimate the cost of main landing gears, a major aircraft commodity. A comparative analysis between the parametric based model and those upon neural networks model will be considered in order to determine the most accurate method to predict the cost of a main landing gear. Several trials are presented for the design and use of the neural network model. The analysis for the case under study shows the flexibility in the design of the neural network model. Furthermore, the performance of the neural network model is deemed superior to the parametric models for this case study.

  19. A model for fine mapping in family based association studies.

    Science.gov (United States)

    Boehringer, Stefan; Pfeiffer, Ruth M

    2009-01-01

    Genome wide association studies for complex diseases are typically followed by more focused characterization of the identified genetic region. We propose a latent class model to evaluate a candidate region with several measured markers using observations on families. The main goal is to estimate linkage disequilibrium (LD) between the observed markers and the putative true but unobserved disease locus in the region. Based on this model, we estimate the joint distribution of alleles at the observed markers and the unobserved true disease locus, and a penetrance parameter measuring the impact of the disease allele on disease risk. A family specific random effect allows for varying baseline disease prevalences for different families. We present a likelihood framework for our model and assess its properties in simulations. We apply the model to an Alzheimer data set and confirm previous findings in the ApoE region.

  20. Society by Numbers : Studies on Model-Based Explanations in the Social Sciences

    OpenAIRE

    Kuorikoski, Jaakko

    2010-01-01

    The aim of this dissertation is to provide conceptual tools for the social scientist for clarifying, evaluating and comparing explanations of social phenomena based on formal mathematical models. The focus is on relatively simple theoretical models and simulations, not statistical models. These studies apply a theory of explanation according to which explanation is about tracing objective relations of dependence, knowledge of which enables answers to contrastive why and how-questions. This th...

  1. Aircraft operational reliability—A model-based approach and a case study

    International Nuclear Information System (INIS)

    Tiassou, Kossi; Kanoun, Karama; Kaâniche, Mohamed; Seguin, Christel; Papadopoulos, Chris

    2013-01-01

    The success of an aircraft mission is subject to the fulfillment of some operational requirements before and during each flight. As these requirements depend essentially on the aircraft system components and the mission profile, the effects of failures can be very severe if they are not anticipated. Hence, one should be able to assess the aircraft operational reliability with regard to its missions in order to be able to cope with failures. We address aircraft operational reliability modeling to support maintenance planning during the mission achievement. We develop a modeling approach, based on a meta-model that is used as a basis: (i) to structure the information needed to assess aircraft operational reliability and (ii) to build a stochastic model that can be tuned dynamically, in order to take into account the aircraft system operational state, a mission profile and the maintenance facilities available at the flight stop locations involved in the mission. The aim is to enable operational reliability assessment online. A case study, based on an aircraft subsystem, is considered for illustration using the Stochastic Activity Networks (SANs) formalism

  2. The evolution of network-based business models illustrated through the case study of an entrepreneurship project

    DEFF Research Database (Denmark)

    Lund, Morten; Nielsen, Christian

    2014-01-01

    can gain insight into barriers and enablers relating to different types of loose organisations and how to best manage such relationships and interactions Originality/value: This study adds value to the existing literature by reflecting the dynamics created in the interactions between a business model......-based business model that generates additional value for the core business model and for both the partners and the customers. Research limitations/implications: The results should be taken with caution as they are based on the case study of a single network-based business model. Practical implications: Managers......Purpose: Existing frameworks for understanding and analyzing the value configuration and structuring of partnerships in relation such network-based business models are found to be inferior. The purpose of this paper is therefore to broaden our understanding of how business models may change over...

  3. Copula based prediction models: an application to an aortic regurgitation study

    Directory of Open Access Journals (Sweden)

    Shoukri Mohamed M

    2007-06-01

    Full Text Available Abstract Background: An important issue in prediction modeling of multivariate data is the measure of dependence structure. The use of Pearson's correlation as a dependence measure has several pitfalls and hence application of regression prediction models based on this correlation may not be an appropriate methodology. As an alternative, a copula based methodology for prediction modeling and an algorithm to simulate data are proposed. Methods: The method consists of introducing copulas as an alternative to the correlation coefficient commonly used as a measure of dependence. An algorithm based on the marginal distributions of random variables is applied to construct the Archimedean copulas. Monte Carlo simulations are carried out to replicate datasets, estimate prediction model parameters and validate them using Lin's concordance measure. Results: We have carried out a correlation-based regression analysis on data from 20 patients aged 17–82 years on pre-operative and post-operative ejection fractions after surgery and estimated the prediction model: Post-operative ejection fraction = - 0.0658 + 0.8403 (Pre-operative ejection fraction; p = 0.0008; 95% confidence interval of the slope coefficient (0.3998, 1.2808. From the exploratory data analysis, it is noted that both the pre-operative and post-operative ejection fractions measurements have slight departures from symmetry and are skewed to the left. It is also noted that the measurements tend to be widely spread and have shorter tails compared to normal distribution. Therefore predictions made from the correlation-based model corresponding to the pre-operative ejection fraction measurements in the lower range may not be accurate. Further it is found that the best approximated marginal distributions of pre-operative and post-operative ejection fractions (using q-q plots are gamma distributions. The copula based prediction model is estimated as: Post -operative ejection fraction = - 0.0933 + 0

  4. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  5. Tourism Village Model Based on Local Indigenous: Case Study of Nongkosawit Tourism Village, Gunungpati, Semarang

    Science.gov (United States)

    Kurniasih; Nihayah, Dyah Maya; Sudibyo, Syafitri Amalia; Winda, Fajri Nur

    2018-02-01

    Officially, Nongkosawit Village has become a tourism village since 2012. However, the economic impact has not been received by the society yet because of inappropriate tourism village model. Therefore, this study aims to find out the best model for the development of Nongkosawit Tourism Village. This research used Analytical Hierarchy Process method. The results of this research shows that the model of tourism village which was suitable to the local indigenous of Nongkosawit Tourism Village was the cultural based tourism village with the percentage of 58%. Therefore, it is necessary to do re-orientation from the natural-based village model into the cultural-based village model by raising and exploring the existing culture through unique and different tourism products.

  6. Parametric study of a turbocompound diesel engine based on an analytical model

    International Nuclear Information System (INIS)

    Zhao, Rongchao; Zhuge, Weilin; Zhang, Yangjun; Yin, Yong; Zhao, Yanting; Chen, Zhen

    2016-01-01

    Turbocompounding is an important technique to recover waste heat from engine exhaust and reduce CO_2 emission. This paper presents a parametric study of turbocompound diesel engine based on analytical model. An analytical model was developed to investigate the influence of system parameters on the engine fuel consumption. The model is based on thermodynamics knowledge and empirical models, which can consider the impacts of each parameter independently. The effects of turbine efficiency, back pressure, exhaust temperature, pressure ratio and engine speed on the recovery energy, pumping loss and engine fuel reductions were studied. Results show that turbine efficiency, exhaust temperature and back pressure has great influence on the fuel reduction and optimal power turbine (PT) expansion ratio. However, engine operation speed has little impact on the fuel savings obtained by turbocompounding. The interaction mechanism between the PT recovery power and engine pumping loss is presented in the paper. Due to the nonlinear characteristic of turbine power, there is an optimum value of PT expansion ratio to achieve largest power gain. At the end, the fuel saving potential of high performance turbocompound engine and the requirements for it are proposed in the paper. - Highlights: • An analytical model for turbocompound engine is developed and validated. • Parametric study is performed to obtain lowest BSFC and optimal expansion ratio. • The influences of each parameter on the fuel saving potentials are presented. • The impact mechanisms of each parameter on the energy tradeoff are disclosed. • It provides an effective tool to guide the preliminary design of turbocompounding.

  7. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar o...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  8. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    Science.gov (United States)

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Culturicon model: A new model for cultural-based emoticon

    Science.gov (United States)

    Zukhi, Mohd Zhafri Bin Mohd; Hussain, Azham

    2017-10-01

    Emoticons are popular among distributed collective interaction user in expressing their emotion, gestures and actions. Emoticons have been proved to be able to avoid misunderstanding of the message, attention saving and improved the communications among different native speakers. However, beside the benefits that emoticons can provide, the study regarding emoticons in cultural perspective is still lacking. As emoticons are crucial in global communication, culture should be one of the extensively research aspect in distributed collective interaction. Therefore, this study attempt to explore and develop model for cultural-based emoticon. Three cultural models that have been used in Human-Computer Interaction were studied which are the Hall Culture Model, Trompenaars and Hampden Culture Model and Hofstede Culture Model. The dimensions from these three models will be used in developing the proposed cultural-based emoticon model.

  10. Coach simplified structure modeling and optimization study based on the PBM method

    Science.gov (United States)

    Zhang, Miaoli; Ren, Jindong; Yin, Ying; Du, Jian

    2016-09-01

    For the coach industry, rapid modeling and efficient optimization methods are desirable for structure modeling and optimization based on simplified structures, especially for use early in the concept phase and with capabilities of accurately expressing the mechanical properties of structure and with flexible section forms. However, the present dimension-based methods cannot easily meet these requirements. To achieve these goals, the property-based modeling (PBM) beam modeling method is studied based on the PBM theory and in conjunction with the characteristics of coach structure of taking beam as the main component. For a beam component of concrete length, its mechanical characteristics are primarily affected by the section properties. Four section parameters are adopted to describe the mechanical properties of a beam, including the section area, the principal moments of inertia about the two principal axles, and the torsion constant of the section. Based on the equivalent stiffness strategy, expressions for the above section parameters are derived, and the PBM beam element is implemented in HyperMesh software. A case is realized using this method, in which the structure of a passenger coach is simplified. The model precision is validated by comparing the basic performance of the total structure with that of the original structure, including the bending and torsion stiffness and the first-order bending and torsional modal frequencies. Sensitivity analysis is conducted to choose design variables. The optimal Latin hypercube experiment design is adopted to sample the test points, and polynomial response surfaces are used to fit these points. To improve the bending and torsion stiffness and the first-order torsional frequency and taking the allowable maximum stresses of the braking and left turning conditions as constraints, the multi-objective optimization of the structure is conducted using the NSGA-II genetic algorithm on the ISIGHT platform. The result of the

  11. The Evolution of Network-based Business Models Illustrated Through the Case Study of an Entrepreneurship Project

    Directory of Open Access Journals (Sweden)

    Morten Lund

    2014-08-01

    Full Text Available Purpose: Existing frameworks for understanding and analyzing the value configuration and structuring of partnerships in relation such network-based business models are found to be inferior. The purpose of this paper is therefore to broaden our understanding of how business models may change over time and how the role of strategic partners may differ over time too. Design/methodology/approach: A longitudinal case study spanning over years and mobilising multiple qualitative methods such as interviews, observation and participative observation forms the basis of the data collection. Findings: This paper illustrates how a network-based business model arises and evolves and how the forces of a network structure impact the development of its partner relationships. The contribution of this article is to understanding how partners positioned around a business model can be organized into a network-based business model that generates additional value for the core business model and for both the partners and the customers. Research limitations/implications: The results should be taken with caution as they are based on the case study of a single network-based business model. Practical implications: Managers can gain insight into barriers and enablers relating to different types of loose organisations and how to best manage such relationships and interactions Originality/value: This study adds value to the existing literature by reflecting the dynamics created in the interactions between a business model’s strategic partners and how a how a business model can evolve in a series of distinct phases

  12. Experimental and Computer Modelling Studies of Metastability of Amorphous Silicon Based Solar Cells

    NARCIS (Netherlands)

    Munyeme, Geoffrey

    2003-01-01

    We present a combination of experimental and computer modelling studies of the light induced degradation in the performance of amorphous silicon based single junction solar cells. Of particular interest in this study is the degradation kinetics of different types of amorphous silicon single junction

  13. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    Directory of Open Access Journals (Sweden)

    Han Bossier

    2018-01-01

    Full Text Available Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1 the balance between false and true positives and (2 the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS, or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35. To do this, we apply a resampling scheme on a large dataset (N = 1,400 to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results.

  14. Collaborative Model-based Systems Engineering for Cyber-Physical Systems, with a Building Automation Case Study

    DEFF Research Database (Denmark)

    Fitzgerald, John; Gamble, Carl; Payne, Richard

    2016-01-01

    We describe an approach to the model-based engineering of cyber-physical systems that permits the coupling of diverse discrete-event and continuous-time models and their simulators. A case study in the building automation domain demonstrates how such co-models and co-simulation can promote early...

  15. The Use of Modeling-Based Text to Improve Students' Modeling Competencies

    Science.gov (United States)

    Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan

    2015-01-01

    This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…

  16. Measurement error in epidemiologic studies of air pollution based on land-use regression models.

    Science.gov (United States)

    Basagaña, Xavier; Aguilera, Inmaculada; Rivera, Marcela; Agis, David; Foraster, Maria; Marrugat, Jaume; Elosua, Roberto; Künzli, Nino

    2013-10-15

    Land-use regression (LUR) models are increasingly used to estimate air pollution exposure in epidemiologic studies. These models use air pollution measurements taken at a small set of locations and modeling based on geographical covariates for which data are available at all study participant locations. The process of LUR model development commonly includes a variable selection procedure. When LUR model predictions are used as explanatory variables in a model for a health outcome, measurement error can lead to bias of the regression coefficients and to inflation of their variance. In previous studies dealing with spatial predictions of air pollution, bias was shown to be small while most of the effect of measurement error was on the variance. In this study, we show that in realistic cases where LUR models are applied to health data, bias in health-effect estimates can be substantial. This bias depends on the number of air pollution measurement sites, the number of available predictors for model selection, and the amount of explainable variability in the true exposure. These results should be taken into account when interpreting health effects from studies that used LUR models.

  17. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  18. Model-free and model-based reward prediction errors in EEG.

    Science.gov (United States)

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Model-Based Motion Tracking of Infants

    DEFF Research Database (Denmark)

    Olsen, Mikkel Damgaard; Herskind, Anna; Nielsen, Jens Bo

    2014-01-01

    Even though motion tracking is a widely used technique to analyze and measure human movements, only a few studies focus on motion tracking of infants. In recent years, a number of studies have emerged focusing on analyzing the motion pattern of infants, using computer vision. Most of these studies...... are based on 2D images, but few are based on 3D information. In this paper, we present a model-based approach for tracking infants in 3D. The study extends a novel study on graph-based motion tracking of infants and we show that the extension improves the tracking results. A 3D model is constructed...

  20. An Empirical Rate Constant Based Model to Study Capacity Fading in Lithium Ion Batteries

    Directory of Open Access Journals (Sweden)

    Srivatsan Ramesh

    2015-01-01

    Full Text Available A one-dimensional model based on solvent diffusion and kinetics to study the formation of the SEI (solid electrolyte interphase layer and its impact on the capacity of a lithium ion battery is developed. The model uses the earlier work on silicon oxidation but studies the kinetic limitations of the SEI growth process. The rate constant of the SEI formation reaction at the anode is seen to play a major role in film formation. The kinetics of the reactions for capacity fading for various battery systems are studied and the rate constants are evaluated. The model is used to fit the capacity fade in different battery systems.

  1. A comparative study of independent particle model based ...

    Indian Academy of Sciences (India)

    We find that among these three independent particle model based methods, the ss-VSCF method provides most accurate results in the thermal averages followed by t-SCF and the v-VSCF is the least accurate. However, the ss-VSCF is found to be computationally very expensive for the large molecules. The t-SCF gives ...

  2. Agent-based modeling of sustainable behaviors

    CERN Document Server

    Sánchez-Maroño, Noelia; Fontenla-Romero, Oscar; Polhill, J; Craig, Tony; Bajo, Javier; Corchado, Juan

    2017-01-01

    Using the O.D.D. (Overview, Design concepts, Detail) protocol, this title explores the role of agent-based modeling in predicting the feasibility of various approaches to sustainability. The chapters incorporated in this volume consist of real case studies to illustrate the utility of agent-based modeling and complexity theory in discovering a path to more efficient and sustainable lifestyles. The topics covered within include: households' attitudes toward recycling, designing decision trees for representing sustainable behaviors, negotiation-based parking allocation, auction-based traffic signal control, and others. This selection of papers will be of interest to social scientists who wish to learn more about agent-based modeling as well as experts in the field of agent-based modeling.

  3. Building a semantic web-based metadata repository for facilitating detailed clinical modeling in cancer genome studies.

    Science.gov (United States)

    Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian

    2017-06-05

    Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.

  4. Discovering the Power of Individual-Based Modelling in Teaching and Learning: The Study of a Predator-Prey System

    Science.gov (United States)

    Ginovart, Marta

    2014-08-01

    The general aim is to promote the use of individual-based models (biological agent-based models) in teaching and learning contexts in life sciences and to make their progressive incorporation into academic curricula easier, complementing other existing modelling strategies more frequently used in the classroom. Modelling activities for the study of a predator-prey system for a mathematics classroom in the first year of an undergraduate program in biosystems engineering have been designed and implemented. These activities were designed to put two modelling approaches side by side, an individual-based model and a set of ordinary differential equations. In order to organize and display this, a system with wolves and sheep in a confined domain was considered and studied. With the teaching material elaborated and a computer to perform the numerical resolutions involved and the corresponding individual-based simulations, the students answered questions and completed exercises to achieve the learning goals set. Students' responses regarding the modelling of biological systems and these two distinct methodologies applied to the study of a predator-prey system were collected via questionnaires, open-ended queries and face-to-face dialogues. Taking into account the positive responses of the students when they were doing these activities, it was clear that using a discrete individual-based model to deal with a predator-prey system jointly with a set of ordinary differential equations enriches the understanding of the modelling process, adds new insights and opens novel perspectives of what can be done with computational models versus other models. The complementary views given by the two modelling approaches were very well assessed by students.

  5. Model based feasibility study on bidirectional check valves in wave energy converters

    DEFF Research Database (Denmark)

    Hansen, Anders Hedegaard; Pedersen, Henrik C.; Andersen, Torben Ole

    2014-01-01

    Discrete fluid power force systems have been proposed as the primary stage for Wave Energy Converters (WEC’s) when converting ocean waves into electricity, this to improve the overall efficiency of wave energy devices. This paper presents a model based feasibility study of using bidirectional check....../Off and bidirectional check valves. Based on the analysis it is found that the energy production may be slightly improved by using bidirectional check valves as compared to on/off valves, due to a decrease in switching losses. Furthermore a reduction in high flow peaks are realised. The downside being increased...

  6. Method for mapping population-based case-control studies: an application using generalized additive models

    Directory of Open Access Journals (Sweden)

    Aschengrau Ann

    2006-06-01

    Full Text Available Abstract Background Mapping spatial distributions of disease occurrence and risk can serve as a useful tool for identifying exposures of public health concern. Disease registry data are often mapped by town or county of diagnosis and contain limited data on covariates. These maps often possess poor spatial resolution, the potential for spatial confounding, and the inability to consider latency. Population-based case-control studies can provide detailed information on residential history and covariates. Results Generalized additive models (GAMs provide a useful framework for mapping point-based epidemiologic data. Smoothing on location while controlling for covariates produces adjusted maps. We generate maps of odds ratios using the entire study area as a reference. We smooth using a locally weighted regression smoother (loess, a method that combines the advantages of nearest neighbor and kernel methods. We choose an optimal degree of smoothing by minimizing Akaike's Information Criterion. We use a deviance-based test to assess the overall importance of location in the model and pointwise permutation tests to locate regions of significantly increased or decreased risk. The method is illustrated with synthetic data and data from a population-based case-control study, using S-Plus and ArcView software. Conclusion Our goal is to develop practical methods for mapping population-based case-control and cohort studies. The method described here performs well for our synthetic data, reproducing important features of the data and adequately controlling the covariate. When applied to the population-based case-control data set, the method suggests spatial confounding and identifies statistically significant areas of increased and decreased odds ratios.

  7. An Agent-Based Model for Studying Child Maltreatment and Child Maltreatment Prevention

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard W.

    This paper presents an agent-based model that simulates the dynamics of child maltreatment and child maltreatment prevention. The developed model follows the principles of complex systems science and explicitly models a community and its families with multi-level factors and interconnections across the social ecology. This makes it possible to experiment how different factors and prevention strategies can affect the rate of child maltreatment. We present the background of this work and give an overview of the agent-based model and show some simulation results.

  8. Study of the attractor structure of an agent-based sociological model

    Energy Technology Data Exchange (ETDEWEB)

    Timpanaro, Andre M; Prado, Carmen P C, E-mail: timpa@if.usp.br, E-mail: prado@if.usp.br [Instituto de Fisica da Universidade de Sao Paulo, Sao Paulo (Brazil)

    2011-03-01

    The Sznajd model is a sociophysics model that is based in the Potts model, and used for describing opinion propagation in a society. It employs an agent-based approach and interaction rules favouring pairs of agreeing agents. It has been successfully employed in modeling some properties and scale features of both proportional and majority elections (see for instance the works of A. T. Bernardes and R. N. Costa Filho), but its stationary states are always consensus states. In order to explain more complicated behaviours, we have modified the bounded confidence idea (introduced before in other opinion models, like the Deffuant model), with the introduction of prejudices and biases (we called this modification confidence rules), and have adapted it to the discrete Sznajd model. This generalized Sznajd model is able to reproduce almost all of the previous versions of the Sznajd model, by using appropriate choices of parameters. We solved the attractor structure of the resulting model in a mean-field approach and made Monte Carlo simulations in a Barabasi-Albert network. These simulations show great similarities with the mean-field, for the tested cases of 3 and 4 opinions. The dynamical systems approach that we devised allows for a deeper understanding of the potential of the Sznajd model as an opinion propagation model and can be easily extended to other models, like the voter model. Our modification of the bounded confidence rule can also be readily applied to other opinion propagation models.

  9. A predictive coding account of bistable perception - a model-based fMRI study.

    Science.gov (United States)

    Weilnhammer, Veith; Stuke, Heiner; Hesselmann, Guido; Sterzer, Philipp; Schmack, Katharina

    2017-05-01

    In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together, our current work

  10. A predictive coding account of bistable perception - a model-based fMRI study.

    Directory of Open Access Journals (Sweden)

    Veith Weilnhammer

    2017-05-01

    Full Text Available In bistable vision, subjective perception wavers between two interpretations of a constant ambiguous stimulus. This dissociation between conscious perception and sensory stimulation has motivated various empirical studies on the neural correlates of bistable perception, but the neurocomputational mechanism behind endogenous perceptual transitions has remained elusive. Here, we recurred to a generic Bayesian framework of predictive coding and devised a model that casts endogenous perceptual transitions as a consequence of prediction errors emerging from residual evidence for the suppressed percept. Data simulations revealed close similarities between the model's predictions and key temporal characteristics of perceptual bistability, indicating that the model was able to reproduce bistable perception. Fitting the predictive coding model to behavioural data from an fMRI-experiment on bistable perception, we found a correlation across participants between the model parameter encoding perceptual stabilization and the behaviourally measured frequency of perceptual transitions, corroborating that the model successfully accounted for participants' perception. Formal model comparison with established models of bistable perception based on mutual inhibition and adaptation, noise or a combination of adaptation and noise was used for the validation of the predictive coding model against the established models. Most importantly, model-based analyses of the fMRI data revealed that prediction error time-courses derived from the predictive coding model correlated with neural signal time-courses in bilateral inferior frontal gyri and anterior insulae. Voxel-wise model selection indicated a superiority of the predictive coding model over conventional analysis approaches in explaining neural activity in these frontal areas, suggesting that frontal cortex encodes prediction errors that mediate endogenous perceptual transitions in bistable perception. Taken together

  11. Modeling and Sensitivity Study of Consensus Algorithm-Based Distributed Hierarchical Control for DC Microgrids

    DEFF Research Database (Denmark)

    Meng, Lexuan; Dragicevic, Tomislav; Roldan Perez, Javier

    2016-01-01

    Distributed control methods based on consensus algorithms have become popular in recent years for microgrid (MG) systems. These kinds of algorithms can be applied to share information in order to coordinate multiple distributed generators within a MG. However, stability analysis becomes a challen......Distributed control methods based on consensus algorithms have become popular in recent years for microgrid (MG) systems. These kinds of algorithms can be applied to share information in order to coordinate multiple distributed generators within a MG. However, stability analysis becomes...... in the communication network, continuous-time methods can be inaccurate for this kind of dynamic study. Therefore, this paper aims at modeling a complete DC MG using a discrete-time approach in order to perform a sensitivity analysis taking into account the effects of the consensus algorithm. To this end......, a generalized modeling method is proposed and the influence of key control parameters, the communication topology and the communication speed are studied in detail. The theoretical results obtained with the proposed model are verified by comparing them with the results obtained with a detailed switching...

  12. Correlation between the model accuracy and model-based SOC estimation

    International Nuclear Information System (INIS)

    Wang, Qianqian; Wang, Jiao; Zhao, Pengju; Kang, Jianqiang; Yan, Few; Du, Changqing

    2017-01-01

    State-of-charge (SOC) estimation is a core technology for battery management systems. Considerable progress has been achieved in the study of SOC estimation algorithms, especially the algorithm on the basis of Kalman filter to meet the increasing demand of model-based battery management systems. The Kalman filter weakens the influence of white noise and initial error during SOC estimation but cannot eliminate the existing error of the battery model itself. As such, the accuracy of SOC estimation is directly related to the accuracy of the battery model. Thus far, the quantitative relationship between model accuracy and model-based SOC estimation remains unknown. This study summarizes three equivalent circuit lithium-ion battery models, namely, Thevenin, PNGV, and DP models. The model parameters are identified through hybrid pulse power characterization test. The three models are evaluated, and SOC estimation conducted by EKF-Ah method under three operating conditions are quantitatively studied. The regression and correlation of the standard deviation and normalized RMSE are studied and compared between the model error and the SOC estimation error. These parameters exhibit a strong linear relationship. Results indicate that the model accuracy affects the SOC estimation accuracy mainly in two ways: dispersion of the frequency distribution of the error and the overall level of the error. On the basis of the relationship between model error and SOC estimation error, our study provides a strategy for selecting a suitable cell model to meet the requirements of SOC precision using Kalman filter.

  13. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  14. Technological progress and effects of (supra) regional innovation and production collaboration. An agent-based model simulation study.

    NARCIS (Netherlands)

    Vermeulen, B.; Pyka, A.; Serguieva, A.; Maringer, D.; Palade, V.; Almeida, R.J.

    2014-01-01

    We provide a novel technology development model in which economic agents search for transformations to build artifacts. Using this technology development model, we conduct an agent-based model simulation study on the effect of (supra-)regional collaboration in production and innovation on

  15. Parametric Study of Synthetic-Jet-Based Flow Control on a Vertical Tail Model

    Science.gov (United States)

    Monastero, Marianne; Lindstrom, Annika; Beyar, Michael; Amitay, Michael

    2015-11-01

    Separation control over the rudder of the vertical tail of a commercial airplane using synthetic-jet-based flow control can lead to a reduction in tail size, with an associated decrease in drag and increase in fuel savings. A parametric, experimental study was undertaken using an array of finite span synthetic jets to investigate the sensitivity of the enhanced vertical tail side force to jet parameters, such as jet spanwise spacing and jet momentum coefficient. A generic wind tunnel model was designed and fabricated to fundamentally study the effects of the jet parameters at varying rudder deflection and model sideslip angles. Wind tunnel results obtained from pressure measurements and tuft flow visualization in the Rensselaer Polytechnic Subsonic Wind Tunnel show a decrease in separation severity and increase in model performance in comparison to the baseline, non-actuated case. The sensitivity to various parameters will be presented.

  16. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  17. Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study

    Directory of Open Access Journals (Sweden)

    Cees Buisman

    2013-07-01

    Full Text Available Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data to compare energy and water balance, nutrient recovery, chemical use, effluent quality and land area requirement in four different sanitation concepts: (1 centralized; (2 centralized with source-separation of urine; (3 source-separation of black water, kitchen refuse and grey water; and (4 source-separation of urine, feces, kitchen refuse and grey water. The highest primary energy consumption of 914 MJ/capita(cap/year was attained within the centralized sanitation concept, and the lowest primary energy consumption of 437 MJ/cap/year was attained within source-separation of urine, feces, kitchen refuse and grey water. Grey water bio-flocculation and subsequent grey water sludge co-digestion decreased the primary energy consumption, but was not energetically favorable to couple with grey water effluent reuse. Source-separation of urine improved the energy balance, nutrient recovery and effluent quality, but required larger land area and higher chemical use in the centralized concept.

  18. Injury Based on Its Study in Experimental Models

    Directory of Open Access Journals (Sweden)

    M. Mendes-Braz

    2012-01-01

    Full Text Available The present review focuses on the numerous experimental models used to study the complexity of hepatic ischemia/reperfusion (I/R injury. Although experimental models of hepatic I/R injury represent a compromise between the clinical reality and experimental simplification, the clinical transfer of experimental results is problematic because of anatomical and physiological differences and the inevitable simplification of experimental work. In this review, the strengths and limitations of the various models of hepatic I/R are discussed. Several strategies to protect the liver from I/R injury have been developed in animal models and, some of these, might find their way into clinical practice. We also attempt to highlight the fact that the mechanisms responsible for hepatic I/R injury depend on the experimental model used, and therefore the therapeutic strategies also differ according to the model used. Thus, the choice of model must therefore be adapted to the clinical question being answered.

  19. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented...... on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  20. Environmental Sound Perception: Metadescription and Modeling Based on Independent Primary Studies

    Directory of Open Access Journals (Sweden)

    Stephen McAdams

    2010-01-01

    Full Text Available The aim of the study is to transpose and extend to a set of environmental sounds the notion of sound descriptors usually used for musical sounds. Four separate primary studies dealing with interior car sounds, air-conditioning units, car horns, and closing car doors are considered collectively. The corpus formed by these initial stimuli is submitted to new experimental studies and analyses, both for revealing metacategories and for defining more precisely the limits of each of the resulting categories. In a second step, the new structure is modeled: common and specific dimensions within each category are derived from the initial results and new investigations of audio features are performed. Furthermore, an automatic classifier based on two audio descriptors and a multinomial logistic regression procedure is implemented and validated with the corpus.

  1. A pedagogical model for simulation-based learning in healthcare

    Directory of Open Access Journals (Sweden)

    Tuulikki Keskitalo

    2015-11-01

    Full Text Available The aim of this study was to design a pedagogical model for a simulation-based learning environment (SBLE in healthcare. Currently, simulation and virtual reality are a major focus in healthcare education. However, when and how these learning environments should be applied is not well-known. The present study tries to fill that gap. We pose the following research question: What kind of pedagogical model supports and facilitates students’ meaningful learning in SBLEs? The study used design-based research (DBR and case study approaches. We report the results from our second case study and how the pedagogical model was developed based on the lessons learned. The study involved nine facilitators and 25 students. Data were collected and analysed using mixed methods. The main result of this study is the refined pedagogical model. The model is based on the socio-cultural theory of learning and characteristics of meaningful learning as well as previous pedagogical models. The model will provide a more holistic and meaningful approach to teaching and learning in SBLEs. However, the model requires evidence and further development.

  2. Investigating the consistency between proxy-based reconstructions and climate models using data assimilation: a mid-Holocene case study

    NARCIS (Netherlands)

    A. Mairesse; H. Goosse; P. Mathiot; H. Wanner; S. Dubinkina (Svetlana)

    2013-01-01

    htmlabstractThe mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of

  3. Event-based model diagnosis of rainfall-runoff model structures

    International Nuclear Information System (INIS)

    Stanzel, P.

    2012-01-01

    The objective of this research is a comparative evaluation of different rainfall-runoff model structures. Comparative model diagnostics facilitate the assessment of strengths and weaknesses of each model. The application of multiple models allows an analysis of simulation uncertainties arising from the selection of model structure, as compared with effects of uncertain parameters and precipitation input. Four different model structures, including conceptual and physically based approaches, are compared. In addition to runoff simulations, results for soil moisture and the runoff components of overland flow, interflow and base flow are analysed. Catchment runoff is simulated satisfactorily by all four model structures and shows only minor differences. Systematic deviations from runoff observations provide insight into model structural deficiencies. While physically based model structures capture some single runoff events better, they do not generally outperform conceptual model structures. Contributions to uncertainty in runoff simulations stemming from the choice of model structure show similar dimensions to those arising from parameter selection and the representation of precipitation input. Variations in precipitation mainly affect the general level and peaks of runoff, while different model structures lead to different simulated runoff dynamics. Large differences between the four analysed models are detected for simulations of soil moisture and, even more pronounced, runoff components. Soil moisture changes are more dynamical in the physically based model structures, which is in better agreement with observations. Streamflow contributions of overland flow are considerably lower in these models than in the more conceptual approaches. Observations of runoff components are rarely made and are not available in this study, but are shown to have high potential for an effective selection of appropriate model structures (author) [de

  4. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  5. The Application of FIA-based Data to Wildlife Habitat Modeling: A Comparative Study

    Science.gov (United States)

    Thomas C., Jr. Edwards; Gretchen G. Moisen; Tracey S. Frescino; Randall J. Schultz

    2005-01-01

    We evaluated the capability of two types of models, one based on spatially explicit variables derived from FIA data and one using so-called traditional habitat evaluation methods, for predicting the presence of cavity-nesting bird habitat in Fishlake National Forest, Utah. Both models performed equally well, in measures of predictive accuracy, with the FIA-based model...

  6. Identification of walking human model using agent-based modelling

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  7. ANFIS-Based Modeling for Photovoltaic Characteristics Estimation

    Directory of Open Access Journals (Sweden)

    Ziqiang Bi

    2016-09-01

    Full Text Available Due to the high cost of photovoltaic (PV modules, an accurate performance estimation method is significantly valuable for studying the electrical characteristics of PV generation systems. Conventional analytical PV models are usually composed by nonlinear exponential functions and a good number of unknown parameters must be identified before using. In this paper, an adaptive-network-based fuzzy inference system (ANFIS based modeling method is proposed to predict the current-voltage characteristics of PV modules. The effectiveness of the proposed modeling method is evaluated through comparison with Villalva’s model, radial basis function neural networks (RBFNN based model and support vector regression (SVR based model. Simulation and experimental results confirm both the feasibility and the effectiveness of the proposed method.

  8. Model-based control of the resistive wall mode in DIII-D: A comparison study

    International Nuclear Information System (INIS)

    Dalessio, J.; Schuster, E.; Humphreys, D.A.; Walker, M.L.; In, Y.; Kim, J.-S.

    2009-01-01

    One of the major non-axisymmetric instabilities under study in the DIII-D tokamak is the resistive wall mode (RWM), a form of plasma kink instability whose growth rate is moderated by the influence of a resistive wall. One of the approaches for RWM stabilization, referred to as magnetic control, uses feedback control to produce magnetic fields opposing the moving field that accompanies the growth of the mode. These fields are generated by coils arranged around the tokamak. One problem with RWM control methods used in present experiments is that they predominantly use simple non-model-based proportional-derivative (PD) controllers requiring substantial derivative gain for stabilization, which implies a large response to noise and perturbations, leading to a requirement for high peak voltages and coil currents, usually leading to actuation saturation and instability. Motivated by this limitation, current efforts in DIII-D include the development of model-based RWM controllers. The General Atomics (GA)/Far-Tech DIII-D RWM model represents the plasma surface as a toroidal current sheet and characterizes the wall using an eigenmode approach. Optimal and robust controllers have been designed exploiting the availability of the RWM dynamic model. The controllers are tested through simulations, and results are compared to present non-model-based PD controllers. This comparison also makes use of the μ structured singular value as a measure of robust stability and performance of the closed-loop system.

  9. Thermal conductivity model for powdered materials under vacuum based on experimental studies

    Directory of Open Access Journals (Sweden)

    N. Sakatani

    2017-01-01

    Full Text Available The thermal conductivity of powdered media is characteristically very low in vacuum, and is effectively dependent on many parameters of their constituent particles and packing structure. Understanding of the heat transfer mechanism within powder layers in vacuum and theoretical modeling of their thermal conductivity are of great importance for several scientific and engineering problems. In this paper, we report the results of systematic thermal conductivity measurements of powdered media of varied particle size, porosity, and temperature under vacuum using glass beads as a model material. Based on the obtained experimental data, we investigated the heat transfer mechanism in powdered media in detail, and constructed a new theoretical thermal conductivity model for the vacuum condition. This model enables an absolute thermal conductivity to be calculated for a powder with the input of a set of powder parameters including particle size, porosity, temperature, and compressional stress or gravity, and vice versa. Our model is expected to be a competent tool for several scientific and engineering fields of study related to powders, such as the thermal infrared observation of air-less planetary bodies, thermal evolution of planetesimals, and performance of thermal insulators and heat storage powders.

  10. Lithium-ion battery models: a comparative study and a model-based powerline communication

    Directory of Open Access Journals (Sweden)

    F. Saidani

    2017-09-01

    Full Text Available In this work, various Lithium-ion (Li-ion battery models are evaluated according to their accuracy, complexity and physical interpretability. An initial classification into physical, empirical and abstract models is introduced. Also known as white, black and grey boxes, respectively, the nature and characteristics of these model types are compared. Since the Li-ion battery cell is a thermo-electro-chemical system, the models are either in the thermal or in the electrochemical state-space. Physical models attempt to capture key features of the physical process inside the cell. Empirical models describe the system with empirical parameters offering poor analytical, whereas abstract models provide an alternative representation. In addition, a model selection guideline is proposed based on applications and design requirements. A complex model with a detailed analytical insight is of use for battery designers but impractical for real-time applications and in situ diagnosis. In automotive applications, an abstract model reproducing the battery behavior in an equivalent but more practical form, mainly as an equivalent circuit diagram, is recommended for the purpose of battery management. As a general rule, a trade-off should be reached between the high fidelity and the computational feasibility. Especially if the model is embedded in a real-time monitoring unit such as a microprocessor or a FPGA, the calculation time and memory requirements rise dramatically with a higher number of parameters. Moreover, examples of equivalent circuit models of Lithium-ion batteries are covered. Equivalent circuit topologies are introduced and compared according to the previously introduced criteria. An experimental sequence to model a 20 Ah cell is presented and the results are used for the purposes of powerline communication.

  11. Convex-based void filling method for CAD-based Monte Carlo geometry modeling

    International Nuclear Information System (INIS)

    Yu, Shengpeng; Cheng, Mengyun; Song, Jing; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • We present a new void filling method named CVF for CAD based MC geometry modeling. • We describe convex based void description based and quality-based space subdivision. • The results showed improvements provided by CVF for both modeling and MC calculation efficiency. - Abstract: CAD based automatic geometry modeling tools have been widely applied to generate Monte Carlo (MC) calculation geometry for complex systems according to CAD models. Automatic void filling is one of the main functions in the CAD based MC geometry modeling tools, because the void space between parts in CAD models is traditionally not modeled while MC codes such as MCNP need all the problem space to be described. A dedicated void filling method, named Convex-based Void Filling (CVF), is proposed in this study for efficient void filling and concise void descriptions. The method subdivides all the problem space into disjointed regions using Quality based Subdivision (QS) and describes the void space in each region with complementary descriptions of the convex volumes intersecting with that region. It has been implemented in SuperMC/MCAM, the Multiple-Physics Coupling Analysis Modeling Program, and tested on International Thermonuclear Experimental Reactor (ITER) Alite model. The results showed that the new method reduced both automatic modeling time and MC calculation time

  12. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  13. Material model of pelvic bone based on modal analysis: a study on the composite bone.

    Science.gov (United States)

    Henyš, Petr; Čapek, Lukáš

    2017-02-01

    Digital models based on finite element (FE) analysis are widely used in orthopaedics to predict the stress or strain in the bone due to bone-implant interaction. The usability of the model depends strongly on the bone material description. The material model that is most commonly used is based on a constant Young's modulus or on the apparent density of bone obtained from computer tomography (CT) data. The Young's modulus of bone is described in many experimental works with large variations in the results. The concept of measuring and validating the material model of the pelvic bone based on modal analysis is introduced in this pilot study. The modal frequencies, damping, and shapes of the composite bone were measured precisely by an impact hammer at 239 points. An FE model was built using the data pertaining to the geometry and apparent density obtained from the CT of the composite bone. The isotropic homogeneous Young's modulus and Poisson's ratio of the cortical and trabecular bone were estimated from the optimisation procedure including Gaussian statistical properties. The performance of the updated model was investigated through the sensitivity analysis of the natural frequencies with respect to the material parameters. The maximal error between the numerical and experimental natural frequencies of the bone reached 1.74 % in the first modal shape. Finally, the optimised parameters were matched with the data sheets of the composite bone. The maximal difference between the calibrated material properties and that obtained from the data sheet was 34 %. The optimisation scheme of the FE model based on the modal analysis data provides extremely useful calibration of the FE models with the uncertainty bounds and without the influence of the boundary conditions.

  14. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  15. LEARNING CREATIVE WRITING MODEL BASED ON NEUROLINGUISTIC PROGRAMMING

    OpenAIRE

    Rustan, Edhy

    2017-01-01

    The objectives of the study are to determine: (1) condition on learning creative writing at high school students in Makassar, (2) requirement of learning model in creative writing, (3) program planning and design model in ideal creative writing, (4) feasibility of model study based on creative writing in neurolinguistic programming, and (5) the effectiveness of the learning model based on creative writing in neurolinguisticprogramming.The method of this research uses research development of L...

  16. Physics Based Electrolytic Capacitor Degradation Models for Prognostic Studies under Thermal Overstress

    Science.gov (United States)

    Kulkarni, Chetan S.; Celaya, Jose R.; Goebel, Kai; Biswas, Gautam

    2012-01-01

    Electrolytic capacitors are used in several applications ranging from power supplies on safety critical avionics equipment to power drivers for electro-mechanical actuators. This makes them good candidates for prognostics and health management research. Prognostics provides a way to assess remaining useful life of components or systems based on their current state of health and their anticipated future use and operational conditions. Past experiences show that capacitors tend to degrade and fail faster under high electrical and thermal stress conditions that they are often subjected to during operations. In this work, we study the effects of accelerated aging due to thermal stress on different sets of capacitors under different conditions. Our focus is on deriving first principles degradation models for thermal stress conditions. Data collected from simultaneous experiments are used to validate the desired models. Our overall goal is to derive accurate models of capacitor degradation, and use them to predict performance changes in DC-DC converters.

  17. Chinese Students' Goal Orientation in English Learning: A Study Based on Autonomous Inquiry Model

    Science.gov (United States)

    Zhang, Jianfeng

    2014-01-01

    Goal orientation is a kind of theory of learning motivation, which helps learners to develop their capability by emphasis on new techniques acquiring and environment adapting. In this study, based on the autonomous inquiry model, the construction of Chinese students' goal orientations in English learning are summarized according to the data…

  18. Settings for Physical Activity – Developing a Site-specific Physical Activity Behavior Model based on Multi-level Intervention Studies

    DEFF Research Database (Denmark)

    Troelsen, Jens; Klinker, Charlotte Demant; Breum, Lars

    Settings for Physical Activity – Developing a Site-specific Physical Activity Behavior Model based on Multi-level Intervention Studies Introduction: Ecological models of health behavior have potential as theoretical framework to comprehend the multiple levels of factors influencing physical...... to be taken into consideration. A theoretical implication of this finding is to develop a site-specific physical activity behavior model adding a layered structure to the ecological model representing the determinants related to the specific site. Support: This study was supported by TrygFonden, Realdania...... activity (PA). The potential is shown by the fact that there has been a dramatic increase in application of ecological models in research and practice. One proposed core principle is that an ecological model is most powerful if the model is behavior-specific. However, based on multi-level interventions...

  19. NMR studies concerning base-base interactions in oligonucleotides

    International Nuclear Information System (INIS)

    Hoogen, Y.T. van den.

    1988-01-01

    Two main subjects are treated in the present thesis. The firsst part principally deals with the base-base interactions in single-stranded oligoribonucleotides. The second part presents NMR and model-building studies of DNA and RNA duplexes containing an unpaired base. (author). 242 refs.; 26 figs.; 24 tabs

  20. Automatic CT-based finite element model generation for temperature-based death time estimation: feasibility study and sensitivity analysis.

    Science.gov (United States)

    Schenkl, Sebastian; Muggenthaler, Holger; Hubig, Michael; Erdmann, Bodo; Weiser, Martin; Zachow, Stefan; Heinrich, Andreas; Güttler, Felix Victor; Teichgräber, Ulf; Mall, Gita

    2017-05-01

    Temperature-based death time estimation is based either on simple phenomenological models of corpse cooling or on detailed physical heat transfer models. The latter are much more complex but allow a higher accuracy of death time estimation, as in principle, all relevant cooling mechanisms can be taken into account.Here, a complete workflow for finite element-based cooling simulation is presented. The following steps are demonstrated on a CT phantom: Computer tomography (CT) scan Segmentation of the CT images for thermodynamically relevant features of individual geometries and compilation in a geometric computer-aided design (CAD) model Conversion of the segmentation result into a finite element (FE) simulation model Computation of the model cooling curve (MOD) Calculation of the cooling time (CTE) For the first time in FE-based cooling time estimation, the steps from the CT image over segmentation to FE model generation are performed semi-automatically. The cooling time calculation results are compared to cooling measurements performed on the phantoms under controlled conditions. In this context, the method is validated using a CT phantom. Some of the phantoms' thermodynamic material parameters had to be determined via independent experiments.Moreover, the impact of geometry and material parameter uncertainties on the estimated cooling time is investigated by a sensitivity analysis.

  1. Antecedents of employee electricity saving behavior in organizations: An empirical study based on norm activation model

    International Nuclear Information System (INIS)

    Zhang, Yixiang; Wang, Zhaohua; Zhou, Guanghui

    2013-01-01

    China is one of the major energy-consuming countries, and is under great pressure to promote energy saving and reduce domestic energy consumption. Employees constitute an important target group for energy saving. However, only a few research efforts have been paid to study what drives employee energy saving behavior in organizations. To fill this gap, drawing on norm activation model (NAM), we built a research model to study antecedents of employee electricity saving behavior in organizations. The model was empirically tested using survey data collected from office workers in Beijing, China. Results show that personal norm positively influences employee electricity saving behavior. Organizational electricity saving climate negatively moderates the effect of personal norm on electricity saving behavior. Awareness of consequences, ascription of responsibility, and organizational electricity saving climate positively influence personal norm. Furthermore, awareness of consequences positively influences ascription of responsibility. This paper contributes to the energy saving behavior literature by building a theoretical model of employee electricity saving behavior which is understudied in the current literature. Based on the empirical results, implications on how to promote employee electricity saving are discussed. - Highlights: • We studied employee electricity saving behavior based on norm activation model. • The model was tested using survey data collected from office workers in China. • Personal norm positively influences employee′s electricity saving behavior. • Electricity saving climate negatively moderates personal norm′s effect. • This research enhances our understanding of employee electricity saving behavior

  2. A Comparative Study of the Effects of the Neurocognitive-Based Model and the Conventional Model on Learner Attention, Working Memory and Mood

    Science.gov (United States)

    Srikoon, Sanit; Bunterm, Tassanee; Nethanomsak, Teerachai; Ngang, Tang Keow

    2017-01-01

    Purpose: The attention, working memory, and mood of learners are the most important abilities in the learning process. This study was concerned with the comparison of contextualized attention, working memory, and mood through a neurocognitive-based model (5P) and a conventional model (5E). It sought to examine the significant change in attention,…

  3. Anisotropy in wavelet-based phase field models

    KAUST Repository

    Korzec, Maciek; Mü nch, Andreas; Sü li, Endre; Wagner, Barbara

    2016-01-01

    When describing the anisotropic evolution of microstructures in solids using phase-field models, the anisotropy of the crystalline phases is usually introduced into the interfacial energy by directional dependencies of the gradient energy coefficients. We consider an alternative approach based on a wavelet analogue of the Laplace operator that is intrinsically anisotropic and linear. The paper focuses on the classical coupled temperature/Ginzburg--Landau type phase-field model for dendritic growth. For the model based on the wavelet analogue, existence, uniqueness and continuous dependence on initial data are proved for weak solutions. Numerical studies of the wavelet based phase-field model show dendritic growth similar to the results obtained for classical phase-field models.

  4. Anisotropy in wavelet-based phase field models

    KAUST Repository

    Korzec, Maciek

    2016-04-01

    When describing the anisotropic evolution of microstructures in solids using phase-field models, the anisotropy of the crystalline phases is usually introduced into the interfacial energy by directional dependencies of the gradient energy coefficients. We consider an alternative approach based on a wavelet analogue of the Laplace operator that is intrinsically anisotropic and linear. The paper focuses on the classical coupled temperature/Ginzburg--Landau type phase-field model for dendritic growth. For the model based on the wavelet analogue, existence, uniqueness and continuous dependence on initial data are proved for weak solutions. Numerical studies of the wavelet based phase-field model show dendritic growth similar to the results obtained for classical phase-field models.

  5. Applying an expectancy-value model to study motivators for work-task based information seeking

    DEFF Research Database (Denmark)

    Sigaard, Karen Tølbøl; Skov, Mette

    2015-01-01

    on the theory of expectancy-value and on the operationalisation used when the model was first developed. Data for the analysis were collected from a sample of seven informants working as consultants in Danish municipalities. Each participant filled out a questionnaire, kept a log book for a week...... for interpersonal and internal sources increased when the task had high-value motivation or low-expectancy motivation or both. Research limitations/implications: The study is based on a relatively small sample and considers only one motivation theory. This should be addressed in future research along...... with a broadening of the studied group to involve other professions than municipality consultants. Originality/value: Motivational theories from the field of psychology have been used sparsely in studies of information seeking. This study operationalises and verifies such a theory based on a theoretical adaptation...

  6. Remaining useful life estimation based on stochastic deterioration models: A comparative study

    International Nuclear Information System (INIS)

    Le Son, Khanh; Fouladirad, Mitra; Barros, Anne; Levrat, Eric; Iung, Benoît

    2013-01-01

    Prognostic of system lifetime is a basic requirement for condition-based maintenance in many application domains where safety, reliability, and availability are considered of first importance. This paper presents a probabilistic method for prognostic applied to the 2008 PHM Conference Challenge data. A stochastic process (Wiener process) combined with a data analysis method (Principal Component Analysis) is proposed to model the deterioration of the components and to estimate the RUL on a case study. The advantages of our probabilistic approach are pointed out and a comparison with existing results on the same data is made

  7. Which model based on fluorescence quenching is suitable to study the interaction between trans-resveratrol and BSA?

    Science.gov (United States)

    Wei, Xin Lin; Xiao, Jian Bo; Wang, Yuanfeng; Bai, Yalong

    2010-01-01

    There are several models by means of quenching fluorescence of BSA to determine the binding parameters. The binding parameters obtained from different models are quite different from each other. Which model is suitable to study the interaction between trans-resveratrol and BSA? Herein, twelve models based fluorescence quenching of BSA were compared. The number of binding sites increasing with increased binding constant for similar compounds binding to BSA maybe one approach to resolve this question. For example, here eleven flavonoids were tested to illustrate that the double logarithm regression curve is suitable to study binding polyphenols to BSA.

  8. DEVELOPMENT MODEL OF PATISSERIE PROJECT-BASED LEARNING

    OpenAIRE

    Ana Ana; Lutfhiyah Nurlaela

    2013-01-01

    The study aims to find a model of patisserie project-based learning with production approach that can improve effectiveness of patisserie learning. Delphi Technique, Cohen's Kappa and percentages of agreements were used to assess model of patisserie project based learning. Data collection techniques employed in the study were questionnaire, check list worksheet, observation, and interview sheets. Subjects were 13 lectures of expertise food and nutrition and 91 students of Food and Nutrition ...

  9. Study on the combined influence of battery models and sizing strategy for hybrid and battery-based electric vehicles

    DEFF Research Database (Denmark)

    Pinto, Cláudio; Barreras, Jorge V.; de Castro, Ricardo

    2017-01-01

    This paper presents a study of the combined influence of battery models and sizing strategy for hybrid and battery-based electric vehicles. In particular, the aim is to find the number of battery (and supercapacitor) cells to propel a light vehicle to run two different standard driving cycles....... Despite the same tendency, when a hybrid vehicle is taken into account, the influence of the battery models is dependent on the sizing strategy. In this work, two sizing strategies are evaluated: dynamic programming and filter-based. For the latter, the complexity of the battery model has a clear....... Three equivalent circuit models are considered to simulate the battery electrical performance: linear static, non-linear static and non-linear with first-order dynamics. When dimensioning a battery-based vehicle, less complex models may lead to a solution with more battery cells and higher costs...

  10. Integrated Agent-Based and Production Cost Modeling Framework for Renewable Energy Studies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Gallo, Giulia

    2015-10-07

    The agent-based framework for renewable energy studies (ARES) is an integrated approach that adds an agent-based model of industry actors to PLEXOS and combines the strengths of the two to overcome their individual shortcomings. It can examine existing and novel wholesale electricity markets under high penetrations of renewables. ARES is demonstrated by studying how increasing levels of wind will impact the operations and the exercise of market power of generation companies that exploit an economic withholding strategy. The analysis is carried out on a test system that represents the Electric Reliability Council of Texas energy-only market in the year 2020. The results more realistically reproduce the operations of an energy market under different and increasing penetrations of wind, and ARES can be extended to address pressing issues in current and future wholesale electricity markets.

  11. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  12. A real options-based CCS investment evaluation model: Case study of China's power generation sector

    International Nuclear Information System (INIS)

    Zhu, Lei; Fan, Ying

    2011-01-01

    Highlights: → This paper establishes a carbon captures and storage (CCS) investment evaluation model. → The model is based on real options theory and solved by the Least Squares Monte Carlo (LSM) method. → China is taken as a case study to evaluate the effects of regulations on CCS investment. → The findings show that the current investment risk of CCS is high, climate policy having the greatest impact on CCS development. -- Abstract: This paper establishes a carbon capture and storage (CCS) investment evaluation model based on real options theory considering uncertainties from the existing thermal power generating cost, carbon price, thermal power with CCS generating cost, and investment in CCS technology deployment. The model aims to evaluate the value of the cost saving effect and amount of CO 2 emission reduction through investing in newly-built thermal power with CCS technology to replace existing thermal power in a given period from the perspective of power generation enterprises. The model is solved by the Least Squares Monte Carlo (LSM) method. Since the model could be used as a policy analysis tool, China is taken as a case study to evaluate the effects of regulations on CCS investment through scenario analysis. The findings show that the current investment risk of CCS is high, climate policy having the greatest impact on CCS development. Thus, there is an important trade off for policy makers between reducing greenhouse gas emissions and protecting the interests of power generation enterprises. The research presented would be useful for CCS technology evaluation and related policy-making.

  13. Cost Analysis of Prenatal Care Using the Activity-Based Costing Model: A Pilot Study

    Science.gov (United States)

    Gesse, Theresa; Golembeski, Susan; Potter, Jonell

    1999-01-01

    The cost of prenatal care in a private nurse-midwifery practice was examined using the activity-based costing system. Findings suggest that the activities of the nurse-midwife (the health care provider) constitute the major cost driver of this practice and that the model of care and associated, time-related activities influence the cost. This pilot study information will be used in the development of a comparative study of prenatal care, client education, and self care. PMID:22945985

  14. Cost analysis of prenatal care using the activity-based costing model: a pilot study.

    Science.gov (United States)

    Gesse, T; Golembeski, S; Potter, J

    1999-01-01

    The cost of prenatal care in a private nurse-midwifery practice was examined using the activity-based costing system. Findings suggest that the activities of the nurse-midwife (the health care provider) constitute the major cost driver of this practice and that the model of care and associated, time-related activities influence the cost. This pilot study information will be used in the development of a comparative study of prenatal care, client education, and self care.

  15. New Temperature-based Models for Predicting Global Solar Radiation

    International Nuclear Information System (INIS)

    Hassan, Gasser E.; Youssef, M. Elsayed; Mohamed, Zahraa E.; Ali, Mohamed A.; Hanafy, Ahmed A.

    2016-01-01

    Highlights: • New temperature-based models for estimating solar radiation are investigated. • The models are validated against 20-years measured data of global solar radiation. • The new temperature-based model shows the best performance for coastal sites. • The new temperature-based model is more accurate than the sunshine-based models. • The new model is highly applicable with weather temperature forecast techniques. - Abstract: This study presents new ambient-temperature-based models for estimating global solar radiation as alternatives to the widely used sunshine-based models owing to the unavailability of sunshine data at all locations around the world. Seventeen new temperature-based models are established, validated and compared with other three models proposed in the literature (the Annandale, Allen and Goodin models) to estimate the monthly average daily global solar radiation on a horizontal surface. These models are developed using a 20-year measured dataset of global solar radiation for the case study location (Lat. 30°51′N and long. 29°34′E), and then, the general formulae of the newly suggested models are examined for ten different locations around Egypt. Moreover, the local formulae for the models are established and validated for two coastal locations where the general formulae give inaccurate predictions. Mostly common statistical errors are utilized to evaluate the performance of these models and identify the most accurate model. The obtained results show that the local formula for the most accurate new model provides good predictions for global solar radiation at different locations, especially at coastal sites. Moreover, the local and general formulas of the most accurate temperature-based model also perform better than the two most accurate sunshine-based models from the literature. The quick and accurate estimations of the global solar radiation using this approach can be employed in the design and evaluation of performance for

  16. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented...... with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically...... on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  17. In Vivo RNAi-Based Screens: Studies in Model Organisms

    Directory of Open Access Journals (Sweden)

    Miki Yamamoto-Hino

    2013-11-01

    Full Text Available RNA interference (RNAi is a technique widely used for gene silencing in organisms and cultured cells, and depends on sequence homology between double-stranded RNA (dsRNA and target mRNA molecules. Numerous cell-based genome-wide screens have successfully identified novel genes involved in various biological processes, including signal transduction, cell viability/death, and cell morphology. However, cell-based screens cannot address cellular processes such as development, behavior, and immunity. Drosophila and Caenorhabditis elegans are two model organisms whose whole bodies and individual body parts have been subjected to RNAi-based genome-wide screening. Moreover, Drosophila RNAi allows the manipulation of gene function in a spatiotemporal manner when it is implemented using the Gal4/UAS system. Using this inducible RNAi technique, various large-scale screens have been performed in Drosophila, demonstrating that the method is straightforward and valuable. However, accumulated results reveal that the results of RNAi-based screens have relatively high levels of error, such as false positives and negatives. Here, we review in vivo RNAi screens in Drosophila and the methods that could be used to remove ambiguity from screening results.

  18. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  19. Model-based studies into ground water movement, with water density depending on salt content. Case studies and model validation with respect to the long-term safety of radwaste repositories. Final report

    International Nuclear Information System (INIS)

    Schelkes, K.

    1995-12-01

    Near-to-reality studies into ground water movement in the environment of planned radwaste repositories have to take into account that the flow conditions are influenced by the water density which in turn depends on the salt content. Based on results from earlier studies, computer programs were established that allow computation and modelling of ground water movement in salt water/fresh water systems, and the programs were tested and improved according to progress of the studies performed under the INTRAVAL international project. The computed models of ground water movement in the region of the Gorlebener Rinne showed for strongly simplified model profiles that the developing salinity distribution varies very sensitively in response to the applied model geometry, initial input data for salinity distribution, time frame of the model, and size of the transversal dispersion length. The WIPP 2 INTRAVAL experiment likewise studied a large-area ground water movement system influenced by salt water. Based on the concept of a hydraulically closed, regional ground water system (basin model), a sectional profile was worked out covering all relevant layers of the cap rock above the salt formation planned to serve as a repository. The model data derived to describe the salt water/fresh water movements in this profile resulted in essential enlargements and modifications of the ROCKFLOW computer program applied, (relating to input data for dispersion modelling, particle-tracker, computer graphics interface), and yielded important information for the modelling of such systems (relating to initial pressure data at the upper margin, network enhancement for important concentration boundary conditions, or treatment of permeability contrasts). (orig.) [de

  20. Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies (Final Report)

    Science.gov (United States)

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physi...

  1. The impact of hospital-based and community based models of cerebral palsy rehabilitation: a quasi-experimental study.

    Science.gov (United States)

    Dambi, Jermaine M; Jelsma, Jennifer

    2014-12-05

    Cerebral palsy requires appropriate on-going rehabilitation intervention which should effectively meet the needs of both children and parents/care-givers. The provision of effective support is a challenge, particularly in resource constrained settings. A quasi-experimental pragmatic research design was used to compare the impact of two models of rehabilitation service delivery currently offered in Harare, Zimbabwe, an outreach-based programme and the other institution-based. Questionnaires were distributed to 46 caregivers of children with cerebral palsy at baseline and after three months. Twenty children received rehabilitation services in a community setting and 26 received services as outpatients at a central hospital. The Gross Motor Function Measurement was used to assess functional change. The burden of care was measured using the Caregiver Strain Index, satisfaction with physiotherapy was assessed using the modified Medrisk satisfaction with physiotherapy services questionnaire and compliance was measured as the proportion met of the scheduled appointments. Children receiving outreach-based treatment were significantly older than children in the institution-based group. Regression analysis revealed that, once age and level of severity were controlled for, children in the outreach-based treatment group improved their motor function 6% more than children receiving institution-based services. There were no differences detected between the groups with regard to caregiver well-being and 51% of the caregivers reported signs consistent with clinical distress/depression. Most caregivers (83%) expressed that they were overwhelmed by the caregiving role and this increased with the chronicity of care. The financial burden of caregiver was predictive of caregiver strain. Caregivers in the outreach-based group reported greater satisfaction with services and were more compliant (p design interventions to alleviate the burden. The study was a pragmatic, quasi

  2. Néron models and base change

    CERN Document Server

    Halle, Lars Halvard

    2016-01-01

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...

  3. Uses of Agent-Based Modeling for Health Communication: the TELL ME Case Study.

    Science.gov (United States)

    Barbrook-Johnson, Peter; Badham, Jennifer; Gilbert, Nigel

    2017-08-01

    Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals' protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.

  4. Wavelet-based study of valence-arousal model of emotions on EEG signals with LabVIEW.

    Science.gov (United States)

    Guzel Aydin, Seda; Kaya, Turgay; Guler, Hasan

    2016-06-01

    This paper illustrates the wavelet-based feature extraction for emotion assessment using electroencephalogram (EEG) signal through graphical coding design. Two-dimensional (valence-arousal) emotion model was studied. Different emotions (happy, joy, melancholy, and disgust) were studied for assessment. These emotions were stimulated by video clips. EEG signals obtained from four subjects were decomposed into five frequency bands (gamma, beta, alpha, theta, and delta) using "db5" wavelet function. Relative features were calculated to obtain further information. Impact of the emotions according to valence value was observed to be optimal on power spectral density of gamma band. The main objective of this work is not only to investigate the influence of the emotions on different frequency bands but also to overcome the difficulties in the text-based program. This work offers an alternative approach for emotion evaluation through EEG processing. There are a number of methods for emotion recognition such as wavelet transform-based, Fourier transform-based, and Hilbert-Huang transform-based methods. However, the majority of these methods have been applied with the text-based programming languages. In this study, we proposed and implemented an experimental feature extraction with graphics-based language, which provides great convenience in bioelectrical signal processing.

  5. A model-based approach to estimating forest area

    Science.gov (United States)

    Ronald E. McRoberts

    2006-01-01

    A logistic regression model based on forest inventory plot data and transformations of Landsat Thematic Mapper satellite imagery was used to predict the probability of forest for 15 study areas in Indiana, USA, and 15 in Minnesota, USA. Within each study area, model-based estimates of forest area were obtained for circular areas with radii of 5 km, 10 km, and 15 km and...

  6. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  7. Research on Turbofan Engine Model above Idle State Based on NARX Modeling Approach

    Science.gov (United States)

    Yu, Bing; Shu, Wenjun

    2017-03-01

    The nonlinear model for turbofan engine above idle state based on NARX is studied. Above all, the data sets for the JT9D engine from existing model are obtained via simulation. Then, a nonlinear modeling scheme based on NARX is proposed and several models with different parameters are built according to the former data sets. Finally, the simulations have been taken to verify the precise and dynamic performance the models, the results show that the NARX model can well reflect the dynamics characteristic of the turbofan engine with high accuracy.

  8. Rule-based decision making model

    International Nuclear Information System (INIS)

    Sirola, Miki

    1998-01-01

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  9. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  10. Adapting an evidence-based model to retain adolescent study participants in longitudinal research.

    Science.gov (United States)

    Davis, Erin; Demby, Hilary; Jenner, Lynne Woodward; Gregory, Alethia; Broussard, Marsha

    2016-02-01

    Maintaining contact with and collecting outcome data from adolescent study participants can present a significant challenge for researchers conducting longitudinal studies. Establishing an organized and effective protocol for participant follow-up is crucial to reduce attrition and maintain high retention rates. This paper describes our methods in using and adapting the evidence-based Engagement, Verification, Maintenance, and Confirmation (EVMC) model to follow up with adolescents 6 and 12 months after implementation of a health program. It extends previous research by focusing on two key modifications to the model: (1) the central role of cell phones and texting to maintain contact with study participants throughout the EVMC process and, (2) use of responsive two-way communication between staff and participants and flexible administration modes and methods in the confirmation phase to ensure that busy teens not only respond to contacts, but also complete data collection. These strategies have resulted in high overall retention rates (87-91%) with adolescent study participants at each follow-up data collection point without the utilization of other, more involved tracking measures. The methods and findings presented may be valuable for other researchers with limited resources planning for or engaged in collecting follow-up outcome data from adolescents enrolled in longitudinal studies. Copyright © 2015. Published by Elsevier Ltd.

  11. Learning Design of Problem Based Learning Model Based on Recommendations of Sintax Study and Contents Issues on Physics Impulse Materials with Experimental Activities

    Directory of Open Access Journals (Sweden)

    Kristia Agustina

    2017-08-01

    Full Text Available This study aims to design learning Problem Based Learning Model based on syntax study recommendations and content issues on Physics Impulse materials through experiments. This research is a development research with Kemp model. The reference for making the learning design is the result of the syntax study and the content of existing PBL implementation problems from Agustina research. This instructional design is applied to the physics material about Impulse done through experimental activity. Limited trials were conducted on the SWCU Physics Education Study Program students group Salatiga, while the validity test was conducted by high school teachers and physics education lecturers. The results of the trial evaluation are limited and the validity test is used to improve the designs that have been made. The conclusion of this research is the design of learning by using PBL model on Impuls material by referring the result of syntax study and the problem content of existing PBL implementation can be produced by learning activity designed in laboratory experiment activity. The actual problem for Impuls material can be used car crash test video at factory. The results of validation tests and limited trials conducted by researchers assessed that the design of learning made by researchers can be used with small revisions. Suggestions from this research are in making learning design by using PBL model to get actual problem can by collecting news that come from newspaper, YouTube, internet, and television.

  12. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    Science.gov (United States)

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-06-24

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches.

  13. A study on model fidelity for model predictive control-based obstacle avoidance in high-speed autonomous ground vehicles

    Science.gov (United States)

    Liu, Jiechao; Jayakumar, Paramsothy; Stein, Jeffrey L.; Ersal, Tulga

    2016-11-01

    This paper investigates the level of model fidelity needed in order for a model predictive control (MPC)-based obstacle avoidance algorithm to be able to safely and quickly avoid obstacles even when the vehicle is close to its dynamic limits. The context of this work is large autonomous ground vehicles that manoeuvre at high speed within unknown, unstructured, flat environments and have significant vehicle dynamics-related constraints. Five different representations of vehicle dynamics models are considered: four variations of the two degrees-of-freedom (DoF) representation as lower fidelity models and a fourteen DoF representation with combined-slip Magic Formula tyre model as a higher fidelity model. It is concluded that the two DoF representation that accounts for tyre nonlinearities and longitudinal load transfer is necessary for the MPC-based obstacle avoidance algorithm in order to operate the vehicle at its limits within an environment that includes large obstacles. For less challenging environments, however, the two DoF representation with linear tyre model and constant axle loads is sufficient.

  14. Agent-based Modeling Automated: Data-driven Generation of Innovation Diffusion Models

    NARCIS (Netherlands)

    Jensen, T.; Chappin, E.J.L.

    2016-01-01

    Simulation modeling is useful to gain insights into driving mechanisms of diffusion of innovations. This study aims to introduce automation to make identification of such mechanisms with agent-based simulation modeling less costly in time and labor. We present a novel automation procedure in which

  15. Introducing Model-Based System Engineering Transforming System Engineering through Model-Based Systems Engineering

    Science.gov (United States)

    2014-03-31

    Web  Presentation...Software  .....................................................  20   Figure  6.  Published   Web  Page  from  Data  Collection...the  term  Model  Based  Engineering  (MBE),  Model  Driven  Engineering  ( MDE ),  or  Model-­‐Based  Systems  

  16. Skull base tumor model.

    Science.gov (United States)

    Gragnaniello, Cristian; Nader, Remi; van Doormaal, Tristan; Kamel, Mahmoud; Voormolen, Eduard H J; Lasio, Giovanni; Aboud, Emad; Regli, Luca; Tulleken, Cornelius A F; Al-Mefty, Ossama

    2010-11-01

    Resident duty-hours restrictions have now been instituted in many countries worldwide. Shortened training times and increased public scrutiny of surgical competency have led to a move away from the traditional apprenticeship model of training. The development of educational models for brain anatomy is a fascinating innovation allowing neurosurgeons to train without the need to practice on real patients and it may be a solution to achieve competency within a shortened training period. The authors describe the use of Stratathane resin ST-504 polymer (SRSP), which is inserted at different intracranial locations to closely mimic meningiomas and other pathological entities of the skull base, in a cadaveric model, for use in neurosurgical training. Silicone-injected and pressurized cadaveric heads were used for studying the SRSP model. The SRSP presents unique intrinsic metamorphic characteristics: liquid at first, it expands and foams when injected into the desired area of the brain, forming a solid tumorlike structure. The authors injected SRSP via different passages that did not influence routes used for the surgical approach for resection of the simulated lesion. For example, SRSP injection routes included endonasal transsphenoidal or transoral approaches if lesions were to be removed through standard skull base approach, or, alternatively, SRSP was injected via a cranial approach if the removal was planned to be via the transsphenoidal or transoral route. The model was set in place in 3 countries (US, Italy, and The Netherlands), and a pool of 13 physicians from 4 different institutions (all surgeons and surgeons in training) participated in evaluating it and provided feedback. All 13 evaluating physicians had overall positive impressions of the model. The overall score on 9 components evaluated--including comparison between the tumor model and real tumor cases, perioperative requirements, general impression, and applicability--was 88% (100% being the best possible

  17. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  18. Validation of a model-based measurement of the minimum insert thickness of knee prostheses: a retrieval study.

    Science.gov (United States)

    van IJsseldijk, E A; Harman, M K; Luetzner, J; Valstar, E R; Stoel, B C; Nelissen, R G H H; Kaptein, B L

    2014-10-01

    Wear of polyethylene inserts plays an important role in failure of total knee replacement and can be monitored in vivo by measuring the minimum joint space width in anteroposterior radiographs. The objective of this retrospective cross-sectional study was to compare the accuracy and precision of a new model-based method with the conventional method by analysing the difference between the minimum joint space width measurements and the actual thickness of retrieved polyethylene tibial inserts. Before revision, the minimum joint space width values and their locations on the insert were measured in 15 fully weight-bearing radiographs. These measurements were compared with the actual minimum thickness values and locations of the retrieved tibial inserts after revision. The mean error in the model-based minimum joint space width measurement was significantly smaller than the conventional method for medial condyles (0.50 vs 0.94 mm, p model-based measurements was less than 10 mm in the medial direction in 12 cases and less in the lateral direction in 13 cases. The model-based minimum joint space width measurement method is more accurate than the conventional measurement with the same precision. Cite this article: Bone Joint Res 2014;3:289-96. ©2014 The British Editorial Society of Bone & Joint Surgery.

  19. Structural Acoustic Physics Based Modeling of Curved Composite Shells

    Science.gov (United States)

    2017-09-19

    NUWC-NPT Technical Report 12,236 19 September 2017 Structural Acoustic Physics -Based Modeling of Curved Composite Shells Rachel E. Hesse...SUBTITLE Structural Acoustic Physics -Based Modeling of Curved Composite Shells 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...study was to use physics -based modeling (PBM) to investigate wave propagations through curved shells that are subjected to acoustic excitation. An

  20. Study on heat transfer and hydraulic model of spiral-fin fuel rods based on equivalent annulus method

    International Nuclear Information System (INIS)

    Zhang Dan; Liu Changwen; Lu Jianchao

    2011-01-01

    Tight lattice fuel assembly usually adopts spiral-fin fuel elements. Compared with the traditional PWR fuel rods, the closely packed and spiral fin spacers make the heat transfer and hydraulic phenomena in sub-channels very complicated, and: there was no suitable model and correlation to study it. This paper studied the effect of spiral spacers on the channel geometry in the equivalent annulus and physical performance based on the Rehme equivalent annulus methods, and the heat transfer of the spiral fin fuel rods and hydraulic model were obtained. The new model was verified with the traditional one, and the verification showed that two new models agreed well, which could provide certain theoretical explanation to the effect of the spiral spacer on the thermal hydraulics. (authors)

  1. A Full-Body Layered Deformable Model for Automatic Model-Based Gait Recognition

    Science.gov (United States)

    Lu, Haiping; Plataniotis, Konstantinos N.; Venetsanopoulos, Anastasios N.

    2007-12-01

    This paper proposes a full-body layered deformable model (LDM) inspired by manually labeled silhouettes for automatic model-based gait recognition from part-level gait dynamics in monocular video sequences. The LDM is defined for the fronto-parallel gait with 22 parameters describing the human body part shapes (widths and lengths) and dynamics (positions and orientations). There are four layers in the LDM and the limbs are deformable. Algorithms for LDM-based human body pose recovery are then developed to estimate the LDM parameters from both manually labeled and automatically extracted silhouettes, where the automatic silhouette extraction is through a coarse-to-fine localization and extraction procedure. The estimated LDM parameters are used for model-based gait recognition by employing the dynamic time warping for matching and adopting the combination scheme in AdaBoost.M2. While the existing model-based gait recognition approaches focus primarily on the lower limbs, the estimated LDM parameters enable us to study full-body model-based gait recognition by utilizing the dynamics of the upper limbs, the shoulders and the head as well. In the experiments, the LDM-based gait recognition is tested on gait sequences with differences in shoe-type, surface, carrying condition and time. The results demonstrate that the recognition performance benefits from not only the lower limb dynamics, but also the dynamics of the upper limbs, the shoulders and the head. In addition, the LDM can serve as an analysis tool for studying factors affecting the gait under various conditions.

  2. On Feature Relevance in Image-Based Prediction Models: An Empirical Study

    DEFF Research Database (Denmark)

    Konukoglu, E.; Ganz, Melanie; Van Leemput, Koen

    2013-01-01

    Determining disease-related variations of the anatomy and function is an important step in better understanding diseases and developing early diagnostic systems. In particular, image-based multivariate prediction models and the “relevant features” they produce are attracting attention from the co...

  3. Large-scale Comparative Study of Hi-C-based Chromatin 3D Structure Modeling Methods

    KAUST Repository

    Wang, Cheng

    2018-05-17

    Chromatin is a complex polymer molecule in eukaryotic cells, primarily consisting of DNA and histones. Many works have shown that the 3D folding of chromatin structure plays an important role in DNA expression. The recently proposed Chro- mosome Conformation Capture technologies, especially the Hi-C assays, provide us an opportunity to study how the 3D structures of the chromatin are organized. Based on the data from Hi-C experiments, many chromatin 3D structure modeling methods have been proposed. However, there is limited ground truth to validate these methods and no robust chromatin structure alignment algorithms to evaluate the performance of these methods. In our work, we first made a thorough literature review of 25 publicly available population Hi-C-based chromatin 3D structure modeling methods. Furthermore, to evaluate and to compare the performance of these methods, we proposed a novel data simulation method, which combined the population Hi-C data and single-cell Hi-C data without ad hoc parameters. Also, we designed a global and a local alignment algorithms to measure the similarity between the templates and the chromatin struc- tures predicted by different modeling methods. Finally, the results from large-scale comparative tests indicated that our alignment algorithms significantly outperform the algorithms in literature.

  4. Elastoplastic cup model for cement-based materials

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2010-03-01

    Full Text Available Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results. The case study shows that this cup model has extensive applicability for cement-based materials and other quasi-brittle and high-porosity materials in a complex stress state.

  5. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  6. Guide to APA-Based Models

    Science.gov (United States)

    Robins, Robert E.; Delisi, Donald P.

    2008-01-01

    In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.

  7. PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Toni Bielić

    2014-10-01

    Full Text Available 800x600 This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introducing teamwork on board the ship. Three examples of the ship’s accidents are studied and evaluated through “Leader - participation” model. The model of participation based management as a model of the teamwork has been applied in studying the cause - and - effect of accidents with the critical review of the communication and managing the human resources on a ship. The results have showed that the cause of all three accidents is the autocratic behaviour of the leaders and lack of communication within teams. Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4

  8. Study on solitary word based on HMM model and Baum-Welch algorithm

    Directory of Open Access Journals (Sweden)

    Junxia CHEN

    Full Text Available This paper introduces the principle of Hidden Markov Model, which is used to describe the Markov process with unknown parameters, is a probability model to describe the statistical properties of the random process. On this basis, designed a solitary word detection experiment based on HMM model, by optimizing the experimental model, Using Baum-Welch algorithm for training the problem of solving the HMM model, HMM model to estimate the parameters of the λ value is found, in this view of mathematics equivalent to other linear prediction coefficient. This experiment in reducing unnecessary HMM training at the same time, reduced the algorithm complexity. In order to test the effectiveness of the Baum-Welch algorithm, The simulation of experimental data, the results show that the algorithm is effective.

  9. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  10. Study on a Threat-Countermeasure Model Based on International Standard Information

    Directory of Open Access Journals (Sweden)

    Guillermo Horacio Ramirez Caceres

    2008-12-01

    Full Text Available Many international standards exist in the field of IT security. This research is based on the ISO/IEC 15408, 15446, 19791, 13335 and 17799 standards. In this paper, we propose a knowledge base comprising a threat countermeasure model based on international standards for identifying and specifying threats which affect IT environments. In addition, the proposed knowledge base system aims at fusing similar security control policies and objectives in order to create effective security guidelines for specific IT environments. As a result, a knowledge base of security objectives was developed on the basis of the relationships inside the standards as well as the relationships between different standards. In addition, a web application was developed which displays details about the most common threats to information systems, and for each threat presents a set of related security control policies from different international standards, including ISO/IEC 27002.

  11. Study of Railway Track Irregularity Standard Deviation Time Series Based on Data Mining and Linear Model

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2013-01-01

    Full Text Available Good track geometry state ensures the safe operation of the railway passenger service and freight service. Railway transportation plays an important role in the Chinese economic and social development. This paper studies track irregularity standard deviation time series data and focuses on the characteristics and trend changes of track state by applying clustering analysis. Linear recursive model and linear-ARMA model based on wavelet decomposition reconstruction are proposed, and all they offer supports for the safe management of railway transportation.

  12. Study on geology and geological structure based on literature studies

    International Nuclear Information System (INIS)

    Funaki, Hironori; Ishii, Eiichi; Yasue, Ken-ichi; Takahashi, Kazuharu

    2005-03-01

    Japan Nuclear Cycle Development Institute (JNC) is proceeding with underground research laboratory (URL) project for the sedimentary rock in Horonobe, Hokkaido. This project is an investigation project which is planned over 20 years. Surface-based investigations (Phase 1) have been conducted for the present. The purposes of the Phase 1 are to construct the geological environment model (geological-structural, hydrogeological, and hydrochemical models) and to confirm the applicability of investigation technologies for the geological environment. The geological-structural model comprises the base for the hydrogeological and hydrochemical models. We constructed the geological-structural model by mainly using data obtained from literature studies. Particulars regarding which data the model is based on and who has performed the interpretation are also saved for traceability. As a result, we explain the understanding of degree and the need of information on stratigraphy and discontinuous structure. (author)

  13. Towards a standard model for research in agent-based modeling and simulation

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2015-11-01

    Full Text Available Agent-based modeling (ABM is a bottom-up modeling approach, where each entity of the system being modeled is uniquely represented as an independent decision-making agent. ABMs are very sensitive to implementation details. Thus, it is very easy to inadvertently introduce changes which modify model dynamics. Such problems usually arise due to the lack of transparency in model descriptions, which constrains how models are assessed, implemented and replicated. In this paper, we present PPHPC, a model which aims to serve as a standard in agent based modeling research, namely, but not limited to, conceptual model specification, statistical analysis of simulation output, model comparison and parallelization studies. This paper focuses on the first two aspects (conceptual model specification and statistical analysis of simulation output, also providing a canonical implementation of PPHPC. The paper serves as a complete reference to the presented model, and can be used as a tutorial for simulation practitioners who wish to improve the way they communicate their ABMs.

  14. Bayesian Based Diagnostic Model for Condition Based Maintenance of Offshore Wind Farms

    Directory of Open Access Journals (Sweden)

    Masoud Asgarpour

    2018-01-01

    Full Text Available Operation and maintenance costs are a major contributor to the Levelized Cost of Energy for electricity produced by offshore wind and can be significantly reduced if existing corrective actions are performed as efficiently as possible and if future corrective actions are avoided by performing sufficient preventive actions. This paper presents an applied and generic diagnostic model for fault detection and condition based maintenance of offshore wind components. The diagnostic model is based on two probabilistic matrices; first, a confidence matrix, representing the probability of detection using each fault detection method, and second, a diagnosis matrix, representing the individual outcome of each fault detection method. Once the confidence and diagnosis matrices of a component are defined, the individual diagnoses of each fault detection method are combined into a final verdict on the fault state of that component. Furthermore, this paper introduces a Bayesian updating model based on observations collected by inspections to decrease the uncertainty of initial confidence matrix. The framework and implementation of the presented diagnostic model are further explained within a case study for a wind turbine component based on vibration, temperature, and oil particle fault detection methods. The last part of the paper will have a discussion of the case study results and present conclusions.

  15. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  16. Activity-based DEVS modeling

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2018-01-01

    architecture and the UML concepts. In this paper, we further this work by grounding Activity-based DEVS modeling and developing a fully-fledged modeling engine to demonstrate applicability. We also detail the relevant aspects of the created metamodel in terms of modeling and simulation. A significant number......Use of model-driven approaches has been increasing to significantly benefit the process of building complex systems. Recently, an approach for specifying model behavior using UML activities has been devised to support the creation of DEVS models in a disciplined manner based on the model driven...... of the artifacts of the UML 2.5 activities and actions, from the vantage point of DEVS behavioral modeling, is covered in details. Their semantics are discussed to the extent of time-accurate requirements for simulation. We characterize them in correspondence with the specification of the atomic model behavior. We...

  17. Effectiveness of Facebook Based Learning to Enhance Creativity among Islamic Studies Students by Employing Isman Instructional Design Model

    Science.gov (United States)

    Alias, Norlidah; Siraj, Saedah; Daud, Mohd Khairul Azman Md; Hussin, Zaharah

    2013-01-01

    The study examines the effectiveness of Facebook based learning to enhance creativity among Islamic Studies students in the secondary educational setting in Malaysia. It describes the design process by employing the Isman Instructional Design Model. A quantitative study was carried out using experimental method and background survey. The…

  18. A study of tumour growth based on stoichiometric principles: a continuous model and its discrete analogue.

    Science.gov (United States)

    Saleem, M; Agrawal, Tanuja; Anees, Afzal

    2014-01-01

    In this paper, we consider a continuous mathematically tractable model and its discrete analogue for the tumour growth. The model formulation is based on stoichiometric principles considering tumour-immune cell interactions in potassium (K (+))-limited environment. Our both continuous and discrete models illustrate 'cancer immunoediting' as a dynamic process having all three phases namely elimination, equilibrium and escape. The stoichiometric principles introduced into the model allow us to study its dynamics with the variation in the total potassium in the surrounding of the tumour region. It is found that an increase in the total potassium may help the patient fight the disease for a longer period of time. This result seems to be in line with the protective role of the potassium against the risk of pancreatic cancer as has been reported by Bravi et al. [Dietary intake of selected micronutrients and risk of pancreatic cancer: An Italian case-control study, Ann. Oncol. 22 (2011), pp. 202-206].

  19. A semi-analytical bearing model considering outer race flexibility for model based bearing load monitoring

    Science.gov (United States)

    Kerst, Stijn; Shyrokau, Barys; Holweg, Edward

    2018-05-01

    This paper proposes a novel semi-analytical bearing model addressing flexibility of the bearing outer race structure. It furthermore presents the application of this model in a bearing load condition monitoring approach. The bearing model is developed as current computational low cost bearing models fail to provide an accurate description of the more and more common flexible size and weight optimized bearing designs due to their assumptions of rigidity. In the proposed bearing model raceway flexibility is described by the use of static deformation shapes. The excitation of the deformation shapes is calculated based on the modelled rolling element loads and a Fourier series based compliance approximation. The resulting model is computational low cost and provides an accurate description of the rolling element loads for flexible outer raceway structures. The latter is validated by a simulation-based comparison study with a well-established bearing simulation software tool. An experimental study finally shows the potential of the proposed model in a bearing load monitoring approach.

  20. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  1. Perceptual decision neurosciences: a model-based review

    NARCIS (Netherlands)

    Mulder, M.J.; van Maanen, L.; Forstmann, B.U.

    2014-01-01

    In this review we summarize findings published over the past 10 years focusing on the neural correlates of perceptual decision-making. Importantly, this review highlights only studies that employ a model-based approach, i.e., they use quantitative cognitive models in combination with neuroscientific

  2. Structuring Qualitative Data for Agent-Based Modelling

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dijkema, Gerard P.J.; Schrauwen, Noortje

    2015-01-01

    Using ethnography to build agent-based models may result in more empirically grounded simulations. Our study on innovation practice and culture in the Westland horticulture sector served to explore what information and data from ethnographic analysis could be used in models and how. MAIA, a

  3. A system-theory-based model for monthly river runoff forecasting: model calibration and optimization

    Directory of Open Access Journals (Sweden)

    Wu Jianhua

    2014-03-01

    Full Text Available River runoff is not only a crucial part of the global water cycle, but it is also an important source for hydropower and an essential element of water balance. This study presents a system-theory-based model for river runoff forecasting taking the Hailiutu River as a case study. The forecasting model, designed for the Hailiutu watershed, was calibrated and verified by long-term precipitation observation data and groundwater exploitation data from the study area. Additionally, frequency analysis, taken as an optimization technique, was applied to improve prediction accuracy. Following model optimization, the overall relative prediction errors are below 10%. The system-theory-based prediction model is applicable to river runoff forecasting, and following optimization by frequency analysis, the prediction error is acceptable.

  4. Interactive Coherence-Based Façade Modeling

    KAUST Repository

    Musialski, Przemyslaw

    2012-05-01

    We propose a novel interactive framework for modeling building facades from images. Our method is based on the notion of coherence-based editing which allows exploiting partial symmetries across the facade at any level of detail. The proposed workflow mixes manual interaction with automatic splitting and grouping operations based on unsupervised cluster analysis. In contrast to previous work, our approach leads to detailed 3d geometric models with up to several thousand regions per facade. We compare our modeling scheme to others and evaluate our approach in a user study with an experienced user and several novice users.

  5. A Case Study on a Combination NDVI Forecasting Model Based on the Entropy Weight Method

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Shengzhi; Ming, Bo; Huang, Qiang; Leng, Guoyong; Hou, Beibei

    2017-05-05

    It is critically meaningful to accurately predict NDVI (Normalized Difference Vegetation Index), which helps guide regional ecological remediation and environmental managements. In this study, a combination forecasting model (CFM) was proposed to improve the performance of NDVI predictions in the Yellow River Basin (YRB) based on three individual forecasting models, i.e., the Multiple Linear Regression (MLR), Artificial Neural Network (ANN), and Support Vector Machine (SVM) models. The entropy weight method was employed to determine the weight coefficient for each individual model depending on its predictive performance. Results showed that: (1) ANN exhibits the highest fitting capability among the four orecasting models in the calibration period, whilst its generalization ability becomes weak in the validation period; MLR has a poor performance in both calibration and validation periods; the predicted results of CFM in the calibration period have the highest stability; (2) CFM generally outperforms all individual models in the validation period, and can improve the reliability and stability of predicted results through combining the strengths while reducing the weaknesses of individual models; (3) the performances of all forecasting models are better in dense vegetation areas than in sparse vegetation areas.

  6. Low contrast detectability and spatial resolution with model-based iterative reconstructions of MDCT images: a phantom and cadaveric study

    Energy Technology Data Exchange (ETDEWEB)

    Millon, Domitille; Coche, Emmanuel E. [Universite Catholique de Louvain, Department of Radiology and Medical Imaging, Cliniques Universitaires Saint Luc, Brussels (Belgium); Vlassenbroek, Alain [Philips Healthcare, Brussels (Belgium); Maanen, Aline G. van; Cambier, Samantha E. [Universite Catholique de Louvain, Statistics Unit, King Albert II Cancer Institute, Brussels (Belgium)

    2017-03-15

    To compare image quality [low contrast (LC) detectability, noise, contrast-to-noise (CNR) and spatial resolution (SR)] of MDCT images reconstructed with an iterative reconstruction (IR) algorithm and a filtered back projection (FBP) algorithm. The experimental study was performed on a 256-slice MDCT. LC detectability, noise, CNR and SR were measured on a Catphan phantom scanned with decreasing doses (48.8 down to 0.7 mGy) and parameters typical of a chest CT examination. Images were reconstructed with FBP and a model-based IR algorithm. Additionally, human chest cadavers were scanned and reconstructed using the same technical parameters. Images were analyzed to illustrate the phantom results. LC detectability and noise were statistically significantly different between the techniques, supporting model-based IR algorithm (p < 0.0001). At low doses, the noise in FBP images only enabled SR measurements of high contrast objects. The superior CNR of model-based IR algorithm enabled lower dose measurements, which showed that SR was dose and contrast dependent. Cadaver images reconstructed with model-based IR illustrated that visibility and delineation of anatomical structure edges could be deteriorated at low doses. Model-based IR improved LC detectability and enabled dose reduction. At low dose, SR became dose and contrast dependent. (orig.)

  7. Gradient-based model calibration with proxy-model assistance

    Science.gov (United States)

    Burrows, Wesley; Doherty, John

    2016-02-01

    Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.

  8. Quantitative Analysis of Intra Urban Growth Modeling using socio economic agents by combining cellular automata model with agent based model

    Science.gov (United States)

    Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.

    2017-12-01

    Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes

  9. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    Probabilistic trust has been adopted as an approach to taking security sensitive decisions in modern global computing environments. Existing probabilistic trust frameworks either assume fixed behaviour for the principals or incorporate the notion of ‘decay' as an ad hoc approach to cope...... with their dynamic behaviour. Using Hidden Markov Models (HMMs) for both modelling and approximating the behaviours of principals, we introduce the HMM-based trust model as a new approach to evaluating trust in systems exhibiting dynamic behaviour. This model avoids the fixed behaviour assumption which is considered...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  10. The Challenge of Forecasting Metropolitan Growth: Urban Characteristics Based Models versus Regional Dummy Based Models

    OpenAIRE

    NA

    2005-01-01

    This paper presents a study of errors in forecasting the population of Metropolitan Statistical Areas and the Primary MSAs of Consolidated Metropolitan Statistical Areas and New England MAs. The forecasts are for the year 2000 and are based on a semi-structural model estimated by Mills and Lubelle using 1970 to 1990 census data on population, employment and relative real wages. This model allows the testing of regional effects on population and employment growth. The year 2000 forecasts are f...

  11. Generalized Empirical Likelihood-Based Focused Information Criterion and Model Averaging

    Directory of Open Access Journals (Sweden)

    Naoya Sueishi

    2013-07-01

    Full Text Available This paper develops model selection and averaging methods for moment restriction models. We first propose a focused information criterion based on the generalized empirical likelihood estimator. We address the issue of selecting an optimal model, rather than a correct model, for estimating a specific parameter of interest. Then, this study investigates a generalized empirical likelihood-based model averaging estimator that minimizes the asymptotic mean squared error. A simulation study suggests that our averaging estimator can be a useful alternative to existing post-selection estimators.

  12. Trait-based model development to support breeding programs. A case study for salt tolerance and rice.

    Science.gov (United States)

    Paleari, Livia; Movedi, Ermes; Confalonieri, Roberto

    2017-06-28

    Eco-physiological models are increasingly used to analyze G × E × M interactions to support breeding programs via the design of ideotypes for specific contexts. However, available crop models are only partly suitable for this purpose, since they often lack clear relationships between parameters and traits breeders are working on. Taking salt stress tolerance and rice as a case study, we propose a paradigm shift towards the building of ideotyping-specific models explicitly around traits involved in breeding programs. Salt tolerance is a complex trait relying on different physiological processes that can be alternatively selected to improve the overall crop tolerance. We developed a new model explicitly accounting for these traits and we evaluated its performance using data from growth chamber experiments (e.g., R 2 ranged from 0.74 to 0.94 for the biomass of different plant organs). Using the model, we were able to show how an increase in the overall tolerance can derive from completely different physiological mechanisms according to soil/water salinity dynamics. The study demonstrated that a trait-based approach can increase the usefulness of mathematical models for supporting breeding programs.

  13. Base Station Performance Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  14. Physically based modeling of rainfall-triggered landslides: a case study in the Luquillo forest, Puerto Rico

    Science.gov (United States)

    Lepore, C.; Arnone, E.; Noto, L. V.; Sivandran, G.; Bras, R. L.

    2013-09-01

    This paper presents the development of a rainfall-triggered landslide module within an existing physically based spatially distributed ecohydrologic model. The model, tRIBS-VEGGIE (Triangulated Irregular Networks-based Real-time Integrated Basin Simulator and Vegetation Generator for Interactive Evolution), is capable of a sophisticated description of many hydrological processes; in particular, the soil moisture dynamics are resolved at a temporal and spatial resolution required to examine the triggering mechanisms of rainfall-induced landslides. The validity of the tRIBS-VEGGIE model to a tropical environment is shown with an evaluation of its performance against direct observations made within the study area of Luquillo Forest. The newly developed landslide module builds upon the previous version of the tRIBS landslide component. This new module utilizes a numerical solution to the Richards' equation (present in tRIBS-VEGGIE but not in tRIBS), which better represents the time evolution of soil moisture transport through the soil column. Moreover, the new landslide module utilizes an extended formulation of the factor of safety (FS) to correctly quantify the role of matric suction in slope stability and to account for unsaturated conditions in the evaluation of FS. The new modeling framework couples the capabilities of the detailed hydrologic model to describe soil moisture dynamics with the infinite slope model, creating a powerful tool for the assessment of rainfall-triggered landslide risk.

  15. Physically based modeling of rainfall-triggered landslides: a case study in the Luquillo forest, Puerto Rico

    Directory of Open Access Journals (Sweden)

    C. Lepore

    2013-09-01

    Full Text Available This paper presents the development of a rainfall-triggered landslide module within an existing physically based spatially distributed ecohydrologic model. The model, tRIBS-VEGGIE (Triangulated Irregular Networks-based Real-time Integrated Basin Simulator and Vegetation Generator for Interactive Evolution, is capable of a sophisticated description of many hydrological processes; in particular, the soil moisture dynamics are resolved at a temporal and spatial resolution required to examine the triggering mechanisms of rainfall-induced landslides. The validity of the tRIBS-VEGGIE model to a tropical environment is shown with an evaluation of its performance against direct observations made within the study area of Luquillo Forest. The newly developed landslide module builds upon the previous version of the tRIBS landslide component. This new module utilizes a numerical solution to the Richards' equation (present in tRIBS-VEGGIE but not in tRIBS, which better represents the time evolution of soil moisture transport through the soil column. Moreover, the new landslide module utilizes an extended formulation of the factor of safety (FS to correctly quantify the role of matric suction in slope stability and to account for unsaturated conditions in the evaluation of FS. The new modeling framework couples the capabilities of the detailed hydrologic model to describe soil moisture dynamics with the infinite slope model, creating a powerful tool for the assessment of rainfall-triggered landslide risk.

  16. Energy Sustainability Evaluation Model Based on the Matter-Element Extension Method: A Case Study of Shandong Province, China

    Directory of Open Access Journals (Sweden)

    Siqi Li

    2017-11-01

    Full Text Available Energy sustainability is of vital importance to regional sustainability, because energy sustainability is closely related to both regional economic growth and social stability. The existing energy sustainability evaluation methods lack a unified system to determine the relevant influencing factors, are relatively weak in quantitative analysis, and do not fully describe the ‘paradoxical’ characteristics of energy sustainability. To solve those problems and to reasonably and objectively evaluate energy sustainability, we propose an energy sustainability evaluation model based on the matter-element extension method. We first select energy sustainability evaluation indexes based on previous research and experience. Then, a variation coefficient method is used to determine the weights of these indexes. Finally, the study establishes the classical domain, joint domain, and the matter-element relationship to evaluate energy sustainability through matter-element extension. Data from Shandong Province is used as a case study to evaluate the region’s energy sustainability. The case study shows that the proposed energy sustainability evaluation model, based on the matter-element extension method, can effectively evaluate regional energy sustainability.

  17. Variability in Dopamine Genes Dissociates Model-Based and Model-Free Reinforcement Learning.

    Science.gov (United States)

    Doll, Bradley B; Bath, Kevin G; Daw, Nathaniel D; Frank, Michael J

    2016-01-27

    Considerable evidence suggests that multiple learning systems can drive behavior. Choice can proceed reflexively from previous actions and their associated outcomes, as captured by "model-free" learning algorithms, or flexibly from prospective consideration of outcomes that might occur, as captured by "model-based" learning algorithms. However, differential contributions of dopamine to these systems are poorly understood. Dopamine is widely thought to support model-free learning by modulating plasticity in striatum. Model-based learning may also be affected by these striatal effects, or by other dopaminergic effects elsewhere, notably on prefrontal working memory function. Indeed, prominent demonstrations linking striatal dopamine to putatively model-free learning did not rule out model-based effects, whereas other studies have reported dopaminergic modulation of verifiably model-based learning, but without distinguishing a prefrontal versus striatal locus. To clarify the relationships between dopamine, neural systems, and learning strategies, we combine a genetic association approach in humans with two well-studied reinforcement learning tasks: one isolating model-based from model-free behavior and the other sensitive to key aspects of striatal plasticity. Prefrontal function was indexed by a polymorphism in the COMT gene, differences of which reflect dopamine levels in the prefrontal cortex. This polymorphism has been associated with differences in prefrontal activity and working memory. Striatal function was indexed by a gene coding for DARPP-32, which is densely expressed in the striatum where it is necessary for synaptic plasticity. We found evidence for our hypothesis that variations in prefrontal dopamine relate to model-based learning, whereas variations in striatal dopamine function relate to model-free learning. Decisions can stem reflexively from their previously associated outcomes or flexibly from deliberative consideration of potential choice outcomes

  18. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  19. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  20. Model based design introduction: modeling game controllers to microprocessor architectures

    Science.gov (United States)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  1. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  2. Towards A Model-based Prognostics Methodology for Electrolytic Capacitors: A Case Study Based on Electrical Overstress Accelerated Aging

    Directory of Open Access Journals (Sweden)

    Gautam Biswas

    2012-12-01

    Full Text Available This paper presents a model-driven methodology for predict- ing the remaining useful life of electrolytic capacitors. This methodology adopts a Kalman filter approach in conjunction with an empirical state-based degradation model to predict the degradation of capacitor parameters through the life of the capacitor. Electrolytic capacitors are important components of systems that range from power supplies on critical avion- ics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their critical role in the system, they are good candidates for component level prognostics and health management. Prognostics provides a way to assess remain- ing useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. This paper proposes and empirical degradation model and discusses experimental results for an accelerated aging test performed on a set of identical capacitors subjected to electrical stress. The data forms the basis for developing the Kalman-filter based remaining life prediction algorithm.

  3. Physiologically Based Toxicokinetic Modelling as a Tool to Support Risk Assessment: Three Case Studies

    Directory of Open Access Journals (Sweden)

    Hans Mielke

    2012-01-01

    Full Text Available In this contribution we present three case studies of physiologically based toxicokinetic (PBTK modelling in regulatory risk assessment. (1 Age-dependent lower enzyme expression in the newborn leads to bisphenol A (BPA blood levels which are near the levels of the tolerated daily intake (TDI at the oral exposure as calculated by EFSA. (2 Dermal exposure of BPA by receipts, car park tickets, and so forth, contribute to the exposure towards BPA. However, at the present levels of dermal exposure there is no risk for the adult. (3 Dermal exposure towards coumarin via cosmetic products leads to external exposures of two-fold the TDI. PBTK modeling helped to identify liver peak concentration as the metric for liver toxicity. After dermal exposure of twice the TDI, the liver peak concentration was lower than that present after oral exposure with the TDI dose. In the presented cases, PBTK modeling was useful to reach scientifically sound regulatory decisions.

  4. In silico modelling and molecular dynamics simulation studies of thiazolidine based PTP1B inhibitors.

    Science.gov (United States)

    Mahapatra, Manoj Kumar; Bera, Krishnendu; Singh, Durg Vijay; Kumar, Rajnish; Kumar, Manoj

    2018-04-01

    Protein tyrosine phosphatase 1B (PTP1B) has been identified as a negative regulator of insulin and leptin signalling pathway; hence, it can be considered as a new therapeutic target of intervention for the treatment of type 2 diabetes. Inhibition of this molecular target takes care of both diabetes and obesity, i.e. diabestiy. In order to get more information on identification and optimization of lead, pharmacophore modelling, atom-based 3D QSAR, docking and molecular dynamics studies were carried out on a set of ligands containing thiazolidine scaffold. A six-point pharmacophore model consisting of three hydrogen bond acceptor (A), one negative ionic (N) and two aromatic rings (R) with discrete geometries as pharmacophoric features were developed for a predictive 3D QSAR model. The probable binding conformation of the ligands within the active site was studied through molecular docking. The molecular interactions and the structural features responsible for PTP1B inhibition and selectivity were further supplemented by molecular dynamics simulation study for a time scale of 30 ns. The present investigation has identified some of the indispensible structural features of thiazolidine analogues which can further be explored to optimize PTP1B inhibitors.

  5. EPR-based material modelling of soils

    Science.gov (United States)

    Faramarzi, Asaad; Alani, Amir M.

    2013-04-01

    In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.

  6. Application Study of Comprehensive Forecasting Model Based on Entropy Weighting Method on Trend of PM2.5 Concentration in Guangzhou, China

    Science.gov (United States)

    Liu, Dong-jun; Li, Li

    2015-01-01

    For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332

  7. SMS and Web-Based e-Government Model Case Study: Citizens Complaints Management System at District of Gihosha –Burundi

    Directory of Open Access Journals (Sweden)

    Mugenzi Thierry

    2017-01-01

    Full Text Available E-Government basically comprises the use of electronic communications technologies such as the internet, in enhancing and advancing the citizens access to public services. In most developing countries including Burundi, citizens are facing many difficulties for accessing public services. One of the identified problems is the poor quality of service in managing citizens’ complaints. This study proposes an SMS and web based e-Government Model as a solution. In this study, a case study of a complaint management system at District of Gihosha has been used as a reference to prove that SMS and Web based e-Government Model can enhances the access of public services. The objective of this study is the development of an SMS and web-based system that can enhances the process and the management of citizens’ complaints at District of Gihosha. The system has been developed using PHP as front end, Apache as web server, MySQL as Database and Gammu as SMS gateway. The obtained results after testing the system shows that all the functionalities of the developed system worked properly. Thus, the SMS and web based complaint management system developed is considered to be effective.

  8. EFFECTS OF ECONOMIC BEHAVIOUR AND PEIPLE MIGRANTION ON THE EPIDEMIOLOGY OF MALARIA : A MODEL BASED STUDY

    Directory of Open Access Journals (Sweden)

    Sajal Bhattacharya

    2006-11-01

    Full Text Available The objective of the paper is to study the socio economic behaviour of migrant labourers in the context of the control of the diseases like malaria. The paper, therefore, makes a model and survey based study in the city of Kolkata, India to drive home the point that low income of people particularly of the migrant workers can be a major hurdle in the malaria control programme. The paper first looks at the economic behaviour pattern theoretically from neo-classical optimization exercise and the tries to test the theoetical result empirically from primary survey. The theoritical model gives the result that low income people is likely to take less rest and discontinue medical tratment. Since migrant workers of less developed counties are usually low-income people, pur model suggests that migrant workers will have incomplete treatment and their migration even before complete recovery may contribute to spread of the disease. We hage empirically tested the model econometrically by a logit model, and derived the result that migrat workers do take less rest and discontinue treatment becouse of economic compulsion. Thus the data support the result of the theoretical model and refeals a behafiour pattern, conducive to spread of malaria infection. The paper drives some policy prescriptions on the basis of these studies like infurance support, health survillance of migrant population as a part of integrated malaria control programme.

  9. Consumer Adoption of Future MyData-Based Preventive eHealth Services: An Acceptance Model and Survey Study.

    Science.gov (United States)

    Koivumäki, Timo; Pekkarinen, Saara; Lappi, Minna; Väisänen, Jere; Juntunen, Jouni; Pikkarainen, Minna

    2017-12-22

    Constantly increasing health care costs have led countries and health care providers to the point where health care systems must be reinvented. Consequently, electronic health (eHealth) has recently received a great deal of attention in social sciences in the domain of Internet studies. However, only a fraction of these studies focuses on the acceptability of eHealth, making consumers' subjective evaluation an understudied field. This study will address this gap by focusing on the acceptance of MyData-based preventive eHealth services from the consumer point of view. We are adopting the term "MyData", which according to a White Paper of the Finnish Ministry of Transport and Communication refers to "1) a new approach, a paradigm shift in personal data management and processing that seeks to transform the current organization centric system to a human centric system, 2) to personal data as a resource that the individual can access and control." The aim of this study was to investigate what factors influence consumers' intentions to use a MyData-based preventive eHealth service before use. We applied a new adoption model combining Venkatesh's unified theory of acceptance and use of technology 2 (UTAUT2) in a consumer context and three constructs from health behavior theories, namely threat appraisals, self-efficacy, and perceived barriers. To test the research model, we applied structural equation modeling (SEM) with Mplus software, version 7.4. A Web-based survey was administered. We collected 855 responses. We first applied traditional SEM for the research model, which was not statistically significant. We then tested for possible heterogeneity in the data by running a mixture analysis. We found that heterogeneity was not the cause for the poor performance of the research model. Thus, we moved on to model-generating SEM and ended up with a statistically significant empirical model (root mean square error of approximation [RMSEA] 0.051, Tucker-Lewis index [TLI] 0

  10. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  11. Issues in practical model-based diagnosis

    NARCIS (Netherlands)

    Bakker, R.R.; Bakker, R.R.; van den Bempt, P.C.A.; van den Bempt, P.C.A.; Mars, Nicolaas; Out, D.-J.; Out, D.J.; van Soest, D.C.; van Soes, D.C.

    1993-01-01

    The model-based diagnosis project at the University of Twente has been directed at improving the practical usefulness of model-based diagnosis. In cooperation with industrial partners, the research addressed the modeling problem and the efficiency problem in model-based reasoning. Main results of

  12. Modeling and Experimental Study of Soft Error Propagation Based on Cellular Automaton

    OpenAIRE

    He, Wei; Wang, Yueke; Xing, Kefei; Yang, Jianwei

    2016-01-01

    Aiming to estimate SEE soft error performance of complex electronic systems, a soft error propagation model based on cellular automaton is proposed and an estimation methodology based on circuit partitioning and error propagation is presented. Simulations indicate that different fault grade jamming and different coupling factors between cells are the main parameters influencing the vulnerability of the system. Accelerated radiation experiments have been developed to determine the main paramet...

  13. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  14. Solution processed deposition of electron transport layers on perovskite crystal surface—A modeling based study

    Energy Technology Data Exchange (ETDEWEB)

    Mortuza, S.M.; Taufique, M.F.N.; Banerjee, Soumik, E-mail: soumik.banerjee@wsu.edu

    2017-02-01

    Highlights: • The model determined the surface coverage of solution-processed film on perovskite. • Calculated surface density map provides insight into morphology of the monolayer. • Carbonyl oxygen atom of PCBM strongly attaches to the (110) surface of perovskite. • Uniform distribution of clusters on perovskite surface at lower PCBM concentration. • Deposition rate of PCBM on the surface is very high at initial stage of film growth. - Abstract: The power conversion efficiency (PCE) of planar perovskite solar cells (PSCs) has reached up to ∼20%. However, structural and chemicals defects that lead to hysteresis in the perovskite based thin film pose challenges. Recent work has shown that thin films of [6,6]-phenyl-C61-butyric acid methyl ester (PCBM) deposited on the photo absorption layer, using solution processing techniques, minimize surface pin holes and defects thereby increasing the PCE. We developed and employed a multiscale model based on molecular dynamics (MD) and kinetic Monte Carlo (kMC) to establish a relationship between deposition rate and surface coverage on perovskite surface. The MD simulations of PCBMs dispersed in chlorobenzene, sandwiched between (110) perovskite substrates, indicate that PCBMs are deposited through anchoring of the oxygen atom of carbonyl group to the exposed lead (Pb) atom of (110) perovskite surface. Based on rates of distinct deposition events calculated from MD, kMC simulations were run to determine surface coverage at much larger time and length scales than accessible by MD alone. Based on the model, a generic relationship is established between deposition rate of PCBMs and surface coverage on perovskite crystal. The study also provides detailed insights into the morphology of the deposited film.

  15. Solution processed deposition of electron transport layers on perovskite crystal surface—A modeling based study

    International Nuclear Information System (INIS)

    Mortuza, S.M.; Taufique, M.F.N.; Banerjee, Soumik

    2017-01-01

    Highlights: • The model determined the surface coverage of solution-processed film on perovskite. • Calculated surface density map provides insight into morphology of the monolayer. • Carbonyl oxygen atom of PCBM strongly attaches to the (110) surface of perovskite. • Uniform distribution of clusters on perovskite surface at lower PCBM concentration. • Deposition rate of PCBM on the surface is very high at initial stage of film growth. - Abstract: The power conversion efficiency (PCE) of planar perovskite solar cells (PSCs) has reached up to ∼20%. However, structural and chemicals defects that lead to hysteresis in the perovskite based thin film pose challenges. Recent work has shown that thin films of [6,6]-phenyl-C61-butyric acid methyl ester (PCBM) deposited on the photo absorption layer, using solution processing techniques, minimize surface pin holes and defects thereby increasing the PCE. We developed and employed a multiscale model based on molecular dynamics (MD) and kinetic Monte Carlo (kMC) to establish a relationship between deposition rate and surface coverage on perovskite surface. The MD simulations of PCBMs dispersed in chlorobenzene, sandwiched between (110) perovskite substrates, indicate that PCBMs are deposited through anchoring of the oxygen atom of carbonyl group to the exposed lead (Pb) atom of (110) perovskite surface. Based on rates of distinct deposition events calculated from MD, kMC simulations were run to determine surface coverage at much larger time and length scales than accessible by MD alone. Based on the model, a generic relationship is established between deposition rate of PCBMs and surface coverage on perovskite crystal. The study also provides detailed insights into the morphology of the deposited film.

  16. Three-dimensional model of plate geometry and velocity model for Nankai Trough seismogenic zone based on results from structural studies

    Science.gov (United States)

    Nakanishi, A.; Shimomura, N.; Kodaira, S.; Obana, K.; Takahashi, T.; Yamamoto, Y.; Yamashita, M.; Takahashi, N.; Kaneda, Y.

    2012-12-01

    In the Nankai Trough subduction seismogenic zone, the Nankai and Tonankai earthquakes had often occurred simultaneously, and caused a great event. In order to reduce a great deal of damage to coastal area from both strong ground motion and tsunami generation, it is necessary to understand rupture synchronization and segmentation of the Nankai megathrust earthquake. For a precise estimate of the rupture zone of the Nankai megathrust event based on the knowledge of realistic earthquake cycle and variation of magnitude, it is important to know the geometry and property of the plate boundary of the subduction seismogenic zone. To improve a physical model of the Nankai Trough seismogenic zone, the large-scale high-resolution wide-angle and reflection (MCS) seismic study, and long-term observation has been conducted since 2008. Marine active source seismic data have been acquired along grid two-dimensional profiles having the total length of ~800km every year. A three-dimensional seismic tomography using active and passive seismic data observed both land and ocean bottom stations have been also performed. From those data, we found that several strong lateral variations of the subducting Philippine Sea plate and overriding plate corresponding to margins of coseismic rupture zone of historical large event occurred along the Nankai Trough. Particularly a possible prominent reflector for the forearc Moho is recently imaged in the offshore side in the Kii channel at the depth of ~18km which is shallower than those of other area along the Nankai Trough. Such a drastic variation of the overriding plate might be related to the existence of the segmentation of the Nankai megathrust earthquake. Based on our results derived from seismic studies, we have tried to make a geometrical model of the Philippine Sea plate and a three-dimensional velocity structure model of the Nankai Trough seismogenic zone. In this presentation, we will summarize major results of out seismic studies, and

  17. Modelling regime shifts in the southern Benguela: a frame-based ...

    African Journals Online (AJOL)

    Modelling regime shifts in the southern Benguela: a frame-based approach. MD Smith, A Jarre. Abstract. This study explores the usefulness of a frame-based modelling approach in the southern Benguela upwelling ecosystem, with four frames describing observed small pelagic fish dominance patterns. We modelled the ...

  18. Multiscale agent-based cancer modeling.

    Science.gov (United States)

    Zhang, Le; Wang, Zhihui; Sagotsky, Jonathan A; Deisboeck, Thomas S

    2009-04-01

    Agent-based modeling (ABM) is an in silico technique that is being used in a variety of research areas such as in social sciences, economics and increasingly in biomedicine as an interdisciplinary tool to study the dynamics of complex systems. Here, we describe its applicability to integrative tumor biology research by introducing a multi-scale tumor modeling platform that understands brain cancer as a complex dynamic biosystem. We summarize significant findings of this work, and discuss both challenges and future directions for ABM in the field of cancer research.

  19. An Experimental Study on Mechanical Modeling of Ceramics Based on Microstructure

    Directory of Open Access Journals (Sweden)

    Ya-Nan Zhang

    2015-11-01

    Full Text Available The actual grinding result of ceramics has not been well predicted by the present mechanical models. No allowance is made for direct effects of materials microstructure and almost all the mechanical models were obtained based on crystalline ceramics. In order to improve the mechanical models of ceramics, surface grinding experiments on crystalline ceramics and non-crystalline ceramics were conducted in this research. The normal and tangential grinding forces were measured to calculate single grit force and specific grinding energy. Grinding surfaces were observed. For crystalline alumina ceramics, the predictive modeling of normal force per grit fits well with the experimental result, when the maximum undeformed chip thickness is less than a critical depth, which turns out to be close to the grain size of alumina. Meanwhile, there is a negative correlation between the specific grinding energy and the maximum undeformed chip thickness. With the decreasing maximum undeformed chip thickness, the proportions of ductile removal and transgranular fracture increase. However, the grinding force models are not applicable for non-crystalline ceramic fused silica and the specific grinding energy fluctuates irregularly as a function of maximum undeformed chip thickness seen from the experiment.

  20. Temperature based daily incoming solar radiation modeling based on gene expression programming, neuro-fuzzy and neural network computing techniques.

    Science.gov (United States)

    Landeras, G.; López, J. J.; Kisi, O.; Shiri, J.

    2012-04-01

    The correct observation/estimation of surface incoming solar radiation (RS) is very important for many agricultural, meteorological and hydrological related applications. While most weather stations are provided with sensors for air temperature detection, the presence of sensors necessary for the detection of solar radiation is not so habitual and the data quality provided by them is sometimes poor. In these cases it is necessary to estimate this variable. Temperature based modeling procedures are reported in this study for estimating daily incoming solar radiation by using Gene Expression Programming (GEP) for the first time, and other artificial intelligence models such as Artificial Neural Networks (ANNs), and Adaptive Neuro-Fuzzy Inference System (ANFIS). Traditional temperature based solar radiation equations were also included in this study and compared with artificial intelligence based approaches. Root mean square error (RMSE), mean absolute error (MAE) RMSE-based skill score (SSRMSE), MAE-based skill score (SSMAE) and r2 criterion of Nash and Sutcliffe criteria were used to assess the models' performances. An ANN (a four-input multilayer perceptron with ten neurons in the hidden layer) presented the best performance among the studied models (2.93 MJ m-2 d-1 of RMSE). A four-input ANFIS model revealed as an interesting alternative to ANNs (3.14 MJ m-2 d-1 of RMSE). Very limited number of studies has been done on estimation of solar radiation based on ANFIS, and the present one demonstrated the ability of ANFIS to model solar radiation based on temperatures and extraterrestrial radiation. By the way this study demonstrated, for the first time, the ability of GEP models to model solar radiation based on daily atmospheric variables. Despite the accuracy of GEP models was slightly lower than the ANFIS and ANN models the genetic programming models (i.e., GEP) are superior to other artificial intelligence models in giving a simple explicit equation for the

  1. The Design of Model-Based Training Programs

    Science.gov (United States)

    Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.

  2. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  3. Constructing rule-based models using the belief functions framework

    NARCIS (Netherlands)

    Almeida, R.J.; Denoeux, T.; Kaymak, U.; Greco, S.; Bouchon-Meunier, B.; Coletti, G.; Fedrizzi, M.; Matarazzo, B.; Yager, R.R.

    2012-01-01

    Abstract. We study a new approach to regression analysis. We propose a new rule-based regression model using the theoretical framework of belief functions. For this purpose we use the recently proposed Evidential c-means (ECM) to derive rule-based models solely from data. ECM allocates, for each

  4. Comparisons of complex network based models and real train flow model to analyze Chinese railway vulnerability

    International Nuclear Information System (INIS)

    Ouyang, Min; Zhao, Lijing; Hong, Liu; Pan, Zhezhe

    2014-01-01

    Recently numerous studies have applied complex network based models to study the performance and vulnerability of infrastructure systems under various types of attacks and hazards. But how effective are these models to capture their real performance response is still a question worthy of research. Taking the Chinese railway system as an example, this paper selects three typical complex network based models, including purely topological model (PTM), purely shortest path model (PSPM), and weight (link length) based shortest path model (WBSPM), to analyze railway accessibility and flow-based vulnerability and compare their results with those from the real train flow model (RTFM). The results show that the WBSPM can produce the train routines with 83% stations and 77% railway links identical to the real routines and can approach the RTFM the best for railway vulnerability under both single and multiple component failures. The correlation coefficient for accessibility vulnerability from WBSPM and RTFM under single station failures is 0.96 while it is 0.92 for flow-based vulnerability; under multiple station failures, where each station has the same failure probability fp, the WBSPM can produce almost identical vulnerability results with those from the RTFM under almost all failures scenarios when fp is larger than 0.62 for accessibility vulnerability and 0.86 for flow-based vulnerability

  5. Supporting 3rd-grade students model-based explanations about groundwater: a quasi-experimental study of a curricular intervention

    Science.gov (United States)

    Zangori, Laura; Vo, Tina; Forbes, Cory T.; Schwarz, Christina V.

    2017-07-01

    Scientific modelling is a key practice in which K-12 students should engage to begin developing robust conceptual understanding of natural systems, including water. However, little past research has explored primary students' learning about groundwater, engagement in scientific modelling, and/or the ways in which teachers conceptualise and cultivate model-based science learning environments. We are engaged in a multi-year project designed to support 3rd-grade students' formulation of model-based explanations (MBE) for hydrologic phenomenon, including groundwater, through curricular and instructional support. In this quasi-experimental comparative study of five 3rd-grade classrooms, we present findings from analysis of students' MBE generated as part of experiencing a baseline curricular intervention (Year 1) and a modelling-enhanced curricular intervention (Year 2). Findings show that students experiencing the latter version of the unit made significant gains in both conceptual understanding and reasoning about groundwater, but that these gains varied by classroom. Overall, student gains from Year 1 to Year 2 were attributed to changes in two of the five classrooms in which students were provided additional instructional supports and scaffolds to enhance their MBE for groundwater. Within these two classrooms, the teachers enacted the Year 2 curriculum in unique ways that reflected their deeper understanding about the practices of modelling. Their enactments played a critical role in supporting students' MBE about groundwater. Study findings contribute to research on scientific modelling in elementary science learning environments and have important implications for teachers and curriculum developers.

  6. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  7. A Numerical Study of Water Loss Rate Distributions in MDCT-based Human Airway Models

    Science.gov (United States)

    Wu, Dan; Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2015-01-01

    Both three-dimensional (3D) and one-dimensional (1D) computational fluid dynamics (CFD) methods are applied to study regional water loss in three multi-detector row computed-tomography (MDCT)-based human airway models at the minute ventilations of 6, 15 and 30 L/min. The overall water losses predicted by both 3D and 1D models in the entire respiratory tract agree with available experimental measurements. However, 3D and 1D models reveal different regional water loss rate distributions due to the 3D secondary flows formed at bifurcations. The secondary flows cause local skewed temperature and humidity distributions on inspiration acting to elevate the local water loss rate; and the secondary flow at the carina tends to distribute more cold air to the lower lobes. As a result, the 3D model predicts that the water loss rate first increases with increasing airway generation, and then decreases as the air approaches saturation, while the 1D model predicts a monotonic decrease of water loss rate with increasing airway generation. Moreover, the 3D (or 1D) model predicts relatively higher water loss rates in lower (or upper) lobes. The regional water loss rate can be related to the non-dimensional wall shear stress (τ*) by the non-dimensional mass transfer coefficient (h0*) as h0* = 1.15 τ*0.272, R = 0.842. PMID:25869455

  8. Understanding Elementary Astronomy by Making Drawing-Based Models

    NARCIS (Netherlands)

    van Joolingen, Wouter; Aukes, A.V.A.; Gijlers, Aaltje H.; Bollen, Lars

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247

  9. Modeling Dynamic Systems with Efficient Ensembles of Process-Based Models.

    Directory of Open Access Journals (Sweden)

    Nikola Simidjievski

    Full Text Available Ensembles are a well established machine learning paradigm, leading to accurate and robust models, predominantly applied to predictive modeling tasks. Ensemble models comprise a finite set of diverse predictive models whose combined output is expected to yield an improved predictive performance as compared to an individual model. In this paper, we propose a new method for learning ensembles of process-based models of dynamic systems. The process-based modeling paradigm employs domain-specific knowledge to automatically learn models of dynamic systems from time-series observational data. Previous work has shown that ensembles based on sampling observational data (i.e., bagging and boosting, significantly improve predictive performance of process-based models. However, this improvement comes at the cost of a substantial increase of the computational time needed for learning. To address this problem, the paper proposes a method that aims at efficiently learning ensembles of process-based models, while maintaining their accurate long-term predictive performance. This is achieved by constructing ensembles with sampling domain-specific knowledge instead of sampling data. We apply the proposed method to and evaluate its performance on a set of problems of automated predictive modeling in three lake ecosystems using a library of process-based knowledge for modeling population dynamics. The experimental results identify the optimal design decisions regarding the learning algorithm. The results also show that the proposed ensembles yield significantly more accurate predictions of population dynamics as compared to individual process-based models. Finally, while their predictive performance is comparable to the one of ensembles obtained with the state-of-the-art methods of bagging and boosting, they are substantially more efficient.

  10. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  11. A study of gradient strengthening based on a finite-deformation gradient crystal-plasticity model

    Science.gov (United States)

    Pouriayevali, Habib; Xu, Bai-Xiang

    2017-11-01

    A comprehensive study on a finite-deformation gradient crystal-plasticity model which has been derived based on Gurtin's framework (Int J Plast 24:702-725, 2008) is carried out here. This systematic investigation on the different roles of governing components of the model represents the strength of this framework in the prediction of a wide range of hardening behaviors as well as rate-dependent and scale-variation responses in a single crystal. The model is represented in the reference configuration for the purpose of numerical implementation and then implemented in the FEM software ABAQUS via a user-defined subroutine (UEL). Furthermore, a function of accumulation rates of dislocations is employed and viewed as a measure of formation of short-range interactions. Our simulation results reveal that the dissipative gradient strengthening can be identified as a source of isotropic-hardening behavior, which may represent the effect of irrecoverable work introduced by Gurtin and Ohno (J Mech Phys Solids 59:320-343, 2011). Here, the variation of size dependency at different magnitude of a rate-sensitivity parameter is also discussed. Moreover, an observation of effect of a distinctive feature in the model which explains the effect of distortion of crystal lattice in the reference configuration is reported in this study for the first time. In addition, plastic flows in predefined slip systems and expansion of accumulation of GNDs are distinctly observed in varying scales and under different loading conditions.

  12. Whole body acid-base modeling revisited.

    Science.gov (United States)

    Ring, Troels; Nielsen, Søren

    2017-04-01

    The textbook account of whole body acid-base balance in terms of endogenous acid production, renal net acid excretion, and gastrointestinal alkali absorption, which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. To improve understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production were already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption in terms of urine excretions. With a few assumptions it was possible to see that this expression of net acid balance was arithmetically identical to minus urine charge, whereby under the development of acidosis, urine was predicted to acquire a net negative charge. The literature already mentions unexplained negative urine charges so we scrutinized a series of seminal papers and confirmed empirically the theoretical prediction that observed urine charge did acquire negative charge as acidosis developed. Hence, we can conclude that the conventional model is problematic since it predicts what is physiologically impossible. Therefore, we need a new model for whole body acid-base balance, which does not have impossible implications. Furthermore, new experimental studies are needed to account for charge imbalance in urine under development of acidosis. Copyright © 2017 the American Physiological Society.

  13. Mathematical Modeling of Column-Base Connections under Monotonic Loading

    Directory of Open Access Journals (Sweden)

    Gholamreza Abdollahzadeh

    2014-12-01

    Full Text Available Some considerable damage to steel structures during the Hyogo-ken Nanbu Earthquake occurred. Among them, many exposed-type column bases failed in several consistent patterns, such as brittle base plate fracture, excessive bolt elongation, unexpected early bolt failure, and inferior construction work, etc. The lessons from these phenomena led to the need for improved understanding of column base behavior. Joint behavior must be modeled when analyzing semi-rigid frames, which is associated with a mathematical model of the moment–rotation curve. The most accurate model uses continuous nonlinear functions. This article presents three areas of steel joint research: (1 analysis methods of semi-rigid joints; (2 prediction methods for the mechanical behavior of joints; (3 mathematical representations of the moment–rotation curve. In the current study, a new exponential model to depict the moment–rotation relationship of column base connection is proposed. The proposed nonlinear model represents an approach to the prediction of M–θ curves, taking into account the possible failure modes and the deformation characteristics of the connection elements. The new model has three physical parameters, along with two curve-fitted factors. These physical parameters are generated from dimensional details of the connection, as well as the material properties. The M–θ curves obtained by the model are compared with published connection tests and 3D FEM research. The proposed mathematical model adequately comes close to characterizing M–θ behavior through the full range of loading/rotations. As a result, modeling of column base connections using the proposed mathematical model can give crucial beforehand information, and overcome the disadvantages of time consuming workmanship and cost of experimental studies.

  14. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... of the functionality of a system. The article further presents the application of the framework based on a product example. Finally, an empirical study in industry is presented. Therein, feedback on the potential of the proposed framework to support interdisciplinary design practice as well as on areas of further...

  15. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  16. Taxi trips distribution modeling based on Entropy-Maximizing theory: A case study in Harbin city-China

    Science.gov (United States)

    Tang, Jinjun; Zhang, Shen; Chen, Xinqiang; Liu, Fang; Zou, Yajie

    2018-03-01

    Understanding Origin-Destination distribution of taxi trips is very important for improving effects of transportation planning and enhancing quality of taxi services. This study proposes a new method based on Entropy-Maximizing theory to model OD distribution in Harbin city using large-scale taxi GPS trajectories. Firstly, a K-means clustering method is utilized to partition raw pick-up and drop-off location into different zones, and trips are assumed to start from and end at zone centers. A generalized cost function is further defined by considering travel distance, time and fee between each OD pair. GPS data collected from more than 1000 taxis at an interval of 30 s during one month are divided into two parts: data from first twenty days is treated as training dataset and last ten days is taken as testing dataset. The training dataset is used to calibrate model while testing dataset is used to validate model. Furthermore, three indicators, mean absolute error (MAE), root mean square error (RMSE) and mean percentage absolute error (MPAE), are applied to evaluate training and testing performance of Entropy-Maximizing model versus Gravity model. The results demonstrate Entropy-Maximizing model is superior to Gravity model. Findings of the study are used to validate the feasibility of OD distribution from taxi GPS data in urban system.

  17. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  18. Embracing model-based designs for dose-finding trials.

    Science.gov (United States)

    Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria

    2017-07-25

    Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators' preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia.

  19. A study of the spreading scheme for viral marketing based on a complex network model

    Science.gov (United States)

    Yang, Jianmei; Yao, Canzhong; Ma, Weicheng; Chen, Guanrong

    2010-02-01

    Buzzword-based viral marketing, known also as digital word-of-mouth marketing, is a marketing mode attached to some carriers on the Internet, which can rapidly copy marketing information at a low cost. Viral marketing actually uses a pre-existing social network where, however, the scale of the pre-existing network is believed to be so large and so random, so that its theoretical analysis is intractable and unmanageable. There are very few reports in the literature on how to design a spreading scheme for viral marketing on real social networks according to the traditional marketing theory or the relatively new network marketing theory. Complex network theory provides a new model for the study of large-scale complex systems, using the latest developments of graph theory and computing techniques. From this perspective, the present paper extends the complex network theory and modeling into the research of general viral marketing and develops a specific spreading scheme for viral marking and an approach to design the scheme based on a real complex network on the QQ instant messaging system. This approach is shown to be rather universal and can be further extended to the design of various spreading schemes for viral marketing based on different instant messaging systems.

  20. Model-Based Data Integration and Process Standardization Techniques for Fault Management: A Feasibility Study

    Science.gov (United States)

    Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig

    2018-01-01

    This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.

  1. Group-Based Active Learning of Classification Models.

    Science.gov (United States)

    Luo, Zhipeng; Hauskrecht, Milos

    2017-05-01

    Learning of classification models from real-world data often requires additional human expert effort to annotate the data. However, this process can be rather costly and finding ways of reducing the human annotation effort is critical for this task. The objective of this paper is to develop and study new ways of providing human feedback for efficient learning of classification models by labeling groups of examples. Briefly, unlike traditional active learning methods that seek feedback on individual examples, we develop a new group-based active learning framework that solicits label information on groups of multiple examples. In order to describe groups in a user-friendly way, conjunctive patterns are used to compactly represent groups. Our empirical study on 12 UCI data sets demonstrates the advantages and superiority of our approach over both classic instance-based active learning work, as well as existing group-based active-learning methods.

  2. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction

    Science.gov (United States)

    Grulke, Christopher M.; Chang, Daniel T.; Brooks, Raina D.; Leonard, Jeremy A.; Phillips, Martin B.; Hypes, Ethan D.; Fair, Matthew J.; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C.; Tan, Yu-Mei

    2016-01-01

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals. PMID:26871706

  3. Understanding Group/Party Affiliation Using Social Networks and Agent-Based Modeling

    Science.gov (United States)

    Campbell, Kenyth

    2012-01-01

    The dynamics of group affiliation and group dispersion is a concept that is most often studied in order for political candidates to better understand the most efficient way to conduct their campaigns. While political campaigning in the United States is a very hot topic that most politicians analyze and study, the concept of group/party affiliation presents its own area of study that producers very interesting results. One tool for examining party affiliation on a large scale is agent-based modeling (ABM), a paradigm in the modeling and simulation (M&S) field perfectly suited for aggregating individual behaviors to observe large swaths of a population. For this study agent based modeling was used in order to look at a community of agents and determine what factors can affect the group/party affiliation patterns that are present. In the agent-based model that was used for this experiment many factors were present but two main factors were used to determine the results. The results of this study show that it is possible to use agent-based modeling to explore group/party affiliation and construct a model that can mimic real world events. More importantly, the model in the study allows for the results found in a smaller community to be translated into larger experiments to determine if the results will remain present on a much larger scale.

  4. The Research on Informal Learning Model of College Students Based on SNS and Case Study

    Science.gov (United States)

    Lu, Peng; Cong, Xiao; Bi, Fangyan; Zhou, Dongdai

    2017-03-01

    With the rapid development of network technology, informal learning based on online become the main way for college students to learn a variety of subject knowledge. The favor to the SNS community of students and the characteristics of SNS itself provide a good opportunity for the informal learning of college students. This research first analyzes the related research of the informal learning and SNS, next, discusses the characteristics of informal learning and theoretical basis. Then, it proposed an informal learning model of college students based on SNS according to the support role of SNS to the informal learning of students. Finally, according to the theoretical model and the principles proposed in this study, using the Elgg and related tools which is the open source SNS program to achieve the informal learning community. This research is trying to overcome issues such as the lack of social realism, interactivity, resource transfer mode in the current network informal learning communities, so as to provide a new way of informal learning for college students.

  5. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  6. Multi-Collinearity Based Model Selection for Landslide Susceptibility Mapping: A Case Study from Ulus District of Karabuk, Turkey

    Science.gov (United States)

    Sahin, E. K.; Colkesen, I., , Dr; Kavzoglu, T.

    2017-12-01

    Identification of localities prone to landslide areas plays an important role for emergency planning, disaster management and recovery planning. Due to its great importance for disaster management, producing accurate and up-to-date landslide susceptibility maps is essential for hazard mitigation purpose and regional planning. The main objective of the present study was to apply multi-collinearity based model selection approach for the production of a landslide susceptibility map of Ulus district of Karabuk, Turkey. It is a fact that data do not contain enough information to describe the problem under consideration when the factors are highly correlated with each other. In such cases, choosing a subset of the original features will often lead to better performance. This paper presents multi-collinearity based model selection approach to deal with the high correlation within the dataset. Two collinearity diagnostic factors (Tolerance (TOL) and the Variance Inflation Factor (VIF)) are commonly used to identify multi-collinearity. Values of VIF that exceed 10.0 and TOL values less than 1.0 are often regarded as indicating multi-collinearity. Five causative factors (slope length, curvature, plan curvature, profile curvature and topographical roughness index) were found highly correlated with each other among 15 factors available for the study area. As a result, the five correlated factors were removed from the model estimation, and performances of the models including the remaining 10 factors (aspect, drainage density, elevation, lithology, land use/land cover, NDVI, slope, sediment transport index, topographical position index and topographical wetness index) were evaluated using logistic regression. The performance of prediction model constructed with 10 factors was compared to that of 15-factor model. The prediction performance of two susceptibility maps was evaluated by overall accuracy and the area under the ROC curve (AUC) values. Results showed that overall

  7. High-level PC-based laser system modeling

    Science.gov (United States)

    Taylor, Michael S.

    1991-05-01

    Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.

  8. Model based energy benchmarking for glass furnace

    International Nuclear Information System (INIS)

    Sardeshpande, Vishal; Gaitonde, U.N.; Banerjee, Rangan

    2007-01-01

    Energy benchmarking of processes is important for setting energy efficiency targets and planning energy management strategies. Most approaches used for energy benchmarking are based on statistical methods by comparing with a sample of existing plants. This paper presents a model based approach for benchmarking of energy intensive industrial processes and illustrates this approach for industrial glass furnaces. A simulation model for a glass furnace is developed using mass and energy balances, and heat loss equations for the different zones and empirical equations based on operating practices. The model is checked with field data from end fired industrial glass furnaces in India. The simulation model enables calculation of the energy performance of a given furnace design. The model results show the potential for improvement and the impact of different operating and design preferences on specific energy consumption. A case study for a 100 TPD end fired furnace is presented. An achievable minimum energy consumption of about 3830 kJ/kg is estimated for this furnace. The useful heat carried by glass is about 53% of the heat supplied by the fuel. Actual furnaces operating at these production scales have a potential for reduction in energy consumption of about 20-25%

  9. Marker-based or model-based RSA for evaluation of hip resurfacing arthroplasty? A clinical validation and 5-year follow-up.

    Science.gov (United States)

    Lorenzen, Nina Dyrberg; Stilling, Maiken; Jakobsen, Stig Storgaard; Gustafson, Klas; Søballe, Kjeld; Baad-Hansen, Thomas

    2013-11-01

    The stability of implants is vital to ensure a long-term survival. RSA determines micro-motions of implants as a predictor of early implant failure. RSA can be performed as a marker- or model-based analysis. So far, CAD and RE model-based RSA have not been validated for use in hip resurfacing arthroplasty (HRA). A phantom study determined the precision of marker-based and CAD and RE model-based RSA on a HRA implant. In a clinical study, 19 patients were followed with stereoradiographs until 5 years after surgery. Analysis of double-examination migration results determined the clinical precision of marker-based and CAD model-based RSA, and at the 5-year follow-up, results of the total translation (TT) and the total rotation (TR) for marker- and CAD model-based RSA were compared. The phantom study showed that comparison of the precision (SDdiff) in marker-based RSA analysis was more precise than model-based RSA analysis in TT (p CAD RSA analysis (p = 0.002), but showed no difference between the marker- and CAD model-based RSA analysis regarding the TR (p = 0.91). Comparing the mean signed values regarding the TT and the TR at the 5-year follow-up in 13 patients, the TT was lower (p = 0.03) and the TR higher (p = 0.04) in the marker-based RSA compared to CAD model-based RSA. The precision of marker-based RSA was significantly better than model-based RSA. However, problems with occluded markers lead to exclusion of many patients which was not a problem with model-based RSA. HRA were stable at the 5-year follow-up. The detection limit was 0.2 mm TT and 1° TR for marker-based and 0.5 mm TT and 1° TR for CAD model-based RSA for HRA.

  10. Empirically Based Composite Fracture Prediction Model From the Global Longitudinal Study of Osteoporosis in Postmenopausal Women (GLOW)

    Science.gov (United States)

    Compston, Juliet E.; Chapurlat, Roland D.; Pfeilschifter, Johannes; Cooper, Cyrus; Hosmer, David W.; Adachi, Jonathan D.; Anderson, Frederick A.; Díez-Pérez, Adolfo; Greenspan, Susan L.; Netelenbos, J. Coen; Nieves, Jeri W.; Rossini, Maurizio; Watts, Nelson B.; Hooven, Frederick H.; LaCroix, Andrea Z.; March, Lyn; Roux, Christian; Saag, Kenneth G.; Siris, Ethel S.; Silverman, Stuart; Gehlbach, Stephen H.

    2014-01-01

    Context: Several fracture prediction models that combine fractures at different sites into a composite outcome are in current use. However, to the extent individual fracture sites have differing risk factor profiles, model discrimination is impaired. Objective: The objective of the study was to improve model discrimination by developing a 5-year composite fracture prediction model for fracture sites that display similar risk profiles. Design: This was a prospective, observational cohort study. Setting: The study was conducted at primary care practices in 10 countries. Patients: Women aged 55 years or older participated in the study. Intervention: Self-administered questionnaires collected data on patient characteristics, fracture risk factors, and previous fractures. Main Outcome Measure: The main outcome is time to first clinical fracture of hip, pelvis, upper leg, clavicle, or spine, each of which exhibits a strong association with advanced age. Results: Of four composite fracture models considered, model discrimination (c index) is highest for an age-related fracture model (c index of 0.75, 47 066 women), and lowest for Fracture Risk Assessment Tool (FRAX) major fracture and a 10-site model (c indices of 0.67 and 0.65). The unadjusted increase in fracture risk for an additional 10 years of age ranges from 80% to 180% for the individual bones in the age-associated model. Five other fracture sites not considered for the age-associated model (upper arm/shoulder, rib, wrist, lower leg, and ankle) have age associations for an additional 10 years of age from a 10% decrease to a 60% increase. Conclusions: After examining results for 10 different bone fracture sites, advanced age appeared the single best possibility for uniting several different sites, resulting in an empirically based composite fracture risk model. PMID:24423345

  11. A case study on the historical peninsula of Istanbul based on three-dimensional modeling by using photogrammetry and terrestrial laser scanning.

    Science.gov (United States)

    Ergun, Bahadir; Sahin, Cumhur; Baz, Ibrahim; Ustuntas, Taner

    2010-06-01

    Terrestrial laser scanning is a popular methodology that is used frequently in the process of documenting historical buildings and cultural heritage. The historical peninsula region sprawls over an area of approximately 1,500 ha and is one of the main aggregate areas of the historical buildings in Istanbul. In this study, terrestrial laser scanning and close range photogrammetry techniques are integrated into each other to create a 3D city model of this part of Istanbul, including some of the buildings that represent the most brilliant areas of Byzantine and Ottoman Empires. Several terrestrial laser scanners with their different specifications were used to solve various geometric scanning problems for distinct areas of the subject city. Photogrammetric method was used for the documentation of the façades of these historical buildings for architectural purposes. This study differentiates itself from the similar ones by its application process that focuses on the geometry, the building texture, and density of the study area. Nowadays, the largest-scale studies among 3D modeling studies, in terms of the methodology of measurement, are urban modeling studies. Because of this large scale, the application of 3D urban modeling studies is executed in a gradual way. In this study, a modeling method based on the façades of the streets was used. In addition, the complimentary elements for the process of modeling were combined in several ways. A street model was presented as a sample, as being the subject of the applied study. In our application of 3D modeling, the modeling based on close range photogrammetry and the data of combined calibration with the data of terrestrial laser scanner were used in a compatible way. The final work was formed with the pedestal data for 3D visualization.

  12. A Model of Decision-Making Based on Critical Thinking

    OpenAIRE

    Uluçınar, Ufuk; Aypay, Ahmet

    2016-01-01

    The aim of this study is to examine the causal relationships between high school students' inquisitiveness, open-mindedness, causal thinking, and rational and intuitive decision-making dispositions through an assumed model based on research data. This study was designed in correlational model. Confirmatory factor analysis and path analysis, which are structural equation modelling applications, were used to explain these relationships. The participants were 404 students studying in five high s...

  13. Ratio-based vs. model-based methods to correct for urinary creatinine concentrations.

    Science.gov (United States)

    Jain, Ram B

    2016-08-01

    Creatinine-corrected urinary analyte concentration is usually computed as the ratio of the observed level of analyte concentration divided by the observed level of the urinary creatinine concentration (UCR). This ratio-based method is flawed since it implicitly assumes that hydration is the only factor that affects urinary creatinine concentrations. On the contrary, it has been shown in the literature, that age, gender, race/ethnicity, and other factors also affect UCR. Consequently, an optimal method to correct for UCR should correct for hydration as well as other factors like age, gender, and race/ethnicity that affect UCR. Model-based creatinine correction in which observed UCRs are used as an independent variable in regression models has been proposed. This study was conducted to evaluate the performance of ratio-based and model-based creatinine correction methods when the effects of gender, age, and race/ethnicity are evaluated one factor at a time for selected urinary analytes and metabolites. It was observed that ratio-based method leads to statistically significant pairwise differences, for example, between males and females or between non-Hispanic whites (NHW) and non-Hispanic blacks (NHB), more often than the model-based method. However, depending upon the analyte of interest, the reverse is also possible. The estimated ratios of geometric means (GM), for example, male to female or NHW to NHB, were also compared for the two methods. When estimated UCRs were higher for the group (for example, males) in the numerator of this ratio, these ratios were higher for the model-based method, for example, male to female ratio of GMs. When estimated UCR were lower for the group (for example, NHW) in the numerator of this ratio, these ratios were higher for the ratio-based method, for example, NHW to NHB ratio of GMs. Model-based method is the method of choice if all factors that affect UCR are to be accounted for.

  14. Temporal validation for landsat-based volume estimation model

    Science.gov (United States)

    Renaldo J. Arroyo; Emily B. Schultz; Thomas G. Matney; David L. Evans; Zhaofei Fan

    2015-01-01

    Satellite imagery can potentially reduce the costs and time associated with ground-based forest inventories; however, for satellite imagery to provide reliable forest inventory data, it must produce consistent results from one time period to the next. The objective of this study was to temporally validate a Landsat-based volume estimation model in a four county study...

  15. Model-predictive control based on Takagi-Sugeno fuzzy model for electrical vehicles delayed model

    DEFF Research Database (Denmark)

    Khooban, Mohammad-Hassan; Vafamand, Navid; Niknam, Taher

    2017-01-01

    Electric vehicles (EVs) play a significant role in different applications, such as commuter vehicles and short distance transport applications. This study presents a new structure of model-predictive control based on the Takagi-Sugeno fuzzy model, linear matrix inequalities, and a non......-quadratic Lyapunov function for the speed control of EVs including time-delay states and parameter uncertainty. Experimental data, using the Federal Test Procedure (FTP-75), is applied to test the performance and robustness of the suggested controller in the presence of time-varying parameters. Besides, a comparison...... is made between the results of the suggested robust strategy and those obtained from some of the most recent studies on the same topic, to assess the efficiency of the suggested controller. Finally, the experimental results based on a TMS320F28335 DSP are performed on a direct current motor. Simulation...

  16. Physiologically-Based Toxicokinetic Modeling of Zearalenone and Its Metabolites: Application to the Jersey Girl Study

    Science.gov (United States)

    Mukherjee, Dwaipayan; Royce, Steven G.; Alexander, Jocelyn A.; Buckley, Brian; Isukapalli, Sastry S.; Bandera, Elisa V.; Zarbl, Helmut; Georgopoulos, Panos G.

    2014-01-01

    Zearalenone (ZEA), a fungal mycotoxin, and its metabolite zeranol (ZAL) are known estrogen agonists in mammals, and are found as contaminants in food. Zeranol, which is more potent than ZEA and comparable in potency to estradiol, is also added as a growth additive in beef in the US and Canada. This article presents the development and application of a Physiologically-Based Toxicokinetic (PBTK) model for ZEA and ZAL and their primary metabolites, zearalenol, zearalanone, and their conjugated glucuronides, for rats and for human subjects. The PBTK modeling study explicitly simulates critical metabolic pathways in the gastrointestinal and hepatic systems. Metabolic events such as dehydrogenation and glucuronidation of the chemicals, which have direct effects on the accumulation and elimination of the toxic compounds, have been quantified. The PBTK model considers urinary and fecal excretion and biliary recirculation and compares the predicted biomarkers of blood, urinary and fecal concentrations with published in vivo measurements in rats and human subjects. Additionally, the toxicokinetic model has been coupled with a novel probabilistic dietary exposure model and applied to the Jersey Girl Study (JGS), which involved measurement of mycoestrogens as urinary biomarkers, in a cohort of young girls in New Jersey, USA. A probabilistic exposure characterization for the study population has been conducted and the predicted urinary concentrations have been compared to measurements considering inter-individual physiological and dietary variability. The in vivo measurements from the JGS fall within the high and low predicted distributions of biomarker values corresponding to dietary exposure estimates calculated by the probabilistic modeling system. The work described here is the first of its kind to present a comprehensive framework developing estimates of potential exposures to mycotoxins and linking them with biologically relevant doses and biomarker measurements

  17. A study for production simulation model generation system based on data model at a shipyard

    Directory of Open Access Journals (Sweden)

    Myung-Gi Back

    2016-09-01

    Full Text Available Simulation technology is a type of shipbuilding product lifecycle management solution used to support production planning or decision-making. Normally, most shipbuilding processes are consisted of job shop production, and the modeling and simulation require professional skills and experience on shipbuilding. For these reasons, many shipbuilding companies have difficulties adapting simulation systems, regardless of the necessity for the technology. In this paper, the data model for shipyard production simulation model generation was defined by analyzing the iterative simulation modeling procedure. The shipyard production simulation data model defined in this study contains the information necessary for the conventional simulation modeling procedure and can serve as a basis for simulation model generation. The efficacy of the developed system was validated by applying it to the simulation model generation of the panel block production line. By implementing the initial simulation model generation process, which was performed in the past with a simulation modeler, the proposed system substantially reduced the modeling time. In addition, by reducing the difficulties posed by different modeler-dependent generation methods, the proposed system makes the standardization of the simulation model quality possible.

  18. Dissemination of Cultural Norms and Values: Agent-Based Modeling

    Directory of Open Access Journals (Sweden)

    Denis Andreevich Degterev

    2016-12-01

    Full Text Available This article shows how agent-based modeling allows us to explore the mechanisms of the dissemination of cultural norms and values both within one country and in the whole world. In recent years, this type of simulation is particularly prevalent in the analysis of international relations, becoming more popular than the system dynamics and discrete event simulation. The use of agent-based modeling in the analysis of international relations is connected with the agent-structure problem in international relations. Structure and agents act as interdependent and dynamically changing in the process of interaction between entities. Agent-structure interaction could be modeled by means of the theory of complex adaptive systems with the use of agent-based modeling techniques. One of the first examples of the use of agent-based modeling in political science is a model of racial segregation T. Shellinga. On the basis of this model, the author shows how the change in behavioral patterns at micro-level impacts on the macro-level. Patterns are changing due to the dynamics of cultural norms and values, formed by mass-media and other social institutes. The author shows the main areas of modern application of agent-based modeling in international studies including the analysis of ethnic conflicts, the formation of international coalitions. Particular attention is paid to Robert Axelrod approach based on the use of genetic algorithms to the spread of cultural norms and values. Agent-based modeling shows how to how to create such conditions that the norms that originally are not shared by a significant part of the population, eventually spread everywhere. Practical application of these algorithms is shown by the author of the article on the example of the situation in Ukraine in 2015-2016. The article also reveals the mechanisms of international spread of cultural norms and values. The main think-tanks using agent-based modeling in international studies are

  19. Gender-related model for mobile-based learning

    Science.gov (United States)

    Simanjuntak, R. R.; Dewi, U. P.; Rifai, I.

    2018-03-01

    The study investigates gender influence on mobile-based learning. This case study of university students in Jakarta involved 235 students (128 male, 97 female). Results of this qualitative study showed 96% preference for mobile-based learning. A further 94% showed the needs for collaboration and authenticity for 92%. Hofstede’s cultural dimensions were used to identify the gender aspects of MALL. Preference for Masculinity (65%) was showed rather than Femininity (35%), even among the female respondents (70% of the population). Professions and professionalism received strongest preference (70%) while Individuality and Collectivism had equal preferences among students. Both female and male respondents requested Indulgence (84%) for mobile-based learning with more male respondents opted for Indulgence. The study provided a model for this gender sensitive mobile-based learning. Implications of implementing mobile-based learning as an ideal alternative for well-accommodated education are is also discussed.

  20. Bayesian based Diagnostic Model for Condition based Maintenance of Offshore Wind Farms

    DEFF Research Database (Denmark)

    Asgarpour, Masoud; Sørensen, John Dalsgaard

    2018-01-01

    Operation and maintenance costs are a major contributor to the Levelized Cost of Energy for electricity produced by offshore wind and can be significantly reduced if existing corrective actions are performed as efficiently as possible and if future corrective actions are avoided by performing...... sufficient preventive actions. This paper presents an applied and generic diagnostic model for fault detection and condition based maintenance of offshore wind components. The diagnostic model is based on two probabilistic matrices; first, a confidence matrix, representing the probability of detection using...... for a wind turbine component based on vibration, temperature, and oil particle fault detection methods. The last part of the paper will have a discussion of the case study results and present conclusions....

  1. Predicting seizure by modeling synaptic plasticity based on EEG signals - a case study of inherited epilepsy

    Science.gov (United States)

    Zhang, Honghui; Su, Jianzhong; Wang, Qingyun; Liu, Yueming; Good, Levi; Pascual, Juan M.

    2018-03-01

    This paper explores the internal dynamical mechanisms of epileptic seizures through quantitative modeling based on full brain electroencephalogram (EEG) signals. Our goal is to provide seizure prediction and facilitate treatment for epileptic patients. Motivated by an earlier mathematical model with incorporated synaptic plasticity, we studied the nonlinear dynamics of inherited seizures through a differential equation model. First, driven by a set of clinical inherited electroencephalogram data recorded from a patient with diagnosed Glucose Transporter Deficiency, we developed a dynamic seizure model on a system of ordinary differential equations. The model was reduced in complexity after considering and removing redundancy of each EEG channel. Then we verified that the proposed model produces qualitatively relevant behavior which matches the basic experimental observations of inherited seizure, including synchronization index and frequency. Meanwhile, the rationality of the connectivity structure hypothesis in the modeling process was verified. Further, through varying the threshold condition and excitation strength of synaptic plasticity, we elucidated the effect of synaptic plasticity to our seizure model. Results suggest that synaptic plasticity has great effect on the duration of seizure activities, which support the plausibility of therapeutic interventions for seizure control.

  2. A text-based data mining and toxicity prediction modeling system for a clinical decision support in radiation oncology: A preliminary study

    Science.gov (United States)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Chang, Kyung Hwan; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie

    2017-08-01

    The aim of this study is an integrated research for text-based data mining and toxicity prediction modeling system for clinical decision support system based on big data in radiation oncology as a preliminary research. The structured and unstructured data were prepared by treatment plans and the unstructured data were extracted by dose-volume data image pattern recognition of prostate cancer for research articles crawling through the internet. We modeled an artificial neural network to build a predictor model system for toxicity prediction of organs at risk. We used a text-based data mining approach to build the artificial neural network model for bladder and rectum complication predictions. The pattern recognition method was used to mine the unstructured toxicity data for dose-volume at the detection accuracy of 97.9%. The confusion matrix and training model of the neural network were achieved with 50 modeled plans (n = 50) for validation. The toxicity level was analyzed and the risk factors for 25% bladder, 50% bladder, 20% rectum, and 50% rectum were calculated by the artificial neural network algorithm. As a result, 32 plans could cause complication but 18 plans were designed as non-complication among 50 modeled plans. We integrated data mining and a toxicity modeling method for toxicity prediction using prostate cancer cases. It is shown that a preprocessing analysis using text-based data mining and prediction modeling can be expanded to personalized patient treatment decision support based on big data.

  3. An agent-based model to study market penetration of plug-in hybrid electric vehicles

    International Nuclear Information System (INIS)

    Eppstein, Margaret J.; Grover, David K.; Marshall, Jeffrey S.; Rizzo, Donna M.

    2011-01-01

    A spatially explicit agent-based vehicle consumer choice model is developed to explore sensitivities and nonlinear interactions between various potential influences on plug-in hybrid vehicle (PHEV) market penetration. The model accounts for spatial and social effects (including threshold effects, homophily, and conformity) and media influences. Preliminary simulations demonstrate how such a model could be used to identify nonlinear interactions among potential leverage points, inform policies affecting PHEV market penetration, and help identify future data collection necessary to more accurately model the system. We examine sensitivity of the model to gasoline prices, to accuracy in estimation of fuel costs, to agent willingness to adopt the PHEV technology, to PHEV purchase price and rebates, to PHEV battery range, and to heuristic values related to gasoline usage. Our simulations indicate that PHEV market penetration could be enhanced significantly by providing consumers with ready estimates of expected lifetime fuel costs associated with different vehicles (e.g., on vehicle stickers), and that increases in gasoline prices could nonlinearly magnify the impact on fleet efficiency. We also infer that a potential synergy from a gasoline tax with proceeds is used to fund research into longer-range lower-cost PHEV batteries. - Highlights: → We model consumer agents to study potential market penetration of PHEVs. → The model accounts for spatial, social, and media effects. → We identify interactions among potential leverage points that could inform policy. → Consumer access to expected lifetime fuel costs may enhance PHEV market penetration. → Increasing PHEV battery range has synergistic effects on fleet efficiency.

  4. Training Post-9/11 Police Officers with a Counter-Terrorism Reality-Based Training Model: A Case Study

    Science.gov (United States)

    Biddle, Christopher J.

    2013-01-01

    The purpose of this qualitative holistic multiple-case study was to identify the optimal theoretical approach for a Counter-Terrorism Reality-Based Training (CTRBT) model to train post-9/11 police officers to perform effectively in their counter-terrorism assignments. Post-9/11 police officers assigned to counter-terrorism duties are not trained…

  5. Modeling and cellular studies

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    Testing the applicability of mathematical models with carefully designed experiments is a powerful tool in the investigations of the effects of ionizing radiation on cells. The modeling and cellular studies complement each other, for modeling provides guidance for designing critical experiments which must provide definitive results, while the experiments themselves provide new input to the model. Based on previous experimental results the model for the accumulation of damage in Chlamydomonas reinhardi has been extended to include various multiple two-event combinations. Split dose survival experiments have shown that models tested to date predict most but not all the observed behavior. Stationary-phase mammalian cells, required for tests of other aspects of the model, have been shown to be at different points in the cell cycle depending on how they were forced to stop proliferating. These cultures also demonstrate different capacities for repair of sublethal radiation damage

  6. Measurement-based harmonic current modeling of mobile storage for power quality study in the distribution system

    Directory of Open Access Journals (Sweden)

    Wenge Christoph

    2017-12-01

    Full Text Available Electric vehicles (EVs can be utilized as mobile storages in a power system. The use of battery chargers can cause current harmonics in the supplied AC system. In order to analyze the impact of different EVs with regardto their number and their emission of current harmonics, a generic harmonic current model of EV types was built and implemented in the power system simulation tool PSS®NETOMAC. Based on the measurement data for different types of EVs three standardized harmonic EV models were developed and parametrized. Further, the identified harmonic models are used by the computation of load flow in a modeled, German power distribution system. As a benchmark, a case scenario was studied regarding a high market penetration of EVs in the year 2030 for Germany. The impact of the EV charging on the power distribution system was analyzed and evaluated with valid power quality standards.

  7. Thermodynamic data base needs for modeling studies of the Yucca Mountain project

    International Nuclear Information System (INIS)

    Palmer, C.E.A.; Silva, R.J.; Bucher, J.J.

    1996-01-01

    This document is the first in a series of documents outlining the thermodynamic data needs for performing geochemical modeling calculations in support of various waste package performance assessment activities for the Yucca Mountain Project. The documents are intended to identify and justify the critical thermodynamic data needs for the data base to be used with the models. The Thermodynamic Data Determinations task supplies data needed to resolve performance or design issues and the development of the data base will remain an iterative process as needs change or data improve. For example, data are needed to predict: (1) major ion groundwater chemistry and its evolution, (2) mineral stabilities and evolution, (3) engineered barrier near-field transport and retardation properties, (4) changes in geochemical conditions and processes, (5) solubilities, speciation and transport of waste radionuclides and (6) the dissolution of corrosion of construction and canister materials and the effect on groundwater chemistry and radionuclide solubilities and transport. The system is complex and interactive, and data need to be supplied in order to model the changes and their effect on other components of the system, e.g., temperature, pH and redox conditions (Eh). Through sensitivity and uncertainty analyses, the critical data and system parameters will be identified and the acceptable variations in them documented

  8. Feasibility model study for Blumbangreksa product model based on lean startup method

    Science.gov (United States)

    Pakpahan, A. K.; Dewobroto, W. S.; Pratama, R. Y.

    2017-12-01

    Based on the data from Ministry of Maritime Affairs and Fisheries in 2015, the productivity of shrimp farmers in Indonesia is still below China, India and Thailand, because of the low survival rate of shrimp seeds were planted in Indonesia. Water quality factors become a significant factor that increasesthe survival rate of shrimp seeds plantation, therefore team of PT. Atnic EkoteknoWicaksana create a tool called Blumbangreksa that able to monitor water quality of shrimp farms, measure temperature, salinity, pH, DO (dissolved oxygen), TDS (total dissolve solid) in water and moist air over the surface of the water and GSM -based and Internet of things. Based on the research results, unique value proposition of Blumbangreksa products is the measurement result of water quality are accurate, real-time measurements, based on Internet of things and have the ability measurements at once. Based on the feasibility study using the opportunity assessment of Marty Cagan, it can be seen that the product has fulfilled ten elements of assessment opportunity, so Blumbangreksa products are considered feasible. Initial investment fund of Blumbangreksa products is Rp 1,369,856,574, with profitability index of 1:51 and average breakeven products each year as many as 18 products are sold, and the payback period for 4 years and 2 months, therefore the business of Blumbangreksa product is feasible.

  9. The Research of Clinical Decision Support System Based on Three-Layer Knowledge Base Model

    Directory of Open Access Journals (Sweden)

    Yicheng Jiang

    2017-01-01

    Full Text Available In many clinical decision support systems, a two-layer knowledge base model (disease-symptom of rule reasoning is used. This model often does not express knowledge very well since it simply infers disease from the presence of certain symptoms. In this study, we propose a three-layer knowledge base model (disease-symptom-property to utilize more useful information in inference. The system iteratively calculates the probability of patients who may suffer from diseases based on a multisymptom naive Bayes algorithm, in which the specificity of these disease symptoms is weighted by the estimation of the degree of contribution to diagnose the disease. It significantly reduces the dependencies between attributes to apply the naive Bayes algorithm more properly. Then, the online learning process for parameter optimization of the inference engine was completed. At last, our decision support system utilizing the three-layer model was formally evaluated by two experienced doctors. By comparisons between prediction results and clinical results, our system can provide effective clinical recommendations to doctors. Moreover, we found that the three-layer model can improve the accuracy of predictions compared with the two-layer model. In light of some of the limitations of this study, we also identify and discuss several areas that need continued improvement.

  10. Study on quantification method based on Monte Carlo sampling for multiunit probabilistic safety assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Kye Min [KHNP Central Research Institute, Daejeon (Korea, Republic of); Han, Sang Hoon; Park, Jin Hee; Lim, Ho Gon; Yang, Joon Yang [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of)

    2017-06-15

    In Korea, many nuclear power plants operate at a single site based on geographical characteristics, but the population density near the sites is higher than that in other countries. Thus, multiunit accidents are a more important consideration than in other countries and should be addressed appropriately. Currently, there are many issues related to a multiunit probabilistic safety assessment (PSA). One of them is the quantification of a multiunit PSA model. A traditional PSA uses a Boolean manipulation of the fault tree in terms of the minimal cut set. However, such methods have some limitations when rare event approximations cannot be used effectively or a very small truncation limit should be applied to identify accident sequence combinations for a multiunit site. In particular, it is well known that seismic risk in terms of core damage frequency can be overestimated because there are many events that have a high failure probability. In this study, we propose a quantification method based on a Monte Carlo approach for a multiunit PSA model. This method can consider all possible accident sequence combinations in a multiunit site and calculate a more exact value for events that have a high failure probability. An example model for six identical units at a site was also developed and quantified to confirm the applicability of the proposed method.

  11. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  12. Model-free stochastic processes studied with q-wavelet-based informational tools

    International Nuclear Information System (INIS)

    Perez, D.G.; Zunino, L.; Martin, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A.

    2007-01-01

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework

  13. Study on fusion potential barrier in heavy ion reactions based on the dynamical model

    International Nuclear Information System (INIS)

    Tian Junlong; Wu Xizhen; Li Zhuxia; Wang Ning; Liu Fuhu

    2004-01-01

    Based on an improved quantum molecular dynamics model the static and dynamic potential in the entrance channel of synthesis of superheavy nuclei are studied. The dependence of the static potential (and driving potential) on mass-asymmetry is obtained. From this study authors find out that the mass-symmetric system seems to be difficult to fuse and the fusing system with the largest driving potential could be the optimal choice of the projectile-target combination. By comparing the static potential barrier with the dynamic one authors find that the latter one is lower than former one obviously, and that the dynamical potential barrier is entrance energy dependent. The maximum and minimum of dynamic potential barriers approach to the diabatic (sudden approximation) and the adiabatic static potential barriers, respectively

  14. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).

  15. Implementing three evidence-based program models: early lessons from the Teen Pregnancy Prevention Replication Study.

    Science.gov (United States)

    Kelsey, Meredith; Layzer, Jean

    2014-03-01

    This article describes some of the early implementation challenges faced by nine grantees participating in the Teen Pregnancy Prevention Replication Study and their response to them. The article draws on information collected as part of a comprehensive implementation study. Sources include site and program documents; program officer reports; notes from site investigation, selection and negotiation; ongoing communications with grantees as part of putting the study into place; and semi-structured interviews with program staff. The issues faced by grantees in implementing evidence-based programs designed to prevent teen pregnancy varied by program model. Grantees implementing a classroom-based curriculum faced challenges in delivering the curriculum within the constraints of school schedules and calendars (program length and size of class). Grantees implementing a culturally tailored curriculum faced a series of challenges, including implementing the intervention as part of the regular school curriculum in schools with diverse populations; low attendance when delivered as an after-school program; and resistance on the part of schools to specific curriculum content. The third set of grantees, implementing a program in clinics, faced challenges in identifying and recruiting young women into the program and in retaining young women once they were in the program. The experiences of these grantees reflect some of the complexities that should be carefully considered when choosing to replicate evidence-based programs. The Teen Pregnancy Prevention replication study will provide important context for assessing the effectiveness of some of the more widely replicated evidence-based programs. Copyright © 2014 Society for Adolescent Health and Medicine. All rights reserved.

  16. A Coupled Simulation Architecture for Agent-Based/Geohydrological Modelling

    Science.gov (United States)

    Jaxa-Rozen, M.

    2016-12-01

    The quantitative modelling of social-ecological systems can provide useful insights into the interplay between social and environmental processes, and their impact on emergent system dynamics. However, such models should acknowledge the complexity and uncertainty of both of the underlying subsystems. For instance, the agent-based models which are increasingly popular for groundwater management studies can be made more useful by directly accounting for the hydrological processes which drive environmental outcomes. Conversely, conventional environmental models can benefit from an agent-based depiction of the feedbacks and heuristics which influence the decisions of groundwater users. From this perspective, this work describes a Python-based software architecture which couples the popular NetLogo agent-based platform with the MODFLOW/SEAWAT geohydrological modelling environment. This approach enables users to implement agent-based models in NetLogo's user-friendly platform, while benefiting from the full capabilities of MODFLOW/SEAWAT packages or reusing existing geohydrological models. The software architecture is based on the pyNetLogo connector, which provides an interface between the NetLogo agent-based modelling software and the Python programming language. This functionality is then extended and combined with Python's object-oriented features, to design a simulation architecture which couples NetLogo with MODFLOW/SEAWAT through the FloPy library (Bakker et al., 2016). The Python programming language also provides access to a range of external packages which can be used for testing and analysing the coupled models, which is illustrated for an application of Aquifer Thermal Energy Storage (ATES).

  17. A continuum mechanics-based musculo-mechanical model for esophageal transport

    Science.gov (United States)

    Kou, Wenjun; Griffith, Boyce E.; Pandolfino, John E.; Kahrilas, Peter J.; Patankar, Neelesh A.

    2017-11-01

    In this work, we extend our previous esophageal transport model using an immersed boundary (IB) method with discrete fiber-based structural model, to one using a continuum mechanics-based model that is approximated based on finite elements (IB-FE). To deal with the leakage of flow when the Lagrangian mesh becomes coarser than the fluid mesh, we employ adaptive interaction quadrature points to deal with Lagrangian-Eulerian interaction equations based on a previous work (Griffith and Luo [1]). In particular, we introduce a new anisotropic adaptive interaction quadrature rule. The new rule permits us to vary the interaction quadrature points not only at each time-step and element but also at different orientations per element. This helps to avoid the leakage issue without sacrificing the computational efficiency and accuracy in dealing with the interaction equations. For the material model, we extend our previous fiber-based model to a continuum-based model. We present formulations for general fiber-reinforced material models in the IB-FE framework. The new material model can handle non-linear elasticity and fiber-matrix interactions, and thus permits us to consider more realistic material behavior of biological tissues. To validate our method, we first study a case in which a three-dimensional short tube is dilated. Results on the pressure-displacement relationship and the stress distribution matches very well with those obtained from the implicit FE method. We remark that in our IB-FE case, the three-dimensional tube undergoes a very large deformation and the Lagrangian mesh-size becomes about 6 times of Eulerian mesh-size in the circumferential orientation. To validate the performance of the method in handling fiber-matrix material models, we perform a second study on dilating a long fiber-reinforced tube. Errors are small when we compare numerical solutions with analytical solutions. The technique is then applied to the problem of esophageal transport. We use two

  18. Modeling and Experimental Study of Soft Error Propagation Based on Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Wei He

    2016-01-01

    Full Text Available Aiming to estimate SEE soft error performance of complex electronic systems, a soft error propagation model based on cellular automaton is proposed and an estimation methodology based on circuit partitioning and error propagation is presented. Simulations indicate that different fault grade jamming and different coupling factors between cells are the main parameters influencing the vulnerability of the system. Accelerated radiation experiments have been developed to determine the main parameters for raw soft error vulnerability of the module and coupling factors. Results indicate that the proposed method is feasible.

  19. Understanding Elementary Astronomy by Making Drawing-Based Models

    Science.gov (United States)

    van Joolingen, W. R.; Aukes, Annika V.; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 children (ages ranging from 7 to 15) to create a…

  20. Model For Marketing Strategy Decision Based On Multicriteria Decicion Making: A Case Study In Batik Madura Industry

    Science.gov (United States)

    Anna, I. D.; Cahyadi, I.; Yakin, A.

    2018-01-01

    Selection of marketing strategy is a prominent competitive advantage for small and medium enterprises business development. The selection process is is a multiple criteria decision-making problem, which includes evaluation of various attributes or criteria in a process of strategy formulation. The objective of this paper is to develop a model for the selection of a marketing strategy in Batik Madura industry. The current study proposes an integrated approach based on analytic network process (ANP) and technique for order preference by similarity to ideal solution (TOPSIS) to determine the best strategy for Batik Madura marketing problems. Based on the results of group decision-making technique, this study selected fourteen criteria, including consistency, cost, trend following, customer loyalty, business volume, uniqueness manpower, customer numbers, promotion, branding, bussiness network, outlet location, credibility and the inovation as Batik Madura marketing strategy evaluation criteria. A survey questionnaire developed from literature review was distributed to a sample frame of Batik Madura SMEs in Pamekasan. In the decision procedure step, expert evaluators were asked to establish the decision matrix by comparing the marketing strategy alternatives under each of the individual criteria. Then, considerations obtained from ANP and TOPSIS methods were applied to build the specific criteria constraints and range of the launch strategy in the model. The model in this study demonstrates that, under current business situation, Straight-focus marketing strategy is the best marketing strategy for Batik Madura SMEs in Pamekasan.

  1. Application of a Microstructure-Based ISV Plasticity Damage Model to Study Penetration Mechanics of Metals and Validation through Penetration Study of Aluminum

    Directory of Open Access Journals (Sweden)

    Yangqing Dou

    2017-01-01

    Full Text Available A developed microstructure-based internal state variable (ISV plasticity damage model is for the first time used for simulating penetration mechanics of aluminum to find out its penetration properties. The ISV damage model tries to explain the interplay between physics at different length scales that governs the failure and damage mechanisms of materials by linking the macroscopic failure and damage behavior of the materials with their micromechanical performance, such as void nucleation, growth, and coalescence. Within the continuum modeling framework, microstructural features of materials are represented using a set of ISVs, and rate equations are employed to depict damage history and evolution of the materials. For experimental calibration of this damage model, compression, tension, and torsion straining conditions are considered to distinguish damage evolutions under different stress states. To demonstrate the reliability of the presented ISV model, that model is applied for studying penetration mechanics of aluminum and the numerical results are validated by comparing with simulation results yielded from the Johnson-Cook model as well as analytical results calculated from an existing theoretical model.

  2. The implementation of discovery learning model based on lesson study to increase student's achievement in colloid

    Science.gov (United States)

    Suyanti, Retno Dwi; Purba, Deby Monika

    2017-03-01

    The objectives of this research are to get the increase student's achievement on the discovery learning model based on lesson study. Beside of that, this research also conducted to know the cognitive aspect. This research was done in three school that are SMA N 3 Medan. Population is all the students in SMA N 11 Medan which taken by purposive random sampling. The research instruments are achievement test instruments that have been validated. The research data analyzed by statistic using Ms Excell. The result data shows that the student's achievement taught by discovery learning model based on Lesson study higher than the student's achievement taught by direct instructional method. It can be seen from the average of gain and also proved with t-test, the normalized gain in experimental class of SMA N 11 is (0.74±0.12) and control class (0.45±0.12), at significant level α = 0.05, Ha is received and Ho is refused where tcount>ttable in SMA N 11 (9.81>1,66). Then get the improvement cognitive aspect from three of school is C2 where SMA N 11 is 0.84(high). Then the observation sheet result of lesson study from SMA N 11 92 % of student working together while 67% less in active using media.

  3. Cloud-Based Model Calibration Using OpenStudio: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  4. Comparative Study of SSVEP- and P300-Based Models for the Telepresence Control of Humanoid Robots.

    Directory of Open Access Journals (Sweden)

    Jing Zhao

    Full Text Available In this paper, we evaluate the control performance of SSVEP (steady-state visual evoked potential- and P300-based models using Cerebot-a mind-controlled humanoid robot platform. Seven subjects with diverse experience participated in experiments concerning the open-loop and closed-loop control of a humanoid robot via brain signals. The visual stimuli of both the SSVEP- and P300- based models were implemented on a LCD computer monitor with a refresh frequency of 60 Hz. Considering the operation safety, we set the classification accuracy of a model over 90.0% as the most important mandatory for the telepresence control of the humanoid robot. The open-loop experiments demonstrated that the SSVEP model with at most four stimulus targets achieved the average accurate rate about 90%, whereas the P300 model with the six or more stimulus targets under five repetitions per trial was able to achieve the accurate rates over 90.0%. Therefore, the four SSVEP stimuli were used to control four types of robot behavior; while the six P300 stimuli were chosen to control six types of robot behavior. Both of the 4-class SSVEP and 6-class P300 models achieved the average success rates of 90.3% and 91.3%, the average response times of 3.65 s and 6.6 s, and the average information transfer rates (ITR of 24.7 bits/min 18.8 bits/min, respectively. The closed-loop experiments addressed the telepresence control of the robot; the objective was to cause the robot to walk along a white lane marked in an office environment using live video feedback. Comparative studies reveal that the SSVEP model yielded faster response to the subject's mental activity with less reliance on channel selection, whereas the P300 model was found to be suitable for more classifiable targets and required less training. To conclude, we discuss the existing SSVEP and P300 models for the control of humanoid robots, including the models proposed in this paper.

  5. Study of a risk-based piping inspection guideline system.

    Science.gov (United States)

    Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung

    2007-02-01

    A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.

  6. Visualization study of operators' plant knowledge model

    International Nuclear Information System (INIS)

    Kanno, Tarou; Furuta, Kazuo; Yoshikawa, Shinji

    1999-03-01

    Nuclear plants are typically very complicated systems and are required extremely high level safety on the operations. Since it is never possible to include all the possible anomaly scenarios in education/training curriculum, plant knowledge formation is desired for operators to enable thein to act against unexpected anomalies based on knowledge base decision making. The authors have been conducted a study on operators' plant knowledge model for the purpose of supporting operators' effort in forming this kind of plant knowledge. In this report, an integrated plant knowledge model consisting of configuration space, causality space, goal space and status space is proposed. The authors examined appropriateness of this model and developed a prototype system to support knowledge formation by visualizing the operators' knowledge model and decision making process in knowledge-based actions with this model on a software system. Finally the feasibility of this prototype as a supportive method in operator education/training to enhance operators' ability in knowledge-based performance has been evaluated. (author)

  7. SEP modeling based on global heliospheric models at the CCMC

    Science.gov (United States)

    Mays, M. L.; Luhmann, J. G.; Odstrcil, D.; Bain, H. M.; Schwadron, N.; Gorby, M.; Li, Y.; Lee, K.; Zeitlin, C.; Jian, L. K.; Lee, C. O.; Mewaldt, R. A.; Galvin, A. B.

    2017-12-01

    Heliospheric models provide contextual information of conditions in the heliosphere, including the background solar wind conditions and shock structures, and are used as input to SEP models, providing an essential tool for understanding SEP properties. The global 3D MHD WSA-ENLIL+Cone model provides a time-dependent background heliospheric description, into which a spherical shaped hydrodynamic CME can be inserted. ENLIL simulates solar wind parameters and additionally one can extract the magnetic topologies of observer-connected magnetic field lines and all plasma and shock properties along those field lines. An accurate representation of the background solar wind is necessary for simulating transients. ENLIL simulations also drive SEP models such as the Solar Energetic Particle Model (SEPMOD) (Luhmann et al. 2007, 2010) and the Energetic Particle Radiation Environment Module (EPREM) (Schwadron et al. 2010). The Community Coordinated Modeling Center (CCMC) is in the process of making these SEP models available to the community and offering a system to run SEP models driven by a variety of heliospheric models available at CCMC. SEPMOD injects protons onto a sequence of observer field lines at intensities dependent on the connected shock source strength which are then integrated at the observer to approximate the proton flux. EPREM couples with MHD models such as ENLIL and computes energetic particle distributions based on the focused transport equation along a Lagrangian grid of nodes that propagate out with the solar wind. The coupled SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. The coupled ENLIL and SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. In this presentation we demonstrate several case studies of SEP event modeling at different observers based on WSA-ENLIL+Cone simulations.

  8. Mixed models in cerebral ischemia study

    Directory of Open Access Journals (Sweden)

    Matheus Henrique Dal Molin Ribeiro

    2016-06-01

    Full Text Available The data modeling from longitudinal studies stands out in the current scientific scenario, especially in the areas of health and biological sciences, which induces a correlation between measurements for the same observed unit. Thus, the modeling of the intra-individual dependency is required through the choice of a covariance structure that is able to receive and accommodate the sample variability. However, the lack of methodology for correlated data analysis may result in an increased occurrence of type I or type II errors and underestimate/overestimate the standard errors of the model estimates. In the present study, a Gaussian mixed model was adopted for the variable response latency of an experiment investigating the memory deficits in animals subjected to cerebral ischemia when treated with fish oil (FO. The model parameters estimation was based on maximum likelihood methods. Based on the restricted likelihood ratio test and information criteria, the autoregressive covariance matrix was adopted for errors. The diagnostic analyses for the model were satisfactory, since basic assumptions and results obtained corroborate with biological evidence; that is, the effectiveness of the FO treatment to alleviate the cognitive effects caused by cerebral ischemia was found.

  9. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    Science.gov (United States)

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  10. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Tung, Chang-Shung; Faeder, James R; Lopez, Carlos F; Hlavacek, William S

    2014-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and posttranslational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). © 2013 Wiley Periodicals, Inc.

  11. Case Study for the Return on Investment of Internet of Things Using Agent-Based Modelling and Data Science

    Directory of Open Access Journals (Sweden)

    Charles Houston

    2017-01-01

    Full Text Available As technology advances towards new paradigms such as the Internet of Things, there is a desire among business leaders for a reliable method to determine the value of supporting these ventures. Traditional simulation and analysis techniques cannot model the complex systems inherent in fields such as infrastructure asset management, or suffer from a lack of data on which to build a prediction. Agent-based modelling, through an integration with data science, presents an attractive simulation method to capture these underlying complexities and provide a solution. The aim of this work is to investigate this integration as a refined process for answering practical business questions. A specific case study is addressed to assess the return on investment of installing condition monitoring sensors on lift assets in a London Underground station. An agent-based model is developed for this purpose, supported by analysis from historical data. The simulation results demonstrate how returns can be achieved and highlight features induced as a result of stochasticity in the model. Suggestions of future research paths are additionally outlined.

  12. Knowledge-Based Environmental Context Modeling

    Science.gov (United States)

    Pukite, P. R.; Challou, D. J.

    2017-12-01

    As we move from the oil-age to an energy infrastructure based on renewables, the need arises for new educational tools to support the analysis of geophysical phenomena and their behavior and properties. Our objective is to present models of these phenomena to make them amenable for incorporation into more comprehensive analysis contexts. Starting at the level of a college-level computer science course, the intent is to keep the models tractable and therefore practical for student use. Based on research performed via an open-source investigation managed by DARPA and funded by the Department of Interior [1], we have adapted a variety of physics-based environmental models for a computer-science curriculum. The original research described a semantic web architecture based on patterns and logical archetypal building-blocks (see figure) well suited for a comprehensive environmental modeling framework. The patterns span a range of features that cover specific land, atmospheric and aquatic domains intended for engineering modeling within a virtual environment. The modeling engine contained within the server relied on knowledge-based inferencing capable of supporting formal terminology (through NASA JPL's Semantic Web for Earth and Environmental Technology (SWEET) ontology and a domain-specific language) and levels of abstraction via integrated reasoning modules. One of the key goals of the research was to simplify models that were ordinarily computationally intensive to keep them lightweight enough for interactive or virtual environment contexts. The breadth of the elements incorporated is well-suited for learning as the trend toward ontologies and applying semantic information is vital for advancing an open knowledge infrastructure. As examples of modeling, we have covered such geophysics topics as fossil-fuel depletion, wind statistics, tidal analysis, and terrain modeling, among others. Techniques from the world of computer science will be necessary to promote efficient

  13. Testing the Community-Based Learning Collaborative (CBLC) implementation model: a study protocol.

    Science.gov (United States)

    Hanson, Rochelle F; Schoenwald, Sonja; Saunders, Benjamin E; Chapman, Jason; Palinkas, Lawrence A; Moreland, Angela D; Dopp, Alex

    2016-01-01

    High rates of youth exposure to violence, either through direct victimization or witnessing, result in significant health/mental health consequences and high associated lifetime costs. Evidence-based treatments (EBTs), such as Trauma-Focused Cognitive Behavioral Therapy (TF-CBT), can prevent and/or reduce these negative effects, yet these treatments are not standard practice for therapists working with children identified by child welfare or mental health systems as needing services. While research indicates that collaboration among child welfare and mental health services sectors improves availability and sustainment of EBTs for children, few implementation strategies designed specifically to promote and sustain inter-professional collaboration (IC) and inter-organizational relationships (IOR) have undergone empirical investigation. A potential candidate for evaluation is the Community-Based Learning Collaborative (CBLC) implementation model, an adaptation of the Learning Collaborative which includes strategies designed to develop and strengthen inter-professional relationships between brokers and providers of mental health services to promote IC and IOR and achieve sustained implementation of EBTs for children within a community. This non-experimental, mixed methods study involves two phases: (1) analysis of existing prospective quantitative and qualitative quality improvement and project evaluation data collected pre and post, weekly, and monthly from 998 participants in one of seven CBLCs conducted as part of a statewide initiative; and (2) Phase 2 collection of new quantitative and qualitative (key informant interviews) data during the funded study period to evaluate changes in relations among IC, IOR, social networks and the penetration and sustainment of TF-CBT in targeted communities. Recruitment for Phase 2 is from the pool of 998 CBLC participants to achieve a targeted enrollment of n = 150. Study aims include: (1) Use existing quality improvement

  14. Computer-Based Molecular Modelling: Finnish School Teachers' Experiences and Views

    Science.gov (United States)

    Aksela, Maija; Lundell, Jan

    2008-01-01

    Modern computer-based molecular modelling opens up new possibilities for chemistry teaching at different levels. This article presents a case study seeking insight into Finnish school teachers' use of computer-based molecular modelling in teaching chemistry, into the different working and teaching methods used, and their opinions about necessary…

  15. A Causal Model of Consumer-Based Brand Equity

    Directory of Open Access Journals (Sweden)

    Szőcs Attila

    2015-12-01

    Full Text Available Branding literature suggests that consumer-based brand equity (CBBE is a multidimensional construct. Starting from this approach and developing a conceptual multidimensional model, this study finds that CBBE can be best modelled with a two-dimensional structure and claims that it achieves this result by choosing the theoretically based causal specification. On the contrary, with reflective specification, one will be able to fit almost any valid construct because of the halo effect and common method bias. In the final model, Trust (in quality and Advantage are causing the second-order Brand Equity. The two-dimensional brand equity is an intuitive model easy to interpret and easy to measure, which thus may be a much more attractive means for the management as well.

  16. Agent-based models in economics a toolkit

    CERN Document Server

    Fagiolo, Giorgio; Gallegati, Mauro; Richiardi, Matteo; Russo, Alberto

    2018-01-01

    In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, lab...

  17. A Novel Combined Model Based on an Artificial Intelligence Algorithm—A Case Study on Wind Speed Forecasting in Penglai, China

    Directory of Open Access Journals (Sweden)

    Feiyu Zhang

    2016-06-01

    Full Text Available Wind speed forecasting plays a key role in wind-related engineering studies and is important in the management of wind farms. Current forecasting models based on different optimization algorithms can be adapted to various wind speed time series data. However, these methodologies cannot aggregate different hybrid forecasting methods and take advantage of the component models. To avoid these limitations, we propose a novel combined forecasting model called SSA-PSO-DWCM, i.e., particle swarm optimization (PSO determined weight coefficients model. This model consisted of three main steps: one is the decomposition of the original wind speed signals to discard the noise, the second is the parameter optimization of the forecasting method, and the last is the combination of different models in a nonlinear way. The proposed combined model is examined by forecasting the wind speed (10-min intervals of wind turbine 5 located in the Penglai region of China. The simulations reveal that the proposed combined model demonstrates a more reliable forecast than the component forecasting engines and the traditional combined method, which is based on a linear method.

  18. DEVELOPMENT OF SCIENCE PROCESS SKILLS STUDENTS WITH PROJECT BASED LEARNING MODEL- BASED TRAINING IN LEARNING PHYSICS

    Directory of Open Access Journals (Sweden)

    Ratna Malawati

    2016-06-01

    Full Text Available This study aims to improve the physics Science Process Skills Students on cognitive and psychomotor aspects by using model based Project Based Learning training.The object of this study is the Project Based Learning model used in the learning process of Computationa Physics.The method used is classroom action research through two learning cycles, each cycle consisting of the stages of planning, implementation, observation and reflection. In the first cycle of treatment with their emphasis given training in the first phase up to third in the model Project Based Learning, while the second cycle is given additional treatment with emphasis discussion is collaboration in achieving the best results for each group of products. The results of data analysis showed increased ability to think Students on cognitive and Science Process Skills in the psychomotor.

  19. Forcefields based molecular modeling on the mechanical and physical properties of emeraldine base polyaniline

    NARCIS (Netherlands)

    Chen, X.; Yuan, C.A.; Wong, K.Y.; Zhang, G.Q.

    2010-01-01

    Molecular dynamics (MD) and molecular mechanical (MM) analysis are carried out to provide reliable and accurate model for emeraldine base polyaniline. This study validate the forcefields and model with the physical and mechanical properties of the polyaniline. The temperature effects on non-bond

  20. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  1. Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.

  2. Comparison of Highly Resolved Model-Based Exposure Metrics for Traffic-Related Air Pollutants to Support Environmental Health Studies

    Directory of Open Access Journals (Sweden)

    Shih Ying Chang

    2015-12-01

    Full Text Available Human exposure to air pollution in many studies is represented by ambient concentrations from space-time kriging of observed values. Space-time kriging techniques based on a limited number of ambient monitors may fail to capture the concentration from local sources. Further, because people spend more time indoors, using ambient concentration to represent exposure may cause error. To quantify the associated exposure error, we computed a series of six different hourly-based exposure metrics at 16,095 Census blocks of three Counties in North Carolina for CO, NOx, PM2.5, and elemental carbon (EC during 2012. These metrics include ambient background concentration from space-time ordinary kriging (STOK, ambient on-road concentration from the Research LINE source dispersion model (R-LINE, a hybrid concentration combining STOK and R-LINE, and their associated indoor concentrations from an indoor infiltration mass balance model. Using a hybrid-based indoor concentration as the standard, the comparison showed that outdoor STOK metrics yielded large error at both population (67% to 93% and individual level (average bias between −10% to 95%. For pollutants with significant contribution from on-road emission (EC and NOx, the on-road based indoor metric performs the best at the population level (error less than 52%. At the individual level, however, the STOK-based indoor concentration performs the best (average bias below 30%. For PM2.5, due to the relatively low contribution from on-road emission (7%, STOK-based indoor metric performs the best at both population (error below 40% and individual level (error below 25%. The results of the study will help future epidemiology studies to select appropriate exposure metric and reduce potential bias in exposure characterization.

  3. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  4. TP-model transformation-based-control design frameworks

    CERN Document Server

    Baranyi, Péter

    2016-01-01

    This book covers new aspects and frameworks of control, design, and optimization based on the TP model transformation and its various extensions. The author outlines the three main steps of polytopic and LMI based control design: 1) development of the qLPV state-space model, 2) generation of the polytopic model; and 3) application of LMI to derive controller and observer. He goes on to describe why literature has extensively studied LMI design, but has not focused much on the second step, in part because the generation and manipulation of the polytopic form was not tractable in many cases. The author then shows how the TP model transformation facilitates this second step and hence reveals new directions, leading to powerful design procedures and the formulation of new questions. The chapters of this book, and the complex dynamical control tasks which they cover, are organized so as to present and analyze the beneficial aspect of the family of approaches (control, design, and optimization). Additionally, the b...

  5. Dynamics of sustained use and abandonment of clean cooking systems: study protocol for community-based system dynamics modeling.

    Science.gov (United States)

    Kumar, Praveen; Chalise, Nishesh; Yadama, Gautam N

    2016-04-26

    More than 3 billion of the world's population are affected by household air pollution from relying on unprocessed solid fuels for heating and cooking. Household air pollution is harmful to human health, climate, and environment. Sustained uptake and use of cleaner cooking technologies and fuels are proposed as solutions to this problem. In this paper, we present our study protocol aimed at understanding multiple interacting feedback mechanisms involved in the dynamic behavior between social, ecological, and technological systems driving sustained use or abandonment of cleaner cooking technologies among the rural poor in India. This study uses a comparative case study design to understand the dynamics of sustained use or abandonment of cleaner cooking technologies and fuels in four rural communities of Rajasthan, India. The study adopts a community based system dynamics modeling approach. We describe our approach of using community based system dynamics with rural communities to delineate the feedback mechanisms involved in the uptake and sustainment of clean cooking technologies. We develop a reference mode with communities showing the trend over time of use or abandonment of cleaner cooking technologies and fuels in these communities. Subsequently, the study develops a system dynamics model with communities to understand the complex sub-systems driving the behavior in these communities as reflected in the reference mode. We use group model building techniques to facilitate participation of relevant stakeholders in the four communities and elicit a narrative describing the feedback mechanisms underlying sustained adoption or abandonment of cleaner cooking technologies. In understanding the dynamics of feedback mechanisms in the uptake and exclusive use of cleaner cooking systems, we increase the likelihood of dissemination and implementation of efficacious interventions into everyday settings to improve the health and wellbeing of women and children most affected

  6. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  7. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P J [VTT Electronics, Oulu (Finland). Embedded Software

    1998-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  8. Feasibility Study on Tension Estimation Technique for Hanger Cables Using the FE Model-Based System Identification Method

    Directory of Open Access Journals (Sweden)

    Kyu-Sik Park

    2015-01-01

    Full Text Available Hanger cables in suspension bridges are partly constrained by horizontal clamps. So, existing tension estimation methods based on a single cable model are prone to higher errors as the cable gets shorter, making it more sensitive to flexural rigidity. Therefore, inverse analysis and system identification methods based on finite element models are suggested recently. In this paper, the applicability of system identification methods is investigated using the hanger cables of Gwang-An bridge. The test results show that the inverse analysis and systemic identification methods based on finite element models are more reliable than the existing string theory and linear regression method for calculating the tension in terms of natural frequency errors. However, the estimation error of tension can be varied according to the accuracy of finite element model in model based methods. In particular, the boundary conditions affect the results more profoundly when the cable gets shorter. Therefore, it is important to identify the boundary conditions through experiment if it is possible. The FE model-based tension estimation method using system identification method can take various boundary conditions into account. Also, since it is not sensitive to the number of natural frequency inputs, the availability of this system is high.

  9. Physiologically based pharmacokinetic toolkit to evaluate environmental exposures: Applications of the dioxin model to study real life exposures

    Energy Technology Data Exchange (ETDEWEB)

    Emond, Claude, E-mail: claude.emond@biosmc.com [BioSimulation Consulting Inc, Newark, DE (United States); Ruiz, Patricia; Mumtaz, Moiz [Division of Toxicology and Human Health Sciences, Agency for Toxic Substances and Disease Registry, Atlanta, GA (United States)

    2017-01-15

    Chlorinated dibenzo-p-dioxins (CDDs) are a series of mono- to octa-chlorinated homologous chemicals commonly referred to as polychlorinated dioxins. One of the most potent, well-known, and persistent member of this family is 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD). As part of translational research to make computerized models accessible to health risk assessors, we present a Berkeley Madonna recoded version of the human physiologically based pharmacokinetic (PBPK) model used by the U.S. Environmental Protection Agency (EPA) in the recent dioxin assessment. This model incorporates CYP1A2 induction, which is an important metabolic vector that drives dioxin distribution in the human body, and it uses a variable elimination half-life that is body burden dependent. To evaluate the model accuracy, the recoded model predictions were compared with those of the original published model. The simulations performed with the recoded model matched well with those of the original model. The recoded model was then applied to available data sets of real life exposure studies. The recoded model can describe acute and chronic exposures and can be useful for interpreting human biomonitoring data as part of an overall dioxin and/or dioxin-like compounds risk assessment. - Highlights: • The best available dioxin PBPK model for interpreting human biomonitoring data is presented. • The original PBPK model was recoded from acslX to the Berkeley Madonna (BM) platform. • Comparisons were made of the accuracy of the recoded model with the original model. • The model is a useful addition to the ATSDR's BM based PBPK toolkit that supports risk assessors. • The application of the model to real-life exposure data sets is illustrated.

  10. Business Models for NFC based mobile payments

    OpenAIRE

    Johannes Sang Un Chae; Jonas Hedman

    2015-01-01

    Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experienc...

  11. Impact of Diabetes Education Based on Type 1 Diabetes management model

    OpenAIRE

    Ocakçı, Ayşe Ferda

    2015-01-01

    The diabetes management is considered to be adversely affected when adolescent-specific education methods are not used. In this study, Type 1 Diabetes Management Model which ensures standardisation of the diabetes education and is based on the health promotion model and formed by applying health promotion model (HPM) according to the mastery-learning theory was used. The study was performed to determine effectiveness of diabetes education based on “Type 1 Diabetes Management Model” on adolesc...

  12. Simulation-based modeling of building complexes construction management

    Science.gov (United States)

    Shepelev, Aleksandr; Severova, Galina; Potashova, Irina

    2018-03-01

    The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.

  13. Towards longitudinal activity-based models of travel demand

    NARCIS (Netherlands)

    Arentze, T.A.; Timmermans, H.J.P.; Lo, H.P.; Leung, Stephen C.H.; Tan, Susanna M.L.

    2008-01-01

    Existing activity-based models of travel demand consider a day as the time unit of observation and predict activity patterns of inhviduals for a typical or average day. In this study we argue that the use of a time span of one day severely limits the ability of the models to predict responsive

  14. A Physics-Based Modeling Framework for Prognostic Studies

    Science.gov (United States)

    Kulkarni, Chetan S.

    2014-01-01

    Prognostics and Health Management (PHM) methodologies have emerged as one of the key enablers for achieving efficient system level maintenance as part of a busy operations schedule, and lowering overall life cycle costs. PHM is also emerging as a high-priority issue in critical applications, where the focus is on conducting fundamental research in the field of integrated systems health management. The term diagnostics relates to the ability to detect and isolate faults or failures in a system. Prognostics on the other hand is the process of predicting health condition and remaining useful life based on current state, previous conditions and future operating conditions. PHM methods combine sensing, data collection, interpretation of environmental, operational, and performance related parameters to indicate systems health under its actual application conditions. The development of prognostics methodologies for the electronics field has become more important as more electrical systems are being used to replace traditional systems in several applications in the aeronautics, maritime, and automotive fields. The development of prognostics methods for electronics presents several challenges due to the great variety of components used in a system, a continuous development of new electronics technologies, and a general lack of understanding of how electronics fail. Similarly with electric unmanned aerial vehicles, electrichybrid cars, and commercial passenger aircraft, we are witnessing a drastic increase in the usage of batteries to power vehicles. However, for battery-powered vehicles to operate at maximum efficiency and reliability, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. We develop an electrochemistry-based model of Li-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable

  15. Physically-based modelling of polycrystalline semiconductor devices

    International Nuclear Information System (INIS)

    Lee, S.

    2000-01-01

    Thin-film technology using polycrystalline semiconductors has been widely applied to active-matrix-addressed liquid crystal displays (AMLCDs) where thin-film transistors act as digital pixel switches. Research and development is in progress to integrate the driver circuits around the peripheral of the display, resulting in significant cost reduction of connections between rows and columns and the peripheral circuitry. For this latter application, where for instance it is important to control the greyscale voltage level delivered to the pixel, an understanding of device behaviour is required so that models can be developed for analogue circuit simulation. For this purpose, various analytical models have been developed based on that of Seto who considered the effect of monoenergetic trap states and grain boundaries in polycrystalline materials but not the contribution of the grains to the electrical properties. The principal aim of this thesis is to describe the use of a numerical device simulator (ATLAS) as a tool to investigate the physics of the trapping process involved in the device operation, which additionally takes into account the effect of multienergetic trapping levels and the contribution of the grain into the modelling. A study of the conventional analytical models is presented, and an alternative approach is introduced which takes into account the grain regions to enhance the accuracy of the analytical modelling. A physically-based discrete-grain-boundary model and characterisation method are introduced to study the effects of the multienergetic trap states on the electrical characteristics of poly-TFTs using CdSe devices as the experimental example, and the electrical parameters such as the density distribution of the trapping states are extracted. The results show excellent agreement between the simulation and experimental data. The limitations of this proposed physical model are also studied and discussed. (author)

  16. Least-squares model-based halftoning

    Science.gov (United States)

    Pappas, Thrasyvoulos N.; Neuhoff, David L.

    1992-08-01

    A least-squares model-based approach to digital halftoning is proposed. It exploits both a printer model and a model for visual perception. It attempts to produce an 'optimal' halftoned reproduction, by minimizing the squared error between the response of the cascade of the printer and visual models to the binary image and the response of the visual model to the original gray-scale image. Conventional methods, such as clustered ordered dither, use the properties of the eye only implicitly, and resist printer distortions at the expense of spatial and gray-scale resolution. In previous work we showed that our printer model can be used to modify error diffusion to account for printer distortions. The modified error diffusion algorithm has better spatial and gray-scale resolution than conventional techniques, but produces some well known artifacts and asymmetries because it does not make use of an explicit eye model. Least-squares model-based halftoning uses explicit eye models and relies on printer models that predict distortions and exploit them to increase, rather than decrease, both spatial and gray-scale resolution. We have shown that the one-dimensional least-squares problem, in which each row or column of the image is halftoned independently, can be implemented with the Viterbi's algorithm. Unfortunately, no closed form solution can be found in two dimensions. The two-dimensional least squares solution is obtained by iterative techniques. Experiments show that least-squares model-based halftoning produces more gray levels and better spatial resolution than conventional techniques. We also show that the least- squares approach eliminates the problems associated with error diffusion. Model-based halftoning can be especially useful in transmission of high quality documents using high fidelity gray-scale image encoders. As we have shown, in such cases halftoning can be performed at the receiver, just before printing. Apart from coding efficiency, this approach

  17. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Milgram, J.; Dormoy, J.L.

    1994-09-01

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  18. Study on reliability analysis based on multilevel flow models and fault tree method

    International Nuclear Information System (INIS)

    Chen Qiang; Yang Ming

    2014-01-01

    Multilevel flow models (MFM) and fault tree method describe the system knowledge in different forms, so the two methods express an equivalent logic of the system reliability under the same boundary conditions and assumptions. Based on this and combined with the characteristics of MFM, a method mapping MFM to fault tree was put forward, thus providing a way to establish fault tree rapidly and realizing qualitative reliability analysis based on MFM. Taking the safety injection system of pressurized water reactor nuclear power plant as an example, its MFM was established and its reliability was analyzed qualitatively. The analysis result shows that the logic of mapping MFM to fault tree is correct. The MFM is easily understood, created and modified. Compared with the traditional fault tree analysis, the workload is greatly reduced and the modeling time is saved. (authors)

  19. Discovering the Power of Individual-Based Modelling in Teaching and Learning: The Study of a Predator-Prey System

    Science.gov (United States)

    Ginovart, Marta

    2014-01-01

    The general aim is to promote the use of individual-based models (biological agent-based models) in teaching and learning contexts in life sciences and to make their progressive incorporation into academic curricula easier, complementing other existing modelling strategies more frequently used in the classroom. Modelling activities for the study…

  20. GIS-based modelling of odour emitted from the waste processing plant: case study

    Directory of Open Access Journals (Sweden)

    Sόwka Izabela

    2017-01-01

    Full Text Available The emission of odours into the atmospheric air from the municipal economy and industrial plants, especially in urbanized areas, causes a serious problem, which the mankind has been struggling with for years. The excessive exposure of people to odours may result in many negative health effects, including, for example, headaches and vomiting. There are many different methods that are used in order to evaluate the odour nuisance. The results obtained through those methods can then be used to carry out a visualization and an analysis of a distribution of the odour concentrations in a given area by using the GIS (Geographic Information System. By their application to the spatial analysis of the impact of odours, we can enable the assessment of the magnitude and likelihood of the occurrence of odour nuisance. Modelling using GIS tools and spatial interpolation like IDW method and kriging can provide an alternative to the standard modelling tools, which generally use the emission values from sources that are identified as major emitters of odours. The work presents the result, based on the odour measurements data from waste processing plant, of the attempt to connect two different tools – the reference model OPERAT FB and GIS-based dispersion modelling performed using IDW method and ordinary kriging to analyse their behaviour in terms of limited observation values.

  1. Modeling the impact of prostate edema on LDR brachytherapy: a Monte Carlo dosimetry study based on a 3D biphasic finite element biomechanical model

    Science.gov (United States)

    Mountris, K. A.; Bert, J.; Noailly, J.; Rodriguez Aguilera, A.; Valeri, A.; Pradier, O.; Schick, U.; Promayon, E.; Gonzalez Ballester, M. A.; Troccaz, J.; Visvikis, D.

    2017-03-01

    Prostate volume changes due to edema occurrence during transperineal permanent brachytherapy should be taken under consideration to ensure optimal dose delivery. Available edema models, based on prostate volume observations, face several limitations. Therefore, patient-specific models need to be developed to accurately account for the impact of edema. In this study we present a biomechanical model developed to reproduce edema resolution patterns documented in the literature. Using the biphasic mixture theory and finite element analysis, the proposed model takes into consideration the mechanical properties of the pubic area tissues in the evolution of prostate edema. The model’s computed deformations are incorporated in a Monte Carlo simulation to investigate their effect on post-operative dosimetry. The comparison of Day1 and Day30 dosimetry results demonstrates the capability of the proposed model for patient-specific dosimetry improvements, considering the edema dynamics. The proposed model shows excellent ability to reproduce previously described edema resolution patterns and was validated based on previous findings. According to our results, for a prostate volume increase of 10-20% the Day30 urethra D10 dose metric is higher by 4.2%-10.5% compared to the Day1 value. The introduction of the edema dynamics in Day30 dosimetry shows a significant global dose overestimation identified on the conventional static Day30 dosimetry. In conclusion, the proposed edema biomechanical model can improve the treatment planning of transperineal permanent brachytherapy accounting for post-implant dose alterations during the planning procedure.

  2. Adopting a Models-Based Approach to Teaching Physical Education

    Science.gov (United States)

    Casey, Ashley; MacPhail, Ann

    2018-01-01

    Background: The popularised notion of models-based practice (MBP) is one that focuses on the delivery of a model, e.g. Cooperative Learning, Sport Education, Teaching Personal and Social Responsibility, Teaching Games for Understanding. Indeed, while an abundance of research studies have examined the delivery of a single model and some have…

  3. Citrate synthase proteins in extremophilic organisms: Studies within a structure-based model

    International Nuclear Information System (INIS)

    Różycki, Bartosz; Cieplak, Marek

    2014-01-01

    We study four citrate synthase homodimeric proteins within a structure-based coarse-grained model. Two of these proteins come from thermophilic bacteria, one from a cryophilic bacterium and one from a mesophilic organism; three are in the closed and two in the open conformations. Even though the proteins belong to the same fold, the model distinguishes the properties of these proteins in a way which is consistent with experiments. For instance, the thermophilic proteins are more stable thermodynamically than their mesophilic and cryophilic homologues, which we observe both in the magnitude of thermal fluctuations near the native state and in the kinetics of thermal unfolding. The level of stability correlates with the average coordination number for amino acid contacts and with the degree of structural compactness. The pattern of positional fluctuations along the sequence in the closed conformation is different than in the open conformation, including within the active site. The modes of correlated and anticorrelated movements of pairs of amino acids forming the active site are very different in the open and closed conformations. Taken together, our results show that the precise location of amino acid contacts in the native structure appears to be a critical element in explaining the similarities and differences in the thermodynamic properties, local flexibility, and collective motions of the different forms of the enzyme

  4. Model-Based Learning Environment Based on The Concept IPS School-Based Management

    Directory of Open Access Journals (Sweden)

    Hamid Darmadi

    2017-03-01

    Full Text Available The results showed: (1 learning model IPS-oriented environment can grow and not you love the cultural values of the area as a basis for the development of national culture, (2 community participation, and the role of government in implementing learning model of IPS-based environment provides a positive impact for the improvement of management school resources, (3 learning model IPS-based environment effectively creating a way of life together peacefully, increase the intensity of togetherness and mutual respect (4 learning model IPS-based environment can improve student learning outcomes, (5 there are differences in the expression of attitudes and results learning among students who are located in the area of conflict with students who are outside the area of conflict (6 analysis of the scale of attitudes among school students da SMA result rewards high school students to the values of unity and nation, respect for diversity and peaceful coexistence, It is recommended that the Department of Education authority as an institution of Trustees and the development of social and cultural values in the province can apply IPS learning model based environments.

  5. Study of thermal environment in Jingjintang urban agglomeration based on WRF model and Landsat data

    International Nuclear Information System (INIS)

    Huang, Q N; Cao, Z Q; Guo, H D; Xi, X H; Li, X W

    2014-01-01

    In recent decades, unprecedented urban expansion has taken place in developing countries resulting in the emergence of megacities or urban agglomeration. It has been highly concerned by many countries about a variety of urban environmental issues such as greenhouse gas emissions and urban heat island phenomenon associated with urbanization. Generally, thermal environment is monitored by remote sensing satellite data. This method is usually limited by weather and repeated cycle. Another approach is relied on numerical simulation based on models. In the study, these two means are combined to study the thermal environment of Jingjintang urban agglomeration. The high temperature processes of the study area in 2009 and 1990s are simulated by using WRF (the Weather Research and Forecasting Model) coupled with UCM (Urban Canopy Model) and the urban impervious surface estimated from Landsat-5 TM data using support vector machine. Results show that the trend of simulated air temperature (2 meter) is in accord with observed air temperature. Moreover, it indicates the differences of air temperature and Land Surface Temperature caused by the urbanization efficiently. The UHI effect at night is stronger than that in the day. The maximum difference of LST reaches to 8–10°C for new build-up area at night. The method provided in this research can be used to analyze impacts on urban thermal environment caused by urbanization and it also provides means on thermal environment monitoring and prediction which will benefit the coping capacity of extreme event

  6. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Baldi, Simone; Michailidis, Iakovos; Ravanis, Christos; Kosmatopoulos, Elias B.

    2015-01-01

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  7. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results.

    Science.gov (United States)

    Humada, Ali M; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M; Ahmed, Mushtaq N

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions.

  8. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  9. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    the conceptual model on which it is based. In this study, a number of model structural shortcomings were identified, such as a lack of dissolved phosphorus transport via infiltration excess overland flow, potential discrepancies in the particulate phosphorus simulation and a lack of spatial granularity. (4) Conceptual challenges, as conceptual models on which predictive models are built are often outdated, having not kept up with new insights from monitoring and experiments. For example, soil solution dissolved phosphorus concentration in INCA-P is determined by the Freundlich adsorption isotherm, which could potentially be replaced using more recently-developed adsorption models that take additional soil properties into account. This checklist could be used to assist in identifying why model performance may be poor or unreliable. By providing a model evaluation framework, it could help prioritise which areas should be targeted to improve model performance or model credibility, whether that be through using alternative calibration techniques and statistics, improved data collection, improving or simplifying the model structure or updating the model to better represent current understanding of catchment processes.

  10. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  11. Variance-based sensitivity analysis for wastewater treatment plant modelling.

    Science.gov (United States)

    Cosenza, Alida; Mannina, Giorgio; Vanrolleghem, Peter A; Neumann, Marc B

    2014-02-01

    Global sensitivity analysis (GSA) is a valuable tool to support the use of mathematical models that characterise technical or natural systems. In the field of wastewater modelling, most of the recent applications of GSA use either regression-based methods, which require close to linear relationships between the model outputs and model factors, or screening methods, which only yield qualitative results. However, due to the characteristics of membrane bioreactors (MBR) (non-linear kinetics, complexity, etc.) there is an interest to adequately quantify the effects of non-linearity and interactions. This can be achieved with variance-based sensitivity analysis methods. In this paper, the Extended Fourier Amplitude Sensitivity Testing (Extended-FAST) method is applied to an integrated activated sludge model (ASM2d) for an MBR system including microbial product formation and physical separation processes. Twenty-one model outputs located throughout the different sections of the bioreactor and 79 model factors are considered. Significant interactions among the model factors are found. Contrary to previous GSA studies for ASM models, we find the relationship between variables and factors to be non-linear and non-additive. By analysing the pattern of the variance decomposition along the plant, the model factors having the highest variance contributions were identified. This study demonstrates the usefulness of variance-based methods in membrane bioreactor modelling where, due to the presence of membranes and different operating conditions than those typically found in conventional activated sludge systems, several highly non-linear effects are present. Further, the obtained results highlight the relevant role played by the modelling approach for MBR taking into account simultaneously biological and physical processes. © 2013.

  12. Effective modelling of percolation at the landscape scale using data-based approaches

    Science.gov (United States)

    Selle, Benny; Lischeid, Gunnar; Huwe, Bernd

    2008-06-01

    Process-based models have been extensively applied to assess the impact of landuse change on water quantity and quality at landscape scales. However, the routine application of those models suffers from large computational efforts, lack of transparency and the requirement of many input parameters. Data-based models such as Feed-Forward Multilayer Perceptrons (MLP) and Classification and Regression Trees (CART) may be used as effective models, i.e. simple approximations of complex process-based models. These data-based approaches can subsequently be applied for scenario analysis and as a transparent management tool provided climatic boundary conditions and the basic model assumptions of the process-based models do not change dramatically. In this study, we apply MLP, CART and Multiple Linear Regression (LR) to model the spatially distributed and spatially aggregated percolation in soils using weather, groundwater and soil data. The percolation data is obtained via numerical experiments with Hydrus1D. Thus, the complex process-based model is approximated using simpler data-based approaches. The MLP model explains most of the percolation variance in time and space without using any soil information. This reflects the effective dimensionality of the process-based model and suggests that percolation in the study area may be modelled much simpler than using Hydrus1D. The CART model shows that soil properties play a negligible role for percolation under wet climatic conditions. However, they become more important if the conditions turn drier. The LR method does not yield satisfactory predictions for the spatially distributed percolation however the spatially aggregated percolation is well approximated. This may indicate that the soils behave simpler (i.e. more linear) when percolation dynamics are upscaled.

  13. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.; Sarfraz, M.

    2004-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  14. Study of cosmic ray interaction model based on atmospheric muons for the neutrino flux calculation

    International Nuclear Information System (INIS)

    Sanuki, T.; Honda, M.; Kajita, T.; Kasahara, K.; Midorikawa, S.

    2007-01-01

    We have studied the hadronic interaction for the calculation of the atmospheric neutrino flux by summarizing the accurately measured atmospheric muon flux data and comparing with simulations. We find the atmospheric muon and neutrino fluxes respond to errors in the π-production of the hadronic interaction similarly, and compare the atmospheric muon flux calculated using the HKKM04 [M. Honda, T. Kajita, K. Kasahara, and S. Midorikawa, Phys. Rev. D 70, 043008 (2004).] code with experimental measurements. The μ + +μ - data show good agreement in the 1∼30 GeV/c range, but a large disagreement above 30 GeV/c. The μ + /μ - ratio shows sizable differences at lower and higher momenta for opposite directions. As the disagreements are considered to be due to assumptions in the hadronic interaction model, we try to improve it phenomenologically based on the quark parton model. The improved interaction model reproduces the observed muon flux data well. The calculation of the atmospheric neutrino flux will be reported in the following paper [M. Honda et al., Phys. Rev. D 75, 043006 (2007).

  15. Physics Based Modeling of Compressible Turbulance

    Science.gov (United States)

    2016-11-07

    AFRL-AFOSR-VA-TR-2016-0345 PHYSICS -BASED MODELING OF COMPRESSIBLE TURBULENCE PARVIZ MOIN LELAND STANFORD JUNIOR UNIV CA Final Report 09/13/2016...on the AFOSR project (FA9550-11-1-0111) entitled: Physics based modeling of compressible turbulence. The period of performance was, June 15, 2011...by ANSI Std. Z39.18 Page 1 of 2FORM SF 298 11/10/2016https://livelink.ebs.afrl.af.mil/livelink/llisapi.dll PHYSICS -BASED MODELING OF COMPRESSIBLE

  16. Structured model Consumer-based Brand Equity based on Promotional-mix elements(Case Study: Food Active Industries of Tehran)

    OpenAIRE

    Mehran Rezvani; Siran Mehrnia

    2014-01-01

    Abstract This paper aims to examine the relationships among Promotional-mix elements and Customer-based Brand Equity. Then, a model is developed to examine the relationships Promotional-mix elements and Customer-based brand equity in Food Industries of Tehran. The sample size is 240. Data are collected by questionnaire designed. The collected data is estimated using Lizrel and SEM method. The test results show that four dimensions of brand equity (brand awareness, perceived quality, and br...

  17. Case studies of extended model-based flood forecasting: prediction of dike strength and flood impacts

    Science.gov (United States)

    Stuparu, Dana; Bachmann, Daniel; Bogaard, Tom; Twigt, Daniel; Verkade, Jan; de Bruijn, Karin; de Leeuw, Annemargreet

    2017-04-01

    Flood forecasts, warning and emergency response are important components in flood risk management. Most flood forecasting systems use models to translate weather predictions to forecasted discharges or water levels. However, this information is often not sufficient for real time decisions. A sound understanding of the reliability of embankments and flood dynamics is needed to react timely and reduce the negative effects of the flood. Where are the weak points in the dike system? When, how much and where the water will flow? When and where is the greatest impact expected? Model-based flood impact forecasting tries to answer these questions by adding new dimensions to the existing forecasting systems by providing forecasted information about: (a) the dike strength during the event (reliability), (b) the flood extent in case of an overflow or a dike failure (flood spread) and (c) the assets at risk (impacts). This work presents three study-cases in which such a set-up is applied. Special features are highlighted. Forecasting of dike strength. The first study-case focusses on the forecast of dike strength in the Netherlands for the river Rhine branches Waal, Nederrijn and IJssel. A so-called reliability transformation is used to translate the predicted water levels at selected dike sections into failure probabilities during a flood event. The reliability of a dike section is defined by fragility curves - a summary of the dike strength conditional to the water level. The reliability information enhances the emergency management and inspections of embankments. Ensemble forecasting. The second study-case shows the setup of a flood impact forecasting system in Dumfries, Scotland. The existing forecasting system is extended with a 2D flood spreading model in combination with the Delft-FIAT impact model. Ensemble forecasts are used to make use of the uncertainty in the precipitation forecasts, which is useful to quantify the certainty of a forecasted flood event. From global

  18. New global ICT-based business models

    DEFF Research Database (Denmark)

    The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative ...... The NEWGIBM Cases Show? The Strategy Concept in Light of the Increased Importance of Innovative Business Models Successful Implementation of Global BM Innovation Globalisation Of ICT Based Business Models: Today And In 2020......The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative....... The NEWGIBM book serves as a part of the final evaluation and documentation of the NEWGIBM project and is supported by results from the following projects: M-commerce, Global Innovation, Global Ebusiness & M-commerce, The Blue Ocean project, International Center for Innovation and Women in Business, NEFFICS...

  19. A heat transfer correlation based on a surface renewal model for molten core concrete interaction study

    International Nuclear Information System (INIS)

    Tourniaire, B. . E-mail bruno.tourniaire@cea.fr

    2006-01-01

    The prediction of heat transfer between corium pool and concrete basemat is of particular significance in the framework of the study of PWR's severe accident. Heat transfer directly governs the ablation velocity of concrete in case of molten core concrete interaction (MCCI) and, consequently, the time delay when the reactor cavity may fail. From a restricted hydrodynamic point of view, this issue is related to heat transfer between a heated bubbling pool and a porous wall with gas injection. Several experimental studies have been performed with simulant materials and many correlations have been provided to address this issue. The comparisons of the results of these correlations with the measurements and their extrapolation to reactor materials show that strong discrepancies between the results of these models are obtained which probably means that some phenomena are not well taken into account. The main purpose of this paper is to present an alternative heat transfer model which was originally developed for chemical engineering applications (bubble columns) by Deckwer. A part of this work is devoted to the presentation of this model, which is based on a surface renewal assumption. Comparison of the results of this model with available experimental data in different systems are presented and discussed. These comparisons clearly show that this model can be used to deal with the particular problem of MCCI. The analyses also lead to enrich the original model by taking into account the thermal resistance of the wall: a new formulation of the Deckwer's correlation is finally proposed

  20. Study on the Cooperative E-commerce Model between Enterprises based on the Value Chain

    Institute of Scientific and Technical Information of China (English)

    XU Jun[1,2; LIU Xiaoxing[1

    2015-01-01

    The real e-commerce between enterprises is based on the internal departments of enterprises and the cooperative interaction between enterprise and its partners. In this paper, on the basis of the theory of value chain, 11 cooperative e-commerce models between enterprises have been classified according to the activities of the cooperation between enterprises, and then every cooperative e-commerce model between enterprises is discussed. In practice, cooperative e-commerce between enterprises can be a combination of one or more e-commerce models between enterprises.

  1. CFD based aerodynamic modeling to study flight dynamics of a flapping wing micro air vehicle

    Science.gov (United States)

    Rege, Alok Ashok

    The demand for small unmanned air vehicles, commonly termed micro air vehicles or MAV's, is rapidly increasing. Driven by applications ranging from civil search-and-rescue missions to military surveillance missions, there is a rising level of interest and investment in better vehicle designs, and miniaturized components are enabling many rapid advances. The need to better understand fundamental aspects of flight for small vehicles has spawned a surge in high quality research in the area of micro air vehicles. These aircraft have a set of constraints which are, in many ways, considerably different from that of traditional aircraft and are often best addressed by a multidisciplinary approach. Fast-response non-linear controls, nano-structures, integrated propulsion and lift mechanisms, highly flexible structures, and low Reynolds aerodynamics are just a few of the important considerations which may be combined in the execution of MAV research. The main objective of this thesis is to derive a consistent nonlinear dynamic model to study the flight dynamics of micro air vehicles with a reasonably accurate representation of aerodynamic forces and moments. The research is divided into two sections. In the first section, derivation of the nonlinear dynamics of flapping wing micro air vehicles is presented. The flapping wing micro air vehicle (MAV) used in this research is modeled as a system of three rigid bodies: a body and two wings. The design is based on an insect called Drosophila Melanogaster, commonly known as fruit-fly. The mass and inertial effects of the wing on the body are neglected for the present work. The nonlinear dynamics is simulated with the aerodynamic data published in the open literature. The flapping frequency is used as the control input. Simulations are run for different cases of wing positions and the chosen parameters are studied for boundedness. Results show a qualitative inconsistency in boundedness for some cases, and demand a better

  2. Experimental and modeling study on charge storage/transfer mechanism of graphene-based supercapacitors

    Science.gov (United States)

    Ban, Shuai; Jing, Xie; Zhou, Hongjun; Zhang, Lei; Zhang, Jiujun

    2014-12-01

    A symmetrical graphene-based supercapacitor is constructed for studying the charge-transfer mechanism within the graphene-based electrodes using both experiment measurements and molecular simulation. The in-house synthesized graphene is characterized by XRD, SEM and BET measurements for morphology and surface area. It is observed that the electric capacity of graphene electrode can be reduced by both high internal resistance and limited mass transfer. Computer modeling is conducted at the molecular level to characterize the diffusion behavior of electrolyte ions to the interior of electrode with emphasis on the unique 2D confinement imposed by graphene layers. Although graphene powder poses a moderate internal surface of 400 m2 g-1, the capacitance performance of graphene electrode can be as good as that of commercial activated carbon which has an overwhelming surface area of 1700 m2 g-1. An explanation to this abnormal correlation is that graphene material has an intrinsic capability of adaptively reorganizing its microporous structure in response to intercalation of ions and immergence of electrolyte solvent. The accessible surface of graphene is believed to be dramatically enlarged for ion adsorption during the charging process of capacitor.

  3. Between Complexity and Parsimony: Can Agent-Based Modelling Resolve the Trade-off

    DEFF Research Database (Denmark)

    Nielsen, Helle Ørsted; Malawska, Anna Katarzyna

    2013-01-01

    to BR- based policy studies would be to couple research on bounded ra-tionality with agent-based modeling. Agent-based models (ABMs) are computational models for simulating the behavior and interactions of any number of decision makers in a dynamic system. Agent-based models are better suited than...... are general equilibrium models for capturing behavior patterns of complex systems. ABMs may have the potential to represent complex systems without oversimplifying them. At the same time, research in bounded rationality and behavioral economics has already yielded many insights that could inform the modeling......While Herbert Simon espoused development of general models of behavior, he also strongly advo-cated that these models be based on realistic assumptions about humans and therefore reflect the complexity of human cognition and social systems (Simon 1997). Hence, the model of bounded rationality...

  4. A CBR-Based and MAHP-Based Customer Value Prediction Model for New Product Development

    Science.gov (United States)

    Zhao, Yu-Jie; Luo, Xin-xing; Deng, Li

    2014-01-01

    In the fierce market environment, the enterprise which wants to meet customer needs and boost its market profit and share must focus on the new product development. To overcome the limitations of previous research, Chan et al. proposed a dynamic decision support system to predict the customer lifetime value (CLV) for new product development. However, to better meet the customer needs, there are still some deficiencies in their model, so this study proposes a CBR-based and MAHP-based customer value prediction model for a new product (C&M-CVPM). CBR (case based reasoning) can reduce experts' workload and evaluation time, while MAHP (multiplicative analytic hierarchy process) can use actual but average influencing factor's effectiveness in stimulation, and at same time C&M-CVPM uses dynamic customers' transition probability which is more close to reality. This study not only introduces the realization of CBR and MAHP, but also elaborates C&M-CVPM's three main modules. The application of the proposed model is illustrated and confirmed to be sensible and convincing through a stimulation experiment. PMID:25162050

  5. Strategic directions for agent-based modeling: avoiding the YAAWN syndrome.

    Science.gov (United States)

    O'Sullivan, David; Evans, Tom; Manson, Steven; Metcalf, Sara; Ligmann-Zielinska, Arika; Bone, Chris

    In this short communication, we examine how agent-based modeling has become common in land change science and is increasingly used to develop case studies for particular times and places. There is a danger that the research community is missing a prime opportunity to learn broader lessons from the use of agent-based modeling (ABM), or at the very least not sharing these lessons more widely. How do we find an appropriate balance between empirically rich, realistic models and simpler theoretically grounded models? What are appropriate and effective approaches to model evaluation in light of uncertainties not only in model parameters but also in model structure? How can we best explore hybrid model structures that enable us to better understand the dynamics of the systems under study, recognizing that no single approach is best suited to this task? Under what circumstances - in terms of model complexity, model evaluation, and model structure - can ABMs be used most effectively to lead to new insight for stakeholders? We explore these questions in the hope of helping the growing community of land change scientists using models in their research to move from 'yet another model' to doing better science with models.

  6. Empirical study of travel mode forecasting improvement for the combined revealed preference/stated preference data–based discrete choice model

    Directory of Open Access Journals (Sweden)

    Yanfu Qiao

    2016-01-01

    Full Text Available The combined revealed preference/stated preference data–based discrete choice model has provided the actual choice-making restraints as well as reduced the prediction errors. But the random error variance of alternatives belonging to different data would impact its universality. In this article, we studied the traffic corridor between Chengdu and Longquan with the revealed preference/stated preference joint model, and the single stated preference data model separately predicted the choice probability of each mode. We found the revealed preference/stated preference joint model is universal only when there is a significant difference between the random error terms in different data. The single stated preference data would amplify the travelers’ preference and cause prediction error. We proposed a universal way that uses revealed preference data to modify the single stated preference data parameter estimation results to achieve the composite utility and reduce the prediction error. And the result suggests that prediction results are more reasonable based on the composite utility than the results based on the single stated preference data, especially forecasting the mode share of bus. The future metro line will be the main travel mode in this corridor, and 45% of passenger flow will transfer to the metro.

  7. Lithologic Effects on Landscape Response to Base Level Changes: A Modeling Study in the Context of the Eastern Jura Mountains, Switzerland

    Science.gov (United States)

    Yanites, Brian J.; Becker, Jens K.; Madritsch, Herfried; Schnellmann, Michael; Ehlers, Todd A.

    2017-11-01

    Landscape evolution is a product of the forces that drive geomorphic processes (e.g., tectonics and climate) and the resistance to those processes. The underlying lithology and structural setting in many landscapes set the resistance to erosion. This study uses a modified version of the Channel-Hillslope Integrated Landscape Development (CHILD) landscape evolution model to determine the effect of a spatially and temporally changing erodibility in a terrain with a complex base level history. Specifically, our focus is to quantify how the effects of variable lithology influence transient base level signals. We set up a series of numerical landscape evolution models with increasing levels of complexity based on the lithologic variability and base level history of the Jura Mountains of northern Switzerland. The models are consistent with lithology (and therewith erodibility) playing an important role in the transient evolution of the landscape. The results show that the erosion rate history at a location depends on the rock uplift and base level history, the range of erodibilities of the different lithologies, and the history of the surface geology downstream from the analyzed location. Near the model boundary, the history of erosion is dominated by the base level history. The transient wave of incision, however, is quite variable in the different model runs and depends on the geometric structure of lithology used. It is thus important to constrain the spatiotemporal erodibility patterns downstream of any given point of interest to understand the evolution of a landscape subject to variable base level in a quantitative framework.

  8. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha; Kalogerakis, Evangelos; Guibas, Leonidas; Koltun, Vladlen

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling

  9. Model Based Temporal Reasoning

    Science.gov (United States)

    Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.

    1988-03-01

    Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.

  10. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results

    Science.gov (United States)

    Humada, Ali M.; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M.; Ahmed, Mushtaq N.

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  11. Fostering Transfer of Study Strategies: A Spiral Model.

    Science.gov (United States)

    Davis, Denise M.; Clery, Carolsue

    1994-01-01

    Describes the design and implementation of a Spiral Model for the introduction and repeated practice of study strategies, based on Taba's model for social studies. In a college reading and studies strategies course, key strategies were introduced early and used through several sets of humanities and social and physical sciences readings. (Contains…

  12. Evaluation of adamantane hydroxamates as botulinum neurotoxin inhibitors: synthesis, crystallography, modeling, kinetic and cellular based studies.

    Science.gov (United States)

    Šilhár, Peter; Silvaggi, Nicholas R; Pellett, Sabine; Čapková, Kateřina; Johnson, Eric A; Allen, Karen N; Janda, Kim D

    2013-03-01

    Botulinum neurotoxins (BoNTs) are the most lethal biotoxins known to mankind and are responsible for the neuroparalytic disease botulism. Current treatments for botulinum poisoning are all protein based and thus have a limited window of treatment opportunity. Inhibition of the BoNT light chain protease (LC) has emerged as a therapeutic strategy for the treatment of botulism as it may provide an effective post exposure remedy. Using a combination of crystallographic and modeling studies a series of hydroxamates derived from 1-adamantylacetohydroxamic acid (3a) were prepared. From this group of compounds, an improved potency of about 17-fold was observed for two derivatives. Detailed mechanistic studies on these structures revealed a competitive inhibition model, with a K(i)=27 nM, which makes these compounds some of the most potent small molecule, non-peptidic BoNT/A LC inhibitors reported to date. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Characteristics of the large corporation-based, bureaucratic model among oecd countries - an foi model analysis

    Directory of Open Access Journals (Sweden)

    Bartha Zoltán

    2014-03-01

    Full Text Available Deciding on the development path of the economy has been a delicate question in economic policy, not least because of the trade-off effects which immediately worsen certain economic indicators as steps are taken to improve others. The aim of the paper is to present a framework that helps decide on such policy dilemmas. This framework is based on an analysis conducted among OECD countries with the FOI model (focusing on future, outside and inside potentials. Several development models can be deduced by this method, out of which only the large corporation-based, bureaucratic model is discussed in detail. The large corporation-based, bureaucratic model implies a development strategy focused on the creation of domestic safe havens. Based on country studies, it is concluded that well-performing safe havens require the active participation of the state. We find that, in countries adhering to this model, business competitiveness is sustained through intensive public support, and an active role taken by the government in education, research and development, in detecting and exploiting special market niches, and in encouraging sectorial cooperation.

  14. Teachers' Knowledge Base for Implementing Response-to-Intervention Models in Reading

    Science.gov (United States)

    Spear-Swerling, Louise; Cheesman, Elaine

    2012-01-01

    This study examined the knowledge base of 142 elementary-level educators for implementing response-to-intervention (RTI) models in reading. A questionnaire assessed participants' professional background for teaching reading, as well as their familiarity with specific assessments, research-based instructional models, and interventions potentially…

  15. Output-Feedback Model Predictive Control of a Pasteurization Pilot Plant based on an LPV model

    Science.gov (United States)

    Karimi Pour, Fatemeh; Ocampo-Martinez, Carlos; Puig, Vicenç

    2017-01-01

    This paper presents a model predictive control (MPC) of a pasteurization pilot plant based on an LPV model. Since not all the states are measured, an observer is also designed, which allows implementing an output-feedback MPC scheme. However, the model of the plant is not completely observable when augmented with the disturbance models. In order to solve this problem, the following strategies are used: (i) the whole system is decoupled into two subsystems, (ii) an inner state-feedback controller is implemented into the MPC control scheme. A real-time example based on the pasteurization pilot plant is simulated as a case study for testing the behavior of the approaches.

  16. A model-based meta-analysis of monoclonal antibody pharmacokinetics to guide optimal first-in-human study design

    Science.gov (United States)

    Davda, Jasmine P; Dodds, Michael G; Gibbs, Megan A; Wisdom, Wendy; Gibbs, John P

    2014-01-01

    The objectives of this retrospective analysis were (1) to characterize the population pharmacokinetics (popPK) of four different monoclonal antibodies (mAbs) in a combined analysis of individual data collected during first-in-human (FIH) studies and (2) to provide a scientific rationale for prospective design of FIH studies with mAbs. The data set was composed of 171 subjects contributing a total of 2716 mAb serum concentrations, following intravenous (IV) and subcutaneous (SC) doses. mAb PK was described by an open 2-compartment model with first-order elimination from the central compartment and a depot compartment with first-order absorption. Parameter values obtained from the popPK model were further used to generate optimal sampling times for a single dose study. A robust fit to the combined data from four mAbs was obtained using the 2-compartment model. Population parameter estimates for systemic clearance and central volume of distribution were 0.20 L/day and 3.6 L with intersubject variability of 31% and 34%, respectively. The random residual error was 14%. Differences (> 2-fold) in PK parameters were not apparent across mAbs. Rich designs (22 samples/subject), minimal designs for popPK (5 samples/subject), and optimal designs for non-compartmental analysis (NCA) and popPK (10 samples/subject) were examined by stochastic simulation and estimation. Single-dose PK studies for linear mAbs executed using the optimal designs are expected to yield high-quality model estimates, and accurate capture of NCA estimations. This model-based meta-analysis has determined typical popPK values for four mAbs with linear elimination and enabled prospective optimization of FIH study designs, potentially improving the efficiency of FIH studies for this class of therapeutics. PMID:24837591

  17. How Model Can Help Inquiry--A Qualitative Study of Model Based Inquiry Learning (Mobile) in Engineering Education

    Science.gov (United States)

    Gong, Yu

    2017-01-01

    This study investigates how students can use "interactive example models" in inquiry activities to develop their conceptual knowledge about an engineering phenomenon like electromagnetic fields and waves. An interactive model, for example a computational model, could be used to develop and teach principles of dynamic complex systems, and…

  18. Model study on transesterification of soybean oil to biodiesel with methanol using solid base catalyst.

    Science.gov (United States)

    Liu, Xuejun; Piao, Xianglan; Wang, Yujun; Zhu, Shenlin

    2010-03-25

    Modeling of the transesterification of vegetable oils to biodiesel using a solid base as a catalyst is very important because the mutual solubilities of oil and methanol will increase with the increasing biodiesel yield. The heterogeneous liquid-liquid-solid reaction system would become a liquid-solid system when the biodiesel reaches a certain content. In this work, we adopted a two-film theory and a steady state approximation assumption, then established a heterogeneous liquid-liquid-solid model in the first stage. After the diffusion coefficients on the liquid-liquid interface and the liquid-solid interface were calculated on the basis of the properties of the system, the theoretical value of biodiesel productivity changing with time was obtained. The predicted values were very near the experimental data, which indicated that the proposed models were suitable for the transesterification of soybean oil to biodiesel when solid bases were used as catalysts. Meanwhile, the model indicated that the transesterification reaction was controlled by both mass transfer and reaction. The total resistance will decrease with the increase in biodiesel yield in the liquid-liquid-solid stage. The solid base catalyst exhibited an activation energy range of 9-20 kcal/mol, which was consistent with the reported activation energy range of homogeneous catalysts.

  19. Team-Based Models for End-of-Life Care: An Evidence-Based Analysis

    Science.gov (United States)

    2014-01-01

    Background End of life refers to the period when people are living with advanced illness that will not stabilize and from which they will not recover and will eventually die. It is not limited to the period immediately before death. Multiple services are required to support people and their families during this time period. The model of care used to deliver these services can affect the quality of the care they receive. Objectives Our objective was to determine whether an optimal team-based model of care exists for service delivery at end of life. In systematically reviewing such models, we considered their core components: team membership, services offered, modes of patient contact, and setting. Data Sources A literature search was performed on October 14, 2013, using Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid Embase, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), and EBM Reviews, for studies published from January 1, 2000, to October 14, 2013. Review Methods Abstracts were reviewed by a single reviewer and full-text articles were obtained that met the inclusion criteria. Studies were included if they evaluated a team model of care compared with usual care in an end-of-life adult population. A team was defined as having at least 2 health care disciplines represented. Studies were limited to English publications. A meta-analysis was completed to obtain pooled effect estimates where data permitted. The GRADE quality of the evidence was evaluated. Results Our literature search located 10 randomized controlled trials which, among them, evaluated the following 6 team-based models of care: hospital, direct contact home, direct contact home, indirect contact comprehensive, indirect contact comprehensive, direct contact comprehensive, direct, and early contact Direct contact is when team members see the patient; indirect contact is when they advise another health care practitioner (e.g., a family doctor) who sees

  20. A model-based framework for design of intensified enzyme-based processes

    DEFF Research Database (Denmark)

    Román-Martinez, Alicia

    This thesis presents a generic and systematic model-based framework to design intensified enzyme-based processes. The development of the presented methodology was motivated by the needs of the bio-based industry for a more systematic approach to achieve intensification in its production plants...... in enzyme-based processes which have found significant application in the pharmaceutical, food, and renewable fuels sector. The framework uses model-based strategies for (bio)-chemical process design and optimization, including the use of a superstructure to generate all potential reaction......(s)-separation(s) options according to a desired performance criteria and a generic mathematical model represented by the superstructure to derive the specific models corresponding to a specific process option. In principle, three methods of intensification of bioprocess are considered in this thesis: 1. enzymatic one...

  1. Model-based Acceleration Control of Turbofan Engines with a Hammerstein-Wiener Representation

    Science.gov (United States)

    Wang, Jiqiang; Ye, Zhifeng; Hu, Zhongzhi; Wu, Xin; Dimirovsky, Georgi; Yue, Hong

    2017-05-01

    Acceleration control of turbofan engines is conventionally designed through either schedule-based or acceleration-based approach. With the widespread acceptance of model-based design in aviation industry, it becomes necessary to investigate the issues associated with model-based design for acceleration control. In this paper, the challenges for implementing model-based acceleration control are explained; a novel Hammerstein-Wiener representation of engine models is introduced; based on the Hammerstein-Wiener model, a nonlinear generalized minimum variance type of optimal control law is derived; the feature of the proposed approach is that it does not require the inversion operation that usually upsets those nonlinear control techniques. The effectiveness of the proposed control design method is validated through a detailed numerical study.

  2. Semantic reasoning with XML-based biomedical information models.

    Science.gov (United States)

    O'Connor, Martin J; Das, Amar

    2010-01-01

    The Extensible Markup Language (XML) is increasingly being used for biomedical data exchange. The parallel growth in the use of ontologies in biomedicine presents opportunities for combining the two technologies to leverage the semantic reasoning services provided by ontology-based tools. There are currently no standardized approaches for taking XML-encoded biomedical information models and representing and reasoning with them using ontologies. To address this shortcoming, we have developed a workflow and a suite of tools for transforming XML-based information models into domain ontologies encoded using OWL. In this study, we applied semantics reasoning methods to these ontologies to automatically generate domain-level inferences. We successfully used these methods to develop semantic reasoning methods for information models in the HIV and radiological image domains.

  3. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  4. Best Practices in Academic Management. Study Programs Classification Model

    Directory of Open Access Journals (Sweden)

    Ofelia Ema Aleca

    2016-05-01

    Full Text Available This article proposes and tests a set of performance indicators for the assessment of Bachelor and Master studies, from two perspectives: the study programs and the disciplines. The academic performance at the level of a study program shall be calculated based on success and efficiency rates, and at discipline level, on the basis of rates of efficiency, success and absenteeism. This research proposes a model of classification of the study programs within a Bachelor and Master cycle based on the education performance and efficiency. What recommends this model as a best practice model in academic management is the possibility of grouping a study program or a discipline in a particular category of efficiency

  5. Constraints based analysis of extended cybernetic models.

    Science.gov (United States)

    Mandli, Aravinda R; Venkatesh, Kareenhalli V; Modak, Jayant M

    2015-11-01

    The cybernetic modeling framework provides an interesting approach to model the regulatory phenomena occurring in microorganisms. In the present work, we adopt a constraints based approach to analyze the nonlinear behavior of the extended equations of the cybernetic model. We first show that the cybernetic model exhibits linear growth behavior under the constraint of no resource allocation for the induction of the key enzyme. We then quantify the maximum achievable specific growth rate of microorganisms on mixtures of substitutable substrates under various kinds of regulation and show its use in gaining an understanding of the regulatory strategies of microorganisms. Finally, we show that Saccharomyces cerevisiae exhibits suboptimal dynamic growth with a long diauxic lag phase when growing on a mixture of glucose and galactose and discuss on its potential to achieve optimal growth with a significantly reduced diauxic lag period. The analysis carried out in the present study illustrates the utility of adopting a constraints based approach to understand the dynamic growth strategies of microorganisms. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. A framework for modeling scenario-based barrier island storm impacts

    Science.gov (United States)

    Mickey, Rangley; Long, Joseph W.; Dalyander, P. Soupy; Plant, Nathaniel G.; Thompson, David M.

    2018-01-01

    Methods for investigating the vulnerability of existing or proposed coastal features to storm impacts often rely on simplified parametric models or one-dimensional process-based modeling studies that focus on changes to a profile across a dune or barrier island. These simple studies tend to neglect the impacts to curvilinear or alongshore varying island planforms, influence of non-uniform nearshore hydrodynamics and sediment transport, irregular morphology of the offshore bathymetry, and impacts from low magnitude wave events (e.g. cold fronts). Presented here is a framework for simulating regionally specific, low and high magnitude scenario-based storm impacts to assess the alongshore variable vulnerabilities of a coastal feature. Storm scenarios based on historic hydrodynamic conditions were derived and simulated using the process-based morphologic evolution model XBeach. Model results show that the scenarios predicted similar patterns of erosion and overwash when compared to observed qualitative morphologic changes from recent storm events that were not included in the dataset used to build the scenarios. The framework model simulations were capable of predicting specific areas of vulnerability in the existing feature and the results illustrate how this storm vulnerability simulation framework could be used as a tool to help inform the decision-making process for scientists, engineers, and stakeholders involved in coastal zone management or restoration projects.

  7. Nursing students learning the pharmacology of diabetes mellitus with complexity-based computerized models: A quasi-experimental study.

    Science.gov (United States)

    Dubovi, Ilana; Dagan, Efrat; Sader Mazbar, Ola; Nassar, Laila; Levy, Sharona T

    2018-02-01

    Pharmacology is a crucial component of medications administration in nursing, yet nursing students generally find it difficult and self-rate their pharmacology skills as low. To evaluate nursing students learning pharmacology with the Pharmacology Inter-Leaved Learning-Cells environment, a novel approach to modeling biochemical interactions using a multiscale, computer-based model with a complexity perspective based on a small set of entities and simple rules. This environment represents molecules, organelles and cells to enhance the understanding of cellular processes, and combines these cells at a higher scale to obtain whole-body interactions. Sophomore nursing students who learned the pharmacology of diabetes mellitus with the Pharmacology Inter-Leaved Learning-Cells environment (experimental group; n=94) or via a lecture-based curriculum (comparison group; n=54). A quasi-experimental pre- and post-test design was conducted. The Pharmacology-Diabetes-Mellitus questionnaire and the course's final exam were used to evaluate students' knowledge of the pharmacology of diabetes mellitus. Conceptual learning was significantly higher for the experimental than for the comparison group for the course final exam scores (unpaired t=-3.8, pLearning with complexity-based computerized models is highly effective and enhances the understanding of moving between micro and macro levels of the biochemical phenomena, this is then related to better understanding of medication actions. Moreover, the Pharmacology Inter-Leaved Learning-Cells approach provides a more general reasoning scheme for biochemical processes, which enhances pharmacology learning beyond the specific topic learned. The present study implies that deeper understanding of pharmacology will support nursing students' clinical decisions and empower their proficiency in medications administration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A novel unified dislocation density-based model for hot deformation behavior of a nickel-based superalloy under dynamic recrystallization conditions

    International Nuclear Information System (INIS)

    Lin, Y.C.; Wen, Dong-Xu; Chen, Xiao-Min; Chen, Ming-Song

    2016-01-01

    In this study, a novel unified dislocation density-based model is presented for characterizing hot deformation behaviors in a nickel-based superalloy under dynamic recrystallization (DRX) conditions. In the Kocks-Mecking model, a new softening item is proposed to represent the impacts of DRX behavior on dislocation density evolution. The grain size evolution and DRX kinetics are incorporated into the developed model. Material parameters of the developed model are calibrated by a derivative-free method of MATLAB software. Comparisons between experimental and predicted results confirm that the developed unified dislocation density-based model can nicely reproduce hot deformation behavior, DRX kinetics, and grain size evolution in wide scope of initial grain size, strain rate, and deformation temperature. Moreover, the developed unified dislocation density-based model is well employed to analyze the time-variant forming processes of the studied superalloy. (orig.)

  9. A novel unified dislocation density-based model for hot deformation behavior of a nickel-based superalloy under dynamic recrystallization conditions

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Y.C. [Central South University, School of Mechanical and Electrical Engineering, Changsha (China); Light Alloy Research Institute of Central South University, Changsha (China); State Key Laboratory of High Performance Complex Manufacturing, Changsha (China); Wen, Dong-Xu; Chen, Xiao-Min [Central South University, School of Mechanical and Electrical Engineering, Changsha (China); Chen, Ming-Song [Central South University, School of Mechanical and Electrical Engineering, Changsha (China); State Key Laboratory of High Performance Complex Manufacturing, Changsha (China)

    2016-09-15

    In this study, a novel unified dislocation density-based model is presented for characterizing hot deformation behaviors in a nickel-based superalloy under dynamic recrystallization (DRX) conditions. In the Kocks-Mecking model, a new softening item is proposed to represent the impacts of DRX behavior on dislocation density evolution. The grain size evolution and DRX kinetics are incorporated into the developed model. Material parameters of the developed model are calibrated by a derivative-free method of MATLAB software. Comparisons between experimental and predicted results confirm that the developed unified dislocation density-based model can nicely reproduce hot deformation behavior, DRX kinetics, and grain size evolution in wide scope of initial grain size, strain rate, and deformation temperature. Moreover, the developed unified dislocation density-based model is well employed to analyze the time-variant forming processes of the studied superalloy. (orig.)

  10. Simulation study of a magnetocardiogram based on a virtual heart model: effect of a cardiac equivalent source and a volume conductor

    International Nuclear Information System (INIS)

    Shou Guo-Fa; Xia Ling; Dai Ling; Ma Ping; Tang Fa-Kuan

    2011-01-01

    In this paper, we present a magnetocardiogram (MCG) simulation study using the boundary element method (BEM) and based on the virtual heart model and the realistic human volume conductor model. The different contributions of cardiac equivalent source models and volume conductor models to the MCG are deeply and comprehensively investigated. The single dipole source model, the multiple dipoles source model and the equivalent double layer (EDL) source model are analysed and compared with the cardiac equivalent source models. Meanwhile, the effect of the volume conductor model on the MCG combined with these cardiac equivalent sources is investigated. The simulation results demonstrate that the cardiac electrophysiological information will be partly missed when only the single dipole source is taken, while the EDL source is a good option for MCG simulation and the effect of the volume conductor is smallest for the EDL source. Therefore, the EDL source is suitable for the study of MCG forward and inverse problems, and more attention should be paid to it in future MCG studies. (general)

  11. Problem solving based learning model with multiple representations to improve student's mental modelling ability on physics

    Science.gov (United States)

    Haili, Hasnawati; Maknun, Johar; Siahaan, Parsaoran

    2017-08-01

    Physics is a lessons that related to students' daily experience. Therefore, before the students studying in class formally, actually they have already have a visualization and prior knowledge about natural phenomenon and could wide it themselves. The learning process in class should be aimed to detect, process, construct, and use students' mental model. So, students' mental model agree with and builds in the right concept. The previous study held in MAN 1 Muna informs that in learning process the teacher did not pay attention students' mental model. As a consequence, the learning process has not tried to build students' mental modelling ability (MMA). The purpose of this study is to describe the improvement of students' MMA as a effect of problem solving based learning model with multiple representations approach. This study is pre experimental design with one group pre post. It is conducted in XI IPA MAN 1 Muna 2016/2017. Data collection uses problem solving test concept the kinetic theory of gasses and interview to get students' MMA. The result of this study is clarification students' MMA which is categorized in 3 category; High Mental Modelling Ability (H-MMA) for 7Mental Modelling Ability (M-MMA) for 3Mental Modelling Ability (L-MMA) for 0 ≤ x ≤ 3 score. The result shows that problem solving based learning model with multiple representations approach can be an alternative to be applied in improving students' MMA.

  12. A Mixing Based Model for DME Combustion in Diesel Engines

    DEFF Research Database (Denmark)

    Bek, Bjarne H.; Sorenson, Spencer C.

    1998-01-01

    A series of studies has been conducted investigating the behavior of di-methyl ether (DME) fuel jets injected into quiescent combus-tion chambers. These studies have shown that it is possible to make a good estimate of the penetration of the jet based on existing correlations for diesel fuel......, by using appropriate fuel properties. The results of the spray studies have been incorporated into a first generation model for DME combustion. The model is entirely based on physical mixing, where chemical processes have been assumed to be very fast in relation to mixing. The assumption was made...

  13. A mixing based model for DME combustion in diesel engines

    DEFF Research Database (Denmark)

    Bek, Bjarne Hjort; Sorenson, Spencer C

    2001-01-01

    A series of studies has been conducted investigating the behavior of di-methyl ether (DME) fuel jets injected into quiescent combustion chambers. These studies have shown that it is possible to make a good estimate of the penetration of the jet based on existing correlations for diesel fuel......, by using appropriate fuel properties. The results of the spray studies have been incorporated into a first generation model for DME combustion. The model is entirely based on physical mixing, where chemical processes have been assumed to be very fast in relation to mixing. The assumption was made...

  14. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  15. Banking Crisis Early Warning Model based on a Bayesian Model Averaging Approach

    Directory of Open Access Journals (Sweden)

    Taha Zaghdoudi

    2016-08-01

    Full Text Available The succession of banking crises in which most have resulted in huge economic and financial losses, prompted several authors to study their determinants. These authors constructed early warning models to prevent their occurring. It is in this same vein as our study takes its inspiration. In particular, we have developed a warning model of banking crises based on a Bayesian approach. The results of this approach have allowed us to identify the involvement of the decline in bank profitability, deterioration of the competitiveness of the traditional intermediation, banking concentration and higher real interest rates in triggering bank crisis.

  16. Admission rates in a general practitioner-based versus a hospital specialist based, hospital-at-home model

    DEFF Research Database (Denmark)

    Mogensen, Christian Backer; Ankersen, Ejnar Skytte; Lindberg, Mats J

    2018-01-01

    . CONCLUSIONS: The GP based HaH model was more effective than the hospital specialist model in avoiding hospital admissions within 7 days among elderly patients with an acute medical condition with no differences in mental or physical recovery rates or deaths between the two models. REGISTRATION: No. NCT......BACKGROUND: Hospital at home (HaH) is an alternative to acute admission for elderly patients. It is unclear if should be cared for a primarily by a hospital intern specialist or by the patient's own general practitioner (GP). The study assessed whether a GP based model was more effective than...... Denmark, including + 65 years old patients with an acute medical condition that required acute hospital in-patient care. The patients were randomly assigned to hospital specialist based model or GP model of HaH care. Five physical and cognitive performance tests were performed at inclusion and after 7...

  17. Variable cycle control model for intersection based on multi-source information

    Science.gov (United States)

    Sun, Zhi-Yuan; Li, Yue; Qu, Wen-Cong; Chen, Yan-Yan

    2018-05-01

    In order to improve the efficiency of traffic control system in the era of big data, a new variable cycle control model based on multi-source information is presented for intersection in this paper. Firstly, with consideration of multi-source information, a unified framework based on cyber-physical system is proposed. Secondly, taking into account the variable length of cell, hysteresis phenomenon of traffic flow and the characteristics of lane group, a Lane group-based Cell Transmission Model is established to describe the physical properties of traffic flow under different traffic signal control schemes. Thirdly, the variable cycle control problem is abstracted into a bi-level programming model. The upper level model is put forward for cycle length optimization considering traffic capacity and delay. The lower level model is a dynamic signal control decision model based on fairness analysis. Then, a Hybrid Intelligent Optimization Algorithm is raised to solve the proposed model. Finally, a case study shows the efficiency and applicability of the proposed model and algorithm.

  18. Pricing Mining Concessions Based on Combined Multinomial Pricing Model

    Directory of Open Access Journals (Sweden)

    Chang Xiao

    2017-01-01

    Full Text Available A combined multinomial pricing model is proposed for pricing mining concession in which the annualized volatility of the price of mineral products follows a multinomial distribution. First, a combined multinomial pricing model is proposed which consists of binomial pricing models calculated according to different volatility values. Second, a method is provided to calculate the annualized volatility and the distribution. Third, the value of convenience yields is calculated based on the relationship between the futures price and the spot price. The notion of convenience yields is used to adjust our model as well. Based on an empirical study of a Chinese copper mine concession, we verify that our model is easy to use and better than the model with constant volatility when considering the changing annualized volatility of the price of the mineral product.

  19. FEM BASED PARAMETRIC DESIGN STUDY OF TIRE PROFILE USING DEDICATED CAD MODEL AND TRANSLATION CODE

    Directory of Open Access Journals (Sweden)

    Nikola Korunović

    2014-12-01

    Full Text Available In this paper a finite element method (FEM based parametric design study of the tire profile shape and belt width is presented. One of the main obstacles that similar studies have faced is how to change the finite element mesh after a modification of the tire geometry is performed. In order to overcome this problem, a new approach is proposed. It implies automatic update of the finite elements mesh, which follows the change of geometric design parameters on a dedicated CAD model. The mesh update is facilitated by an originally developed mapping and translation code. In this way, the performance of a large number of geometrically different tire design variations may be analyzed in a very short time. Although a pilot one, the presented study has also led to the improvement of the existing tire design.

  20. Using Model Replication to Improve the Reliability of Agent-Based Models

    Science.gov (United States)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  1. An artificial neural network prediction model of congenital heart disease based on risk factors: A hospital-based case-control study.

    Science.gov (United States)

    Li, Huixia; Luo, Miyang; Zheng, Jianfei; Luo, Jiayou; Zeng, Rong; Feng, Na; Du, Qiyun; Fang, Junqun

    2017-02-01

    An artificial neural network (ANN) model was developed to predict the risks of congenital heart disease (CHD) in pregnant women.This hospital-based case-control study involved 119 CHD cases and 239 controls all recruited from birth defect surveillance hospitals in Hunan Province between July 2013 and June 2014. All subjects were interviewed face-to-face to fill in a questionnaire that covered 36 CHD-related variables. The 358 subjects were randomly divided into a training set and a testing set at the ratio of 85:15. The training set was used to identify the significant predictors of CHD by univariate logistic regression analyses and develop a standard feed-forward back-propagation neural network (BPNN) model for the prediction of CHD. The testing set was used to test and evaluate the performance of the ANN model. Univariate logistic regression analyses were performed on SPSS 18.0. The ANN models were developed on Matlab 7.1.The univariate logistic regression identified 15 predictors that were significantly associated with CHD, including education level (odds ratio  = 0.55), gravidity (1.95), parity (2.01), history of abnormal reproduction (2.49), family history of CHD (5.23), maternal chronic disease (4.19), maternal upper respiratory tract infection (2.08), environmental pollution around maternal dwelling place (3.63), maternal exposure to occupational hazards (3.53), maternal mental stress (2.48), paternal chronic disease (4.87), paternal exposure to occupational hazards (2.51), intake of vegetable/fruit (0.45), intake of fish/shrimp/meat/egg (0.59), and intake of milk/soymilk (0.55). After many trials, we selected a 3-layer BPNN model with 15, 12, and 1 neuron in the input, hidden, and output layers, respectively, as the best prediction model. The prediction model has accuracies of 0.91 and 0.86 on the training and testing sets, respectively. The sensitivity, specificity, and Yuden Index on the testing set (training set) are 0.78 (0.83), 0.90 (0.95), and 0

  2. An Agent-Based Approach to Modeling Online Social Influence

    NARCIS (Netherlands)

    Maanen, P.P. van; Vecht, B. van der

    2013-01-01

    The aim of this study is to better understand social influence in online social media. Therefore, we propose a method in which we implement, validate and improve an individual behavior model. The behavior model is based on three fundamental behavioral principles of social influence from the

  3. A Novel Modeling Method for Aircraft Engine Using Nonlinear Autoregressive Exogenous (NARX) Models Based on Wavelet Neural Networks

    Science.gov (United States)

    Yu, Bing; Shu, Wenjun; Cao, Can

    2018-05-01

    A novel modeling method for aircraft engine using nonlinear autoregressive exogenous (NARX) models based on wavelet neural networks is proposed. The identification principle and process based on wavelet neural networks are studied, and the modeling scheme based on NARX is proposed. Then, the time series data sets from three types of aircraft engines are utilized to build the corresponding NARX models, and these NARX models are validated by the simulation. The results show that all the best NARX models can capture the original aircraft engine's dynamic characteristic well with the high accuracy. For every type of engine, the relative identification errors of its best NARX model and the component level model are no more than 3.5 % and most of them are within 1 %.

  4. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  5. Polynomial fuzzy model-based approach for underactuated surface vessels

    DEFF Research Database (Denmark)

    Khooban, Mohammad Hassan; Vafamand, Navid; Dragicevic, Tomislav

    2018-01-01

    The main goal of this study is to introduce a new polynomial fuzzy model-based structure for a class of marine systems with non-linear and polynomial dynamics. The suggested technique relies on a polynomial Takagi–Sugeno (T–S) fuzzy modelling, a polynomial dynamic parallel distributed compensation...... surface vessel (USV). Additionally, in order to overcome the USV control challenges, including the USV un-modelled dynamics, complex nonlinear dynamics, external disturbances and parameter uncertainties, the polynomial fuzzy model representation is adopted. Moreover, the USV-based control structure...... and a sum-of-squares (SOS) decomposition. The new proposed approach is a generalisation of the standard T–S fuzzy models and linear matrix inequality which indicated its effectiveness in decreasing the tracking time and increasing the efficiency of the robust tracking control problem for an underactuated...

  6. Business Models for NFC based mobile payments

    Directory of Open Access Journals (Sweden)

    Johannes Sang Un Chae

    2015-01-01

    Full Text Available Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.

  7. Business Models for NFC Based Mobile Payments

    DEFF Research Database (Denmark)

    Chae, Johannes Sang-Un; Hedman, Jonas

    2015-01-01

    Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper...... investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation...... from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches....

  8. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    Tao Wu; Edward Lester; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical, Environmental and Mining Engineering

    2006-05-15

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coal particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.

  9. A physiologically based nonhomogeneous Poisson counter model of visual identification

    DEFF Research Database (Denmark)

    Christensen, Jeppe H; Markussen, Bo; Bundesen, Claus

    2018-01-01

    A physiologically based nonhomogeneous Poisson counter model of visual identification is presented. The model was developed in the framework of a Theory of Visual Attention (Bundesen, 1990; Kyllingsbæk, Markussen, & Bundesen, 2012) and meant for modeling visual identification of objects that are ......A physiologically based nonhomogeneous Poisson counter model of visual identification is presented. The model was developed in the framework of a Theory of Visual Attention (Bundesen, 1990; Kyllingsbæk, Markussen, & Bundesen, 2012) and meant for modeling visual identification of objects...... that mimicked the dynamics of receptive field selectivity as found in neurophysiological studies. Furthermore, the initial sensory response yielded theoretical hazard rate functions that closely resembled empirically estimated ones. Finally, supplied with a Naka-Rushton type contrast gain control, the model...

  10. Intelligent-based Structural Damage Detection Model

    International Nuclear Information System (INIS)

    Lee, Eric Wai Ming; Yu, K.F.

    2010-01-01

    This paper presents the application of a novel Artificial Neural Network (ANN) model for the diagnosis of structural damage. The ANN model, denoted as the GRNNFA, is a hybrid model combining the General Regression Neural Network Model (GRNN) and the Fuzzy ART (FA) model. It not only retains the important features of the GRNN and FA models (i.e. fast and stable network training and incremental growth of network structure) but also facilitates the removal of the noise embedded in the training samples. Structural damage alters the stiffness distribution of the structure and so as to change the natural frequencies and mode shapes of the system. The measured modal parameter changes due to a particular damage are treated as patterns for that damage. The proposed GRNNFA model was trained to learn those patterns in order to detect the possible damage location of the structure. Simulated data is employed to verify and illustrate the procedures of the proposed ANN-based damage diagnosis methodology. The results of this study have demonstrated the feasibility of applying the GRNNFA model to structural damage diagnosis even when the training samples were noise contaminated.

  11. Intelligent-based Structural Damage Detection Model

    Science.gov (United States)

    Lee, Eric Wai Ming; Yu, Kin Fung

    2010-05-01

    This paper presents the application of a novel Artificial Neural Network (ANN) model for the diagnosis of structural damage. The ANN model, denoted as the GRNNFA, is a hybrid model combining the General Regression Neural Network Model (GRNN) and the Fuzzy ART (FA) model. It not only retains the important features of the GRNN and FA models (i.e. fast and stable network training and incremental growth of network structure) but also facilitates the removal of the noise embedded in the training samples. Structural damage alters the stiffness distribution of the structure and so as to change the natural frequencies and mode shapes of the system. The measured modal parameter changes due to a particular damage are treated as patterns for that damage. The proposed GRNNFA model was trained to learn those patterns in order to detect the possible damage location of the structure. Simulated data is employed to verify and illustrate the procedures of the proposed ANN-based damage diagnosis methodology. The results of this study have demonstrated the feasibility of applying the GRNNFA model to structural damage diagnosis even when the training samples were noise contaminated.

  12. e-Government Maturity Model Based on Systematic Review and Meta-Ethnography Approach

    Directory of Open Access Journals (Sweden)

    Darmawan Napitupulu

    2016-11-01

    Full Text Available Maturity model based on e-Government portal has been developed by a number of researchers both individually and institutionally, but still scattered in various journals and conference articles and can be said to have a different focus with each other, both in terms of stages and features. The aim of this research is conducting a study to integrate a number of maturity models existing today in order to build generic maturity model based on e-Government portal. The method used in this study is Systematic Review with meta-ethnography qualitative approach. Meta-ethnography, which is part of Systematic Review method, is a technique to perform data integration to obtain theories and concepts with a new level of understanding that is deeper and thorough. The result obtained is a maturity model based on e-Government portal that consists of 7 (seven stages, namely web presence, interaction, transaction, vertical integration, horizontal integration, full integration, and open participation. These seven stages are synthesized from the 111 key concepts related to 25 studies of maturity model based e-Government portal. The maturity model resulted is more comprehensive and generic because it is an integration of models (best practices that exists today.

  13. Alternative ways of using field-based estimates to calibrate ecosystem models and their implications for carbon cycle studies

    Science.gov (United States)

    He, Yujie; Zhuang, Qianlai; McGuire, David; Liu, Yaling; Chen, Min

    2013-01-01

    Model-data fusion is a process in which field observations are used to constrain model parameters. How observations are used to constrain parameters has a direct impact on the carbon cycle dynamics simulated by ecosystem models. In this study, we present an evaluation of several options for the use of observations in modeling regional carbon dynamics and explore the implications of those options. We calibrated the Terrestrial Ecosystem Model on a hierarchy of three vegetation classification levels for the Alaskan boreal forest: species level, plant-functional-type level (PFT level), and biome level, and we examined the differences in simulated carbon dynamics. Species-specific field-based estimates were directly used to parameterize the model for species-level simulations, while weighted averages based on species percent cover were used to generate estimates for PFT- and biome-level model parameterization. We found that calibrated key ecosystem process parameters differed substantially among species and overlapped for species that are categorized into different PFTs. Our analysis of parameter sets suggests that the PFT-level parameterizations primarily reflected the dominant species and that functional information of some species were lost from the PFT-level parameterizations. The biome-level parameterization was primarily representative of the needleleaf PFT and lost information on broadleaf species or PFT function. Our results indicate that PFT-level simulations may be potentially representative of the performance of species-level simulations while biome-level simulations may result in biased estimates. Improved theoretical and empirical justifications for grouping species into PFTs or biomes are needed to adequately represent the dynamics of ecosystem functioning and structure.

  14. An intelligent trust-based access control model for affective ...

    African Journals Online (AJOL)

    In this study, a fuzzy expert system Trust-Based Access Control (TBAC) model for improving the Quality of crowdsourcing using emotional affective computing is presented. This model takes into consideration a pre-processing module consisting of three inputs such as crowd-workers category, trust metric and emotional ...

  15. Are individual based models a suitable approach to estimate population vulnerability? - a case study

    Directory of Open Access Journals (Sweden)

    Eva Maria Griebeler

    2011-04-01

    Full Text Available European populations of the Large Blue Butterfly Maculinea arion have experienced severe declines in the last decades, especially in the northern part of the species range. This endangered lycaenid butterfly needs two resources for development: flower buds of specific plants (Thymus spp., Origanum vulgare, on which young caterpillars briefly feed, and red ants of the genus Myrmica, whose nests support caterpillars during a prolonged final instar. I present an analytically solvable deterministic model to estimate the vulnerability of populations of M. arion. Results obtained from the sensitivity analysis of this mathematical model (MM are contrasted to the respective results that had been derived from a spatially explicit individual based model (IBM for this butterfly. I demonstrate that details in landscape configuration which are neglected by the MM but are easily taken into consideration by the IBM result in a different degree of intraspecific competition of caterpillars on flower buds and within host ant nests. The resulting differences in mortalities of caterpillars lead to erroneous estimates of the extinction risk of a butterfly population living in habitat with low food plant coverage and low abundance in host ant nests. This observation favors the use of an individual based modeling approach over the deterministic approach at least for the management of this threatened butterfly.

  16. Outcompeting nitrite-oxidizing bacteria in single-stage nitrogen removal in sewage treatment plants: a model-based study.

    Science.gov (United States)

    Pérez, Julio; Lotti, Tommaso; Kleerebezem, Robbert; Picioreanu, Cristian; van Loosdrecht, Mark C M

    2014-12-01

    This model-based study investigated the mechanisms and operational window for efficient repression of nitrite oxidizing bacteria (NOB) in an autotrophic nitrogen removal process. The operation of a continuous single-stage granular sludge process was simulated for nitrogen removal from pretreated sewage at 10 °C. The effects of the residual ammonium concentration were explicitly analyzed with the model. Competition for oxygen between ammonia-oxidizing bacteria (AOB) and NOB was found to be essential for NOB repression even when the suppression of nitrite oxidation is assisted by nitrite reduction by anammox (AMX). The nitrite half-saturation coefficient of NOB and AMX proved non-sensitive for the model output. The maximum specific growth rate of AMX bacteria proved a sensitive process parameter, because higher rates would provide a competitive advantage for AMX. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Firm Based Trade Models and Turkish Economy

    Directory of Open Access Journals (Sweden)

    Nilüfer ARGIN

    2015-12-01

    Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.

  18. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  19. COMPARISON of FUZZY-BASED MODELS in LANDSLIDE HAZARD MAPPING

    Directory of Open Access Journals (Sweden)

    N. Mijani

    2017-09-01

    Full Text Available Landslide is one of the main geomorphic processes which effects on the development of prospect in mountainous areas and causes disastrous accidents. Landslide is an event which has different uncertain criteria such as altitude, slope, aspect, land use, vegetation density, precipitation, distance from the river and distance from the road network. This research aims to compare and evaluate different fuzzy-based models including Fuzzy Analytic Hierarchy Process (Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR. The main contribution of this paper reveals to the comprehensive criteria causing landslide hazard considering their uncertainties and comparison of different fuzzy-based models. The quantify of evaluation process are calculated by Density Ratio (DR and Quality Sum (QS. The proposed methodology implemented in Sari, one of the city of Iran which has faced multiple landslide accidents in recent years due to the particular environmental conditions. The achieved results of accuracy assessment based on the quantifier strated that Fuzzy-AHP model has higher accuracy compared to other two models in landslide hazard zonation. Accuracy of zoning obtained from Fuzzy-AHP model is respectively 0.92 and 0.45 based on method Precision (P and QS indicators. Based on obtained landslide hazard maps, Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR respectively cover 13, 26 and 35 percent of the study area with a very high risk level. Based on these findings, fuzzy-AHP model has been selected as the most appropriate method of zoning landslide in the city of Sari and the Fuzzy-gamma method with a minor difference is in the second order.

  20. Prediction of residential radon exposure of the whole Swiss population: comparison of model-based predictions with measurement-based predictions.

    Science.gov (United States)

    Hauri, D D; Huss, A; Zimmermann, F; Kuehni, C E; Röösli, M

    2013-10-01

    Radon plays an important role for human exposure to natural sources of ionizing radiation. The aim of this article is to compare two approaches to estimate mean radon exposure in the Swiss population: model-based predictions at individual level and measurement-based predictions based on measurements aggregated at municipality level. A nationwide model was used to predict radon levels in each household and for each individual based on the corresponding tectonic unit, building age, building type, soil texture, degree of urbanization, and floor. Measurement-based predictions were carried out within a health impact assessment on residential radon and lung cancer. Mean measured radon levels were corrected for the average floor distribution and weighted with population size of each municipality. Model-based predictions yielded a mean radon exposure of the Swiss population of 84.1 Bq/m(3) . Measurement-based predictions yielded an average exposure of 78 Bq/m(3) . This study demonstrates that the model- and the measurement-based predictions provided similar results. The advantage of the measurement-based approach is its simplicity, which is sufficient for assessing exposure distribution in a population. The model-based approach allows predicting radon levels at specific sites, which is needed in an epidemiological study, and the results do not depend on how the measurement sites have been selected. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Modeling Guru: Knowledge Base for NASA Modelers

    Science.gov (United States)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the

  2. Student Teachers' Modeling of Acceleration Using a Video-Based Laboratory in Physics Education: A Multimodal Case Study

    Directory of Open Access Journals (Sweden)

    Louis Trudel

    2016-06-01

    Full Text Available This exploratory study intends to model kinematics learning of a pair of student teachers when exposed to prescribed teaching strategies in a video-based laboratory. Two student teachers were chosen from the Francophone B.Ed. program of the Faculty of Education of a Canadian university. The study method consisted of having the participants interact with a video-based laboratory to complete two activities for learning properties of acceleration in rectilinear motion. Time limits were placed on the learning activities during which the researcher collected detailed multimodal information from the student teachers' answers to questions, the graphs they produced from experimental data, and the videos taken during the learning sessions. As a result, we describe the learning approach each one followed, the evidence of conceptual change and the difficulties they face in tackling various aspects of the accelerated motion. We then specify advantages and limits of our research and propose recommendations for further study.

  3. Polarimetry data inversion in conditions of tokamak plasma: Model based tomography concept

    International Nuclear Information System (INIS)

    Bieg, B.; Chrzanowski, J.; Kravtsov, Yu. A.; Mazon, D.

    2015-01-01

    Highlights: • Model based plasma tomography is presented. • Minimization procedure for the error function is suggested to be performed using the gradient method. • model based procedure of data inversion in the case of joint polarimetry–interferometry data. - Abstract: Model based plasma tomography is studied which fits a hypothetical multi-parameter plasma model to polarimetry and interferometry experimental data. Fitting procedure implies minimization of the error function, defined as a sum of squared differences between theoretical and empirical values. Minimization procedure for the function is suggested to be performed using the gradient method. Contrary to traditional tomography, which deals exclusively with observational data, model-based tomography (MBT) operates also with reasonable model of inhomogeneous plasma distribution and verifies which profile of a given class better fits experimental data. Model based tomography (MBT) restricts itself by definite class of models for instance power series, Fourier expansion etc. The basic equations of MBT are presented which generalize the equations of model based procedure of polarimetric data inversion in the case of joint polarimetry–interferometry data.

  4. Polarimetry data inversion in conditions of tokamak plasma: Model based tomography concept

    Energy Technology Data Exchange (ETDEWEB)

    Bieg, B. [Maritime University of Szczecin, Waly Chrobrego 1-2, 70-500 Szczecin (Poland); Chrzanowski, J., E-mail: j.chrzanowski@am.szczecin.pl [Maritime University of Szczecin, Waly Chrobrego 1-2, 70-500 Szczecin (Poland); Kravtsov, Yu. A. [Maritime University of Szczecin, Waly Chrobrego 1-2, 70-500 Szczecin (Poland); Space Research Institute, Profsoyuznaya St. 82/34 Russian Academy of Science, Moscow 117997 (Russian Federation); Mazon, D. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France)

    2015-10-15

    Highlights: • Model based plasma tomography is presented. • Minimization procedure for the error function is suggested to be performed using the gradient method. • model based procedure of data inversion in the case of joint polarimetry–interferometry data. - Abstract: Model based plasma tomography is studied which fits a hypothetical multi-parameter plasma model to polarimetry and interferometry experimental data. Fitting procedure implies minimization of the error function, defined as a sum of squared differences between theoretical and empirical values. Minimization procedure for the function is suggested to be performed using the gradient method. Contrary to traditional tomography, which deals exclusively with observational data, model-based tomography (MBT) operates also with reasonable model of inhomogeneous plasma distribution and verifies which profile of a given class better fits experimental data. Model based tomography (MBT) restricts itself by definite class of models for instance power series, Fourier expansion etc. The basic equations of MBT are presented which generalize the equations of model based procedure of polarimetric data inversion in the case of joint polarimetry–interferometry data.

  5. Study on Software Quality Improvement based on Rayleigh Model and PDCA Model

    OpenAIRE

    Ning Jingfeng; Hu Ming

    2013-01-01

    As the software industry gradually becomes mature, software quality is regarded as the life of a software enterprise. This article discusses how to improve the quality of software, applies Rayleigh model and PDCA model to the software quality management, combines with the defect removal effectiveness index, exerts PDCA model to solve the problem of quality management objectives when using the Rayleigh model in bidirectional quality improvement strategies of software quality management, a...

  6. Comparing ESC and iPSC—Based Models for Human Genetic Disorders

    Directory of Open Access Journals (Sweden)

    Tomer Halevy

    2014-10-01

    Full Text Available Traditionally, human disorders were studied using animal models or somatic cells taken from patients. Such studies enabled the analysis of the molecular mechanisms of numerous disorders, and led to the discovery of new treatments. Yet, these systems are limited or even irrelevant in modeling multiple genetic diseases. The isolation of human embryonic stem cells (ESCs from diseased blastocysts, the derivation of induced pluripotent stem cells (iPSCs from patients’ somatic cells, and the new technologies for genome editing of pluripotent stem cells have opened a new window of opportunities in the field of disease modeling, and enabled studying diseases that couldn’t be modeled in the past. Importantly, despite the high similarity between ESCs and iPSCs, there are several fundamental differences between these cells, which have important implications regarding disease modeling. In this review we compare ESC-based models to iPSC-based models, and highlight the advantages and disadvantages of each system. We further suggest a roadmap for how to choose the optimal strategy to model each specific disorder.

  7. Comparing ESC and iPSC-Based Models for Human Genetic Disorders.

    Science.gov (United States)

    Halevy, Tomer; Urbach, Achia

    2014-10-24

    Traditionally, human disorders were studied using animal models or somatic cells taken from patients. Such studies enabled the analysis of the molecular mechanisms of numerous disorders, and led to the discovery of new treatments. Yet, these systems are limited or even irrelevant in modeling multiple genetic diseases. The isolation of human embryonic stem cells (ESCs) from diseased blastocysts, the derivation of induced pluripotent stem cells (iPSCs) from patients' somatic cells, and the new technologies for genome editing of pluripotent stem cells have opened a new window of opportunities in the field of disease modeling, and enabled studying diseases that couldn't be modeled in the past. Importantly, despite the high similarity between ESCs and iPSCs, there are several fundamental differences between these cells, which have important implications regarding disease modeling. In this review we compare ESC-based models to iPSC-based models, and highlight the advantages and disadvantages of each system. We further suggest a roadmap for how to choose the optimal strategy to model each specific disorder.

  8. Practicality of Agent-Based Modeling of Civil Violence: an Assessment

    OpenAIRE

    Thron, Christopher; Jackson, Elizabeth

    2015-01-01

    Joshua Epstein (2002) proposed a simple agent-based model to describe the formation and evolution of spontaneous civil violence (such as riots or violent demonstrations). In this paper we study the practical applicability of Epstein's model.

  9. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    International Nuclear Information System (INIS)

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing

  10. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing.

  11. A Novel GMM-Based Behavioral Modeling Approach for Smartwatch-Based Driver Authentication.

    Science.gov (United States)

    Yang, Ching-Han; Chang, Chin-Chun; Liang, Deron

    2018-03-28

    All drivers have their own distinct driving habits, and usually hold and operate the steering wheel differently in different driving scenarios. In this study, we proposed a novel Gaussian mixture model (GMM)-based method that can improve the traditional GMM in modeling driving behavior. This new method can be applied to build a better driver authentication system based on the accelerometer and orientation sensor of a smartwatch. To demonstrate the feasibility of the proposed method, we created an experimental system that analyzes driving behavior using the built-in sensors of a smartwatch. The experimental results for driver authentication-an equal error rate (EER) of 4.62% in the simulated environment and an EER of 7.86% in the real-traffic environment-confirm the feasibility of this approach.

  12. Simulation-Based Dynamic Passenger Flow Assignment Modelling for a Schedule-Based Transit Network

    Directory of Open Access Journals (Sweden)

    Xiangming Yao

    2017-01-01

    Full Text Available The online operation management and the offline policy evaluation in complex transit networks require an effective dynamic traffic assignment (DTA method that can capture the temporal-spatial nature of traffic flows. The objective of this work is to propose a simulation-based dynamic passenger assignment framework and models for such applications in the context of schedule-based rail transit systems. In the simulation framework, travellers are regarded as individual agents who are able to obtain complete information on the current traffic conditions. A combined route selection model integrated with pretrip route selection and entrip route switch is established for achieving the dynamic network flow equilibrium status. The train agent is operated strictly with the timetable and its capacity limitation is considered. A continuous time-driven simulator based on the proposed framework and models is developed, whose performance is illustrated through a large-scale network of Beijing subway. The results indicate that more than 0.8 million individual passengers and thousands of trains can be simulated simultaneously at a speed ten times faster than real time. This study provides an efficient approach to analyze the dynamic demand-supply relationship for large schedule-based transit networks.

  13. Study on non-linear bistable dynamics model based EEG signal discrimination analysis method.

    Science.gov (United States)

    Ying, Xiaoguo; Lin, Han; Hui, Guohua

    2015-01-01

    Electroencephalogram (EEG) is the recording of electrical activity along the scalp. EEG measures voltage fluctuations generating from ionic current flows within the neurons of the brain. EEG signal is looked as one of the most important factors that will be focused in the next 20 years. In this paper, EEG signal discrimination based on non-linear bistable dynamical model was proposed. EEG signals were processed by non-linear bistable dynamical model, and features of EEG signals were characterized by coherence index. Experimental results showed that the proposed method could properly extract the features of different EEG signals.

  14. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.

    2011-01-01

    Since the world's energy demand increases every year, the oil & gas industry makes a continuous effort to improve fossil fuel recovery. Physics-based petroleum reservoir modeling and closed-loop model-based reservoir management concept can play an important role here. In this concept measured data

  15. Stimulating Scientific Reasoning with Drawing-Based Modeling

    Science.gov (United States)

    Heijnes, Dewi; van Joolingen, Wouter; Leenaars, Frank

    2018-01-01

    We investigate the way students' reasoning about evolution can be supported by drawing-based modeling. We modified the drawing-based modeling tool SimSketch to allow for modeling evolutionary processes. In three iterations of development and testing, students in lower secondary education worked on creating an evolutionary model. After each…

  16. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  17. Model-Based Development of Control Systems for Forestry Cranes

    Directory of Open Access Journals (Sweden)

    Pedro La Hera

    2015-01-01

    Full Text Available Model-based methods are used in industry for prototyping concepts based on mathematical models. With our forest industry partners, we have established a model-based workflow for rapid development of motion control systems for forestry cranes. Applying this working method, we can verify control algorithms, both theoretically and practically. This paper is an example of this workflow and presents four topics related to the application of nonlinear control theory. The first topic presents the system of differential equations describing the motion dynamics. The second topic presents nonlinear control laws formulated according to sliding mode control theory. The third topic presents a procedure for model calibration and control tuning that are a prerequisite to realize experimental tests. The fourth topic presents the results of tests performed on an experimental crane specifically equipped for these tasks. Results of these studies show the advantages and disadvantages of these control algorithms, and they highlight their performance in terms of robustness and smoothness.

  18. Exploratory modeling and simulation to support development of motesanib in Asian patients with non-small cell lung cancer based on MONET1 study results.

    Science.gov (United States)

    Claret, L; Bruno, R; Lu, J-F; Sun, Y-N; Hsu, C-P

    2014-04-01

    The motesanib phase III MONET1 study failed to show improvement in overall survival (OS) in non-small cell lung cancer, but a subpopulation of Asian patients had a favorable outcome. We performed exploratory modeling and simulations based on MONET1 data to support further development of motesanib in Asian patients. A model-based estimate of time to tumor growth was the best of tested tumor size response metrics in a multivariate OS model (P Simulations indicated that a phase III study in 500 Asian patients would exceed 80% power to confirm superior efficacy of motesanib combination therapy (expected HR: 0.74), suggesting that motesanib combination therapy may benefit Asian patients.

  19. Test-Driven, Model-Based Systems Engineering

    DEFF Research Database (Denmark)

    Munck, Allan

    Hearing systems have evolved over many years from simple mechanical devices (horns) to electronic units consisting of microphones, amplifiers, analog filters, loudspeakers, batteries, etc. Digital signal processors replaced analog filters to provide better performance end new features. Central....... This thesis concerns methods for identifying, selecting and implementing tools for various aspects of model-based systems engineering. A comprehensive method was proposed that include several novel steps such as techniques for analyzing the gap between requirements and tool capabilities. The method...... was verified with good results in two case studies for selection of a traceability tool (single-tool scenario) and a set of modeling tools (multi-tool scenarios). Models must be subjected to testing to allow engineers to predict functionality and performance of systems. Test-first strategies are known...

  20. An Integrated Model of Co-ordinated Community-Based Care.

    Science.gov (United States)

    Scharlach, Andrew E; Graham, Carrie L; Berridge, Clara

    2015-08-01

    Co-ordinated approaches to community-based care are a central component of current and proposed efforts to help vulnerable older adults obtain needed services and supports and reduce unnecessary use of health care resources. This study examines ElderHelp Concierge Club, an integrated community-based care model that includes comprehensive personal and environmental assessment, multilevel care co-ordination, a mix of professional and volunteer service providers, and a capitated, income-adjusted fee model. Evaluation includes a retrospective study (n = 96) of service use and perceived program impact, and a prospective study (n = 21) of changes in participant physical and social well-being and health services utilization. Over the period of this study, participants showed greater mobility, greater ability to meet household needs, greater access to health care, reduced social isolation, reduced home hazards, fewer falls, and greater perceived ability to obtain assistance needed to age in place. This study provides preliminary evidence that an integrated multilevel care co-ordination approach may be an effective and efficient model for serving vulnerable community-based elders, especially low and moderate-income elders who otherwise could not afford the cost of care. The findings suggest the need for multisite controlled studies to more rigorously evaluate program impacts and the optimal mix of various program components. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Limiting CT radiation dose in children with craniosynostosis: phantom study using model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Kaasalainen, Touko; Lampinen, Anniina [University of Helsinki and Helsinki University Hospital, HUS Medical Imaging Center, Radiology, POB 340, Helsinki (Finland); University of Helsinki, Department of Physics, Helsinki (Finland); Palmu, Kirsi [University of Helsinki and Helsinki University Hospital, HUS Medical Imaging Center, Radiology, POB 340, Helsinki (Finland); School of Science, Aalto University, Department of Biomedical Engineering and Computational Science, Helsinki (Finland); Reijonen, Vappu; Kortesniemi, Mika [University of Helsinki and Helsinki University Hospital, HUS Medical Imaging Center, Radiology, POB 340, Helsinki (Finland); Leikola, Junnu [University of Helsinki and Helsinki University Hospital, Department of Plastic Surgery, Helsinki (Finland); Kivisaari, Riku [University of Helsinki and Helsinki University Hospital, Department of Neurosurgery, Helsinki (Finland)

    2015-09-15

    Medical professionals need to exercise particular caution when developing CT scanning protocols for children who require multiple CT studies, such as those with craniosynostosis. To evaluate the utility of ultra-low-dose CT protocols with model-based iterative reconstruction techniques for craniosynostosis imaging. We scanned two pediatric anthropomorphic phantoms with a 64-slice CT scanner using different low-dose protocols for craniosynostosis. We measured organ doses in the head region with metal-oxide-semiconductor field-effect transistor (MOSFET) dosimeters. Numerical simulations served to estimate organ and effective doses. We objectively and subjectively evaluated the quality of images produced by adaptive statistical iterative reconstruction (ASiR) 30%, ASiR 50% and Veo (all by GE Healthcare, Waukesha, WI). Image noise and contrast were determined for different tissues. Mean organ dose with the newborn phantom was decreased up to 83% compared to the routine protocol when using ultra-low-dose scanning settings. Similarly, for the 5-year phantom the greatest radiation dose reduction was 88%. The numerical simulations supported the findings with MOSFET measurements. The image quality remained adequate with Veo reconstruction, even at the lowest dose level. Craniosynostosis CT with model-based iterative reconstruction could be performed with a 20-μSv effective dose, corresponding to the radiation exposure of plain skull radiography, without compromising required image quality. (orig.)

  2. Development and Analysis of Patient-Based Complete Conducting Airways Models.

    Directory of Open Access Journals (Sweden)

    Rafel Bordas

    Full Text Available The analysis of high-resolution computed tomography (CT images of the lung is dependent on inter-subject differences in airway geometry. The application of computational models in understanding the significance of these differences has previously been shown to be a useful tool in biomedical research. Studies using image-based geometries alone are limited to the analysis of the central airways, down to generation 6-10, as other airways are not visible on high-resolution CT. However, airways distal to this, often termed the small airways, are known to play a crucial role in common airway diseases such as asthma and chronic obstructive pulmonary disease (COPD. Other studies have incorporated an algorithmic approach to extrapolate CT segmented airways in order to obtain a complete conducting airway tree down to the level of the acinus. These models have typically been used for mechanistic studies, but also have the potential to be used in a patient-specific setting. In the current study, an image analysis and modelling pipeline was developed and applied to a number of healthy (n = 11 and asthmatic (n = 24 CT patient scans to produce complete patient-based airway models to the acinar level (mean terminal generation 15.8 ± 0.47. The resulting models are analysed in terms of morphometric properties and seen to be consistent with previous work. A number of global clinical lung function measures are compared to resistance predictions in the models to assess their suitability for use in a patient-specific setting. We show a significant difference (p < 0.01 in airways resistance at all tested flow rates in complete airway trees built using CT data from severe asthmatics (GINA 3-5 versus healthy subjects. Further, model predictions of airways resistance at all flow rates are shown to correlate with patient forced expiratory volume in one second (FEV1 (Spearman ρ = -0.65, p < 0.001 and, at low flow rates (0.00017 L/s, FEV1 over forced vital capacity (FEV1

  3. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  4. Campus network security model study

    Science.gov (United States)

    Zhang, Yong-ku; Song, Li-ren

    2011-12-01

    Campus network security is growing importance, Design a very effective defense hacker attacks, viruses, data theft, and internal defense system, is the focus of the study in this paper. This paper compared the firewall; IDS based on the integrated, then design of a campus network security model, and detail the specific implementation principle.

  5. A Three-Pulse Release Tablet for Amoxicillin: Preparation, Pharmacokinetic Study and Physiologically Based Pharmacokinetic Modeling.

    Science.gov (United States)

    Li, Jin; Chai, Hongyu; Li, Yang; Chai, Xuyu; Zhao, Yan; Zhao, Yunfan; Tao, Tao; Xiang, Xiaoqiang

    2016-01-01

    Amoxicillin is a commonly used antibiotic which has a short half-life in human. The frequent administration of amoxicillin is often required to keep the plasma drug level in an effective range. The short dosing interval of amoxicillin could also cause some side effects and drug resistance, and impair its therapeutic efficacy and patients' compliance. Therefore, a three-pulse release tablet of amoxicillin is desired to generate sustained release in vivo, and thus to avoid the above mentioned disadvantages. The pulsatile release tablet consists of three pulsatile components: one immediate-release granule and two delayed release pellets, all containing amoxicillin. The preparation of a pulsatile release tablet of amoxicillin mainly includes wet granulation craft, extrusion/spheronization craft, pellet coating craft, mixing craft, tablet compression craft and film coating craft. Box-Behnken design, Scanning Electron Microscope and in vitro drug release test were used to help the optimization of formulations. A crossover pharmacokinetic study was performed to compare the pharmacokinetic profile of our in-house pulsatile tablet with that of commercial immediate release tablet. The pharmacokinetic profile of this pulse formulation was simulated by physiologically based pharmacokinetic (PBPK) model with the help of Simcyp®. Single factor experiments identify four important factors of the formulation, namely, coating weight of Eudragit L30 D-55 (X1), coating weight of AQOAT AS-HF (X2), the extrusion screen aperture (X3) and compression forces (X4). The interrelations of the four factors were uncovered by a Box-Behnken design to help to determine the optimal formulation. The immediate-release granule, two delayed release pellets, together with other excipients, namely, Avicel PH 102, colloidal silicon dioxide, polyplasdone and magnesium stearate were mixed, and compressed into tablets, which was subsequently coated with Opadry® film to produce pulsatile tablet of

  6. Systematic review of model-based cervical screening evaluations.

    Science.gov (United States)

    Mendes, Diana; Bains, Iren; Vanni, Tazio; Jit, Mark

    2015-05-01

    Optimising population-based cervical screening policies is becoming more complex due to the expanding range of screening technologies available and the interplay with vaccine-induced changes in epidemiology. Mathematical models are increasingly being applied to assess the impact of cervical cancer screening strategies. We systematically reviewed MEDLINE®, Embase, Web of Science®, EconLit, Health Economic Evaluation Database, and The Cochrane Library databases in order to identify the mathematical models of human papillomavirus (HPV) infection and cervical cancer progression used to assess the effectiveness and/or cost-effectiveness of cervical cancer screening strategies. Key model features and conclusions relevant to decision-making were extracted. We found 153 articles meeting our eligibility criteria published up to May 2013. Most studies (72/153) evaluated the introduction of a new screening technology, with particular focus on the comparison of HPV DNA testing and cytology (n = 58). Twenty-eight in forty of these analyses supported HPV DNA primary screening implementation. A few studies analysed more recent technologies - rapid HPV DNA testing (n = 3), HPV DNA self-sampling (n = 4), and genotyping (n = 1) - and were also supportive of their introduction. However, no study was found on emerging molecular markers and their potential utility in future screening programmes. Most evaluations (113/153) were based on models simulating aggregate groups of women at risk of cervical cancer over time without accounting for HPV infection transmission. Calibration to country-specific outcome data is becoming more common, but has not yet become standard practice. Models of cervical screening are increasingly used, and allow extrapolation of trial data to project the population-level health and economic impact of different screening policy. However, post-vaccination analyses have rarely incorporated transmission dynamics. Model calibration to country

  7. Pengembangan Model Pembelajaran Project Based Learning pada Mata Kuliah Computer Aided Design

    Directory of Open Access Journals (Sweden)

    Satoto Endar Nayono

    2013-09-01

    Full Text Available One of the key competencies of graduates majoring in Civil Engineering and Planning Education, Faculty of Engineering, Yogyakarta State University (YSU is able to plan buildings. CAD courses aim to train students to be able to pour the planning concepts into the picture. One of the obstacles faced in the course are concepts and pictures that created by the students often do not correspond to the standards used in the field. This study aims to develop a model of project-based learning so that the students’ pictures are more in line with the actual conditions in the field. This study was carried out through the stages as follows: (1 Pre test, (2 Planning of learning, (3 Implementation of the learning model of project-based learning, (4 monitoring and evaluation (5 Reflection and revision, (6 Implementation of learning in the next cycle, and (7 Evaluation of the learning outcomes. This study was conducted for four months in 2012 in the Department of Civil Engineering and Planning Education, Faculty of Engineering, YSU. The subjects of this study are the students who took the course of Computer Aided Design. The analysis of the data used descriptive qualitative and descriptive statistics. The results of this study were: (1 The implementation of project based learning model was proven to increase the learning process and the learning outcomes of students in the subject of CAD through the provision of buildings planning pictures tasks of school buildings based on the real conditions in the field. The task was delivered in every meeting and improved based on the feedback from their lecturers, (2 the learning model of project based learning will be easier to be implemented if it is accompanied by the model of peer tutoring and the learning model of PAIKEM.

  8. Multi-Scale Analysis of Regional Inequality based on Spatial Field Model: A Case Study of China from 2000 to 2012

    Directory of Open Access Journals (Sweden)

    Shasha Lu

    2015-10-01

    Full Text Available A large body of recent studies—from both inside and outside of China—are devoted to the understanding of China’s regional inequality. The current study introduces “the spatial field model” to achieve comprehensive evaluation and multi-scale analysis of regional inequality. The model is based on the growth pole theory, regional interaction theory, and energy space theory. The spatial field is an abstract concept that defines the potential energy difference that is formed in the process of a regional growth pole driving the economic development of peripheral areas through transportation and communication corridors. The model is able to provide potentially more precise regional inequality estimates and generates isarithmic maps that will provide highly intuitive and visualized presentations. The model is applied to evaluate the spatiotemporal pattern of economic inequality in China from 2000 to 2012 amongst internal eastern-central-western regions as well as north-south regions at three geographical scales—i.e., inter-province, inter-city, and inter-county. The results indicate that the spatial field model could comprehensively evaluate regional inequality, provide aesthetically pleasing and highly adaptable presentations based on a pixel-based raster, and realise the multi-scale analyses of the regional inequality. The paper also investigates the limitations and extensions of the spatial field model in future application.

  9. A two-stage stochastic rule-based model to determine pre-assembly buffer content

    Science.gov (United States)

    Gunay, Elif Elcin; Kula, Ufuk

    2018-01-01

    This study considers instant decision-making needs of the automobile manufactures for resequencing vehicles before final assembly (FA). We propose a rule-based two-stage stochastic model to determine the number of spare vehicles that should be kept in the pre-assembly buffer to restore the altered sequence due to paint defects and upstream department constraints. First stage of the model decides the spare vehicle quantities, where the second stage model recovers the scrambled sequence respect to pre-defined rules. The problem is solved by sample average approximation (SAA) algorithm. We conduct a numerical study to compare the solutions of heuristic model with optimal ones and provide following insights: (i) as the mismatch between paint entrance and scheduled sequence decreases, the rule-based heuristic model recovers the scrambled sequence as good as the optimal resequencing model, (ii) the rule-based model is more sensitive to the mismatch between the paint entrance and scheduled sequences for recovering the scrambled sequence, (iii) as the defect rate increases, the difference in recovery effectiveness between rule-based heuristic and optimal solutions increases, (iv) as buffer capacity increases, the recovery effectiveness of the optimization model outperforms heuristic model, (v) as expected the rule-based model holds more inventory than the optimization model.

  10. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  11. Modeling of Mixing Behavior in a Combined Blowing Steelmaking Converter with a Filter-Based Euler-Lagrange Model

    Science.gov (United States)

    Li, Mingming; Li, Lin; Li, Qiang; Zou, Zongshu

    2018-05-01

    A filter-based Euler-Lagrange multiphase flow model is used to study the mixing behavior in a combined blowing steelmaking converter. The Euler-based volume of fluid approach is employed to simulate the top blowing, while the Lagrange-based discrete phase model that embeds the local volume change of rising bubbles for the bottom blowing. A filter-based turbulence method based on the local meshing resolution is proposed aiming to improve the modeling of turbulent eddy viscosities. The model validity is verified through comparison with physical experiments in terms of mixing curves and mixing times. The effects of the bottom gas flow rate on bath flow and mixing behavior are investigated and the inherent reasons for the mixing result are clarified in terms of the characteristics of bottom-blowing plumes, the interaction between plumes and top-blowing jets, and the change of bath flow structure.

  12. Multiagent-Based Model For ESCM

    Directory of Open Access Journals (Sweden)

    Delia MARINCAS

    2011-01-01

    Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.

  13. Statistically Based Morphodynamic Modeling of Tracer Slowdown

    Science.gov (United States)

    Borhani, S.; Ghasemi, A.; Hill, K. M.; Viparelli, E.

    2017-12-01

    Tracer particles are used to study bedload transport in gravel-bed rivers. One of the advantages associated with using of tracer particles is that they allow for direct measures of the entrainment rates and their size distributions. The main issue in large scale studies with tracer particles is the difference between tracer stone short term and long term behavior. This difference is due to the fact that particles undergo vertical mixing or move to less active locations such as bars or even floodplains. For these reasons the average virtual velocity of tracer particle decreases in time, i.e. the tracer slowdown. In summary, tracer slowdown can have a significant impact on the estimation of bedload transport rate or long term dispersal of contaminated sediment. The vast majority of the morphodynamic models that account for the non-uniformity of the bed material (tracer and not tracer, in this case) are based on a discrete description of the alluvial deposit. The deposit is divided in two different regions; the active layer and the substrate. The active layer is a thin layer in the topmost part of the deposit whose particles can interact with the bed material transport. The substrate is the part of the deposit below the active layer. Due to the discrete representation of the alluvial deposit, active layer models are not able to reproduce tracer slowdown. In this study we try to model the slowdown of tracer particles with the continuous Parker-Paola-Leclair morphodynamic framework. This continuous, i.e. not layer-based, framework is based on a stochastic description of the temporal variation of bed surface elevation, and of the elevation specific particle entrainment and deposition. Particle entrainment rates are computed as a function of the flow and sediment characteristics, while particle deposition is estimated with a step length formulation. Here we present one of the first implementation of the continuum framework at laboratory scale, its validation against

  14. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  15. Study of the coupling of geochemical models based on thermodynamic equilibrium with models of component transfer as solutions in porous media or fractures

    International Nuclear Information System (INIS)

    Coudrain-Ribstein, A.

    1985-01-01

    This study is a contribution of analyses possibilities of modelling the transfer of components in the underground taking into account complexes geochemical phenomena. In the first part, the aim and the methodology of existing codes are presented. The transfer codes describe with a great precision the physical phenomena of transport but they are based on a very simple conceptualisation of the geochemical phenomena of retention by the rock. The geochemical models are interested by a stable unity of volume. They allow to compute the equilibrium distribution of the components between the chemical species of the solution, and the solid and gaseous phases. They use important thermodynamic data bases corresponding to each possible reaction. To sum up the situation about the geochemical codes in Europe and United States, a list of about thirty codes describe their method and potentialities. The mathematical analysis of the different methods used in both types of codes is presented. Then, the principles of a modelisation associating the potentialities of the transport codes and the geochemical codes are discussed. It is not possible to think of a simple coupling. A general code must be established on the bases of the existing codes but also on new concepts and under new constraints. In such studies one must always deal with the problem of the reactions kinetics. When the velocity of the reactions is big enough versus the velocity of transport processes, the assumption of local geochemical equilibrium can be retained. A general code would be very cumbersome, expensive and difficult to use. The results would be difficult to analyse and exploit. On the other hand, for each case study, a detailed analysis can point out many computing simplifications without simplifying the concepts [fr

  16. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the

  17. Identifiability study of the proteins degradation model, based on ADM1, using simultaneous batch experiments

    DEFF Research Database (Denmark)

    Flotats, X.; Palatsi, J.; Ahring, Birgitte Kiær

    2006-01-01

    are not inhibiting the hydrolysis process. The ADM1 model adequately expressed the consecutive steps of hydrolysis and acidogenesis, with estimated kinetic values corresponding to a fast acidogenesis and slower hydrolysis. The hydrolysis was found to be the rate limiting step of anaerobic degradation. Estimation...... of yield coefficients based on the relative initial slopes of VFA profiles obtained in a simple batch experiment produced satisfactory results. From the identification study, it was concluded that it is possible to determine univocally the related kinetic parameter values for protein degradation...... if the evolution of amino acids is measured in simultaneous batch experiments, with different initial protein and amino acids concentrations....

  18. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  19. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  20. Sample size calculation to externally validate scoring systems based on logistic regression models.

    Directory of Open Access Journals (Sweden)

    Antonio Palazón-Bru

    Full Text Available A sample size containing at least 100 events and 100 non-events has been suggested to validate a predictive model, regardless of the model being validated and that certain factors can influence calibration of the predictive model (discrimination, parameterization and incidence. Scoring systems based on binary logistic regression models are a specific type of predictive model.The aim of this study was to develop an algorithm to determine the sample size for validating a scoring system based on a binary logistic regression model and to apply it to a case study.The algorithm was based on bootstrap samples in which the area under the ROC curve, the observed event probabilities through smooth curves, and a measure to determine the lack of calibration (estimated calibration index were calculated. To illustrate its use for interested researchers, the algorithm was applied to a scoring system, based on a binary logistic regression model, to determine mortality in intensive care units.In the case study provided, the algorithm obtained a sample size with 69 events, which is lower than the value suggested in the literature.An algorithm is provided for finding the appropriate sample size to validate scoring systems based on binary logistic regression models. This could be applied to determine the sample size in other similar cases.

  1. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  2. An Exploratory Study of the Butterfly Effect Using Agent-Based Modeling

    Science.gov (United States)

    Khasawneh, Mahmoud T.; Zhang, Jun; Shearer, Nevan E. N.; Rodriquez-Velasquez, Elkin; Bowling, Shannon R.

    2010-01-01

    This paper provides insights about the behavior of chaotic complex systems, and the sensitive dependence of the system on the initial starting conditions. How much does a small change in the initial conditions of a complex system affect it in the long term? Do complex systems exhibit what is called the "Butterfly Effect"? This paper uses an agent-based modeling approach to address these questions. An existing model from NetLogo library was extended in order to compare chaotic complex systems with near-identical initial conditions. Results show that small changes in initial starting conditions can have a huge impact on the behavior of chaotic complex systems. The term the "butterfly effect" is attributed to the work of Edward Lorenz [1]. It is used to describe the sensitive dependence of the behavior of chaotic complex systems on the initial conditions of these systems. The metaphor refers to the notion that a butterfly flapping its wings somewhere may cause extreme changes in the ecological system's behavior in the future, such as a hurricane.

  3. Making the most of sparse clinical data by using a predictive-model-based analysis, illustrated with a stavudine pharmacokinetic study.

    Science.gov (United States)

    Zhang, L; Price, R; Aweeka, F; Bellibas, S E; Sheiner, L B

    2001-02-01

    A small-scale clinical investigation was done to quantify the penetration of stavudine (D4T) into cerebrospinal fluid (CSF). A model-based analysis estimates the steady-state ratio of AUCs of CSF and plasma concentrations (R(AUC)) to be 0.270, and the mean residence time of drug in the CSF to be 7.04 h. The analysis illustrates the advantages of a causal (scientific, predictive) model-based approach to analysis over a noncausal (empirical, descriptive) approach when the data, as here, demonstrate certain problematic features commonly encountered in clinical data, namely (i) few subjects, (ii) sparse sampling, (iii) repeated measures, (iv) imbalance, and (v) individual design variation. These features generally require special attention in data analysis. The causal-model-based analysis deals with features (i) and (ii), both of which reduce efficiency, by combining data from different studies and adding subject-matter prior information. It deals with features (iii)--(v), all of which prevent 'averaging' individual data points directly, first, by adjusting in the model for interindividual data differences due to design differences, secondly, by explicitly differentiating between interpatient, interoccasion, and measurement error variation, and lastly, by defining a scientifically meaningful estimand (R(AUC)) that is independent of design.

  4. Mars 2020 Model Based Systems Engineering Pilot

    Science.gov (United States)

    Dukes, Alexandra Marie

    2017-01-01

    The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and

  5. A Comparative Study of Marketing Channel Multiagent Stackelberg Model Based on Perfect Rationality and Fairness Preference

    Directory of Open Access Journals (Sweden)

    Kaihong Wang

    2014-01-01

    Full Text Available This paper studies channel consisting of a manufacturer and two retailers. As a basis for comparison, the first, multiagent Stackelberg model has been structured based on perfect rationality. Further, fairness preference theory will be embedded in marketing channel multiagent Stackelberg model, and the results show that if the retailers have a jealous fairness preference, the manufacturer will reduce the wholesale price, retailers will increase the effort level, product sales will be increased, and the total channel utility and manufacturers’ utility will be pareto improvement, but the pareto improvement of retailers’ utility is associated with the interval of jealousy fairness preference coefficient. If the retailers have a sympathetic fairness preference, the manufacturer increases wholesale price, retailers reduce the effort level, and the total channel utility, manufacturer’s utility, and retailers’ utility are less than that of the no fairness preference utility.

  6. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  7. Regionalization Study of Satellite based Hydrological Model (SHM) in Hydrologically Homogeneous River Basins of India

    Science.gov (United States)

    Kumari, Babita; Paul, Pranesh Kumar; Singh, Rajendra; Mishra, Ashok; Gupta, Praveen Kumar; Singh, Raghvendra P.

    2017-04-01

    A new semi-distributed conceptual hydrological model, namely Satellite based Hydrological Model (SHM), has been developed under 'PRACRITI-2' program of Space Application Centre (SAC), Ahmedabad for sustainable water resources management of India by using data from Indian Remote Sensing satellites. Entire India is divided into 5km x 5km grid cells and properties at the center of the cells are assumed to represent the property of the cells. SHM contains five modules namely surface water, forest, snow, groundwater and routing. Two empirical equations (SCS-CN and Hargreaves) and water balance method have been used in the surface water module; the forest module is based on the calculations of water balancing & dynamics of subsurface. 2-D Boussinesq equation is used for groundwater modelling which is solved using implicit finite-difference. The routing module follows a distributed routing approach which requires flow path and network with the key point of travel time estimation. The aim of this study is to evaluate the performance of SHM using regionalization technique which also checks the usefulness of a model in data scarce condition or for ungauged basins. However, homogeneity analysis is pre-requisite to regionalization. Similarity index (Φ) and hierarchical agglomerative cluster analysis are adopted to test the homogeneity in terms of physical attributes of three basins namely Brahmani (39,033 km km^2)), Baitarani (10,982 km km^2)) and Kangsabati (9,660 km km^2)) with respect to Subarnarekha (29,196 km km^2)) basin. The results of both homogeneity analysis show that Brahmani basin is the most homogeneous with respect to Subarnarekha river basin in terms of physical characteristics (land use land cover classes, soiltype and elevation). The calibration and validation of model parameters of Brahmani basin is in progress which are to be transferred into the SHM set up of Subarnarekha basin and results are to be compared with the results of calibrated and validated

  8. Acid-base chemistry of white wine: analytical characterisation and chemical modelling.

    Science.gov (United States)

    Prenesti, Enrico; Berto, Silvia; Toso, Simona; Daniele, Pier Giuseppe

    2012-01-01

    A chemical model of the acid-base properties is optimized for each white wine under study, together with the calculation of their ionic strength, taking into account the contributions of all significant ionic species (strong electrolytes and weak one sensitive to the chemical equilibria). Coupling the HPLC-IEC and HPLC-RP methods, we are able to quantify up to 12 carboxylic acids, the most relevant substances responsible of the acid-base equilibria of wine. The analytical concentration of carboxylic acids and of other acid-base active substances was used as input, with the total acidity, for the chemical modelling step of the study based on the contemporary treatment of overlapped protonation equilibria. New protonation constants were refined (L-lactic and succinic acids) with respect to our previous investigation on red wines. Attention was paid for mixed solvent (ethanol-water mixture), ionic strength, and temperature to ensure a thermodynamic level to the study. Validation of the chemical model optimized is achieved by way of conductometric measurements and using a synthetic "wine" especially adapted for testing.

  9. The effectiveness of snow cube throwing learning model based on exploration

    Science.gov (United States)

    Sari, Nenden Mutiara

    2017-08-01

    This study aimed to know the effectiveness of Snow Cube Throwing (SCT) and Cooperative Model in Exploration-Based Math Learning in terms of the time required to complete the teaching materials and student engagement. This study was quasi-experimental research was conducted at SMPN 5 Cimahi, Indonesia. All student in grade VIII SMPN 5 Cimahi which consists of 382 students is used as population. The sample consists of two classes which had been chosen randomly with purposive sampling. First experiment class consists of 38 students and the second experiment class consists of 38 students. Observation sheet was used to observe the time required to complete the teaching materials and record the number of students involved in each meeting. The data obtained was analyzed by independent sample-t test and used the chart. The results of this study: SCT learning model based on exploration are more effective than cooperative learning models based on exploration in terms of the time required to complete teaching materials based on exploration and student engagement.

  10. Acid-Base Chemistry of White Wine: Analytical Characterisation and Chemical Modelling

    Directory of Open Access Journals (Sweden)

    Enrico Prenesti

    2012-01-01

    Full Text Available A chemical model of the acid-base properties is optimized for each white wine under study, together with the calculation of their ionic strength, taking into account the contributions of all significant ionic species (strong electrolytes and weak one sensitive to the chemical equilibria. Coupling the HPLC-IEC and HPLC-RP methods, we are able to quantify up to 12 carboxylic acids, the most relevant substances responsible of the acid-base equilibria of wine. The analytical concentration of carboxylic acids and of other acid-base active substances was used as input, with the total acidity, for the chemical modelling step of the study based on the contemporary treatment of overlapped protonation equilibria. New protonation constants were refined (L-lactic and succinic acids with respect to our previous investigation on red wines. Attention was paid for mixed solvent (ethanol-water mixture, ionic strength, and temperature to ensure a thermodynamic level to the study. Validation of the chemical model optimized is achieved by way of conductometric measurements and using a synthetic “wine” especially adapted for testing.

  11. Acid-Base Chemistry of White Wine: Analytical Characterisation and Chemical Modelling

    Science.gov (United States)

    Prenesti, Enrico; Berto, Silvia; Toso, Simona; Daniele, Pier Giuseppe

    2012-01-01

    A chemical model of the acid-base properties is optimized for each white wine under study, together with the calculation of their ionic strength, taking into account the contributions of all significant ionic species (strong electrolytes and weak one sensitive to the chemical equilibria). Coupling the HPLC-IEC and HPLC-RP methods, we are able to quantify up to 12 carboxylic acids, the most relevant substances responsible of the acid-base equilibria of wine. The analytical concentration of carboxylic acids and of other acid-base active substances was used as input, with the total acidity, for the chemical modelling step of the study based on the contemporary treatment of overlapped protonation equilibria. New protonation constants were refined (L-lactic and succinic acids) with respect to our previous investigation on red wines. Attention was paid for mixed solvent (ethanol-water mixture), ionic strength, and temperature to ensure a thermodynamic level to the study. Validation of the chemical model optimized is achieved by way of conductometric measurements and using a synthetic “wine” especially adapted for testing. PMID:22566762

  12. Multivariate EMD-Based Modeling and Forecasting of Crude Oil Price

    Directory of Open Access Journals (Sweden)

    Kaijian He

    2016-04-01

    Full Text Available Recent empirical studies reveal evidence of the co-existence of heterogeneous data characteristics distinguishable by time scale in the movement crude oil prices. In this paper we propose a new multivariate Empirical Mode Decomposition (EMD-based model to take advantage of these heterogeneous characteristics of the price movement and model them in the crude oil markets. Empirical studies in benchmark crude oil markets confirm that more diverse heterogeneous data characteristics can be revealed and modeled in the projected time delayed domain. The proposed model demonstrates the superior performance compared to the benchmark models.

  13. A Nonlinear Model for Gene-Based Gene-Environment Interaction

    Directory of Open Access Journals (Sweden)

    Jian Sa

    2016-06-01

    Full Text Available A vast amount of literature has confirmed the role of gene-environment (G×E interaction in the etiology of complex human diseases. Traditional methods are predominantly focused on the analysis of interaction between a single nucleotide polymorphism (SNP and an environmental variable. Given that genes are the functional units, it is crucial to understand how gene effects (rather than single SNP effects are influenced by an environmental variable to affect disease risk. Motivated by the increasing awareness of the power of gene-based association analysis over single variant based approach, in this work, we proposed a sparse principle component regression (sPCR model to understand the gene-based G×E interaction effect on complex disease. We first extracted the sparse principal components for SNPs in a gene, then the effect of each principal component was modeled by a varying-coefficient (VC model. The model can jointly model variants in a gene in which their effects are nonlinearly influenced by an environmental variable. In addition, the varying-coefficient sPCR (VC-sPCR model has nice interpretation property since the sparsity on the principal component loadings can tell the relative importance of the corresponding SNPs in each component. We applied our method to a human birth weight dataset in Thai population. We analyzed 12,005 genes across 22 chromosomes and found one significant interaction effect using the Bonferroni correction method and one suggestive interaction. The model performance was further evaluated through simulation studies. Our model provides a system approach to evaluate gene-based G×E interaction.

  14. Experimental study on unsteady open channel flow and bedload transport based on a physical model

    Science.gov (United States)

    Cao, W.

    2015-12-01

    Flow in a nature river are usually unsteady, while nearly all the theories about bedload transport are on the basis of steady, uniform flow, and also with supposed equilibrium state of sediment transport. This is may be one of the main reasons why the bedload transport formulas are notoriously poor accuracy to predict the bedload. The aim of this research is to shed light on the effect of unsteadiness on the bedload transport based on experimental studies. The novel of this study is that the experiments were not carried out in a conventional flume but in a physical model, which are more similar to the actual river. On the other hand, in our experiments, multiple consecutive flood wave were reproduced in the physical model, and all the flow and sediment parameters are based on a large number of data obtained from many of identical flood waves. This method allow us to get more data for one flood, efficiently avoids the uncertainty of bedload rate only for one single flood wave, due to the stochastic fluctuation of the bedload transport. Three different flood waves were selected in the experiments. During each run of experiment, the water level of five different positions along the model were measured by ultrasonic water level gauge, flow velocity at the middle of the channel were measured by two dimensional electromagnetic current meter. Moreover, the bedload transport rate was measured by a unique automatic trap collecting and weighing system at the end of the physical model. The results shows that the celerity of flood wave propagate varies for different flow conditions. The velocity distribution was approximately accord with log-law profile during the entire rising and falling limb of flood. The bedload transport rate show intensity fluctuation in all the experiments, moreover, for different flood waves, the moment when the shear stress reaches its maximum value is not the exact moment when the sediment transport rate reaches its maximum value, which indicates

  15. Reactor kinetics revisited: a coefficient based model (CBM)

    International Nuclear Information System (INIS)

    Ratemi, W.M.

    2011-01-01

    In this paper, a nuclear reactor kinetics model based on Guelph expansion coefficients calculation ( Coefficients Based Model, CBM), for n groups of delayed neutrons is developed. The accompanying characteristic equation is a polynomial form of the Inhour equation with the same coefficients of the CBM- kinetics model. Those coefficients depend on Universal abc- values which are dependent on the type of the fuel fueling a nuclear reactor. Furthermore, such coefficients are linearly dependent on the inserted reactivity. In this paper, the Universal abc- values have been presented symbolically, for the first time, as well as with their numerical values for U-235 fueled reactors for one, two, three, and six groups of delayed neutrons. Simulation studies for constant and variable reactivity insertions are made for the CBM kinetics model, and a comparison of results, with numerical solutions of classical kinetics models for one, two, three, and six groups of delayed neutrons are presented. The results show good agreements, especially for single step insertion of reactivity, with the advantage of the CBM- solution of not encountering the stiffness problem accompanying the numerical solutions of the classical kinetics model. (author)

  16. Reaction time for trimolecular reactions in compartment-based reaction-diffusion models

    Science.gov (United States)

    Li, Fei; Chen, Minghan; Erban, Radek; Cao, Yang

    2018-05-01

    Trimolecular reaction models are investigated in the compartment-based (lattice-based) framework for stochastic reaction-diffusion modeling. The formulae for the first collision time and the mean reaction time are derived for the case where three molecules are present in the solution under periodic boundary conditions. For the case of reflecting boundary conditions, similar formulae are obtained using a computer-assisted approach. The accuracy of these formulae is further verified through comparison with numerical results. The presented derivation is based on the first passage time analysis of Montroll [J. Math. Phys. 10, 753 (1969)]. Montroll's results for two-dimensional lattice-based random walks are adapted and applied to compartment-based models of trimolecular reactions, which are studied in one-dimensional or pseudo one-dimensional domains.

  17. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.

    Science.gov (United States)

    Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.

  18. Combretastatin A-4 based thiophene derivatives as antitumor agent: Development of structure activity correlation model using 3D-QSAR, pharmacophore and docking studies

    Directory of Open Access Journals (Sweden)

    Vijay K. Patel

    2017-12-01

    Full Text Available The structure and ligand based synergistic approach is being applied to design ligands more correctly. The present report discloses the combination of structure and ligand based tactics i.e., molecular docking, energetic based pharmacophore, pharmacophore and atom based 3D-QSAR modeling for the analysis of thiophene derivatives as anticancer agent. The main purpose of using structure and ligand based synergistic approach is to ascertain a correlation between structure and its biological activity. Thiophene derivatives have been found to possess cytotoxic activity in several cancer cell lines and its mechanism of action basically involves the binding to the colchicine site on β-tubulin. The structure based approach (molecular docking was performed on a series of thiophene derivatives. All the structures were docked to colchicine binding site of β tubulin for examining the binding affinity of compounds for antitumor activity. The pharmacophore and atom based 3D-QSAR modeling was accomplished on a series of thiophene (32 compounds analogues. Five-point common pharmacophore hypotheses (AAAAR.38 were selected for alignment of all compounds. The atom based 3D-QSAR models were developed by selection of 23 compounds as training set and 9 compounds as test set, demonstrated good partial least squares statistical results. The generated common pharmacophore hypothesis and 3D-QSAR models were validated further externally by measuring the activity of database compounds and assessing it with actual activity. The common pharmacophore hypothesis AAAAR.38 resulted in a 3D-QSAR model with excellent PLSs data for factor two characterized by the best predication coefficient Q2 (cross validated r2 (0.7213, regression R2 (0.8311, SD (0.3672, F (49.2, P (1.89E-08, RMSE (0.3864, Stability (0.8702, Pearson-r (0.8722. The results of these molecular modeling studies i.e., molecular docking, energetic based pharmacophore, pharmacophore and atom based 3D-QSAR modeling

  19. Agent-based modeling of noncommunicable diseases: a systematic review.

    Science.gov (United States)

    Nianogo, Roch A; Arah, Onyebuchi A

    2015-03-01

    We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application.

  20. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  1. Graph configuration model based evaluation of the education-occupation match.

    Science.gov (United States)

    Gadar, Laszlo; Abonyi, Janos

    2018-01-01

    To study education-occupation matchings we developed a bipartite network model of education to work transition and a graph configuration model based metric. We studied the career paths of 15 thousand Hungarian students based on the integrated database of the National Tax Administration, the National Health Insurance Fund, and the higher education information system of the Hungarian Government. A brief analysis of gender pay gap and the spatial distribution of over-education is presented to demonstrate the background of the research and the resulted open dataset. We highlighted the hierarchical and clustered structure of the career paths based on the multi-resolution analysis of the graph modularity. The results of the cluster analysis can support policymakers to fine-tune the fragmented program structure of higher education.

  2. A Model of Homeschooling Based on Technology in Malaysia

    Science.gov (United States)

    Alias, Norlidah; Rahman, Mohd. Nazri Abdul; Siraj, Saedah; Ibrahim, Ruslina

    2013-01-01

    Homeschooling in Malaysia is a form of alternative education that emphasizes quality education based on moral values and belief in strengthening family ties. The purpose of this study is to produce a model of homeschooling technology-based learning activities in Malaysia as a guideline to improve the quality of education, curriculum and organize…

  3. 2008 GEM Modeling Challenge: Metrics Study of the Dst Index in Physics-Based Magnetosphere and Ring Current Models and in Statistical and Analytic Specifications

    Science.gov (United States)

    Rastaetter, L.; Kuznetsova, M.; Hesse, M.; Pulkkinen, A.; Glocer, A.; Yu, Y.; Meng, X.; Raeder, J.; Wiltberger, M.; Welling, D.; hide

    2011-01-01

    In this paper the metrics-based results of the Dst part of the 2008-2009 GEM Metrics Challenge are reported. The Metrics Challenge asked modelers to submit results for 4 geomagnetic storm events and 5 different types of observations that can be modeled by statistical or climatological or physics-based (e.g. MHD) models of the magnetosphere-ionosphere system. We present the results of over 25 model settings that were run at the Community Coordinated Modeling Center (CCMC) and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations we use comparisons of one-hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of one-minute model data with the one-minute Dst index calculated by the United States Geologic Survey (USGS).

  4. Data-based Non-Markovian Model Inference

    Science.gov (United States)

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close

  5. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  6. Biologically based neural circuit modelling for the study of fear learning and extinction

    Science.gov (United States)

    Nair, Satish S.; Paré, Denis; Vicentic, Aleksandra

    2016-11-01

    The neuronal systems that promote protective defensive behaviours have been studied extensively using Pavlovian conditioning. In this paradigm, an initially neutral-conditioned stimulus is paired with an aversive unconditioned stimulus leading the subjects to display behavioural signs of fear. Decades of research into the neural bases of this simple behavioural paradigm uncovered that the amygdala, a complex structure comprised of several interconnected nuclei, is an essential part of the neural circuits required for the acquisition, consolidation and expression of fear memory. However, emerging evidence from the confluence of electrophysiological, tract tracing, imaging, molecular, optogenetic and chemogenetic methodologies, reveals that fear learning is mediated by multiple connections between several amygdala nuclei and their distributed targets, dynamical changes in plasticity in local circuit elements as well as neuromodulatory mechanisms that promote synaptic plasticity. To uncover these complex relations and analyse multi-modal data sets acquired from these studies, we argue that biologically realistic computational modelling, in conjunction with experiments, offers an opportunity to advance our understanding of the neural circuit mechanisms of fear learning and to address how their dysfunction may lead to maladaptive fear responses in mental disorders.

  7. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    In this paper we present a model for email authorship identification (EAI) by employing a Cluster-based Classification (CCM) technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature-set to include some...... more interesting and effective features for email authorship identification (e.g. the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell). We also included Info Gain feature selection based...... reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM)-based models, as well as the models proposed by Iqbal et al. [1, 2]. The proposed model attains an accuracy rate of 94% for 10...

  8. Comparison of a Conceptual Groundwater Model and Physically Based Groundwater Mode

    Science.gov (United States)

    Yang, J.; Zammit, C.; Griffiths, J.; Moore, C.; Woods, R. A.

    2017-12-01

    Groundwater is a vital resource for human activities including agricultural practice and urban water demand. Hydrologic modelling is an important way to study groundwater recharge, movement and discharge, and its response to both human activity and climate change. To understand the groundwater hydrologic processes nationally in New Zealand, we have developed a conceptually based groundwater flow model, which is fully integrated into a national surface-water model (TopNet), and able to simulate groundwater recharge, movement, and interaction with surface water. To demonstrate the capability of this groundwater model (TopNet-GW), we applied the model to an irrigated area with water shortage and pollution problems in the upper Ruamahanga catchment in Great Wellington Region, New Zealand, and compared its performance with a physically-based groundwater model (MODFLOW). The comparison includes river flow at flow gauging sites, and interaction between groundwater and river. Results showed that the TopNet-GW produced similar flow and groundwater interaction patterns as the MODFLOW model, but took less computation time. This shows the conceptually-based groundwater model has the potential to simulate national groundwater process, and could be used as a surrogate for the more physically based model.

  9. Getting water right: A case study in water yield modelling based on precipitation data.

    Science.gov (United States)

    Pessacg, Natalia; Flaherty, Silvia; Brandizi, Laura; Solman, Silvina; Pascual, Miguel

    2015-12-15

    Water yield is a key ecosystem service in river basins and especially in dry regions around the World. In this study we carry out a modelling analysis of water yields in the Chubut River basin, located in one of the driest districts of Patagonia, Argentina. We focus on the uncertainty around precipitation data, a driver of paramount importance for water yield. The objectives of this study are to: i) explore the spatial and numeric differences among six widely used global precipitation datasets for this region, ii) test them against data from independent ground stations, and iii) explore the effects of precipitation data uncertainty on simulations of water yield. The simulations were performed using the ecosystem services model InVEST (Integrated Valuation of Ecosystem Services and Tradeoffs) with each of the six different precipitation datasets as input. Our results show marked differences among datasets for the Chubut watershed region, both in the magnitude of precipitations and their spatial arrangement. Five of the precipitation databases overestimate the precipitation over the basin by 50% or more, particularly over the more humid western range. Meanwhile, the remaining dataset (Tropical Rainfall Measuring Mission - TRMM), based on satellite measurements, adjusts well to the observed rainfall in different stations throughout the watershed and provides a better representation of the precipitation gradient characteristic of the rain shadow of the Andes. The observed differences among datasets in the representation of the rainfall gradient translate into large differences in water yield simulations. Errors in precipitation of +30% (-30%) amplify to water yield errors ranging from 50 to 150% (-45 to -60%) in some sub-basins. These results highlight the importance of assessing uncertainties in main input data when quantifying and mapping ecosystem services with biophysical models and cautions about the undisputed use of global environmental datasets. Copyright

  10. Modeling of Individual and Organizational Factors Affecting Traumatic Occupational Injuries Based on the Structural Equation Modeling: A Case Study in Large Construction Industries.

    Science.gov (United States)

    Mohammadfam, Iraj; Soltanzadeh, Ahmad; Moghimbeigi, Abbas; Akbarzadeh, Mehdi

    2016-09-01

    Individual and organizational factors are the factors influencing traumatic occupational injuries. The aim of the present study was the short path analysis of the severity of occupational injuries based on individual and organizational factors. The present cross-sectional analytical study was implemented on traumatic occupational injuries within a ten-year timeframe in 13 large Iranian construction industries. Modeling and data analysis were done using the structural equation modeling (SEM) approach and the IBM SPSS AMOS statistical software version 22.0, respectively. The mean age and working experience of the injured workers were 28.03 ± 5.33 and 4.53 ± 3.82 years, respectively. The portions of construction and installation activities of traumatic occupational injuries were 64.4% and 18.1%, respectively. The SEM findings showed that the individual, organizational and accident type factors significantly were considered as effective factors on occupational injuries' severity (P accidents' severity in large construction industries.

  11. Individual-based modeling of fish: Linking to physical models and water quality.

    Energy Technology Data Exchange (ETDEWEB)

    Rose, K.A.

    1997-08-01

    The individual-based modeling approach for the simulating fish population and community dynamics is gaining popularity. Individual-based modeling has been used in many other fields, such as forest succession and astronomy. The popularity of the individual-based approach is partly a result of the lack of success of the more aggregate modeling approaches traditionally used for simulating fish population and community dynamics. Also, recent recognition that it is often the atypical individual that survives has fostered interest in the individual-based approach. Two general types of individual-based models are distribution and configuration. Distribution models follow the probability distributions of individual characteristics, such as length and age. Configuration models explicitly simulate each individual; the sum over individuals being the population. DeAngelis et al (1992) showed that, when distribution and configuration models were formulated from the same common pool of information, both approaches generated similar predictions. The distribution approach was more compact and general, while the configuration approach was more flexible. Simple biological changes, such as making growth rate dependent on previous days growth rates, were easy to implement in the configuration version but prevented simple analytical solution of the distribution version.

  12. Feature-based component model for design of embedded systems

    Science.gov (United States)

    Zha, Xuan Fang; Sriram, Ram D.

    2004-11-01

    An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.

  13. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  14. Context-model-based instruction in teaching EFL writing: A narrative inquiry

    Directory of Open Access Journals (Sweden)

    Zheng Lin

    2016-12-01

    Full Text Available This study aims to re-story the provision of the context-model-based instruction in teaching EFL writing, focusing especially on students’ development of the context model and learning to guide EFL writing with the context model. The research data have been collected from the audio recordings of the classroom instruction, the teacher-researcher’s memos, and the students’ reflections on their learning experience in the study. The findings that have resulted from this narrative inquiry show (1 the context-model-based instruction has helped students develop their context model; (2 students could learn to configure the four elements of the context model (i.e. “the purpose of communication, the subject matter, the relationship with the reader and the normal pattern of presentation”; and (3 students could learn to be mindful to proactively apply the context model in the process of EFL writing to manage the situated, dynamic and intercultural issues involved.

  15. Modeling uranium transport in acidic contaminated groundwater with base addition

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Fan [Institute of Tibetan Plateau Research, Chinese Academy of Sciences; Luo, Wensui [ORNL; Parker, Jack C. [University of Tennessee, Knoxville (UTK); Brooks, Scott C [ORNL; Watson, David B [ORNL; Jardine, Philip [University of Tennessee, Knoxville (UTK); Gu, Baohua [ORNL

    2011-01-01

    This study investigates reactive transport modeling in a column of uranium(VI)-contaminated sediments with base additions in the circulating influent. The groundwater and sediment exhibit oxic conditions with low pH, high concentrations of NO{sub 3}{sup -}, SO{sub 4}{sup 2-}, U and various metal cations. Preliminary batch experiments indicate that additions of strong base induce rapid immobilization of U for this material. In the column experiment that is the focus of the present study, effluent groundwater was titrated with NaOH solution in an inflow reservoir before reinjection to gradually increase the solution pH in the column. An equilibrium hydrolysis, precipitation and ion exchange reaction model developed through simulation of the preliminary batch titration experiments predicted faster reduction of aqueous Al than observed in the column experiment. The model was therefore modified to consider reaction kinetics for the precipitation and dissolution processes which are the major mechanism for Al immobilization. The combined kinetic and equilibrium reaction model adequately described variations in pH, aqueous concentrations of metal cations (Al, Ca, Mg, Sr, Mn, Ni, Co), sulfate and U(VI). The experimental and modeling results indicate that U(VI) can be effectively sequestered with controlled base addition due to sorption by slowly precipitated Al with pH-dependent surface charge. The model may prove useful to predict field-scale U(VI) sequestration and remediation effectiveness.

  16. Modeling uranium transport in acidic contaminated groundwater with base addition

    International Nuclear Information System (INIS)

    Zhang Fan; Luo Wensui; Parker, Jack C.; Brooks, Scott C.; Watson, David B.; Jardine, Philip M.; Gu Baohua

    2011-01-01

    This study investigates reactive transport modeling in a column of uranium(VI)-contaminated sediments with base additions in the circulating influent. The groundwater and sediment exhibit oxic conditions with low pH, high concentrations of NO 3 - , SO 4 2- , U and various metal cations. Preliminary batch experiments indicate that additions of strong base induce rapid immobilization of U for this material. In the column experiment that is the focus of the present study, effluent groundwater was titrated with NaOH solution in an inflow reservoir before reinjection to gradually increase the solution pH in the column. An equilibrium hydrolysis, precipitation and ion exchange reaction model developed through simulation of the preliminary batch titration experiments predicted faster reduction of aqueous Al than observed in the column experiment. The model was therefore modified to consider reaction kinetics for the precipitation and dissolution processes which are the major mechanism for Al immobilization. The combined kinetic and equilibrium reaction model adequately described variations in pH, aqueous concentrations of metal cations (Al, Ca, Mg, Sr, Mn, Ni, Co), sulfate and U(VI). The experimental and modeling results indicate that U(VI) can be effectively sequestered with controlled base addition due to sorption by slowly precipitated Al with pH-dependent surface charge. The model may prove useful to predict field-scale U(VI) sequestration and remediation effectiveness.

  17. Modeling uranium transport in acidic contaminated groundwater with base addition

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Fan, E-mail: zhangfan@itpcas.ac.cn [Key Laboratory of Tibetan Environment Changes and Land Surface Processes, Institute of Tibetan Plateau Research, Chinese Academy of Sciences, P.O. Box 2871, Beijing, 100085 (China); Luo Wensui [Institute of Urban Environment, Chinese Academy of Sciences, Xiamen, 361021 (China); Parker, Jack C. [Institute for a Secure and Sustainable Environment, Department of Civil and Environmental Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Brooks, Scott C.; Watson, David B. [Environmental Sciences Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Jardine, Philip M. [Biosystems Engineering and Soil Science Department, University of Tennessee, Knoxville, TN 37996 (United States); Gu Baohua [Environmental Sciences Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States)

    2011-06-15

    This study investigates reactive transport modeling in a column of uranium(VI)-contaminated sediments with base additions in the circulating influent. The groundwater and sediment exhibit oxic conditions with low pH, high concentrations of NO{sub 3}{sup -}, SO{sub 4}{sup 2-}, U and various metal cations. Preliminary batch experiments indicate that additions of strong base induce rapid immobilization of U for this material. In the column experiment that is the focus of the present study, effluent groundwater was titrated with NaOH solution in an inflow reservoir before reinjection to gradually increase the solution pH in the column. An equilibrium hydrolysis, precipitation and ion exchange reaction model developed through simulation of the preliminary batch titration experiments predicted faster reduction of aqueous Al than observed in the column experiment. The model was therefore modified to consider reaction kinetics for the precipitation and dissolution processes which are the major mechanism for Al immobilization. The combined kinetic and equilibrium reaction model adequately described variations in pH, aqueous concentrations of metal cations (Al, Ca, Mg, Sr, Mn, Ni, Co), sulfate and U(VI). The experimental and modeling results indicate that U(VI) can be effectively sequestered with controlled base addition due to sorption by slowly precipitated Al with pH-dependent surface charge. The model may prove useful to predict field-scale U(VI) sequestration and remediation effectiveness.

  18. Re-evaluation of model-based light-scattering spectroscopy for tissue spectroscopy

    Science.gov (United States)

    Lau, Condon; Šćepanović, Obrad; Mirkovic, Jelena; McGee, Sasha; Yu, Chung-Chieh; Fulghum, Stephen; Wallace, Michael; Tunnell, James; Bechtel, Kate; Feld, Michael

    2009-01-01

    Model-based light scattering spectroscopy (LSS) seemed a promising technique for in-vivo diagnosis of dysplasia in multiple organs. In the studies, the residual spectrum, the difference between the observed and modeled diffuse reflectance spectra, was attributed to single elastic light scattering from epithelial nuclei, and diagnostic information due to nuclear changes was extracted from it. We show that this picture is incorrect. The actual single scattering signal arising from epithelial nuclei is much smaller than the previously computed residual spectrum, and does not have the wavelength dependence characteristic of Mie scattering. Rather, the residual spectrum largely arises from assuming a uniform hemoglobin distribution. In fact, hemoglobin is packaged in blood vessels, which alters the reflectance. When we include vessel packaging, which accounts for an inhomogeneous hemoglobin distribution, in the diffuse reflectance model, the reflectance is modeled more accurately, greatly reducing the amplitude of the residual spectrum. These findings are verified via numerical estimates based on light propagation and Mie theory, tissue phantom experiments, and analysis of published data measured from Barrett’s esophagus. In future studies, vessel packaging should be included in the model of diffuse reflectance and use of model-based LSS should be discontinued. PMID:19405760

  19. Model Integrated Problem Solving Based Learning pada Perkuliahan Dasar-dasar Kimia Analitik

    Directory of Open Access Journals (Sweden)

    Indarini Dwi Pursitasari

    2013-07-01

    Full Text Available Abstract: Integrated Problem Solving Based Learning Model on Foundation of Analytical Chemistry. This study was conducted to know the effects of Integrated Problem Solving Based Learning (IPSBL model on problem solving skills and cognitive ability of pre-service teachers. The subjects of the study were 41 pre- service teachers, 21 in the experimental group and 20 in the control group. The data were collected through a test on problem solving skills, a test on cognitive ability, and a questionnaire on the students’opinions on the use of IPSBL model. The quantitative data were analyzed using t-test and one-way ANOVA, and the qualitative data were analyzed by counting the percentage. The results of the study show that the implementation of IPSBL model increased the problem solving skills and cognitive ability of the pre-service teachers . The model was also responded positively by the research subjects. Abstrak: Model Integrated Problem Solving Based learning pada Perkuliahan Dasar-dasar Kimia Analitik. Penelitian ini bertujuan menentukan pengaruh model Integrated Problem Solving Based Learning(IPSBL terhadap peningkatan kemampuan problem solving dan kemampuan kognitif mahasiswa calon guru. Subjek penelitian terdiri dari 21 mahasiswa kelas eksperimen dan 20 mahasiswa kelas kontrol. Data dikumpulkan menggunakan tes kemampuan problem solving, tes kemampuan kognitif, dan angket untuk menjaring pendapat mahasiswa terhadap penggunaan model IPSBL . Data kuantitatif dianalisis denga n uji- t dan Anava dengan bantuan program SPSS 16.0. Data kualitatif dihitung persentasenya. Hasil penelitian menunjukkan bahwa model IPSBL dapat meningkatkan kemampuan problem solving dan kemampuan kognitif serta mendapat tanggapan yang positif dari mahasiswa.

  20. Mathematical Modelling Research in Turkey: A Content Analysis Study

    Science.gov (United States)

    Çelik, H. Coskun

    2017-01-01

    The aim of the present study was to examine the mathematical modelling studies done between 2004 and 2015 in Turkey and to reveal their tendencies. Forty-nine studies were selected using purposeful sampling based on the term, "mathematical modelling" with Higher Education Academic Search Engine. They were analyzed with content analysis.…

  1. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. SDRAM-based packet buffer model for high speed switches

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Ruepp, Sarah Renée; Berger, Michael Stübert

    2011-01-01

    based on the specifications of a real-life DDR3-SDRAM chip. Based on this model the performance of different schemes for optimizing the performance of such a packet buffer can be evaluated. The purpose of this study is to find efficient schemes for memory mapping of the packet queues and I/O traffic...

  3. Humidification of base flow gas during adult high-frequency oscillatory ventilation: an experimental study using a lung model.

    Science.gov (United States)

    Shiba, Naoki; Nagano, Osamu; Hirayama, Takahiro; Ichiba, Shingo; Ujike, Yoshihito

    2012-01-01

    In adult high-frequency oscillatory ventilation (HFOV) with an R100 artificial ventilator, exhaled gas from patient's lung may warm the temperature probe and thereby disturb the humidification of base flow (BF) gas. We measured the humidity of BF gas during HFOV with frequencies of 6, 8 and 10 Hz, maximum stroke volumes (SV) of 285, 205, and 160 ml at the respective frequencies, and, BFs of 20, 30, 40 l/min using an original lung model. The R100 device was equipped with a heated humidifier, Hummax Ⅱ, consisting of a porous hollow fiber in circuit. A 50-cm length of circuit was added between temperature probe (located at 50 cm proximal from Y-piece) and the hollow fiber. The lung model was made of a plastic container and a circuit equipped with another Hummax Ⅱ. The lung model temperature was controlled at 37℃. The Hummax Ⅱ of the R100 was inactivated in study-1 and was set at 35℃ or 37℃ in study-2. The humidity was measured at the distal end of the added circuit in study-1 and at the proximal end in study-2. In study-1, humidity was detected at 6 Hz (SV 285 ml) and BF 20 l/min, indicating the direct reach of the exhaled gas from the lung model to the temperature probe. In study-2 the absolute humidity of the BF gas decreased by increasing SV and by increasing BF and it was low with setting of 35℃. In this study setting, increasing the SV induced significant reduction of humidification of the BF gas during HFOV with R100.

  4. Does Accrual Management Impair the Performance of Earnings-Based Valuation Models?

    OpenAIRE

    Lucie Courteau; Jennifer L. Kao; Yao Tian

    2013-01-01

    This study examines empirically how the presence of accrual management may affect firm valuation. We compare the performance of earnings-based and non-earnings-based valuation models, represented by Residual Income Model (RIM) and Discounted Cash Flow (DCF), respectively, based on the absolute percentage pricing and valuation errors for two subsets of US firms: “Suspect” firms that are likely to have engaged in accrual management and “Normal” firms matched on industry, year and size. Results ...

  5. Model Based Mission Assurance in a Model Based Systems Engineering (MBSE) Framework: State-of-the-Art Assessment

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.

    2016-01-01

    This report explores the current state of the art of Safety and Mission Assurance (S&MA) in projects that have shifted towards Model Based Systems Engineering (MBSE). Its goal is to provide insight into how NASA's Office of Safety and Mission Assurance (OSMA) should respond to this shift. In MBSE, systems engineering information is organized and represented in models: rigorous computer-based representations, which collectively make many activities easier to perform, less error prone, and scalable. S&MA practices must shift accordingly. The "Objective Structure Hierarchies" recently developed by OSMA provide the framework for understanding this shift. Although the objectives themselves will remain constant, S&MA practices (activities, processes, tools) to achieve them are subject to change. This report presents insights derived from literature studies and interviews. The literature studies gleaned assurance implications from reports of space-related applications of MBSE. The interviews with knowledgeable S&MA and MBSE personnel discovered concerns and ideas for how assurance may adapt. Preliminary findings and observations are presented on the state of practice of S&MA with respect to MBSE, how it is already changing, and how it is likely to change further. Finally, recommendations are provided on how to foster the evolution of S&MA to best fit with MBSE.

  6. Application of model-based and knowledge-based measuring methods as analytical redundancy

    International Nuclear Information System (INIS)

    Hampel, R.; Kaestner, W.; Chaker, N.; Vandreier, B.

    1997-01-01

    The safe operation of nuclear power plants requires the application of modern and intelligent methods of signal processing for the normal operation as well as for the management of accident conditions. Such modern and intelligent methods are model-based and knowledge-based ones being founded on analytical knowledge (mathematical models) as well as experiences (fuzzy information). In addition to the existing hardware redundancies analytical redundancies will be established with the help of these modern methods. These analytical redundancies support the operating staff during the decision-making. The design of a hybrid model-based and knowledge-based measuring method will be demonstrated by the example of a fuzzy-supported observer. Within the fuzzy-supported observer a classical linear observer is connected with a fuzzy-supported adaptation of the model matrices of the observer model. This application is realized for the estimation of the non-measurable variables as steam content and mixture level within pressure vessels with water-steam mixture during accidental depressurizations. For this example the existing non-linearities will be classified and the verification of the model will be explained. The advantages of the hybrid method in comparison to the classical model-based measuring methods will be demonstrated by the results of estimation. The consideration of the parameters which have an important influence on the non-linearities requires the inclusion of high-dimensional structures of fuzzy logic within the model-based measuring methods. Therefore methods will be presented which allow the conversion of these high-dimensional structures to two-dimensional structures of fuzzy logic. As an efficient solution of this problem a method based on cascaded fuzzy controllers will be presented. (author). 2 refs, 12 figs, 5 tabs

  7. Biologically based modelling and simulation of carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Ouchi, Noriyuki B.

    2003-01-01

    The process of the carcinogenesis is studied by computer simulation. In general, we need a large number of experimental samples to detect mutations at low doses, but in practice it is difficult to get such a large number of data. To satisfy the requirements of the situation at low doses, it is good to study the process of carcinogenesis using biologically based mathematical model. We have mainly studied it by using as known as 'multi-stage model'; the model seems to get complicated, as we adopt the recent new findings of molecular biological experiments. Moreover, the basic idea of the multi-stage model is based on the epidemiologic data of log-log variation of cancer incidence with age, it seems to be difficult to compare with experimental data of irradiated cell culture system, which has been increasing in recent years. Taking above into consideration, we concluded that we had better make new model with following features: 1) a unit of the target system is a cell, 2) the new information of the molecular biology can be easily introduced, 3) having spatial coordinates for checking a colony formation or tumorigenesis. In this presentation, we will show the detail of the model and some simulation results about the carcinogenesis. (author)

  8. Towards computer-based perception by modeling visual perception : A probalistic theory

    NARCIS (Netherlands)

    Ciftcioglu, O.; Bittermann, M.; Sariyildiz, S.

    2006-01-01

    Studies on computer-based perception by vision modelling are described. The visual perception is mathematically modelled where the model receives and interprets visual data from the environment. The perception is defined in probabilistic terms so that it is in the same way quantified. Human visual

  9. A mathematical framework for agent based models of complex biological networks.

    Science.gov (United States)

    Hinkelmann, Franziska; Murrugarra, David; Jarrah, Abdul Salam; Laubenbacher, Reinhard

    2011-07-01

    Agent-based modeling and simulation is a useful method to study biological phenomena in a wide range of fields, from molecular biology to ecology. Since there is currently no agreed-upon standard way to specify such models, it is not always easy to use published models. Also, since model descriptions are not usually given in mathematical terms, it is difficult to bring mathematical analysis tools to bear, so that models are typically studied through simulation. In order to address this issue, Grimm et al. proposed a protocol for model specification, the so-called ODD protocol, which provides a standard way to describe models. This paper proposes an addition to the ODD protocol which allows the description of an agent-based model as a dynamical system, which provides access to computational and theoretical tools for its analysis. The mathematical framework is that of algebraic models, that is, time-discrete dynamical systems with algebraic structure. It is shown by way of several examples how this mathematical specification can help with model analysis. This mathematical framework can also accommodate other model types such as Boolean networks and the more general logical models, as well as Petri nets.

  10. Effects of creating video-based modeling examples on learning and transfer

    NARCIS (Netherlands)

    Hoogerheide, Vincent; Loyens, Sofie M M; van Gog, Tamara

    2014-01-01

    Two experiments investigated whether acting as a peer model for a video-based modeling example, which entails studying a text with the intention to explain it to others and then actually explaining it on video, would foster learning and transfer. In both experiments, novices were instructed to study

  11. The Martian Water Cycle Based on 3-D Modeling

    Science.gov (United States)

    Houben, H.; Haberle, R. M.; Joshi, M. M.

    1999-01-01

    Understanding the distribution of Martian water is a major goal of the Mars Surveyor program. However, until the bulk of the data from the nominal missions of TES, PMIRR, GRS, MVACS, and the DS2 probes are available, we are bound to be in a state where much of our knowledge of the seasonal behavior of water is based on theoretical modeling. We therefore summarize the results of this modeling at the present time. The most complete calculations come from a somewhat simplified treatment of the Martian climate system which is capable of simulating many decades of weather. More elaborate meteorological models are now being applied to study of the problem. The results show a high degree of consistency with observations of aspects of the Martian water cycle made by Viking MAWD, a large number of ground-based measurements of atmospheric column water vapor, studies of Martian frosts, and the widespread occurrence of water ice clouds. Additional information is contained in the original extended abstract.

  12. A polynomial based model for cell fate prediction in human diseases.

    Science.gov (United States)

    Ma, Lichun; Zheng, Jie

    2017-12-21

    Cell fate regulation directly affects tissue homeostasis and human health. Research on cell fate decision sheds light on key regulators, facilitates understanding the mechanisms, and suggests novel strategies to treat human diseases that are related to abnormal cell development. In this study, we proposed a polynomial based model to predict cell fate. This model was derived from Taylor series. As a case study, gene expression data of pancreatic cells were adopted to test and verify the model. As numerous features (genes) are available, we employed two kinds of feature selection methods, i.e. correlation based and apoptosis pathway based. Then polynomials of different degrees were used to refine the cell fate prediction function. 10-fold cross-validation was carried out to evaluate the performance of our model. In addition, we analyzed the stability of the resultant cell fate prediction model by evaluating the ranges of the parameters, as well as assessing the variances of the predicted values at randomly selected points. Results show that, within both the two considered gene selection methods, the prediction accuracies of polynomials of different degrees show little differences. Interestingly, the linear polynomial (degree 1 polynomial) is more stable than others. When comparing the linear polynomials based on the two gene selection methods, it shows that although the accuracy of the linear polynomial that uses correlation analysis outcomes is a little higher (achieves 86.62%), the one within genes of the apoptosis pathway is much more stable. Considering both the prediction accuracy and the stability of polynomial models of different degrees, the linear model is a preferred choice for cell fate prediction with gene expression data of pancreatic cells. The presented cell fate prediction model can be extended to other cells, which may be important for basic research as well as clinical study of cell development related diseases.

  13. A multi-objective constraint-based approach for modeling genome-scale microbial ecosystems.

    Science.gov (United States)

    Budinich, Marko; Bourdon, Jérémie; Larhlimi, Abdelhalim; Eveillard, Damien

    2017-01-01

    Interplay within microbial communities impacts ecosystems on several scales, and elucidation of the consequent effects is a difficult task in ecology. In particular, the integration of genome-scale data within quantitative models of microbial ecosystems remains elusive. This study advocates the use of constraint-based modeling to build predictive models from recent high-resolution -omics datasets. Following recent studies that have demonstrated the accuracy of constraint-based models (CBMs) for simulating single-strain metabolic networks, we sought to study microbial ecosystems as a combination of single-strain metabolic networks that exchange nutrients. This study presents two multi-objective extensions of CBMs for modeling communities: multi-objective flux balance analysis (MO-FBA) and multi-objective flux variability analysis (MO-FVA). Both methods were applied to a hot spring mat model ecosystem. As a result, multiple trade-offs between nutrients and growth rates, as well as thermodynamically favorable relative abundances at community level, were emphasized. We expect this approach to be used for integrating genomic information in microbial ecosystems. Following models will provide insights about behaviors (including diversity) that take place at the ecosystem scale.

  14. Landscape Epidemiology Modeling Using an Agent-Based Model and a Geographic Information System

    Directory of Open Access Journals (Sweden)

    S. M. Niaz Arifin

    2015-05-01

    Full Text Available A landscape epidemiology modeling framework is presented which integrates the simulation outputs from an established spatial agent-based model (ABM of malaria with a geographic information system (GIS. For a study area in Kenya, five landscape scenarios are constructed with varying coverage levels of two mosquito-control interventions. For each scenario, maps are presented to show the average distributions of three output indices obtained from the results of 750 simulation runs. Hot spot analysis is performed to detect statistically significant hot spots and cold spots. Additional spatial analysis is conducted using ordinary kriging with circular semivariograms for all scenarios. The integration of epidemiological simulation-based results with spatial analyses techniques within a single modeling framework can be a valuable tool for conducting a variety of disease control activities such as exploring new biological insights, monitoring epidemiological landscape changes, and guiding resource allocation for further investigation.

  15. A hybrid agent-based approach for modeling microbiological systems.

    Science.gov (United States)

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  16. Cognitive components underpinning the development of model-based learning.

    Science.gov (United States)

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2017-06-01

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Rule-based modularization in model transformation languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    2007-01-01

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the basis of relations between source and target metamodels and on the base of generic transformation

  18. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  19. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  20. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  1. Integrating Biodiversity into Biosphere-Atmosphere Interactions Using Individual-Based Models (IBM)

    Science.gov (United States)

    Wang, B.; Shugart, H. H., Jr.; Lerdau, M.

    2017-12-01

    A key component regulating complex, nonlinear, and dynamic biosphere-atmosphere interactions is the inherent diversity of biological systems. The model frameworks currently widely used, i.e., Plant Functional Type models) do not even begin to capture the metabolic and taxonomic diversity found in many terrestrial systems. We propose that a transition from PFT-based to individual-based modeling approaches (hereafter referred to as IBM) is essential for integrating biodiversity into research on biosphere-atmosphere interactions. The proposal emerges from our studying the interactions of forests with atmospheric processes in the context of climate change using an individual-based forest volatile organic compounds model, UVAFME-VOC. This individual-based model can explicitly simulate VOC emissions based on an explicit modelling of forest dynamics by computing the growth, death, and regeneration of each individual tree of different species and their competition for light, moisture, and nutrient, from which system-level VOC emissions are simulated by explicitly computing and summing up each individual's emissions. We found that elevated O3 significantly altered the forest dynamics by favoring species that are O3-resistant, which, meanwhile, are producers of isoprene. Such compositional changes, on the one hand, resulted in unsuppressed forest productivity and carbon stock because of the compensation by O3-resistant species. On the other hand, with more isoprene produced arising from increased producers, a possible positive feedback loop between tropospheric O3 and forest thereby emerged. We also found that climate warming will not always stimulate isoprene emissions because warming simultaneously reduces isoprene emissions by causing a decline in the abundance of isoprene-emitting species. These results suggest that species diversity is of great significance and that individual-based modelling strategies should be applied in studying biosphere-atmosphere interactions.

  2. Reduced material model for closed cell metal foam infiltrated with phase change material based on high resolution numerical studies

    International Nuclear Information System (INIS)

    Ohsenbrügge, Christoph; Marth, Wieland; Navarro y de Sosa, Iñaki; Drossel, Welf-Guntram; Voigt, Axel

    2016-01-01

    Highlights: • Closed cell metal foam sandwich structures were investigated. • High resolution numerical studies were conducted using CT scan data. • A reduced model for use in commercial FE software reduces needed degrees of freedom. • Thermal inertia is increased about 4 to 5 times in PCM filled structures. • The reduced material model was verified using experimental data. - Abstract: The thermal behaviour of closed cell metal foam infiltrated with paraffin wax as latent heat storage for application in high precision tool machines was examined. Aluminium foam sandwiches with metallically bound cover layers were prepared in a powder metallurgical process and cross-sectional images of the structures were generated with X-ray computed tomography. Based on the image data a three dimensional highly detailed model was derived and prepared for simulation with the adaptive FE-library AMDiS. The pores were assumed to be filled with paraffin wax. The thermal conductivity and the transient thermal behaviour in the phase-change region were investigated. Based on the results from the highly detailed simulations a reduced model for use in commercial FE-software (ANSYS) was derived. It incorporates the properties of the matrix and the phase change material into a homogenized material. A sandwich-structure with and without paraffin was investigated experimentally under constant thermal load. The results were used to verify the reduced material model in ANSYS.

  3. Ecosystem Based Business Model of Smart Grid

    OpenAIRE

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on t...

  4. Identifying Multiple Levels of Discussion-Based Teaching Strategies for Constructing Scientific Models

    Science.gov (United States)

    Williams, Grant; Clement, John

    2015-01-01

    This study sought to identify specific types of discussion-based strategies that two successful high school physics teachers using a model-based approach utilized in attempting to foster students' construction of explanatory models for scientific concepts. We found evidence that, in addition to previously documented dialogical strategies that…

  5. A study on the energy management in domestic micro-grids based on Model Predictive Control strategies

    International Nuclear Information System (INIS)

    Bruni, G.; Cordiner, S.; Mulone, V.; Rocco, V.; Spagnolo, F.

    2015-01-01

    Highlights: • Development of a domestic microgrid and house thermal model. • Model Predictive Control for simultaneous management of power flow and thermal comfort. • Modeling of summer and winter typical conditions. • Comparison with standard rule based controller results. • Fuel cell downsizing potential of output is up to 60%. - Abstract: In this paper a Model Predictive Control (MPC) logic, based on weather forecasts, has been applied to the analysis of power management in a domestic off-grid system. The system is laid out as the integration of renewable energy conversion devices (Photovoltaic, PV), a high efficiency energy conversion programmable system (a Fuel Cell, FC) and an electrochemical energy storage (batteries). The control strategy has the objective of minimizing energy costs, while maintaining the optimal environmental comfort in the house, thus optimizing the use of renewable sources. To that aim, a validated numerical model of the whole system has been developed, and simulations have been carried out for winter and summer periods. Performances attainable with a MPC-based logic have been evaluated in comparison with a standard Rule Based Control logic, by means of costs and efficiency parameters of the micro-grid. Temperature violations have been taken into account to represent the impact of the control on comfort. Results show an improvement of the house comfort conditions and a lower use (on average 14.5%) of primary fossil energy. This is due both to a reduction of required energy, and to an increased use of renewable energy sources. Moreover, the modulation of the HVAC load and of the FC operation gives a reduction of requested power by approximately 40%. Smoother battery pack charge and discharge processes are also obtained. As a main positive effect, a reduction of the FC powerplant size and an increase of its durability seems feasible, leading to an overall reduction of capital costs

  6. Modeling river total bed material load discharge using artificial intelligence approaches (based on conceptual inputs)

    Science.gov (United States)

    Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal

    2014-06-01

    This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.

  7. Bond graph model-based fault diagnosis of hybrid systems

    CERN Document Server

    Borutzky, Wolfgang

    2015-01-01

    This book presents a bond graph model-based approach to fault diagnosis in mechatronic systems appropriately represented by a hybrid model. The book begins by giving a survey of the fundamentals of fault diagnosis and failure prognosis, then recalls state-of-art developments referring to latest publications, and goes on to discuss various bond graph representations of hybrid system models, equations formulation for switched systems, and simulation of their dynamic behavior. The structured text: • focuses on bond graph model-based fault detection and isolation in hybrid systems; • addresses isolation of multiple parametric faults in hybrid systems; • considers system mode identification; • provides a number of elaborated case studies that consider fault scenarios for switched power electronic systems commonly used in a variety of applications; and • indicates that bond graph modelling can also be used for failure prognosis. In order to facilitate the understanding of fault diagnosis and the presented...

  8. Modified hyperbolic sine model for titanium dioxide-based memristive thin films

    Science.gov (United States)

    Abu Bakar, Raudah; Syahirah Kamarozaman, Nur; Fazlida Hanim Abdullah, Wan; Herman, Sukreen Hana

    2018-03-01

    Since the emergence of memristor as the newest fundamental circuit elements, studies on memristor modeling have been evolved. To date, the developed models were based on the linear model, linear ionic drift model using different window functions, tunnelling barrier model and hyperbolic-sine function based model. Although using hyperbolic-sine function model could predict the memristor electrical properties, the model was not well fitted to the experimental data. In order to improve the performance of the hyperbolic-sine function model, the state variable equation was modified. On the one hand, the addition of window function cannot provide an improved fitting. By multiplying the Yakopcic’s state variable model to Chang’s model on the other hand resulted in the closer agreement with the TiO2 thin film experimental data. The percentage error was approximately 2.15%.

  9. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  10. Meta Analisis Model Pembelajaran Problem Based Learning dalam Meningkatkan Keterampilan Berpikir Kritis di Sekolah Dasar [A Meta-analysis of Problem-Based Learning Models in Increasing Critical Thinking Skills in Elementary Schools

    Directory of Open Access Journals (Sweden)

    Indri Anugraheni

    2018-01-01

    Full Text Available This study aims to analyze Problem-based Learning models intended to improve critical thinking skills in elementary school students. Problem-based learning models are learning processes where students are open minded, reflexive, active, reflective, and critical through real-world context activities. In this study the researcher used a meta-analysis method. First, the researcher formulated the research problem, then proceeded to review the existing relevant research for analysis. Data were collected by using a non-test technique by browsing electronic journals through Google Scholar and studying documentation in the library. Seven articles were found through Google Scholar and only one was found in the library. Based on the analysis of the results, the problem-based learning model can improve students' thinking ability from as little as 2.87% up to 33.56% with an average of 14.18%. BAHASA INDONESIA ABSTRAK: Penelitian ini bertujuan untuk menganalisis kembali tentang model pembelajaran Problem Based Learning untuk meningkatkan keterampilan berpikir kritis di Sekolah Dasar. Model pembelajaran Problem Based Learning adalah proses pembelajaran dimana siswa mampu memiliki pola pikir yang terbuka, refktif, aktif, reflektif dan kritis melalui kegiatan konteks dunia nyata. Dalam penelitian ini peneliti menggunakan metode meta analisis. Pertama-tama, peneliti merumuskan masalah penelitian, kemudian dilanjutkan dengan menelusuri penelitian yang sudah ada dan relevan untuk dianalisis. Teknik pengumpulan data dengan menggunakan non tes yaitu dengan menelusuri jurnal elektronik melalui google Cendekia dan studi dokumentasi di perpustakaan. Dari hasil penelusuran diperoleh 20 artikel dari jurnal dan 3 dari repository. Berdasarkan hasil analisis ternyata model pembelajaran Problem Based Learning mampu meningkatkan kemampuan berpikir Siswa mulai dari yang terendah 2,87% sampai yang tertinggi 33,56% dengan rata-rata 12,73%.

  11. Urban flood simulation based on the SWMM model

    Directory of Open Access Journals (Sweden)

    L. Jiang

    2015-05-01

    Full Text Available China is the nation with the fastest urbanization in the past decades which has caused serious urban flooding. Flood forecasting is regarded as one of the important flood mitigation methods, and is widely used in catchment flood mitigation, but is not widely used in urban flooding mitigation. This paper, employing the SWMM model, one of the widely used urban flood planning and management models, simulates the urban flooding of Dongguan City in the rapidly urbanized southern China. SWMM is first set up based on the DEM, digital map and underground pipeline network, then parameters are derived based on the properties of the subcatchment and the storm sewer conduits; the parameter sensitivity analysis shows the parameter robustness. The simulated results show that with the 1-year return period precipitation, the studied area will have no flooding, but for the 2-, 5-, 10- and 20-year return period precipitation, the studied area will be inundated. The results show the SWMM model is promising for urban flood forecasting, but as it has no surface runoff routing, the urban flooding could not be forecast precisely.

  12. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  13. Flow Formulation-based Model for the Curriculum-based Course Timetabling Problem

    DEFF Research Database (Denmark)

    Bagger, Niels-Christian Fink; Kristiansen, Simon; Sørensen, Matias

    2015-01-01

    problem. This decreases the number of integer variables signicantly and improves the performance compared to the basic formulation. It also shows competitiveness with other approaches based on mixed integer programming from the literature and improves the currently best known lower bound on one data...... instance in the benchmark data set from the second international timetabling competition.......In this work we will present a new mixed integer programming formulation for the curriculum-based course timetabling problem. We show that the model contains an underlying network model by dividing the problem into two models and then connecting the two models back into one model using a maximum ow...

  14. Model-based sensor-augmented pump therapy.

    Science.gov (United States)

    Grosman, Benyamin; Voskanyan, Gayane; Loutseiko, Mikhail; Roy, Anirban; Mehta, Aloke; Kurtz, Natalie; Parikh, Neha; Kaufman, Francine R; Mastrototaro, John J; Keenan, Barry

    2013-03-01

    In insulin pump therapy, optimization of bolus and basal insulin dose settings is a challenge. We introduce a new algorithm that provides individualized basal rates and new carbohydrate ratio and correction factor recommendations. The algorithm utilizes a mathematical model of blood glucose (BG) as a function of carbohydrate intake and delivered insulin, which includes individualized parameters derived from sensor BG and insulin delivery data downloaded from a patient's pump. A mathematical model of BG as a function of carbohydrate intake and delivered insulin was developed. The model includes fixed parameters and several individualized parameters derived from the subject's BG measurements and pump data. Performance of the new algorithm was assessed using n = 4 diabetic canine experiments over a 32 h duration. In addition, 10 in silico adults from the University of Virginia/Padova type 1 diabetes mellitus metabolic simulator were tested. The percentage of time in glucose range 80-180 mg/dl was 86%, 85%, 61%, and 30% using model-based therapy and [78%, 100%] (brackets denote multiple experiments conducted under the same therapy and animal model), [75%, 67%], 47%, and 86% for the control experiments for dogs 1 to 4, respectively. The BG measurements obtained in the simulation using our individualized algorithm were in 61-231 mg/dl min-max envelope, whereas use of the simulator's default treatment resulted in BG measurements 90-210 mg/dl min-max envelope. The study results demonstrate the potential of this method, which could serve as a platform for improving, facilitating, and standardizing insulin pump therapy based on a single download of data. © 2013 Diabetes Technology Society.

  15. Feedback loops and temporal misalignment in component-based hydrologic modeling

    Science.gov (United States)

    Elag, Mostafa M.; Goodall, Jonathan L.; Castronova, Anthony M.

    2011-12-01

    In component-based modeling, a complex system is represented as a series of loosely integrated components with defined interfaces and data exchanges that allow the components to be coupled together through shared boundary conditions. Although the component-based paradigm is commonly used in software engineering, it has only recently been applied for modeling hydrologic and earth systems. As a result, research is needed to test and verify the applicability of the approach for modeling hydrologic systems. The objective of this work was therefore to investigate two aspects of using component-based software architecture for hydrologic modeling: (1) simulation of feedback loops between components that share a boundary condition and (2) data transfers between temporally misaligned model components. We investigated these topics using a simple case study where diffusion of mass is modeled across a water-sediment interface. We simulated the multimedia system using two model components, one for the water and one for the sediment, coupled using the Open Modeling Interface (OpenMI) standard. The results were compared with a more conventional numerical approach for solving the system where the domain is represented by a single multidimensional array. Results showed that the component-based approach was able to produce the same results obtained with the more conventional numerical approach. When the two components were temporally misaligned, we explored the use of different interpolation schemes to minimize mass balance error within the coupled system. The outcome of this work provides evidence that component-based modeling can be used to simulate complicated feedback loops between systems and guidance as to how different interpolation schemes minimize mass balance error introduced when components are temporally misaligned.

  16. Assessing the external validity of model-based estimates of the incidence of heart attack in England: a modelling study

    Directory of Open Access Journals (Sweden)

    Peter Scarborough

    2016-11-01

    Full Text Available Abstract Background The DisMod II model is designed to estimate epidemiological parameters on diseases where measured data are incomplete and has been used to provide estimates of disease incidence for the Global Burden of Disease study. We assessed the external validity of the DisMod II model by comparing modelled estimates of the incidence of first acute myocardial infarction (AMI in England in 2010 with estimates derived from a linked dataset of hospital records and death certificates. Methods Inputs for DisMod II were prevalence rates of ever having had an AMI taken from a population health survey, total mortality rates and AMI mortality rates taken from death certificates. By definition, remission rates were zero. We estimated first AMI incidence in an external dataset from England in 2010 using a linked dataset including all hospital admissions and death certificates since 1998. 95 % confidence intervals were derived around estimates from the external dataset and DisMod II estimates based on sampling variance and reported uncertainty in prevalence estimates respectively. Results Estimates of the incidence rate for the whole population were higher in the DisMod II results than the external dataset (+54 % for men and +26 % for women. Age-specific results showed that the DisMod II results over-estimated incidence for all but the oldest age groups. Confidence intervals for the DisMod II and external dataset estimates did not overlap for most age groups. Conclusion By comparison with AMI incidence rates in England, DisMod II did not achieve external validity for age-specific incidence rates, but did provide global estimates of incidence that are of similar magnitude to measured estimates. The model should be used with caution when estimating age-specific incidence rates.

  17. Model-based reasoning technology for the power industry

    International Nuclear Information System (INIS)

    Touchton, R.A.; Subramanyan, N.S.; Naser, J.A.

    1991-01-01

    This paper reports on model-based reasoning which refers to an expert system implementation methodology that uses a model of the system which is being reasoned about. Model-based representation and reasoning techniques offer many advantages and are highly suitable for domains where the individual components, their interconnection, and their behavior is well-known. Technology Applications, Inc. (TAI), under contract to the Electric Power Research Institute (EPRI), investigated the use of model-based reasoning in the power industry including the nuclear power industry. During this project, a model-based monitoring and diagnostic tool, called ProSys, was developed. Also, an alarm prioritization system was developed as a demonstration prototype

  18. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  19. Fatigue crack initiation in nickel-based superalloys studied by microstructure-based FE modeling and scanning electron microscopy

    Directory of Open Access Journals (Sweden)

    Fried M.

    2014-01-01

    Full Text Available In this work stage I crack initiation in polycrystalline nickel-based superalloys is investigated by analyzing anisotropic mechanical properties, local stress concentrations and plastic deformation on the microstructural length scale. The grain structure in the gauge section of fatigue specimens was characterized by EBSD. Based on the measured data, a microstructure-based FE model could be established to simulate the strain and stress distribution in the specimens during the first loading cycle of a fatigue test. The results were in fairly good agreement with experimentally measured local strains. Furthermore, the onset of plastic deformation was predicted by identifying shear stress maxima in the microstructure, presumably leading to activation of slip systems. Measurement of plastic deformation and observation of slip traces in the respective regions of the microstructure confirmed the predicted slip activity. The close relation between micro-plasticity, formation of slip traces and stage I crack initiation was demonstrated by SEM surface analyses of fatigued specimens and an in-situ fatigue test in a large chamber SEM.

  20. AlgiMatrix™ based 3D cell culture system as an in-vitro tumor model for anticancer studies.

    Directory of Open Access Journals (Sweden)

    Chandraiah Godugu

    Full Text Available Three-dimensional (3D in-vitro cultures are recognized for recapitulating the physiological microenvironment and exhibiting high concordance with in-vivo conditions. Taking the advantages of 3D culture, we have developed the in-vitro tumor model for anticancer drug screening.Cancer cells grown in 6 and 96 well AlgiMatrix™ scaffolds resulted in the formation of multicellular spheroids in the size range of 100-300 µm. Spheroids were grown in two weeks in cultures without compromising the growth characteristics. Different marketed anticancer drugs were screened by incubating them for 24 h at 7, 9 and 11 days in 3D cultures and cytotoxicity was measured by AlamarBlue® assay. Effectiveness of anticancer drug treatments were measured based on spheroid number and size distribution. Evaluation of apoptotic and anti-apoptotic markers was done by immunohistochemistry and RT-PCR. The 3D results were compared with the conventional 2D monolayer cultures. Cellular uptake studies for drug (Doxorubicin and nanoparticle (NLC were done using spheroids.IC(50 values for anticancer drugs were significantly higher in AlgiMatrix™ systems compared to 2D culture models. The cleaved caspase-3 expression was significantly decreased (2.09 and 2.47 folds respectively for 5-Fluorouracil and Camptothecin in H460 spheroid cultures compared to 2D culture system. The cytotoxicity, spheroid size distribution, immunohistochemistry, RT-PCR and nanoparticle penetration data suggested that in vitro tumor models show higher resistance to anticancer drugs and supporting the fact that 3D culture is a better model for the cytotoxic evaluation of anticancer drugs in vitro.The results from our studies are useful to develop a high throughput in vitro tumor model to study the effect of various anticancer agents and various molecular pathways affected by the anticancer drugs and formulations.

  1. Structure-Based Turbulence Model

    National Research Council Canada - National Science Library

    Reynolds, W

    2000-01-01

    .... Maire carried out this work as part of his Phi) research. During the award period we began to explore ways to simplify the structure-based modeling so that it could be used in repetitive engineering calculations...

  2. Agent-Based Modeling of Day-Ahead Real Time Pricing in a Pool-Based Electricity Market

    Directory of Open Access Journals (Sweden)

    Sh. Yousefi

    2011-09-01

    Full Text Available In this paper, an agent-based structure of the electricity retail market is presented based on which day-ahead (DA energy procurement for customers is modeled. Here, we focus on operation of only one Retail Energy Provider (REP agent who purchases energy from DA pool-based wholesale market and offers DA real time tariffs to a group of its customers. As a model of customer response to the offered real time prices, an hourly acceptance function is proposed in order to represent the hourly changes in the customer’s effective demand according to the prices. Here, Q-learning (QL approach is applied in day-ahead real time pricing for the customers enabling the REP agent to discover which price yields the most benefit through a trial-and-error search. Numerical studies are presented based on New England day-ahead market data which include comparing the results of RTP based on QL approach with that of genetic-based pricing.

  3. Fog Simulations Based on Multi-Model System: A Feasibility Study

    Science.gov (United States)

    Shi, Chune; Wang, Lei; Zhang, Hao; Zhang, Su; Deng, Xueliang; Li, Yaosun; Qiu, Mingyan

    2012-05-01

    Accurate forecasts of fog and visibility are very important to air and high way traffic, and are still a big challenge. A 1D fog model (PAFOG) is coupled to MM5 by obtaining the initial and boundary conditions (IC/BC) and some other necessary input parameters from MM5. Thus, PAFOG can be run for any area of interest. On the other hand, MM5 itself can be used to simulate fog events over a large domain. This paper presents evaluations of the fog predictability of these two systems for December of 2006 and December of 2007, with nine regional fog events observed in a field experiment, as well as over a large domain in eastern China. Among the simulations of the nine fog events by the two systems, two cases were investigated in detail. Daily results of ground level meteorology were validated against the routine observations at the CMA observational network. Daily fog occurrences for the two study periods was validated in Nanjing. General performance of the two models for the nine fog cases are presented by comparing with routine and field observational data. The results of MM5 and PAFOG for two typical fog cases are verified in detail against field observations. The verifications demonstrated that all methods tended to overestimate fog occurrence, especially for near-fog cases. In terms of TS/ETS, the LWC-only threshold with MM5 showed the best performance, while PAFOG showed the worst. MM5 performed better for advection-radiation fog than for radiation fog, and PAFOG could be an alternative tool for forecasting radiation fogs. PAFOG did show advantages over MM5 on the fog dissipation time. The performance of PAFOG highly depended on the quality of MM5 output. The sensitive runs of PAFOG with different IC/BC showed the capability of using MM5 output to run the 1D model and the high sensitivity of PAFOG on cloud cover. Future works should intensify the study of how to improve the quality of input data (e.g. cloud cover, advection, large scale subsidence) for the 1D

  4. A role based coordination model in agent systems

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ya-ying; YOU Jin-yuan

    2005-01-01

    Coordination technology addresses the construction of open, flexible systems from active and independent software agents in concurrent and distributed systems. In most open distributed applications, multiple agents need interaction and communication to achieve their overall goal. Coordination technologies for the Internet typically are concerned with enabling interaction among agents and helping them cooperate with each other.At the same time, access control should also be considered to constrain interaction to make it harmless. Access control should be regarded as the security counterpart of coordination. At present, the combination of coordination and access control remains an open problem. Thus, we propose a role based coordination model with policy enforcement in agent application systems. In this model, coordination is combined with access control so as to fully characterize the interactions in agent systems. A set of agents interacting with each other for a common global system task constitutes a coordination group. Role based access control is applied in this model to prevent unauthorized accesses. Coordination policy is enforced in a distributed manner so that the model can be applied to the open distributed systems such as Intemet. An Internet online auction system is presented as a case study to illustrate the proposed coordination model and finally the performance analysis of the model is introduced.

  5. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    Science.gov (United States)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  6. Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model

    Science.gov (United States)

    Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.

    2014-02-01

    , can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.

  7. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the several...

  8. Probabilistic Model-based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Anderson, Jakob; Prehn, Thomas

    2005-01-01

    is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  9. Evaluation of Artificial Intelligence Based Models for Chemical Biodegradability Prediction

    Directory of Open Access Journals (Sweden)

    Aleksandar Sabljic

    2004-12-01

    Full Text Available This study presents a review of biodegradability modeling efforts including a detailed assessment of two models developed using an artificial intelligence based methodology. Validation results for these models using an independent, quality reviewed database, demonstrate that the models perform well when compared to another commonly used biodegradability model, against the same data. The ability of models induced by an artificial intelligence methodology to accommodate complex interactions in detailed systems, and the demonstrated reliability of the approach evaluated by this study, indicate that the methodology may have application in broadening the scope of biodegradability models. Given adequate data for biodegradability of chemicals under environmental conditions, this may allow for the development of future models that include such things as surface interface impacts on biodegradability for example.

  10. Coast-down model based on rated parameters of reactor coolant pump

    International Nuclear Information System (INIS)

    Jiang Maohua; Zou Zhichao; Wang Pengfei; Ruan Xiaodong

    2014-01-01

    For a sudden loss of power in reactor coolant pump (RCP), a calculation model of rotor speed and flow characteristics based on rated parameters was studied. The derived model was verified by comparing with the power-off experimental data of 100D RCP. The results indicate that it can be used in preliminary design calculation and verification analysis. Then a design criterion of RCP was described based on the calculation model. The moment of inertia in AP1000 RCP was verified by this criterion. (authors)

  11. Discrete Event System Based Pyroprocessing Modeling and Simulation: Oxide Reduction

    International Nuclear Information System (INIS)

    Lee, H. J.; Ko, W. I.; Choi, S. Y.; Kim, S. K.; Hur, J. M.; Choi, E. Y.; Im, H. S.; Park, K. I.; Kim, I. T.

    2014-01-01

    Dynamic changes according to the batch operation cannot be predicted in an equilibrium material flow. This study began to build a dynamic material balance model based on the previously developed pyroprocessing flowsheet. As a mid- and long-term research, an integrated pyroprocessing simulator is being developed at the Korea Atomic Energy Research Institute (KAERI) to cope with a review on the technical feasibility, safeguards assessment, conceptual design of facility, and economic feasibility evaluation. The most fundamental thing in such a simulator development is to establish the dynamic material flow framework. This study focused on the operation modeling of pyroprocessing to implement a dynamic material flow. As a case study, oxide reduction was investigated in terms of a dynamic material flow. DES based modeling was applied to build a pyroprocessing operation model. A dynamic material flow as the basic framework for an integrated pyroprocessing was successfully implemented through ExtendSim's internal database and item blocks. Complex operation logic behavior was verified, for example, an oxide reduction process in terms of dynamic material flow. Compared to the equilibrium material flow, a model-based dynamic material flow provides such detailed information that a careful analysis of every batch is necessary to confirm the dynamic material balance results. With the default scenario of oxide reduction, the batch mass balance was verified in comparison with a one-year equilibrium mass balance. This study is still under progress with a mid-and long-term goal, the development of a multi-purpose pyroprocessing simulator that is able to cope with safeguards assessment, economic feasibility, technical evaluation, conceptual design, and support of licensing for a future pyroprocessing facility

  12. Model-based optimization biofilm based systems performing autotrophic nitrogen removal using the comprehensive NDHA model

    DEFF Research Database (Denmark)

    Valverde Pérez, Borja; Ma, Yunjie; Morset, Martin

    Completely autotrophic nitrogen removal (CANR) can be obtained in single stage biofilm-based bioreactors. However, their environmental footprint is compromised due to elevated N2O emissions. We developed novel spatially explicit biochemical process model of biofilm based CANR systems that predicts...

  13. Towards a model-based cognitive neuroscience of stopping - a neuroimaging perspective.

    Science.gov (United States)

    Sebastian, Alexandra; Forstmann, Birte U; Matzke, Dora

    2018-07-01

    Our understanding of the neural correlates of response inhibition has greatly advanced over the last decade. Nevertheless the specific function of regions within this stopping network remains controversial. The traditional neuroimaging approach cannot capture many processes affecting stopping performance. Despite the shortcomings of the traditional neuroimaging approach and a great progress in mathematical and computational models of stopping, model-based cognitive neuroscience approaches in human neuroimaging studies are largely lacking. To foster model-based approaches to ultimately gain a deeper understanding of the neural signature of stopping, we outline the most prominent models of response inhibition and recent advances in the field. We highlight how a model-based approach in clinical samples has improved our understanding of altered cognitive functions in these disorders. Moreover, we show how linking evidence-accumulation models and neuroimaging data improves the identification of neural pathways involved in the stopping process and helps to delineate these from neural networks of related but distinct functions. In conclusion, adopting a model-based approach is indispensable to identifying the actual neural processes underlying stopping. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. A temperature dependent slip factor based thermal model for friction

    Indian Academy of Sciences (India)

    This paper proposes a new slip factor based three-dimensional thermal model to predict the temperature distribution during friction stir welding of 304L stainless steel plates. The proposed model employs temperature and radius dependent heat source to study the thermal cycle, temperature distribution, power required, the ...

  15. Particle-based model for skiing traffic.

    Science.gov (United States)

    Holleczek, Thomas; Tröster, Gerhard

    2012-05-01

    We develop and investigate a particle-based model for ski slope traffic. Skiers are modeled as particles with a mass that are exposed to social and physical forces, which define the riding behavior of skiers during their descents on ski slopes. We also report position and speed data of 21 skiers recorded with GPS-equipped cell phones on two ski slopes. A comparison of these data with the trajectories resulting from computer simulations of our model shows a good correspondence. A study of the relationship among the density, speed, and flow of skiers reveals that congestion does not occur even with arrival rates of skiers exceeding the maximum ski lift capacity. In a sensitivity analysis, we identify the kinetic friction coefficient of skis on snow, the skier mass, the range of repelling social forces, and the arrival rate of skiers as the crucial parameters influencing the simulation results. Our model allows for the prediction of speed zones and skier densities on ski slopes, which is important in the prevention of skiing accidents.

  16. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  17. Interactive physically-based structural modeling of hydrocarbon systems

    International Nuclear Information System (INIS)

    Bosson, Mael; Grudinin, Sergei; Bouju, Xavier; Redon, Stephane

    2012-01-01

    Hydrocarbon systems have been intensively studied via numerical methods, including electronic structure computations, molecular dynamics and Monte Carlo simulations. Typically, these methods require an initial structural model (atomic positions and types, topology, etc.) that may be produced using scripts and/or modeling tools. For many systems, however, these building methods may be ineffective, as the user may have to specify the positions of numerous atoms while maintaining structural plausibility. In this paper, we present an interactive physically-based modeling tool to construct structural models of hydrocarbon systems. As the user edits the geometry of the system, atomic positions are also influenced by the Brenner potential, a well-known bond-order reactive potential. In order to be able to interactively edit systems containing numerous atoms, we introduce a new adaptive simulation algorithm, as well as a novel algorithm to incrementally update the forces and the total potential energy based on the list of updated relative atomic positions. The computational cost of the adaptive simulation algorithm depends on user-defined error thresholds, and our potential update algorithm depends linearly with the number of updated bonds. This allows us to enable efficient physically-based editing, since the computational cost is decoupled from the number of atoms in the system. We show that our approach may be used to effectively build realistic models of hydrocarbon structures that would be difficult or impossible to produce using other tools.

  18. Multi-Domain Modeling Based on Modelica

    Directory of Open Access Journals (Sweden)

    Liu Jun

    2016-01-01

    Full Text Available With the application of simulation technology in large-scale and multi-field problems, multi-domain unified modeling become an effective way to solve these problems. This paper introduces several basic methods and advantages of the multidisciplinary model, and focuses on the simulation based on Modelica language. The Modelica/Mworks is a newly developed simulation software with features of an object-oriented and non-casual language for modeling of the large, multi-domain system, which makes the model easier to grasp, develop and maintain.It This article shows the single degree of freedom mechanical vibration system based on Modelica language special connection mechanism in Mworks. This method that multi-domain modeling has simple and feasible, high reusability. it closer to the physical system, and many other advantages.

  19. Web-based reactive transport modeling using PFLOTRAN

    Science.gov (United States)

    Zhou, H.; Karra, S.; Lichtner, P. C.; Versteeg, R.; Zhang, Y.

    2017-12-01

    Actionable understanding of system behavior in the subsurface is required for a wide spectrum of societal and engineering needs by both commercial firms and government entities and academia. These needs include, for example, water resource management, precision agriculture, contaminant remediation, unconventional energy production, CO2 sequestration monitoring, and climate studies. Such understanding requires the ability to numerically model various coupled processes that occur across different temporal and spatial scales as well as multiple physical domains (reservoirs - overburden, surface-subsurface, groundwater-surface water, saturated-unsaturated zone). Currently, this ability is typically met through an in-house approach where computational resources, model expertise, and data for model parameterization are brought together to meet modeling needs. However, such an approach has multiple drawbacks which limit the application of high-end reactive transport codes such as the Department of Energy funded[?] PFLOTRAN code. In addition, while many end users have a need for the capabilities provided by high-end reactive transport codes, they do not have the expertise - nor the time required to obtain the expertise - to effectively use these codes. We have developed and are actively enhancing a cloud-based software platform through which diverse users are able to easily configure, execute, visualize, share, and interpret PFLOTRAN models. This platform consists of a web application and available on-demand HPC computational infrastructure. The web application consists of (1) a browser-based graphical user interface which allows users to configure models and visualize results interactively, and (2) a central server with back-end relational databases which hold configuration, data, modeling results, and Python scripts for model configuration, and (3) a HPC environment for on-demand model execution. We will discuss lessons learned in the development of this platform, the

  20. Testing R&D-Based Endogenous Growth Models

    DEFF Research Database (Denmark)

    Kruse-Andersen, Peter Kjær

    2017-01-01

    R&D-based growth models are tested using US data for the period 1953-2014. A general growth model is developed which nests the model varieties of interest. The model implies a cointegrating relationship between multifactor productivity, research intensity, and employment. This relationship...... is estimated using cointegrated VAR models. The results provide evidence against the widely used fully endogenous variety and in favor of the semi-endogenous variety. Forecasts based on the empirical estimates suggest that the slowdown in US productivity growth will continue. Particularly, the annual long...