WorldWideScience

Sample records for models based primarily

  1. Development and Sensitivity Analysis of a Frost Risk model based primarily on freely distributed Earth Observation data

    Science.gov (United States)

    Louka, Panagiota; Petropoulos, George; Papanikolaou, Ioannis

    2015-04-01

    The ability to map the spatiotemporal distribution of extreme climatic conditions, such as frost, is a significant tool in successful agricultural management and decision making. Nowadays, with the development of Earth Observation (EO) technology, it is possible to obtain accurately, timely and in a cost-effective way information on the spatiotemporal distribution of frost conditions, particularly over large and otherwise inaccessible areas. The present study aimed at developing and evaluating a frost risk prediction model, exploiting primarily EO data from MODIS and ASTER sensors and ancillary ground observation data. For the evaluation of our model, a region in north-western Greece was selected as test site and a detailed sensitivity analysis was implemented. The agreement between the model predictions and the observed (remotely sensed) frost frequency obtained by MODIS sensor was evaluated thoroughly. Also, detailed comparisons of the model predictions were performed against reference frost ground observations acquired from the Greek Agricultural Insurance Organization (ELGA) over a period of 10-years (2000-2010). Overall, results evidenced the ability of the model to produce reasonably well the frost conditions, following largely explainable patterns in respect to the study site and local weather conditions characteristics. Implementation of our proposed frost risk model is based primarily on satellite imagery analysis provided nowadays globally at no cost. It is also straightforward and computationally inexpensive, requiring much less effort in comparison for example to field surveying. Finally, the method is adjustable to be potentially integrated with other high resolution data available from both commercial and non-commercial vendors. Keywords: Sensitivity analysis, frost risk mapping, GIS, remote sensing, MODIS, Greece

  2. Disgust sensitivity is primarily associated with purity-based moral judgments.

    Science.gov (United States)

    Wagemans, Fieke M A; Brandt, Mark J; Zeelenberg, Marcel

    2018-03-01

    Individual differences in disgust sensitivity are associated with a range of judgments and attitudes related to the moral domain. Some perspectives suggest that the association between disgust sensitivity and moral judgments will be equally strong across all moral domains (i.e., purity, authority, loyalty, care, fairness, and liberty). Other perspectives predict that disgust sensitivity is primarily associated with judgments of specific moral domains (e.g., primarily purity). However, no study has systematically tested if disgust sensitivity is associated with moral judgments of the purity domain specifically, more generally to moral judgments of the binding moral domains, or to moral judgments of all of the moral domains equally. Across 5 studies (total N = 1,104), we find consistent evidence for the notion that disgust sensitivity relates more strongly to moral condemnation of purity-based transgressions (meta-analytic r = .40) than to moral condemnation of transgressions of any of the other domains (range meta-analytic rs: .07-.27). Our findings are in line with predictions from Moral Foundations Theory, which predicts that personality characteristics like disgust sensitivity make people more sensitive to a certain set of moral issues. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Electroejaculation functions primarily by direct activation of pelvic musculature: Perspectives from a porcine model

    Directory of Open Access Journals (Sweden)

    Adam M.R. Groh

    2018-03-01

    Full Text Available Ejaculatory dysfunction is a significant cause of infertility in men that have incurred spinal cord injury or iatrogenic lesions to the sympathetic nerves in the retroperitoneum. For such patients, electroejaculation – whereby a voltage is applied transrectally under general anesthesia – is a highly-effective procedure to obtain ejaculate. At present, however, there remains uncertainty as to the physiological mechanism by which electroejaculation prompts seminal emission in males with neurogenic anejaculation. Thus, in the present study, we aimed to determine, for the first time, whether electroejaculation functions by mimicking a neurophysiological response, or by directly activating local pelvic musculature. Using electroejaculation in a novel porcine model, we monitored the strength of contraction of the internal urethral sphincter (a smooth muscle involved in ejaculation before and after lesioning its sympathetic innervation with a combination of progressively-worsening surgical and pharmacological insults in three anesthetized boars (46.1 ± 7.4 kg. Importantly, prior to this investigation, we confirmed the comparative structural anatomy of the porcine model to humans through gross dissection and histological analysis of the infrarenal retroperitoneal sympathetic nerves and ganglia in 18 unembalmed boars. Prior to sacrifice, three of these boars underwent functional testing to confirm control of the internal urethral sphincter by the hypogastric nerves. Our results demonstrate that electroejaculation-induced contraction of the internal urethral sphincter was preserved following each progressive neural insult compared to the control state (p > 0.05. In contrast, these same insults resulted in paralysis/paresis of the internal urethral sphincter when its sympathetic innervation was directly stimulated with bipolar electrodes (p < 0.05. Taken together, our results provide the first empirical evidence to suggest that

  4. Digital Surface and Terrain Models (DSM,DTM), The DTM associated with the Base Mapping Program consists of mass points and breaklines used primarily for ortho rectification. The DTM specifications included all breaklines for all hydro and transportation features and are the source for the TIPS (Tenn, Published in 2007, 1:4800 (1in=400ft) scale, Tennessee, OIR-GIS.

    Data.gov (United States)

    NSGIC State | GIS Inventory — Digital Surface and Terrain Models (DSM,DTM) dataset current as of 2007. The DTM associated with the Base Mapping Program consists of mass points and breaklines used...

  5. Incidence of diseases primarily affecting the skin by age group: population-based epidemiologic study in Olmsted County, Minnesota, and comparison with age-specific incidence rates worldwide.

    Science.gov (United States)

    Wessman, Laurel L; Andersen, Louise K; Davis, Mark D P

    2018-01-29

    Understanding the effects of age on the epidemiology of diseases primarily affecting the skin is important to the practice of dermatology, both for proper allocation of resources and for optimal patient-centered care. To fully appreciate the effect that age may have on the population-based calculations of incidence of diseases primarily affecting the skin in Olmsted County, Minnesota, and worldwide, we performed a review of all relevant Rochester Epidemiology Project-published data and compared them to similar reports in the worldwide English literature. Using the Rochester Epidemiology Project, population-based epidemiologic studies have been performed to estimate the incidence of specific skin diseases over the past 50 years. In older persons (>65 years), nonmelanoma skin cancer, lentigo maligna, herpes zoster, delusional infestation, venous stasis syndrome, venous ulcer, and burning mouth syndrome were more commonly diagnosed. In those younger than 65 years, atypical nevi, psoriatic arthritis, pityriasis rosea, herpes progenitalis, genital warts, alopecia areata, hidradenitis suppurativa, infantile hemangioma, Behçet's disease, and sarcoidosis (isolated cutaneous, with sarcoidosis-specific cutaneous lesions and with erythema nodosum) had a higher incidence. Many of the incidence rates by age group of diseases primarily affecting the skin derived from the Rochester Epidemiology Project were similar to those reported elsewhere. © 2018 The International Society of Dermatology.

  6. Nanoparticles affect PCR primarily via surface interactions with PCR components: using amino-modified silica-coated magnetic nanoparticles as a main model

    Science.gov (United States)

    Nanomaterials have been widely reported to affect the polymerase chain reaction (PCR). However, many studies in which these effects were observed were not comprehensive, and many of the proposed mechanisms have been primarily speculative. In this work, we used amino-modified silica-coated magnetic n...

  7. An empirical test of the information-motivation-behavioral skills model of ART adherence in a sample of HIV-positive persons primarily in out-of-HIV-care settings.

    Science.gov (United States)

    Horvath, Keith J; Smolenski, Derek; Amico, K Rivet

    2014-02-01

    The current body of evidence supporting the Information-Motivation-Behavioral Skills (IMB) model of antiretroviral therapy (ART) adherence rests exclusively on data collected from people living with HIV (PLWH) at point-of-HIV-care services. The aims of this study were to: (1) determine if the IMB model is a useful predictive model of ART adherence among PLWH who were primarily recruited in out-of-HIV-care settings; and (2) assess whether the theorized associations between IMB model constructs and adherence persist in the presence of depression and current drug use. PLWH (n = 312) responding to a one-time online survey completed the Life Windows IMB-ART-Adherence Questionnaire, and demographic, depression (CES-D 10), and drug use items. Path models were used to assess the fit of a saturated versus fully mediated IMB model of adherence and examined for moderating effects of depression and current drug use. Participants were on average 43 years of age, had been living with HIV for 9 or more years, and mostly male (84.0%), Caucasian (68.8%), and gay-identified (74.8%). The a priori measurement models for information and behavioral skills did not have acceptable fit to the data and were modified accordingly. Using the revised IMB scales, IMB constructs were associated with adherence as predicted by the theory in all but one model (i.e., the IMB model operated as predicted among nondrug users and those with and without depression). Among drug users, information exerted a direct effect on adherence but was not significantly associated with behavioral skills. Results of this study suggest that the fully or partially mediated IMB model is supported for use with samples of PLWH recruited primarily out-of-HIV-care service settings and is robust in the presence of depression and drug use.

  8. Podocytes Degrade Endocytosed Albumin Primarily in Lysosomes

    Science.gov (United States)

    Carson, John M.; Okamura, Kayo; Wakashin, Hidefumi; McFann, Kim; Dobrinskikh, Evgenia; Kopp, Jeffrey B.; Blaine, Judith

    2014-01-01

    Albuminuria is a strong, independent predictor of chronic kidney disease progression. We hypothesize that podocyte processing of albumin via the lysosome may be an important determinant of podocyte injury and loss. A human urine derived podocyte-like epithelial cell (HUPEC) line was used for in vitro experiments. Albumin uptake was quantified by Western blot after loading HUPECs with fluorescein-labeled (FITC) albumin. Co-localization of albumin with lysosomes was determined by confocal microscopy. Albumin degradation was measured by quantifying FITC-albumin abundance in HUPEC lysates by Western blot. Degradation experiments were repeated using HUPECs treated with chloroquine, a lysosome inhibitor, or MG-132, a proteasome inhibitor. Lysosome activity was measured by fluorescence recovery after photo bleaching (FRAP). Cytokine production was measured by ELISA. Cell death was determined by trypan blue staining. In vivo, staining with lysosome-associated membrane protein-1 (LAMP-1) was performed on tissue from a Denys-Drash trangenic mouse model of nephrotic syndrome. HUPECs endocytosed albumin, which co-localized with lysosomes. Choloroquine, but not MG-132, inhibited albumin degradation, indicating that degradation occurs in lysosomes. Cathepsin B activity, measured by FRAP, significantly decreased in HUPECs exposed to albumin (12.5% of activity in controls) and chloroquine (12.8%), and declined further with exposure to albumin plus chloroquine (8.2%, palbumin and chloroquine alone, and these effects were potentiated by exposure to albumin plus chloroquine. Compared to wild-type mice, glomerular staining of LAMP-1 was significantly increased in Denys-Drash mice and appeared to be most prominent in podocytes. These data suggest lysosomes are involved in the processing of endocytosed albumin in podocytes, and lysosomal dysfunction may contribute to podocyte injury and glomerulosclerosis in albuminuric diseases. Modifiers of lysosomal activity may have therapeutic

  9. Podocytes degrade endocytosed albumin primarily in lysosomes.

    Science.gov (United States)

    Carson, John M; Okamura, Kayo; Wakashin, Hidefumi; McFann, Kim; Dobrinskikh, Evgenia; Kopp, Jeffrey B; Blaine, Judith

    2014-01-01

    Albuminuria is a strong, independent predictor of chronic kidney disease progression. We hypothesize that podocyte processing of albumin via the lysosome may be an important determinant of podocyte injury and loss. A human urine derived podocyte-like epithelial cell (HUPEC) line was used for in vitro experiments. Albumin uptake was quantified by Western blot after loading HUPECs with fluorescein-labeled (FITC) albumin. Co-localization of albumin with lysosomes was determined by confocal microscopy. Albumin degradation was measured by quantifying FITC-albumin abundance in HUPEC lysates by Western blot. Degradation experiments were repeated using HUPECs treated with chloroquine, a lysosome inhibitor, or MG-132, a proteasome inhibitor. Lysosome activity was measured by fluorescence recovery after photo bleaching (FRAP). Cytokine production was measured by ELISA. Cell death was determined by trypan blue staining. In vivo, staining with lysosome-associated membrane protein-1 (LAMP-1) was performed on tissue from a Denys-Drash trangenic mouse model of nephrotic syndrome. HUPECs endocytosed albumin, which co-localized with lysosomes. Choloroquine, but not MG-132, inhibited albumin degradation, indicating that degradation occurs in lysosomes. Cathepsin B activity, measured by FRAP, significantly decreased in HUPECs exposed to albumin (12.5% of activity in controls) and chloroquine (12.8%), and declined further with exposure to albumin plus chloroquine (8.2%, plysosomes are involved in the processing of endocytosed albumin in podocytes, and lysosomal dysfunction may contribute to podocyte injury and glomerulosclerosis in albuminuric diseases. Modifiers of lysosomal activity may have therapeutic potential in slowing the progression of glomerulosclerosis by enhancing the ability of podocytes to process and degrade albumin.

  10. Relationship between obsessive-compulsive disorders and diseases affecting primarily the basal ganglia Relação entre transtorno obsessivo-compulsivo e doenças neurológicas dos gânglios da base

    Directory of Open Access Journals (Sweden)

    Alex S. S. Freire Maia

    1999-12-01

    Full Text Available Obsessive-compulsive disorder (OCD has been reported in association with some neurological diseases that affect the basal ganglia such as Tourette's syndrome, Sydenham's chorea, Parkinson's disease, and Huntington's disease. Furthermore, studies such as neuroimaging, suggest a role of the basal ganglia in the pathophysiology of OCD. The aim of this paper is to describe the association of OCD and several neurologic disorders affecting the basal ganglia, report the existing evidences of the role of the basal ganglia in the pathophysiology of OCD, and analyze the mechanisms probably involved in this pathophysiology.O transtorno obsessivo-compulsivo (TOC tem sido reportado em associação com algumas doenças neurológicas que afetam primariamente os gânglios da base como a síndrome de Tourette , a coréia de Sydenham, a doença de Parkinson e a doença de Huntington. Da mesma forma, estudos de neuroimagem sugerem a participação dos gânglios da base na fisiopatologia do TOC. O objetivo deste estudo é rever a coexistência de TOC e várias doenças que afetam os gânglios da base, as evidências da participação dessas estruturas na fisiopatologia do TOC e os mecanismos neurais subjacentes a esse distúrbio psiquiátrico.

  11. Base Station Performance Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  12. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.; Sarfraz, M.

    2004-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  13. Model Based Temporal Reasoning

    Science.gov (United States)

    Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.

    1988-03-01

    Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.

  14. Skull base tumor model.

    Science.gov (United States)

    Gragnaniello, Cristian; Nader, Remi; van Doormaal, Tristan; Kamel, Mahmoud; Voormolen, Eduard H J; Lasio, Giovanni; Aboud, Emad; Regli, Luca; Tulleken, Cornelius A F; Al-Mefty, Ossama

    2010-11-01

    Resident duty-hours restrictions have now been instituted in many countries worldwide. Shortened training times and increased public scrutiny of surgical competency have led to a move away from the traditional apprenticeship model of training. The development of educational models for brain anatomy is a fascinating innovation allowing neurosurgeons to train without the need to practice on real patients and it may be a solution to achieve competency within a shortened training period. The authors describe the use of Stratathane resin ST-504 polymer (SRSP), which is inserted at different intracranial locations to closely mimic meningiomas and other pathological entities of the skull base, in a cadaveric model, for use in neurosurgical training. Silicone-injected and pressurized cadaveric heads were used for studying the SRSP model. The SRSP presents unique intrinsic metamorphic characteristics: liquid at first, it expands and foams when injected into the desired area of the brain, forming a solid tumorlike structure. The authors injected SRSP via different passages that did not influence routes used for the surgical approach for resection of the simulated lesion. For example, SRSP injection routes included endonasal transsphenoidal or transoral approaches if lesions were to be removed through standard skull base approach, or, alternatively, SRSP was injected via a cranial approach if the removal was planned to be via the transsphenoidal or transoral route. The model was set in place in 3 countries (US, Italy, and The Netherlands), and a pool of 13 physicians from 4 different institutions (all surgeons and surgeons in training) participated in evaluating it and provided feedback. All 13 evaluating physicians had overall positive impressions of the model. The overall score on 9 components evaluated--including comparison between the tumor model and real tumor cases, perioperative requirements, general impression, and applicability--was 88% (100% being the best possible

  15. Primarily Experimental Results for a W Wire Array Z Pinch

    International Nuclear Information System (INIS)

    Kuai Bin; Aici, Qiu; Wang Liangping; Zeng Zhengzhong; Wang Wensheng; Cong Peitian; Gai Tongyang; Wei Fuli; Guo Ning; Zhang Zhong

    2006-01-01

    Primarily experimental results are given for a W wire array Z pinch imploded with up to 2 MA in 100 ns on a Qiangguang-I pulsed power generator. The configuration and parameters of the generator, the W wire array load assembly and the diagnostic system for the experiment are described. The total X-ray energy has been obtained with a averaged power of X-ray radiation of 1.28 TW

  16. Human punishment is not primarily motivated by inequality.

    Science.gov (United States)

    Marczyk, Jesse

    2017-01-01

    Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player's payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value.

  17. Human punishment is not primarily motivated by inequality

    Science.gov (United States)

    Marczyk, Jesse

    2017-01-01

    Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player’s payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value. PMID:28187166

  18. Human punishment is not primarily motivated by inequality.

    Directory of Open Access Journals (Sweden)

    Jesse Marczyk

    Full Text Available Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player's payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value.

  19. Model-Based Reasoning

    Science.gov (United States)

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  20. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  1. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  2. Risk based modelling

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Baker, A.E.

    1993-01-01

    Risk based analysis is a tool becoming available to both engineers and managers to aid decision making concerning plant matters such as In-Service Inspection (ISI). In order to develop a risk based method, some form of Structural Reliability Risk Assessment (SRRA) needs to be performed to provide a probability of failure ranking for all sites around the plant. A Probabilistic Risk Assessment (PRA) can then be carried out to combine these possible events with the capability of plant safety systems and procedures, to establish the consequences of failure for the sites. In this way the probability of failures are converted into a risk based ranking which can be used to assist the process of deciding which sites should be included in an ISI programme. This paper reviews the technique and typical results of a risk based ranking assessment carried out for nuclear power plant pipework. (author)

  3. Model-based consensus

    NARCIS (Netherlands)

    Boumans, M.; Martini, C.; Boumans, M.

    2014-01-01

    The aim of the rational-consensus method is to produce "rational consensus", that is, "mathematical aggregation", by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  4. Model-based consensus

    NARCIS (Netherlands)

    Boumans, Marcel

    2014-01-01

    The aim of the rational-consensus method is to produce “rational consensus”, that is, “mathematical aggregation”, by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  5. Activity-based DEVS modeling

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2018-01-01

    architecture and the UML concepts. In this paper, we further this work by grounding Activity-based DEVS modeling and developing a fully-fledged modeling engine to demonstrate applicability. We also detail the relevant aspects of the created metamodel in terms of modeling and simulation. A significant number......Use of model-driven approaches has been increasing to significantly benefit the process of building complex systems. Recently, an approach for specifying model behavior using UML activities has been devised to support the creation of DEVS models in a disciplined manner based on the model driven...... of the artifacts of the UML 2.5 activities and actions, from the vantage point of DEVS behavioral modeling, is covered in details. Their semantics are discussed to the extent of time-accurate requirements for simulation. We characterize them in correspondence with the specification of the atomic model behavior. We...

  6. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  7. Spatial agent-based models for socio-ecological systems: challenges and prospects

    NARCIS (Netherlands)

    de Filatova, T.; Verburg, P.H.; Parker, D.C.; Stannard, S.R.

    2013-01-01

    Departing from the comprehensive reviews carried out in the field, we identify the key challenges that agent-based methodology faces when modeling coupled socio-ecological systems. Focusing primarily on the papers presented in this thematic issue, we review progress in spatial agent-based models

  8. Central Puget Sound Ecopath/Ecosim model outputs - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  9. Central Puget Sound Ecopath/Ecosim model biological parameters - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  10. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    Probabilistic trust has been adopted as an approach to taking security sensitive decisions in modern global computing environments. Existing probabilistic trust frameworks either assume fixed behaviour for the principals or incorporate the notion of ‘decay' as an ad hoc approach to cope...... with their dynamic behaviour. Using Hidden Markov Models (HMMs) for both modelling and approximating the behaviours of principals, we introduce the HMM-based trust model as a new approach to evaluating trust in systems exhibiting dynamic behaviour. This model avoids the fixed behaviour assumption which is considered...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  11. Modeling Guru: Knowledge Base for NASA Modelers

    Science.gov (United States)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the

  12. Structure-Based Turbulence Model

    National Research Council Canada - National Science Library

    Reynolds, W

    2000-01-01

    .... Maire carried out this work as part of his Phi) research. During the award period we began to explore ways to simplify the structure-based modeling so that it could be used in repetitive engineering calculations...

  13. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  14. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Process-Based Modeling of Constructed Wetlands

    Science.gov (United States)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  16. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  17. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...... the accuracy at the same time. The test example is classified using simpler and smaller model. The training examples in a particular cluster share the common vocabulary. At the time of clustering, we do not take into account the labels of the training examples. After the clusters have been created......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...

  18. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...

  19. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  20. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Milgram, J.; Dormoy, J.L.

    1994-09-01

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  1. Model-based security testing

    OpenAIRE

    Schieferdecker, Ina; Großmann, Jürgen; Schneider, Martin

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security...

  2. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  3. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  4. CROWDSOURCING BASED 3D MODELING

    Directory of Open Access Journals (Sweden)

    A. Somogyi

    2016-06-01

    Full Text Available Web-based photo albums that support organizing and viewing the users’ images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  5. Microinstability-based model for anomalous thermal confinement in tokamaks

    International Nuclear Information System (INIS)

    Tang, W.M.

    1986-03-01

    This paper deals with the formulation of microinstability-based thermal transport coefficients (chi/sub j/) for the purpose of modelling anomalous energy confinement properties in tokamak plasmas. Attention is primarily focused on ohmically heated discharges and the associated anomalous electron thermal transport. An appropriate expression for chi/sub e/ is developed which is consistent with reasonable global constraints on the current and electron temperature profiles as well as with the key properties of the kinetic instabilities most likely to be present. Comparisons of confinement scaling trends predicted by this model with the empirical ohmic data base indicate quite favorable agreement. The subject of anomalous ion thermal transport and its implications for high density ohmic discharges and for auxiliary-heated plasmas is also addressed

  6. Issues in practical model-based diagnosis

    NARCIS (Netherlands)

    Bakker, R.R.; Bakker, R.R.; van den Bempt, P.C.A.; van den Bempt, P.C.A.; Mars, Nicolaas; Out, D.-J.; Out, D.J.; van Soest, D.C.; van Soes, D.C.

    1993-01-01

    The model-based diagnosis project at the University of Twente has been directed at improving the practical usefulness of model-based diagnosis. In cooperation with industrial partners, the research addressed the modeling problem and the efficiency problem in model-based reasoning. Main results of

  7. A simplified physics-based model for nickel hydrogen battery

    Science.gov (United States)

    Liu, Shengyi; Dougal, Roger A.; Weidner, John W.; Gao, Lijun

    This paper presents a simplified model of a nickel hydrogen battery based on a first approximation. The battery is assumed uniform throughout. The reversible potential is considered primarily due to one-electron transfer redox reaction of nickel hydroxide and nickel oxyhydroxide. The non-ideality due to phase reactions is characterized by the two-parameter activity coefficients. The overcharge process is characterized by the oxygen reaction. The overpotentials are lumped to a tunable resistive drop to fit particular battery designs. The model is implemented in the Virtual Test Bed environment, and the characteristics of the battery are simulated and in good agreement with the experimental data within the normal operating regime. The model can be used for battery dynamic simulation and design in a satellite power system, an example of which is given.

  8. Control volume based modelling of compressible flow in reciprocating machines

    DEFF Research Database (Denmark)

    Andersen, Stig Kildegård; Thomsen, Per Grove; Carlsen, Henrik

    2004-01-01

    , and multidimensional effects must be calculated using empirical correlations; correlations for steady state flow can be used as an approximation. A transformation that assumes ideal gas is presented for transforming equations for masses and energies in control volumes into the corresponding pressures and temperatures......An approach to modelling unsteady compressible flow that is primarily one dimensional is presented. The approach was developed for creating distributed models of machines with reciprocating pistons but it is not limited to this application. The approach is based on the integral form of the unsteady...... conservation laws for mass, energy, and momentum applied to a staggered mesh consisting of two overlapping strings of control volumes. Loss mechanisms can be included directly in the governing equations of models by including them as terms in the conservation laws. Heat transfer, flow friction...

  9. Sensor-based interior modeling

    International Nuclear Information System (INIS)

    Herbert, M.; Hoffman, R.; Johnson, A.; Osborn, J.

    1995-01-01

    Robots and remote systems will play crucial roles in future decontamination and decommissioning (D ampersand D) of nuclear facilities. Many of these facilities, such as uranium enrichment plants, weapons assembly plants, research and production reactors, and fuel recycling facilities, are dormant; there is also an increasing number of commercial reactors whose useful lifetime is nearly over. To reduce worker exposure to radiation, occupational and other hazards associated with D ampersand D tasks, robots will execute much of the work agenda. Traditional teleoperated systems rely on human understanding (based on information gathered by remote viewing cameras) of the work environment to safely control the remote equipment. However, removing the operator from the work site substantially reduces his efficiency and effectiveness. To approach the productivity of a human worker, tasks will be performed telerobotically, in which many aspects of task execution are delegated to robot controllers and other software. This paper describes a system that semi-automatically builds a virtual world for remote D ampersand D operations by constructing 3-D models of a robot's work environment. Planar and quadric surface representations of objects typically found in nuclear facilities are generated from laser rangefinder data with a minimum of human interaction. The surface representations are then incorporated into a task space model that can be viewed and analyzed by the operator, accessed by motion planning and robot safeguarding algorithms, and ultimately used by the operator to instruct the robot at a level much higher than teleoperation

  10. Differential Geometry Based Multiscale Models

    Science.gov (United States)

    Wei, Guo-Wei

    2010-01-01

    Large chemical and biological systems such as fuel cells, ion channels, molecular motors, and viruses are of great importance to the scientific community and public health. Typically, these complex systems in conjunction with their aquatic environment pose a fabulous challenge to theoretical description, simulation, and prediction. In this work, we propose a differential geometry based multiscale paradigm to model complex macromolecular systems, and to put macroscopic and microscopic descriptions on an equal footing. In our approach, the differential geometry theory of surfaces and geometric measure theory are employed as a natural means to couple the macroscopic continuum mechanical description of the aquatic environment with the microscopic discrete atom-istic description of the macromolecule. Multiscale free energy functionals, or multiscale action functionals are constructed as a unified framework to derive the governing equations for the dynamics of different scales and different descriptions. Two types of aqueous macromolecular complexes, ones that are near equilibrium and others that are far from equilibrium, are considered in our formulations. We show that generalized Navier–Stokes equations for the fluid dynamics, generalized Poisson equations or generalized Poisson–Boltzmann equations for electrostatic interactions, and Newton's equation for the molecular dynamics can be derived by the least action principle. These equations are coupled through the continuum-discrete interface whose dynamics is governed by potential driven geometric flows. Comparison is given to classical descriptions of the fluid and electrostatic interactions without geometric flow based micro-macro interfaces. The detailed balance of forces is emphasized in the present work. We further extend the proposed multiscale paradigm to micro-macro analysis of electrohydrodynamics, electrophoresis, fuel cells, and ion channels. We derive generalized Poisson–Nernst–Planck equations that

  11. Differential geometry based multiscale models.

    Science.gov (United States)

    Wei, Guo-Wei

    2010-08-01

    Large chemical and biological systems such as fuel cells, ion channels, molecular motors, and viruses are of great importance to the scientific community and public health. Typically, these complex systems in conjunction with their aquatic environment pose a fabulous challenge to theoretical description, simulation, and prediction. In this work, we propose a differential geometry based multiscale paradigm to model complex macromolecular systems, and to put macroscopic and microscopic descriptions on an equal footing. In our approach, the differential geometry theory of surfaces and geometric measure theory are employed as a natural means to couple the macroscopic continuum mechanical description of the aquatic environment with the microscopic discrete atomistic description of the macromolecule. Multiscale free energy functionals, or multiscale action functionals are constructed as a unified framework to derive the governing equations for the dynamics of different scales and different descriptions. Two types of aqueous macromolecular complexes, ones that are near equilibrium and others that are far from equilibrium, are considered in our formulations. We show that generalized Navier-Stokes equations for the fluid dynamics, generalized Poisson equations or generalized Poisson-Boltzmann equations for electrostatic interactions, and Newton's equation for the molecular dynamics can be derived by the least action principle. These equations are coupled through the continuum-discrete interface whose dynamics is governed by potential driven geometric flows. Comparison is given to classical descriptions of the fluid and electrostatic interactions without geometric flow based micro-macro interfaces. The detailed balance of forces is emphasized in the present work. We further extend the proposed multiscale paradigm to micro-macro analysis of electrohydrodynamics, electrophoresis, fuel cells, and ion channels. We derive generalized Poisson-Nernst-Planck equations that are

  12. Temporal integration of loudness in listeners with hearing losses of primarily cochlear origin

    DEFF Research Database (Denmark)

    Buus, Søren; Florentine, Mary; Poulsen, Torben

    1999-01-01

    To investigate how hearing loss of primarily cochlear origin affects the loudness of brief tones, loudness matches between 5- and 200-ms tones were obtained as a function of level for 15 listeners with cochlear impairments and for seven age-matched controls. Three frequencies, usually 0.5, 1, and 4...... of temporal integration—defined as the level difference between equally loud short and long tones—varied nonmonotonically with level and was largest at moderate levels. No consistent effect of frequency was apparent. The impaired listeners varied widely, but most showed a clear effect of level on the amount...... of temporal integration. Overall, their results appear consistent with expectations based on knowledge of the general properties of their loudness-growth functions and the equal-loudness-ratio hypothesis, which states that the loudness ratio between equal-SPL long and brief tones is the same at all SPLs...

  13. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through

  14. Guide to APA-Based Models

    Science.gov (United States)

    Robins, Robert E.; Delisi, Donald P.

    2008-01-01

    In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.

  15. The Serotonin Transporter Undergoes Constitutive Internalization and Is Primarily Sorted to Late Endosomes and Lysosomal Degradation*

    Science.gov (United States)

    Rahbek-Clemmensen, Troels; Bay, Tina; Eriksen, Jacob; Gether, Ulrik; Jørgensen, Trine Nygaard

    2014-01-01

    The serotonin transporter (SERT) plays a critical role in regulating serotonin signaling by mediating reuptake of serotonin from the extracellular space. The molecular and cellular mechanisms controlling SERT levels in the membrane remain poorly understood. To study trafficking of the surface resident SERT, two functional epitope-tagged variants were generated. Fusion of a FLAG-tagged one-transmembrane segment protein Tac to the SERT N terminus generated a transporter with an extracellular epitope suited for trafficking studies (TacSERT). Likewise, a construct with an extracellular antibody epitope was generated by introducing an HA (hemagglutinin) tag in the extracellular loop 2 of SERT (HA-SERT). By using TacSERT and HA-SERT in antibody-based internalization assays, we show that SERT undergoes constitutive internalization in a dynamin-dependent manner. Confocal images of constitutively internalized SERT demonstrated that SERT primarily co-localized with the late endosomal/lysosomal marker Rab7, whereas little co-localization was observed with the Rab11, a marker of the “long loop” recycling pathway. This sorting pattern was distinct from that of a prototypical recycling membrane protein, the β2-adrenergic receptor. Furthermore, internalized SERT co-localized with the lysosomal marker LysoTracker and not with transferrin. The sorting pattern was further confirmed by visualizing internalization of SERT using the fluorescent cocaine analog JHC1-64 and by reversible and pulse-chase biotinylation assays showing evidence for lysosomal degradation of the internalized transporter. Finally, we found that SERT internalized in response to stimulation with 12-myristate 13-acetate co-localized primarily with Rab7- and LysoTracker-positive compartments. We conclude that SERT is constitutively internalized and that the internalized transporter is sorted mainly to degradation. PMID:24973209

  16. Eningiomas: outcome, and analysis of prognostic factors of primarily resected tumors

    International Nuclear Information System (INIS)

    Stafford, S.L.; Perry, A.; Suman, V.; Meyer, B.; Scheithauer, B.W.; Shaw, E.G.; Earle, J.D.

    1996-01-01

    Purpose: 582 consecutive cases of primary intracranial meningioma undergoing resection at the Mayo Clinic, (Rochester, MN) were reviewed to determine overall survival (OS), progression free survival(PFS), prognostic factors predicting recurrence, and to determine the importance of radiation therapy in the management of this tumor. Materials and Methods: Between 1978-1988, 582 cases of primarily resected meningiomas were identified based on the tumor and operative registries where diagnosis was between 1978-1988 inclusive. PFS was identified by radiographic progression. Follow-up was accomplished by chart review, and a detailed questionnaire sent to patients and referring physicians. Estimation of OS and PFS distributions were done by the Kaplan-Meier method. The log rank test was used to assess which factors were associated with PFS. Proportional hazard modeling was performed to obtain a subset of independent predictors of PFS. Results: the median age was 57(5-93). 67% were female. CT identified the tumor in 91% of cases. There was associated edema in 21% and 2% were radiographically en plaque. There were 17 patients with multiple tumors, four of whom had a known diagnosis of neurofibromatosis. Gross total resection (GTR) was accomplished in 80%, radical subtotal or subtotal resection(STR) in 20%, and biopsy in 53) cellularity, and four or more mitoses per 10 HPF. Multivariate analysis indicated young age, male sex, en plaque at surgery, were significant for decreased PFS when only patient characteristics were considered. When treatment and pathologic factors were also considered, then young age, male sex, less than GTR, and tumor sheeting were predictors for decreased PFS. 10 patients had RT after initial resection, two of whom recurred. There were 107 first recurrences. 50 were observed(no intervention within 3 months), 35 treated by surgery alone, 11 had S+RT, and 11 were treated with RT alone. Considering those patients treated at recurrence (n=57), PFS was at

  17. Perceptions of Mindfulness in a Low-income, Primarily African American Treatment-Seeking Sample.

    Science.gov (United States)

    Spears, Claire Adams; Houchins, Sean C; Bamatter, Wendy P; Barrueco, Sandra; Hoover, Diana Stewart; Perskaudas, Rokas

    2017-12-01

    Individuals with low socioeconomic status (SES) and members of racial/ethnic minority groups often experience profound disparities in mental health and physical well-being. Mindfulness-based interventions show promise for improving mood and health behaviors in higher-SES and non-Latino White populations. However, research is needed to explore what types of adaptations, if any, are needed to best support underserved populations. This study used qualitative methods to gain information about a) perceptions of mindfulness, b) experiences with meditation, c) barriers to practicing mindfulness, and d) recommendations for tailoring mindfulness-based interventions in a low-income, primarily African American treatment-seeking sample. Eight focus groups were conducted with 32 adults (16 men and 16 women) currently receiving services at a community mental health center. Most participants (91%) were African American. Focus group data were transcribed and analyzed using NVivo 10. A team of coders reviewed the transcripts to identify salient themes. Relevant themes included beliefs that mindfulness practice might improve mental health (e.g., managing stress and anger more effectively) and physical health (e.g., improving sleep and chronic pain, promoting healthier behaviors). Participants also discussed ways in which mindfulness might be consistent with, and even enhance, their religious and spiritual practices. Results could be helpful in tailoring mindfulness-based treatments to optimize feasibility and effectiveness for low-SES adults receiving mental health services.

  18. Rule-based decision making model

    International Nuclear Information System (INIS)

    Sirola, Miki

    1998-01-01

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  19. Admission rates in a general practitioner-based versus a hospital specialist based, hospital-at-home model

    DEFF Research Database (Denmark)

    Mogensen, Christian Backer; Ankersen, Ejnar Skytte; Lindberg, Mats J

    2018-01-01

    . CONCLUSIONS: The GP based HaH model was more effective than the hospital specialist model in avoiding hospital admissions within 7 days among elderly patients with an acute medical condition with no differences in mental or physical recovery rates or deaths between the two models. REGISTRATION: No. NCT......BACKGROUND: Hospital at home (HaH) is an alternative to acute admission for elderly patients. It is unclear if should be cared for a primarily by a hospital intern specialist or by the patient's own general practitioner (GP). The study assessed whether a GP based model was more effective than...... Denmark, including + 65 years old patients with an acute medical condition that required acute hospital in-patient care. The patients were randomly assigned to hospital specialist based model or GP model of HaH care. Five physical and cognitive performance tests were performed at inclusion and after 7...

  20. Cost Effective Community Based Dementia Screening: A Markov Model Simulation

    Directory of Open Access Journals (Sweden)

    Erin Saito

    2014-01-01

    Full Text Available Background. Given the dementia epidemic and the increasing cost of healthcare, there is a need to assess the economic benefit of community based dementia screening programs. Materials and Methods. Markov model simulations were generated using data obtained from a community based dementia screening program over a one-year period. The models simulated yearly costs of caring for patients based on clinical transitions beginning in pre dementia and extending for 10 years. Results. A total of 93 individuals (74 female, 19 male were screened for dementia and 12 meeting clinical criteria for either mild cognitive impairment (n=7 or dementia (n=5 were identified. Assuming early therapeutic intervention beginning during the year of dementia detection, Markov model simulations demonstrated 9.8% reduction in cost of dementia care over a ten-year simulation period, primarily through increased duration in mild stages and reduced time in more costly moderate and severe stages. Discussion. Community based dementia screening can reduce healthcare costs associated with caring for demented individuals through earlier detection and treatment, resulting in proportionately reduced time in more costly advanced stages.

  1. Model-based DSL frameworks

    NARCIS (Netherlands)

    Ivanov, Ivan; Bézivin, J.; Jouault, F.; Valduriez, P.

    2006-01-01

    More than five years ago, the OMG proposed the Model Driven Architecture (MDA™) approach to deal with the separation of platform dependent and independent aspects in information systems. Since then, the initial idea of MDA evolved and Model Driven Engineering (MDE) is being increasingly promoted to

  2. Model based design introduction: modeling game controllers to microprocessor architectures

    Science.gov (United States)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  3. EPR-based material modelling of soils

    Science.gov (United States)

    Faramarzi, Asaad; Alani, Amir M.

    2013-04-01

    In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.

  4. Through the Loupe: Visitor engagement with a primarily text-based handheld AR application

    NARCIS (Netherlands)

    van der Vaart, M.; Damala, A.; Guidi, G; Scopigno, R.; Torres, J.C.; Graf, H.; Remondino, F.; Duranti, L.; Brunet, P.; Hazan, S.; Barceló, J.

    2015-01-01

    The use of Augmented Reality (AR) in a museum or heritage setting holds great potential. However, until now, introducing AR into their buildings has been prohibitively expensive for most museums. On the one hand, programming the AR application could not be done in-house and would be rather costly.

  5. Disgust sensitivity is primarily associated with purity-based moral judgments

    NARCIS (Netherlands)

    Wagemans, F.M.A.; Brandt, M.J.; Zeelenberg, M.

    2018-01-01

    Individual differences in disgust sensitivity are associated with a range of judgments and attitudes related to the moral domain. Some perspectives suggest that the association between disgust sensitivity and moral judgments will be equally strong across all moral domains (i.e., purity, authority,

  6. Siting studies for an asymptotic U.S. energy supply system based primarily on nuclear energy

    International Nuclear Information System (INIS)

    Burwell, C.C.

    1977-01-01

    The nuclear energy center (NEC) concept is an approach to siting wherein nuclear facilities would be clustered in and delimited to a relatively small number of locations throughout the United States. These designated centers would be concurrently developed to their full capability over several decades, at which time, they would be several times larger than the largest nuclear power stations in existence today. The centers would be permanently dedicated to nuclear operations including the future decommissioning of functionally obsolescent facilities as well as the commissioning of their replacements. The criteria for and characteristics of an acceptable nuclear energy system that could supply most of the U.S. energy requirements in the distant future are discussed. The time period is unspecified but occurs when fossil-fuel resources are depleted to such an extent that their use is economic only in special situations, and is not economic, in general, for use as fuel

  7. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  8. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  9. Improving Agent Based Modeling of Critical Incidents

    Directory of Open Access Journals (Sweden)

    Robert Till

    2010-04-01

    Full Text Available Agent Based Modeling (ABM is a powerful method that has been used to simulate potential critical incidents in the infrastructure and built environments. This paper will discuss the modeling of some critical incidents currently simulated using ABM and how they may be expanded and improved by using better physiological modeling, psychological modeling, modeling the actions of interveners, introducing Geographic Information Systems (GIS and open source models.

  10. Models for Rational Number Bases

    Science.gov (United States)

    Pedersen, Jean J.; Armbruster, Frank O.

    1975-01-01

    This article extends number bases to negative integers, then to positive rationals and finally to negative rationals. Methods and rules for operations in positive and negative rational bases greater than one or less than negative one are summarized in tables. Sample problems are explained and illustrated. (KM)

  11. Airfoil Shape Optimization based on Surrogate Model

    Science.gov (United States)

    Mukesh, R.; Lingadurai, K.; Selvakumar, U.

    2018-02-01

    Engineering design problems always require enormous amount of real-time experiments and computational simulations in order to assess and ensure the design objectives of the problems subject to various constraints. In most of the cases, the computational resources and time required per simulation are large. In certain cases like sensitivity analysis, design optimisation etc where thousands and millions of simulations have to be carried out, it leads to have a life time of difficulty for designers. Nowadays approximation models, otherwise called as surrogate models (SM), are more widely employed in order to reduce the requirement of computational resources and time in analysing various engineering systems. Various approaches such as Kriging, neural networks, polynomials, Gaussian processes etc are used to construct the approximation models. The primary intention of this work is to employ the k-fold cross validation approach to study and evaluate the influence of various theoretical variogram models on the accuracy of the surrogate model construction. Ordinary Kriging and design of experiments (DOE) approaches are used to construct the SMs by approximating panel and viscous solution algorithms which are primarily used to solve the flow around airfoils and aircraft wings. The method of coupling the SMs with a suitable optimisation scheme to carryout an aerodynamic design optimisation process for airfoil shapes is also discussed.

  12. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  13. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  14. Firm Based Trade Models and Turkish Economy

    Directory of Open Access Journals (Sweden)

    Nilüfer ARGIN

    2015-12-01

    Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.

  15. Efficient transfection of DNA into primarily cultured rat sertoli cells by electroporation.

    Science.gov (United States)

    Li, Fuping; Yamaguchi, Kohei; Okada, Keisuke; Matsushita, Kei; Enatsu, Noritoshi; Chiba, Koji; Yue, Huanxun; Fujisawa, Masato

    2013-03-01

    The expression of exogenous DNA in Sertoli cells is essential for studying its functional genomics, pathway analysis, and medical applications. Electroporation is a valuable tool for nucleic acid delivery, even in primarily cultured cells, which are considered difficult to transfect. In this study, we developed an optimized protocol for electroporation-based transfection of Sertoli cells and compared its efficiency with conventional lipofection. Sertoli cells were transfected with pCMV-GFP plasmid by square-wave electroporation under different conditions. After transfection of plasmid into Sertoli cells, enhanced green fluorescent protein (EGFP) expression could be easily detected by fluorescent microscopy, and cell survival was evaluated by dye exclusion assay using Trypan blue. In terms of both cell survival and the percentage expressing EGFP, 250 V was determined to produce the greatest number of transiently transfected cells. Keeping the voltage constant (250 V), relatively high cell survival (76.5% ± 3.4%) and transfection efficiency (30.6% ± 5.6%) were observed with a pulse length of 20 μm. The number of pulses significantly affected cell survival and EGFP expression (P transfection methods, the transfection efficiency of electroporation (21.5% ± 5.7%) was significantly higher than those of Lipofectamine 2000 (2.9% ± 1.0%) and Effectene (1.9% ± 0.8%) in this experiment (P transfection of Sertoli cells.

  16. GOLD HULL AND INTERNODE2 encodes a primarily multifunctional cinnamyl-alcohol dehydrogenase in rice.

    Science.gov (United States)

    Zhang, Kewei; Qian, Qian; Huang, Zejun; Wang, Yiqin; Li, Ming; Hong, Lilan; Zeng, Dali; Gu, Minghong; Chu, Chengcai; Cheng, Zhukuan

    2006-03-01

    Lignin content and composition are two important agronomic traits for the utilization of agricultural residues. Rice (Oryza sativa) gold hull and internode phenotype is a classical morphological marker trait that has long been applied to breeding and genetics study. In this study, we have cloned the GOLD HULL AND INTERNODE2 (GH2) gene in rice using a map-based cloning approach. The result shows that the gh2 mutant is a lignin-deficient mutant, and GH2 encodes a cinnamyl-alcohol dehydrogenase (CAD). Consistent with this finding, extracts from roots, internodes, hulls, and panicles of the gh2 plants exhibited drastically reduced CAD activity and undetectable sinapyl alcohol dehydrogenase activity. When expressed in Escherichia coli, purified recombinant GH2 was found to exhibit strong catalytic ability toward coniferaldehyde and sinapaldehyde, while the mutant protein gh2 completely lost the corresponding CAD and sinapyl alcohol dehydrogenase activities. Further phenotypic analysis of the gh2 mutant plants revealed that the p-hydroxyphenyl, guaiacyl, and sinapyl monomers were reduced in almost the same ratio compared to the wild type. Our results suggest GH2 acts as a primarily multifunctional CAD to synthesize coniferyl and sinapyl alcohol precursors in rice lignin biosynthesis.

  17. GOLD HULL AND INTERNODE2 Encodes a Primarily Multifunctional Cinnamyl-Alcohol Dehydrogenase in Rice1

    Science.gov (United States)

    Zhang, Kewei; Qian, Qian; Huang, Zejun; Wang, Yiqin; Li, Ming; Hong, Lilan; Zeng, Dali; Gu, Minghong; Chu, Chengcai; Cheng, Zhukuan

    2006-01-01

    Lignin content and composition are two important agronomic traits for the utilization of agricultural residues. Rice (Oryza sativa) gold hull and internode phenotype is a classical morphological marker trait that has long been applied to breeding and genetics study. In this study, we have cloned the GOLD HULL AND INTERNODE2 (GH2) gene in rice using a map-based cloning approach. The result shows that the gh2 mutant is a lignin-deficient mutant, and GH2 encodes a cinnamyl-alcohol dehydrogenase (CAD). Consistent with this finding, extracts from roots, internodes, hulls, and panicles of the gh2 plants exhibited drastically reduced CAD activity and undetectable sinapyl alcohol dehydrogenase activity. When expressed in Escherichia coli, purified recombinant GH2 was found to exhibit strong catalytic ability toward coniferaldehyde and sinapaldehyde, while the mutant protein gh2 completely lost the corresponding CAD and sinapyl alcohol dehydrogenase activities. Further phenotypic analysis of the gh2 mutant plants revealed that the p-hydroxyphenyl, guaiacyl, and sinapyl monomers were reduced in almost the same ratio compared to the wild type. Our results suggest GH2 acts as a primarily multifunctional CAD to synthesize coniferyl and sinapyl alcohol precursors in rice lignin biosynthesis. PMID:16443696

  18. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar o...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  19. Distributed Prognostics Based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...

  20. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    Science.gov (United States)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  1. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  2. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the

  3. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  4. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  5. Gradient-based model calibration with proxy-model assistance

    Science.gov (United States)

    Burrows, Wesley; Doherty, John

    2016-02-01

    Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.

  6. Opinion dynamics model based on quantum formalism

    Energy Technology Data Exchange (ETDEWEB)

    Artawan, I. Nengah, E-mail: nengahartawan@gmail.com [Theoretical Physics Division, Department of Physics, Udayana University (Indonesia); Trisnawati, N. L. P., E-mail: nlptrisnawati@gmail.com [Biophysics, Department of Physics, Udayana University (Indonesia)

    2016-03-11

    Opinion dynamics model based on quantum formalism is proposed. The core of the quantum formalism is on the half spin dynamics system. In this research the implicit time evolution operators are derived. The analogy between the model with Deffuant dan Sznajd models is discussed.

  7. Modeling the interdependent network based on two-mode networks

    Science.gov (United States)

    An, Feng; Gao, Xiangyun; Guan, Jianhe; Huang, Shupei; Liu, Qian

    2017-10-01

    Among heterogeneous networks, there exist obviously and closely interdependent linkages. Unlike existing research primarily focus on the theoretical research of physical interdependent network model. We propose a two-layer interdependent network model based on two-mode networks to explore the interdependent features in the reality. Specifically, we construct a two-layer interdependent loan network and develop several dependent features indices. The model is verified to enable us to capture the loan dependent features of listed companies based on loan behaviors and shared shareholders. Taking Chinese debit and credit market as case study, the main conclusions are: (1) only few listed companies shoulder the main capital transmission (20% listed companies occupy almost 70% dependent degree). (2) The control of these key listed companies will be more effective of avoiding the spreading of financial risks. (3) Identifying the companies with high betweenness centrality and controlling them could be helpful to monitor the financial risk spreading. (4) The capital transmission channel among Chinese financial listed companies and Chinese non-financial listed companies are relatively strong. However, under greater pressure of demand of capital transmission (70% edges failed), the transmission channel, which constructed by debit and credit behavior, will eventually collapse.

  8. Agent-based modeling of sustainable behaviors

    CERN Document Server

    Sánchez-Maroño, Noelia; Fontenla-Romero, Oscar; Polhill, J; Craig, Tony; Bajo, Javier; Corchado, Juan

    2017-01-01

    Using the O.D.D. (Overview, Design concepts, Detail) protocol, this title explores the role of agent-based modeling in predicting the feasibility of various approaches to sustainability. The chapters incorporated in this volume consist of real case studies to illustrate the utility of agent-based modeling and complexity theory in discovering a path to more efficient and sustainable lifestyles. The topics covered within include: households' attitudes toward recycling, designing decision trees for representing sustainable behaviors, negotiation-based parking allocation, auction-based traffic signal control, and others. This selection of papers will be of interest to social scientists who wish to learn more about agent-based modeling as well as experts in the field of agent-based modeling.

  9. Ammonia concentration modeling based on retained gas sampler data

    International Nuclear Information System (INIS)

    Terrones, G.; Palmer, B.J.; Cuta, J.M.

    1997-09-01

    The vertical ammonia concentration distributions determined by the retained gas sampler (RGS) apparatus were modeled for double-shell tanks (DSTs) AW-101, AN-103, AN-104, and AN-105 and single-shell tanks (SSTs) A-101, S-106, and U-103. One the vertical transport of ammonia in the tanks were used for the modeling. Transport in the non-convective settled solids and floating solids layers is assumed to occur primarily via some type of diffusion process, while transport in the convective liquid layers is incorporated into the model via mass transfer coefficients based on empirical correlations. Mass transfer between the top of the waste and the tank headspace and the effects of ventilation of the headspace are also included in the models. The resulting models contain a large number of parameters, but many of them can be determined from known properties of the waste configuration or can be estimated within reasonable bounds from data on the waste samples themselves. The models are used to extract effective diffusion coefficients for transport in the nonconvective layers based on the measured values of ammonia from the RGS apparatus. The modeling indicates that the higher concentrations of ammonia seen in bubbles trapped inside the waste relative to the ammonia concentrations in the tank headspace can be explained by a combination of slow transport of ammonia via diffusion in the nonconvective layers and ventilation of the tank headspace by either passive or active means. Slow transport by diffusion causes a higher concentration of ammonia to build up deep within the waste until the concentration gradients between the interior and top of the waste are sufficient to allow ammonia to escape at the same rate at which it is being generated in the waste

  10. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  11. A Full-Body Layered Deformable Model for Automatic Model-Based Gait Recognition

    Science.gov (United States)

    Lu, Haiping; Plataniotis, Konstantinos N.; Venetsanopoulos, Anastasios N.

    2007-12-01

    This paper proposes a full-body layered deformable model (LDM) inspired by manually labeled silhouettes for automatic model-based gait recognition from part-level gait dynamics in monocular video sequences. The LDM is defined for the fronto-parallel gait with 22 parameters describing the human body part shapes (widths and lengths) and dynamics (positions and orientations). There are four layers in the LDM and the limbs are deformable. Algorithms for LDM-based human body pose recovery are then developed to estimate the LDM parameters from both manually labeled and automatically extracted silhouettes, where the automatic silhouette extraction is through a coarse-to-fine localization and extraction procedure. The estimated LDM parameters are used for model-based gait recognition by employing the dynamic time warping for matching and adopting the combination scheme in AdaBoost.M2. While the existing model-based gait recognition approaches focus primarily on the lower limbs, the estimated LDM parameters enable us to study full-body model-based gait recognition by utilizing the dynamics of the upper limbs, the shoulders and the head as well. In the experiments, the LDM-based gait recognition is tested on gait sequences with differences in shoe-type, surface, carrying condition and time. The results demonstrate that the recognition performance benefits from not only the lower limb dynamics, but also the dynamics of the upper limbs, the shoulders and the head. In addition, the LDM can serve as an analysis tool for studying factors affecting the gait under various conditions.

  12. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    In this paper we present a model for email authorship identification (EAI) by employing a Cluster-based Classification (CCM) technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature-set to include some...... more interesting and effective features for email authorship identification (e.g. the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell). We also included Info Gain feature selection based...... reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM)-based models, as well as the models proposed by Iqbal et al. [1, 2]. The proposed model attains an accuracy rate of 94% for 10...

  13. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  14. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  15. Culturicon model: A new model for cultural-based emoticon

    Science.gov (United States)

    Zukhi, Mohd Zhafri Bin Mohd; Hussain, Azham

    2017-10-01

    Emoticons are popular among distributed collective interaction user in expressing their emotion, gestures and actions. Emoticons have been proved to be able to avoid misunderstanding of the message, attention saving and improved the communications among different native speakers. However, beside the benefits that emoticons can provide, the study regarding emoticons in cultural perspective is still lacking. As emoticons are crucial in global communication, culture should be one of the extensively research aspect in distributed collective interaction. Therefore, this study attempt to explore and develop model for cultural-based emoticon. Three cultural models that have been used in Human-Computer Interaction were studied which are the Hall Culture Model, Trompenaars and Hampden Culture Model and Hofstede Culture Model. The dimensions from these three models will be used in developing the proposed cultural-based emoticon model.

  16. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  17. Physics Based Modeling of Compressible Turbulance

    Science.gov (United States)

    2016-11-07

    AFRL-AFOSR-VA-TR-2016-0345 PHYSICS -BASED MODELING OF COMPRESSIBLE TURBULENCE PARVIZ MOIN LELAND STANFORD JUNIOR UNIV CA Final Report 09/13/2016...on the AFOSR project (FA9550-11-1-0111) entitled: Physics based modeling of compressible turbulence. The period of performance was, June 15, 2011...by ANSI Std. Z39.18 Page 1 of 2FORM SF 298 11/10/2016https://livelink.ebs.afrl.af.mil/livelink/llisapi.dll PHYSICS -BASED MODELING OF COMPRESSIBLE

  18. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  19. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha; Kalogerakis, Evangelos; Guibas, Leonidas; Koltun, Vladlen

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling

  20. Model-Based Motion Tracking of Infants

    DEFF Research Database (Denmark)

    Olsen, Mikkel Damgaard; Herskind, Anna; Nielsen, Jens Bo

    2014-01-01

    Even though motion tracking is a widely used technique to analyze and measure human movements, only a few studies focus on motion tracking of infants. In recent years, a number of studies have emerged focusing on analyzing the motion pattern of infants, using computer vision. Most of these studies...... are based on 2D images, but few are based on 3D information. In this paper, we present a model-based approach for tracking infants in 3D. The study extends a novel study on graph-based motion tracking of infants and we show that the extension improves the tracking results. A 3D model is constructed...

  1. 29 CFR 780.607 - “Primarily employed” in agriculture.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false âPrimarily employedâ in agriculture. 780.607 Section 780... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements...

  2. Base Flow Model Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  3. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  4. Springer handbook of model-based science

    CERN Document Server

    Bertolotti, Tommaso

    2017-01-01

    The handbook offers the first comprehensive reference guide to the interdisciplinary field of model-based reasoning. It highlights the role of models as mediators between theory and experimentation, and as educational devices, as well as their relevance in testing hypotheses and explanatory functions. The Springer Handbook merges philosophical, cognitive and epistemological perspectives on models with the more practical needs related to the application of this tool across various disciplines and practices. The result is a unique, reliable source of information that guides readers toward an understanding of different aspects of model-based science, such as the theoretical and cognitive nature of models, as well as their practical and logical aspects. The inferential role of models in hypothetical reasoning, abduction and creativity once they are constructed, adopted, and manipulated for different scientific and technological purposes is also discussed. Written by a group of internationally renowned experts in ...

  5. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  6. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn-Beckers, Petronella; Doldersum, Tom; Useya, Juliana; Augustijn, Dionysius C.M.

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V.cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse

  7. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy approximation

  8. Probabilistic Model-based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Anderson, Jakob; Prehn, Thomas

    2005-01-01

    is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  9. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  10. Predictor-Based Model Reference Adaptive Control

    Science.gov (United States)

    Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.

    2010-01-01

    This paper is devoted to the design and analysis of a predictor-based model reference adaptive control. Stable adaptive laws are derived using Lyapunov framework. The proposed architecture is compared with the now classical model reference adaptive control. A simulation example is presented in which numerical evidence indicates that the proposed controller yields improved transient characteristics.

  11. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  12. Gradient based filtering of digital elevation models

    DEFF Research Database (Denmark)

    Knudsen, Thomas; Andersen, Rune Carbuhn

    We present a filtering method for digital terrain models (DTMs). The method is based on mathematical morphological filtering within gradient (slope) defined domains. The intention with the filtering procedure is to improbé the cartographic quality of height contours generated from a DTM based...

  13. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented...... on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  14. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  15. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...... on underlying basic assumptions, such as diffuse fields, high modal overlap, resonant field being dominant, etc., and the consequences of these in terms of limitations in the theory and in the practical use of the models....

  16. A subchannel based annular flow dryout model

    International Nuclear Information System (INIS)

    Hammouda, Najmeddine; Cheng, Zhong; Rao, Yanfei F.

    2016-01-01

    Highlights: • A modified annular flow dryout model for subchannel thermalhydraulic analysis. • Implementation of the model in Canadian subchannel code ASSERT-PV. • Assessment of the model against tube CHF experiments. • Assessment of the model against CANDU-bundle CHF experiments. - Abstract: This paper assesses a popular tube-based mechanistic critical heat flux model (Hewitt and Govan’s annular flow model (based on the model of Whalley et al.), and modifies and implements the model for bundle geometries. It describes the results of the ASSERT subchannel code predictions using the modified model, as applied to a single tube and the 28-element, 37-element and 43-element (CANFLEX) CANDU bundles. A quantitative comparison between the model predictions and experimental data indicates good agreement for a wide range of flow conditions. The comparison has resulted in an overall average error of −0.15% and an overall root-mean-square error of 5.46% with tube data representing annular film dryout type critical heat flux, and in an overall average error of −0.9% and an overall RMS error of 9.9% with Stern Laboratories’ CANDU-bundle data.

  17. Multi-Domain Modeling Based on Modelica

    Directory of Open Access Journals (Sweden)

    Liu Jun

    2016-01-01

    Full Text Available With the application of simulation technology in large-scale and multi-field problems, multi-domain unified modeling become an effective way to solve these problems. This paper introduces several basic methods and advantages of the multidisciplinary model, and focuses on the simulation based on Modelica language. The Modelica/Mworks is a newly developed simulation software with features of an object-oriented and non-casual language for modeling of the large, multi-domain system, which makes the model easier to grasp, develop and maintain.It This article shows the single degree of freedom mechanical vibration system based on Modelica language special connection mechanism in Mworks. This method that multi-domain modeling has simple and feasible, high reusability. it closer to the physical system, and many other advantages.

  18. SEP modeling based on global heliospheric models at the CCMC

    Science.gov (United States)

    Mays, M. L.; Luhmann, J. G.; Odstrcil, D.; Bain, H. M.; Schwadron, N.; Gorby, M.; Li, Y.; Lee, K.; Zeitlin, C.; Jian, L. K.; Lee, C. O.; Mewaldt, R. A.; Galvin, A. B.

    2017-12-01

    Heliospheric models provide contextual information of conditions in the heliosphere, including the background solar wind conditions and shock structures, and are used as input to SEP models, providing an essential tool for understanding SEP properties. The global 3D MHD WSA-ENLIL+Cone model provides a time-dependent background heliospheric description, into which a spherical shaped hydrodynamic CME can be inserted. ENLIL simulates solar wind parameters and additionally one can extract the magnetic topologies of observer-connected magnetic field lines and all plasma and shock properties along those field lines. An accurate representation of the background solar wind is necessary for simulating transients. ENLIL simulations also drive SEP models such as the Solar Energetic Particle Model (SEPMOD) (Luhmann et al. 2007, 2010) and the Energetic Particle Radiation Environment Module (EPREM) (Schwadron et al. 2010). The Community Coordinated Modeling Center (CCMC) is in the process of making these SEP models available to the community and offering a system to run SEP models driven by a variety of heliospheric models available at CCMC. SEPMOD injects protons onto a sequence of observer field lines at intensities dependent on the connected shock source strength which are then integrated at the observer to approximate the proton flux. EPREM couples with MHD models such as ENLIL and computes energetic particle distributions based on the focused transport equation along a Lagrangian grid of nodes that propagate out with the solar wind. The coupled SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. The coupled ENLIL and SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. In this presentation we demonstrate several case studies of SEP event modeling at different observers based on WSA-ENLIL+Cone simulations.

  19. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  20. Hydrogen peroxide production is not primarily increased in human myotubes established from type 2 diabetic subjects.

    Science.gov (United States)

    Minet, A D; Gaster, M

    2011-09-01

    Increased oxidative stress and mitochondrial dysfunction have been implicated in the development of insulin resistance in type 2 diabetes. To date, it is unknown whether increased mitochondrial reactive oxygen species (ROS) production in skeletal muscle from patients with type 2 diabetes is primarily increased or a secondary adaptation to environmental, lifestyle, and hormonal factors. This study investigates whether ROS production is primarily increased in isolated diabetic myotubes. Mitochondrial membrane potential, hydrogen peroxide (H(2)O(2)), superoxide, and mitochondrial mass were determined in human myotubes precultured under normophysiological conditions. Furthermore, the corresponding ATP synthesis was measured in isolated mitochondria. Muscle biopsies were taken from 10 lean subjects, 10 obese subjects, and 10 subjects with type 2 diabetes; satellite cells were isolated, cultured, and differentiated to myotubes. Mitochondrial mass, membrane potential/mitochondrial mass, and superoxide-production/mitochondrial mass were not different between groups. In contrast, H(2)O(2) production/mitochondrial mass and ATP production were significantly reduced in diabetic myotubes compared to lean controls (P production is not primarily increased in diabetic myotubes but rather is reduced. Moreover, the comparable ATP/H(2)O(2) ratios indicate that the reduced ROS production in diabetic myotubes parallels the reduced ATP production because ROS production in diabetic myotubes must be considered to be in a proportion comparable to lean. Thus, the increased ROS production seen in skeletal muscle of type 2 diabetic patients is an adaptation to the in vivo conditions.

  1. Identification of walking human model using agent-based modelling

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  2. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    the conceptual model on which it is based. In this study, a number of model structural shortcomings were identified, such as a lack of dissolved phosphorus transport via infiltration excess overland flow, potential discrepancies in the particulate phosphorus simulation and a lack of spatial granularity. (4) Conceptual challenges, as conceptual models on which predictive models are built are often outdated, having not kept up with new insights from monitoring and experiments. For example, soil solution dissolved phosphorus concentration in INCA-P is determined by the Freundlich adsorption isotherm, which could potentially be replaced using more recently-developed adsorption models that take additional soil properties into account. This checklist could be used to assist in identifying why model performance may be poor or unreliable. By providing a model evaluation framework, it could help prioritise which areas should be targeted to improve model performance or model credibility, whether that be through using alternative calibration techniques and statistics, improved data collection, improving or simplifying the model structure or updating the model to better represent current understanding of catchment processes.

  3. Nonlinear system modeling based on bilinear Laguerre orthonormal bases.

    Science.gov (United States)

    Garna, Tarek; Bouzrara, Kais; Ragot, José; Messaoud, Hassani

    2013-05-01

    This paper proposes a new representation of discrete bilinear model by developing its coefficients associated to the input, to the output and to the crossed product on three independent Laguerre orthonormal bases. Compared to classical bilinear model, the resulting model entitled bilinear-Laguerre model ensures a significant parameter number reduction as well as simple recursive representation. However, such reduction still constrained by an optimal choice of Laguerre pole characterizing each basis. To do so, we develop a pole optimization algorithm which constitutes an extension of that proposed by Tanguy et al.. The bilinear-Laguerre model as well as the proposed pole optimization algorithm are illustrated and tested on a numerical simulations and validated on the Continuous Stirred Tank Reactor (CSTR) System. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  4. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  5. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented...... with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically...... on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  6. Business Models for NFC based mobile payments

    OpenAIRE

    Johannes Sang Un Chae; Jonas Hedman

    2015-01-01

    Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experienc...

  7. Quality Model Based on Cots Quality Attributes

    OpenAIRE

    Jawad Alkhateeb; Khaled Musa

    2013-01-01

    The quality of software is essential to corporations in making their commercial software. Good or poorquality to software plays an important role to some systems such as embedded systems, real-time systems,and control systems that play an important aspect in human life. Software products or commercial off theshelf software are usually programmed based on a software quality model. In the software engineeringfield, each quality model contains a set of attributes or characteristics that drives i...

  8. A Multiagent Based Model for Tactical Planning

    Science.gov (United States)

    2002-10-01

    Pub. Co. 1985. [10] Castillo, J.M. Aproximación mediante procedimientos de Inteligencia Artificial al planeamiento táctico. Doctoral Thesis...been developed under the same conceptual model and using similar Artificial Intelligence Tools. We use four different stimulus/response agents in...The conceptual model is built on base of the Agents theory. To implement the different agents we have used Artificial Intelligence techniques such

  9. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  10. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  11. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the Da......This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together...

  12. Least-squares model-based halftoning

    Science.gov (United States)

    Pappas, Thrasyvoulos N.; Neuhoff, David L.

    1992-08-01

    A least-squares model-based approach to digital halftoning is proposed. It exploits both a printer model and a model for visual perception. It attempts to produce an 'optimal' halftoned reproduction, by minimizing the squared error between the response of the cascade of the printer and visual models to the binary image and the response of the visual model to the original gray-scale image. Conventional methods, such as clustered ordered dither, use the properties of the eye only implicitly, and resist printer distortions at the expense of spatial and gray-scale resolution. In previous work we showed that our printer model can be used to modify error diffusion to account for printer distortions. The modified error diffusion algorithm has better spatial and gray-scale resolution than conventional techniques, but produces some well known artifacts and asymmetries because it does not make use of an explicit eye model. Least-squares model-based halftoning uses explicit eye models and relies on printer models that predict distortions and exploit them to increase, rather than decrease, both spatial and gray-scale resolution. We have shown that the one-dimensional least-squares problem, in which each row or column of the image is halftoned independently, can be implemented with the Viterbi's algorithm. Unfortunately, no closed form solution can be found in two dimensions. The two-dimensional least squares solution is obtained by iterative techniques. Experiments show that least-squares model-based halftoning produces more gray levels and better spatial resolution than conventional techniques. We also show that the least- squares approach eliminates the problems associated with error diffusion. Model-based halftoning can be especially useful in transmission of high quality documents using high fidelity gray-scale image encoders. As we have shown, in such cases halftoning can be performed at the receiver, just before printing. Apart from coding efficiency, this approach

  13. Designing Network-based Business Model Ontology

    DEFF Research Database (Denmark)

    Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz

    2015-01-01

    Survival on dynamic environment is not achieved without a map. Scanning and monitoring of the market show business models as a fruitful tool. But scholars believe that old-fashioned business models are dead; as they are not included the effect of internet and network in themselves. This paper...... is going to propose e-business model ontology from the network point of view and its application in real world. The suggested ontology for network-based businesses is composed of individuals` characteristics and what kind of resources they own. also, their connections and pre-conceptions of connections...... such as shared-mental model and trust. However, it mostly covers previous business model elements. To confirm the applicability of this ontology, it has been implemented in business angel network and showed how it works....

  14. Triacylglycerol Accumulation is not primarily affected in Myotubes established from Type 2 Diabetic Subjects

    DEFF Research Database (Denmark)

    Gaster, Michael; Beck-Nielsen, Henning

    2006-01-01

    In the present study, we investigated triacylglycerol (TAG) accumulation, glucose and fatty acid (FA) uptake, and glycogen synthesis (GS) in human myotubes from healthy, lean, and obese subjects with and without type 2 diabetes (T2D), exposed to increasing palmitate (PA) and oleate (OA...... uptake (P0.05). These results indicate that (1) TAG accumulation is not primarily affected in skeletal muscle tissue of obese and T2D; (2) induced inhibition of oxidative phosphorylation is followed by TAG accumulation...... in skeletal muscle of obese and T2D subjects is adaptive....

  15. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  16. Knowledge-Based Environmental Context Modeling

    Science.gov (United States)

    Pukite, P. R.; Challou, D. J.

    2017-12-01

    As we move from the oil-age to an energy infrastructure based on renewables, the need arises for new educational tools to support the analysis of geophysical phenomena and their behavior and properties. Our objective is to present models of these phenomena to make them amenable for incorporation into more comprehensive analysis contexts. Starting at the level of a college-level computer science course, the intent is to keep the models tractable and therefore practical for student use. Based on research performed via an open-source investigation managed by DARPA and funded by the Department of Interior [1], we have adapted a variety of physics-based environmental models for a computer-science curriculum. The original research described a semantic web architecture based on patterns and logical archetypal building-blocks (see figure) well suited for a comprehensive environmental modeling framework. The patterns span a range of features that cover specific land, atmospheric and aquatic domains intended for engineering modeling within a virtual environment. The modeling engine contained within the server relied on knowledge-based inferencing capable of supporting formal terminology (through NASA JPL's Semantic Web for Earth and Environmental Technology (SWEET) ontology and a domain-specific language) and levels of abstraction via integrated reasoning modules. One of the key goals of the research was to simplify models that were ordinarily computationally intensive to keep them lightweight enough for interactive or virtual environment contexts. The breadth of the elements incorporated is well-suited for learning as the trend toward ontologies and applying semantic information is vital for advancing an open knowledge infrastructure. As examples of modeling, we have covered such geophysics topics as fossil-fuel depletion, wind statistics, tidal analysis, and terrain modeling, among others. Techniques from the world of computer science will be necessary to promote efficient

  17. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  18. Ligand based pharmacophore modelling of anticancer histone ...

    African Journals Online (AJOL)

    USER

    2010-06-21

    Jun 21, 2010 ... are useful in predicting the biological activity of the compound or compound library by screening it ... with high affinity of binding toward a given protein ..... High- throughput structure-based pharmacophore modelling as a basis for successful parallel virtual screening. J. Comp. Aided Mol. Design, 20:.

  19. Service creation: a model-based approach

    NARCIS (Netherlands)

    Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a model-based approach to support service creation. In this approach, services are assumed to be created from (available) software components. The creation process may involve multiple design steps in which the requested service is repeatedly decomposed into more detailed

  20. Model based development of engine control algorithms

    NARCIS (Netherlands)

    Dekker, H.J.; Sturm, W.L.

    1996-01-01

    Model based development of engine control systems has several advantages. The development time and costs are strongly reduced because much of the development and optimization work is carried out by simulating both engine and control system. After optimizing the control algorithm it can be executed

  1. Modelling Web-Based Instructional Systems

    NARCIS (Netherlands)

    Retalis, Symeon; Avgeriou, Paris

    2002-01-01

    The size and complexity of modern instructional systems, which are based on the World Wide Web, bring about great intricacy in their crafting, as there is not enough knowledge or experience in this field. This imposes the use of new instructional design models in order to achieve risk-mitigation,

  2. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  3. Prototype-based models in machine learning

    NARCIS (Netherlands)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of

  4. Model-based auditing using REA

    NARCIS (Netherlands)

    Weigand, H.; Elsas, P.

    2012-01-01

    The recent financial crisis has renewed interest in the value of the owner-ordered auditing tradition that starts from society's long-term interest rather than management interest. This tradition uses a model-based auditing approach in which control requirements are derived in a principled way. A

  5. Model based energy benchmarking for glass furnace

    International Nuclear Information System (INIS)

    Sardeshpande, Vishal; Gaitonde, U.N.; Banerjee, Rangan

    2007-01-01

    Energy benchmarking of processes is important for setting energy efficiency targets and planning energy management strategies. Most approaches used for energy benchmarking are based on statistical methods by comparing with a sample of existing plants. This paper presents a model based approach for benchmarking of energy intensive industrial processes and illustrates this approach for industrial glass furnaces. A simulation model for a glass furnace is developed using mass and energy balances, and heat loss equations for the different zones and empirical equations based on operating practices. The model is checked with field data from end fired industrial glass furnaces in India. The simulation model enables calculation of the energy performance of a given furnace design. The model results show the potential for improvement and the impact of different operating and design preferences on specific energy consumption. A case study for a 100 TPD end fired furnace is presented. An achievable minimum energy consumption of about 3830 kJ/kg is estimated for this furnace. The useful heat carried by glass is about 53% of the heat supplied by the fuel. Actual furnaces operating at these production scales have a potential for reduction in energy consumption of about 20-25%

  6. Whole body acid-base modeling revisited.

    Science.gov (United States)

    Ring, Troels; Nielsen, Søren

    2017-04-01

    The textbook account of whole body acid-base balance in terms of endogenous acid production, renal net acid excretion, and gastrointestinal alkali absorption, which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. To improve understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production were already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption in terms of urine excretions. With a few assumptions it was possible to see that this expression of net acid balance was arithmetically identical to minus urine charge, whereby under the development of acidosis, urine was predicted to acquire a net negative charge. The literature already mentions unexplained negative urine charges so we scrutinized a series of seminal papers and confirmed empirically the theoretical prediction that observed urine charge did acquire negative charge as acidosis developed. Hence, we can conclude that the conventional model is problematic since it predicts what is physiologically impossible. Therefore, we need a new model for whole body acid-base balance, which does not have impossible implications. Furthermore, new experimental studies are needed to account for charge imbalance in urine under development of acidosis. Copyright © 2017 the American Physiological Society.

  7. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  8. Agent Based Modeling Applications for Geosciences

    Science.gov (United States)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  9. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  10. Lamin A/C mutation affecting primarily the right side of the heart

    Directory of Open Access Journals (Sweden)

    Laura Ollila

    2013-04-01

    Full Text Available LMNA mutations are amongst the most important causes of familial dilated cardiomyopathy. The most important cause of arrhythmogenic right ventricular cardiomyopathy (ARVC is desmosomal pathology. The aim of the study was to elucidate the role of LMNA mutations among Finnish cardiomyopathy patients. We screened 135 unrelated cardiomyopathy patients for LMNA mutations. Because of unusual phenotype, two patients were screened for the known Finnish ARVC-related mutations of desmosomal genes, and their Plakophilin-2b gene was sequenced. Myocardial samples from two patients were examined by immunohistochemical plakoglobin staining and in one case by electron microscopy. We found a new LMNA mutation Phe237Ser in a family of five affected members with a cardiomyopathy affecting primarily the right side of the heart. The phenotype resembles ARVC but does not fulfill the Task Force Criteria. The main clinical manifestations of the mutation were severe tricuspid insufficiency, right ventricular enlargement and failure. Three of the affected patients died of the heart disease, and the two living patients received heart transplants at ages 44 and 47. Electron microscopy showed nuclear blebbing compatible with laminopathy. Immunohisto - chemical analysis did not suggest desmosomal pathology. No desmosomal mutations were found. The Phe237Ser LMNA mutation causes a phenotype different from traditional cardiolaminopathy. Our findings suggest that cardiomyopathy affecting primarily the right side of the heart is not always caused by desmosomal pathology. Our observations highlight the challenges in classifying cardiomyopathies, as there often is significant overlap between the traditional categories.

  11. Intelligent-based Structural Damage Detection Model

    International Nuclear Information System (INIS)

    Lee, Eric Wai Ming; Yu, K.F.

    2010-01-01

    This paper presents the application of a novel Artificial Neural Network (ANN) model for the diagnosis of structural damage. The ANN model, denoted as the GRNNFA, is a hybrid model combining the General Regression Neural Network Model (GRNN) and the Fuzzy ART (FA) model. It not only retains the important features of the GRNN and FA models (i.e. fast and stable network training and incremental growth of network structure) but also facilitates the removal of the noise embedded in the training samples. Structural damage alters the stiffness distribution of the structure and so as to change the natural frequencies and mode shapes of the system. The measured modal parameter changes due to a particular damage are treated as patterns for that damage. The proposed GRNNFA model was trained to learn those patterns in order to detect the possible damage location of the structure. Simulated data is employed to verify and illustrate the procedures of the proposed ANN-based damage diagnosis methodology. The results of this study have demonstrated the feasibility of applying the GRNNFA model to structural damage diagnosis even when the training samples were noise contaminated.

  12. Intelligent-based Structural Damage Detection Model

    Science.gov (United States)

    Lee, Eric Wai Ming; Yu, Kin Fung

    2010-05-01

    This paper presents the application of a novel Artificial Neural Network (ANN) model for the diagnosis of structural damage. The ANN model, denoted as the GRNNFA, is a hybrid model combining the General Regression Neural Network Model (GRNN) and the Fuzzy ART (FA) model. It not only retains the important features of the GRNN and FA models (i.e. fast and stable network training and incremental growth of network structure) but also facilitates the removal of the noise embedded in the training samples. Structural damage alters the stiffness distribution of the structure and so as to change the natural frequencies and mode shapes of the system. The measured modal parameter changes due to a particular damage are treated as patterns for that damage. The proposed GRNNFA model was trained to learn those patterns in order to detect the possible damage location of the structure. Simulated data is employed to verify and illustrate the procedures of the proposed ANN-based damage diagnosis methodology. The results of this study have demonstrated the feasibility of applying the GRNNFA model to structural damage diagnosis even when the training samples were noise contaminated.

  13. Mesoscopic model of actin-based propulsion.

    Directory of Open Access Journals (Sweden)

    Jie Zhu

    Full Text Available Two theoretical models dominate current understanding of actin-based propulsion: microscopic polymerization ratchet model predicts that growing and writhing actin filaments generate forces and movements, while macroscopic elastic propulsion model suggests that deformation and stress of growing actin gel are responsible for the propulsion. We examine both experimentally and computationally the 2D movement of ellipsoidal beads propelled by actin tails and show that neither of the two models can explain the observed bistability of the orientation of the beads. To explain the data, we develop a 2D hybrid mesoscopic model by reconciling these two models such that individual actin filaments undergoing nucleation, elongation, attachment, detachment and capping are embedded into the boundary of a node-spring viscoelastic network representing the macroscopic actin gel. Stochastic simulations of this 'in silico' actin network show that the combined effects of the macroscopic elastic deformation and microscopic ratchets can explain the observed bistable orientation of the actin-propelled ellipsoidal beads. To test the theory further, we analyze observed distribution of the curvatures of the trajectories and show that the hybrid model's predictions fit the data. Finally, we demonstrate that the model can explain both concave-up and concave-down force-velocity relations for growing actin networks depending on the characteristic time scale and network recoil. To summarize, we propose that both microscopic polymerization ratchets and macroscopic stresses of the deformable actin network are responsible for the force and movement generation.

  14. Physiologically Based Pharmacokinetic Modeling of Therapeutic Proteins.

    Science.gov (United States)

    Wong, Harvey; Chow, Timothy W

    2017-09-01

    Biologics or therapeutic proteins are becoming increasingly important as treatments for disease. The most common class of biologics are monoclonal antibodies (mAbs). Recently, there has been an increase in the use of physiologically based pharmacokinetic (PBPK) modeling in the pharmaceutical industry in drug development. We review PBPK models for therapeutic proteins with an emphasis on mAbs. Due to their size and similarity to endogenous antibodies, there are distinct differences between PBPK models for small molecules and mAbs. The high-level organization of a typical mAb PBPK model consists of a whole-body PBPK model with organ compartments interconnected by both blood and lymph flows. The whole-body PBPK model is coupled with tissue-level submodels used to describe key mechanisms governing mAb disposition including tissue efflux via the lymphatic system, elimination by catabolism, protection from catabolism binding to the neonatal Fc (FcRn) receptor, and nonlinear binding to specific pharmacological targets of interest. The use of PBPK modeling in the development of therapeutic proteins is still in its infancy. Further application of PBPK modeling for therapeutic proteins will help to define its developing role in drug discovery and development. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  15. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  16. Sandboxes for Model-Based Inquiry

    Science.gov (United States)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  17. PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Toni Bielić

    2014-10-01

    Full Text Available 800x600 This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introducing teamwork on board the ship. Three examples of the ship’s accidents are studied and evaluated through “Leader - participation” model. The model of participation based management as a model of the teamwork has been applied in studying the cause - and - effect of accidents with the critical review of the communication and managing the human resources on a ship. The results have showed that the cause of all three accidents is the autocratic behaviour of the leaders and lack of communication within teams. Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4

  18. Multiagent-Based Model For ESCM

    Directory of Open Access Journals (Sweden)

    Delia MARINCAS

    2011-01-01

    Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.

  19. Agent based modeling of energy networks

    International Nuclear Information System (INIS)

    Gonzalez de Durana, José María; Barambones, Oscar; Kremers, Enrique; Varga, Liz

    2014-01-01

    Highlights: • A new approach for energy network modeling is designed and tested. • The agent-based approach is general and no technology dependent. • The models can be easily extended. • The range of applications encompasses from small to large energy infrastructures. - Abstract: Attempts to model any present or future power grid face a huge challenge because a power grid is a complex system, with feedback and multi-agent behaviors, integrated by generation, distribution, storage and consumption systems, using various control and automation computing systems to manage electricity flows. Our approach to modeling is to build upon an established model of the low voltage electricity network which is tested and proven, by extending it to a generalized energy model. But, in order to address the crucial issues of energy efficiency, additional processes like energy conversion and storage, and further energy carriers, such as gas, heat, etc., besides the traditional electrical one, must be considered. Therefore a more powerful model, provided with enhanced nodes or conversion points, able to deal with multidimensional flows, is being required. This article addresses the issue of modeling a local multi-carrier energy network. This problem can be considered as an extension of modeling a low voltage distribution network located at some urban or rural geographic area. But instead of using an external power flow analysis package to do the power flow calculations, as used in electric networks, in this work we integrate a multiagent algorithm to perform the task, in a concurrent way to the other simulation tasks, and not only for the electric fluid but also for a number of additional energy carriers. As the model is mainly focused in system operation, generation and load models are not developed

  20. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  1. Introducing Model-Based System Engineering Transforming System Engineering through Model-Based Systems Engineering

    Science.gov (United States)

    2014-03-31

    Web  Presentation...Software  .....................................................  20   Figure  6.  Published   Web  Page  from  Data  Collection...the  term  Model  Based  Engineering  (MBE),  Model  Driven  Engineering  ( MDE ),  or  Model-­‐Based  Systems  

  2. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  3. Agent-Based Models in Social Physics

    Science.gov (United States)

    Quang, Le Anh; Jung, Nam; Cho, Eun Sung; Choi, Jae Han; Lee, Jae Woo

    2018-06-01

    We review the agent-based models (ABM) on social physics including econophysics. The ABM consists of agent, system space, and external environment. The agent is autonomous and decides his/her behavior by interacting with the neighbors or the external environment with the rules of behavior. Agents are irrational because they have only limited information when they make decisions. They adapt using learning from past memories. Agents have various attributes and are heterogeneous. ABM is a non-equilibrium complex system that exhibits various emergence phenomena. The social complexity ABM describes human behavioral characteristics. In ABMs of econophysics, we introduce the Sugarscape model and the artificial market models. We review minority games and majority games in ABMs of game theory. Social flow ABM introduces crowding, evacuation, traffic congestion, and pedestrian dynamics. We also review ABM for opinion dynamics and voter model. We discuss features and advantages and disadvantages of Netlogo, Repast, Swarm, and Mason, which are representative platforms for implementing ABM.

  4. Mechanics model for actin-based motility.

    Science.gov (United States)

    Lin, Yuan

    2009-02-01

    We present here a mechanics model for the force generation by actin polymerization. The possible adhesions between the actin filaments and the load surface, as well as the nucleation and capping of filament tips, are included in this model on top of the well-known elastic Brownian ratchet formulation. A closed form solution is provided from which the force-velocity relationship, summarizing the mechanics of polymerization, can be drawn. Model predictions on the velocity of moving beads driven by actin polymerization are consistent with experiment observations. This model also seems capable of explaining the enhanced actin-based motility of Listeria monocytogenes and beads by the presence of Vasodilator-stimulated phosphoprotein, as observed in recent experiments.

  5. A satellite-based global landslide model

    Directory of Open Access Journals (Sweden)

    A. Farahmand

    2013-05-01

    Full Text Available Landslides are devastating phenomena that cause huge damage around the world. This paper presents a quasi-global landslide model derived using satellite precipitation data, land-use land cover maps, and 250 m topography information. This suggested landslide model is based on the Support Vector Machines (SVM, a machine learning algorithm. The National Aeronautics and Space Administration (NASA Goddard Space Flight Center (GSFC landslide inventory data is used as observations and reference data. In all, 70% of the data are used for model development and training, whereas 30% are used for validation and verification. The results of 100 random subsamples of available landslide observations revealed that the suggested landslide model can predict historical landslides reliably. The average error of 100 iterations of landslide prediction is estimated to be approximately 7%, while approximately 2% false landslide events are observed.

  6. Electrochemistry-based Battery Modeling for Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2013-01-01

    Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.

  7. 3-D model-based vehicle tracking.

    Science.gov (United States)

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter.

  8. SLS Model Based Design: A Navigation Perspective

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Park, Thomas; Geohagan, Kevin

    2018-01-01

    The SLS Program has implemented a Model-based Design (MBD) and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team is responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1B design, the additional GPS Receiver hardware model is managed as a DMM at the vehicle design level. This paper describes the models, and discusses the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the navigation components.

  9. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...... and related to the uncertainty of the impulse response coefficients. The simulations can be used to benchmark l2 MPC against FIR based robust MPC as well as to estimate the maximum performance improvements by robust MPC....

  10. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    the information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual...... average and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted annual average concentrations compared to a simple stochastic method based solely on data. The predicted annual average obtained by using passive sampler measurements (one month installation...

  11. Divergence-based tests for model diagnostic

    Czech Academy of Sciences Publication Activity Database

    Hobza, Tomáš; Esteban, M. D.; Morales, D.; Marhuenda, Y.

    2008-01-01

    Roč. 78, č. 13 (2008), s. 1702-1710 ISSN 0167-7152 R&D Projects: GA MŠk 1M0572 Grant - others:Instituto Nacional de Estadistica (ES) MTM2006-05693 Institutional research plan: CEZ:AV0Z10750506 Keywords : goodness of fit * devergence statistics * GLM * model checking * bootstrap Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.445, year: 2008 http://library.utia.cas.cz/separaty/2008/SI/hobza-divergence-based%20tests%20for%20model%20diagnostic.pdf

  12. Multiscale agent-based cancer modeling.

    Science.gov (United States)

    Zhang, Le; Wang, Zhihui; Sagotsky, Jonathan A; Deisboeck, Thomas S

    2009-04-01

    Agent-based modeling (ABM) is an in silico technique that is being used in a variety of research areas such as in social sciences, economics and increasingly in biomedicine as an interdisciplinary tool to study the dynamics of complex systems. Here, we describe its applicability to integrative tumor biology research by introducing a multi-scale tumor modeling platform that understands brain cancer as a complex dynamic biosystem. We summarize significant findings of this work, and discuss both challenges and future directions for ABM in the field of cancer research.

  13. Direct healthcare costs of selected diseases primarily or partially transmitted by water.

    Science.gov (United States)

    Collier, S A; Stockman, L J; Hicks, L A; Garrison, L E; Zhou, F J; Beach, M J

    2012-11-01

    Despite US sanitation advancements, millions of waterborne disease cases occur annually, although the precise burden of disease is not well quantified. Estimating the direct healthcare cost of specific infections would be useful in prioritizing waterborne disease prevention activities. Hospitalization and outpatient visit costs per case and total US hospitalization costs for ten waterborne diseases were calculated using large healthcare claims and hospital discharge databases. The five primarily waterborne diseases in this analysis (giardiasis, cryptosporidiosis, Legionnaires' disease, otitis externa, and non-tuberculous mycobacterial infection) were responsible for over 40 000 hospitalizations at a cost of $970 million per year, including at least $430 million in hospitalization costs for Medicaid and Medicare patients. An additional 50 000 hospitalizations for campylobacteriosis, salmonellosis, shigellosis, haemolytic uraemic syndrome, and toxoplasmosis cost $860 million annually ($390 million in payments for Medicaid and Medicare patients), a portion of which can be assumed to be due to waterborne transmission.

  14. Physics-Based Modeling of Meteor Entry and Breakup

    Science.gov (United States)

    Prabhu, Dinesh K.; Agrawal, Parul; Allen, Gary A., Jr.; Bauschlicher, Charles W., Jr.; Brandis, Aaron M.; Chen, Yih-Kang; Jaffe, Richard L.; Palmer, Grant E.; Saunders, David A.; Stern, Eric C.; hide

    2015-01-01

    A new research effort at NASA Ames Research Center has been initiated in Planetary Defense, which integrates the disciplines of planetary science, atmospheric entry physics, and physics-based risk assessment. This paper describes work within the new program and is focused on meteor entry and breakup.Over the last six decades significant effort was expended in the US and in Europe to understand meteor entry including ablation, fragmentation and airburst (if any) for various types of meteors ranging from stony to iron spectral types. These efforts have produced primarily empirical mathematical models based on observations. Weaknesses of these models, apart from their empiricism, are reliance on idealized shapes (spheres, cylinders, etc.) and simplified models for thermal response of meteoritic materials to aerodynamic and radiative heating. Furthermore, the fragmentation and energy release of meteors (airburst) is poorly understood.On the other hand, flight of human-made atmospheric entry capsules is well understood. The capsules and their requisite heatshields are designed and margined to survive entry. However, the highest speed Earth entry for capsules is 13 kms (Stardust). Furthermore, Earth entry capsules have never exceeded diameters of 5 m, nor have their peak aerothermal environments exceeded 0.3 atm and 1 kW/sq cm. The aims of the current work are: (i) to define the aerothermal environments for objects with entry velocities from 13 to 20 kms; (ii) to explore various hypotheses of fragmentation and airburst of stony meteors in the near term; (iii) to explore the possibility of performing relevant ground-based tests to verify candidate hypotheses; and (iv) to quantify the energy released in airbursts. The results of the new simulations will be used to anchor said risk assessment analyses. With these aims in mind, state-of-the-art entry capsule design tools are being extended for meteor entries. We describe: (i) applications of current simulation tools to

  15. Knowledge-based geometric modeling in construction

    DEFF Research Database (Denmark)

    Bonev, Martin; Hvam, Lars

    2012-01-01

    a considerably high amount of their recourses is required for designing and specifying the majority of their product assortment. As design decisions are hereby based on knowledge and experience about behaviour and applicability of construction techniques and materials for a predefined design situation, smart......A wider application of IT-based solutions, such as configuration systems and the implementation of modeling standards, has facilitated the trend to produce mass customized products to support inter alia the specification process of the increasing product variety. However, not all industries have...

  16. Item Modeling Concept Based on Multimedia Authoring

    Directory of Open Access Journals (Sweden)

    Janez Stergar

    2008-09-01

    Full Text Available In this paper a modern item design framework for computer based assessment based on Flash authoring environment will be introduced. Question design will be discussed as well as the multimedia authoring environment used for item modeling emphasized. Item type templates are a structured means of collecting and storing item information that can be used to improve the efficiency and security of the innovative item design process. Templates can modernize the item design, enhance and speed up the development process. Along with content creation, multimedia has vast potential for use in innovative testing. The introduced item design template is based on taxonomy of innovative items which have great potential for expanding the content areas and construct coverage of an assessment. The presented item design approach is based on GUI's – one for question design based on implemented item design templates and one for user interaction tracking/retrieval. The concept of user interfaces based on Flash technology will be discussed as well as implementation of the innovative approach of the item design forms with multimedia authoring. Also an innovative method for user interaction storage/retrieval based on PHP extending Flash capabilities in the proposed framework will be introduced.

  17. Agent-based modeling in ecological economics.

    Science.gov (United States)

    Heckbert, Scott; Baynes, Tim; Reeson, Andrew

    2010-01-01

    Interconnected social and environmental systems are the domain of ecological economics, and models can be used to explore feedbacks and adaptations inherent in these systems. Agent-based modeling (ABM) represents autonomous entities, each with dynamic behavior and heterogeneous characteristics. Agents interact with each other and their environment, resulting in emergent outcomes at the macroscale that can be used to quantitatively analyze complex systems. ABM is contributing to research questions in ecological economics in the areas of natural resource management and land-use change, urban systems modeling, market dynamics, changes in consumer attitudes, innovation, and diffusion of technology and management practices, commons dilemmas and self-governance, and psychological aspects to human decision making and behavior change. Frontiers for ABM research in ecological economics involve advancing the empirical calibration and validation of models through mixed methods, including surveys, interviews, participatory modeling, and, notably, experimental economics to test specific decision-making hypotheses. Linking ABM with other modeling techniques at the level of emergent properties will further advance efforts to understand dynamics of social-environmental systems.

  18. Agent Based Model of Livestock Movements

    Science.gov (United States)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  19. Online constrained model-based reinforcement learning

    CSIR Research Space (South Africa)

    Van Niekerk, B

    2017-08-01

    Full Text Available Constrained Model-based Reinforcement Learning Benjamin van Niekerk School of Computer Science University of the Witwatersrand South Africa Andreas Damianou∗ Amazon.com Cambridge, UK Benjamin Rosman Council for Scientific and Industrial Research, and School... MULTIPLE SHOOTING Using direct multiple shooting (Bock and Plitt, 1984), problem (1) can be transformed into a structured non- linear program (NLP). First, the time horizon [t0, t0 + T ] is partitioned into N equal subintervals [tk, tk+1] for k = 0...

  20. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  1. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  2. Agent Based Modelling for Social Simulation

    OpenAIRE

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course of this project two workshops were organized. At these workshops, a wide range of experts, both ABM experts and domain experts, worked on several potential applications of ABM. The results and ins...

  3. Ecosystem Based Business Model of Smart Grid

    OpenAIRE

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on t...

  4. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  5. Business Models for NFC based mobile payments

    Directory of Open Access Journals (Sweden)

    Johannes Sang Un Chae

    2015-01-01

    Full Text Available Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.

  6. Business Models for NFC Based Mobile Payments

    DEFF Research Database (Denmark)

    Chae, Johannes Sang-Un; Hedman, Jonas

    2015-01-01

    Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper...... investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation...... from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches....

  7. Constraints based analysis of extended cybernetic models.

    Science.gov (United States)

    Mandli, Aravinda R; Venkatesh, Kareenhalli V; Modak, Jayant M

    2015-11-01

    The cybernetic modeling framework provides an interesting approach to model the regulatory phenomena occurring in microorganisms. In the present work, we adopt a constraints based approach to analyze the nonlinear behavior of the extended equations of the cybernetic model. We first show that the cybernetic model exhibits linear growth behavior under the constraint of no resource allocation for the induction of the key enzyme. We then quantify the maximum achievable specific growth rate of microorganisms on mixtures of substitutable substrates under various kinds of regulation and show its use in gaining an understanding of the regulatory strategies of microorganisms. Finally, we show that Saccharomyces cerevisiae exhibits suboptimal dynamic growth with a long diauxic lag phase when growing on a mixture of glucose and galactose and discuss on its potential to achieve optimal growth with a significantly reduced diauxic lag period. The analysis carried out in the present study illustrates the utility of adopting a constraints based approach to understand the dynamic growth strategies of microorganisms. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. INDIVIDUAL BASED MODELLING APPROACH TO THERMAL ...

    Science.gov (United States)

    Diadromous fish populations in the Pacific Northwest face challenges along their migratory routes from declining habitat quality, harvest, and barriers to longitudinal connectivity. Changes in river temperature regimes are producing an additional challenge for upstream migrating adult salmon and steelhead, species that are sensitive to absolute and cumulative thermal exposure. Adult salmon populations have been shown to utilize cold water patches along migration routes when mainstem river temperatures exceed thermal optimums. We are employing an individual based model (IBM) to explore the costs and benefits of spatially-distributed cold water refugia for adult migrating salmon. Our model, developed in the HexSim platform, is built around a mechanistic behavioral decision tree that drives individual interactions with their spatially explicit simulated environment. Population-scale responses to dynamic thermal regimes, coupled with other stressors such as disease and harvest, become emergent properties of the spatial IBM. Other model outputs include arrival times, species-specific survival rates, body energetic content, and reproductive fitness levels. Here, we discuss the challenges associated with parameterizing an individual based model of salmon and steelhead in a section of the Columbia River. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec

  9. Néron models and base change

    CERN Document Server

    Halle, Lars Halvard

    2016-01-01

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...

  10. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  11. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  12. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P J [VTT Electronics, Oulu (Finland). Embedded Software

    1998-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  13. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  14. Bacterial diversity shift determined by different diets in the gut of the spotted wing fly Drosophila suzukii is primarily reflected on acetic acid bacteria

    KAUST Repository

    Vacchini, Violetta

    2016-11-25

    The pivotal role of diet in shaping gut microbiota has been evaluated in different animal models, including insects. Drosophila flies harbour an inconstant microbiota among which acetic acid bacteria (AAB) are important components. Here, we investigated the bacterial and AAB components of the invasive pest Drosophila suzukii microbiota, by studying the same insect population separately grown on fruit-based or non-fruit artificial diet. AAB were highly prevalent in the gut under both diets (90 and 92% infection rates with fruits and artificial diet, respectively). Fluorescent in situ hybridization and recolonization experiments with green fluorescent protein (Gfp)-labelled strains showed AAB capability to massively colonize insect gut. High-throughput sequencing on 16S rRNA gene indicated that the bacterial microbiota of guts fed with the two diets clustered separately. By excluding AAB-related OTUs from the analysis, insect bacterial communities did not cluster separately according to the diet, suggesting that diet-based diversification of the community is primarily reflected on the AAB component of the community. Diet influenced also AAB alpha-diversity, with separate OTU distributions based on diets. High prevalence, localization and massive recolonization, together with AAB clustering behaviour in relation to diet, suggest an AAB role in the D. suzukii gut response to diet modification. This article is protected by copyright. All rights reserved.

  15. Physics-based models of the plasmasphere

    Energy Technology Data Exchange (ETDEWEB)

    Jordanova, Vania K [Los Alamos National Laboratory; Pierrard, Vivane [BELGIUM; Goldstein, Jerry [SWRI; Andr' e, Nicolas [ESTEC/ESA; Kotova, Galina A [SRI, RUSSIA; Lemaire, Joseph F [BELGIUM; Liemohn, Mike W [U OF MICHIGAN; Matsui, H [UNIV OF NEW HAMPSHIRE

    2008-01-01

    We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.

  16. Modeling oil production based on symbolic regression

    International Nuclear Information System (INIS)

    Yang, Guangfei; Li, Xianneng; Wang, Jianliang; Lian, Lian; Ma, Tieju

    2015-01-01

    Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak

  17. Grid based calibration of SWAT hydrological models

    Directory of Open Access Journals (Sweden)

    D. Gorgan

    2012-07-01

    Full Text Available The calibration and execution of large hydrological models, such as SWAT (soil and water assessment tool, developed for large areas, high resolution, and huge input data, need not only quite a long execution time but also high computation resources. SWAT hydrological model supports studies and predictions of the impact of land management practices on water, sediment, and agricultural chemical yields in complex watersheds. The paper presents the gSWAT application as a web practical solution for environmental specialists to calibrate extensive hydrological models and to run scenarios, by hiding the complex control of processes and heterogeneous resources across the grid based high computation infrastructure. The paper highlights the basic functionalities of the gSWAT platform, and the features of the graphical user interface. The presentation is concerned with the development of working sessions, interactive control of calibration, direct and basic editing of parameters, process monitoring, and graphical and interactive visualization of the results. The experiments performed on different SWAT models and the obtained results argue the benefits brought by the grid parallel and distributed environment as a solution for the processing platform. All the instances of SWAT models used in the reported experiments have been developed through the enviroGRIDS project, targeting the Black Sea catchment area.

  18. Mars 2020 Model Based Systems Engineering Pilot

    Science.gov (United States)

    Dukes, Alexandra Marie

    2017-01-01

    The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and

  19. Nitric oxide circulates in mammalian plasma primarily as an S-nitroso adduct of serum albumin.

    Science.gov (United States)

    Stamler, J S; Jaraki, O; Osborne, J; Simon, D I; Keaney, J; Vita, J; Singel, D; Valeri, C R; Loscalzo, J

    1992-01-01

    We have recently shown that nitric oxide or authentic endothelium-derived relaxing factor generated in a biologic system reacts in the presence of specific protein thiols to form S-nitrosoprotein derivatives that have endothelium-derived relaxing factor-like properties. The single free cysteine of serum albumin, Cys-34, is particularly reactive toward nitrogen oxides (most likely nitrosonium ion) under physiologic conditions, primarily because of its anomalously low pK; given its abundance in plasma, where it accounts for approximately 0.5 mM thiol, we hypothesized that this plasma protein serves as a reservoir for nitric oxide produced by the endothelial cell. To test this hypothesis, we developed a methodology, which involves UV photolytic cleavage of the S--NO bond before reaction with ozone for chemiluminescence detection, with which to measure free nitric oxide, S-nitrosothiols, and S-nitrosoproteins in biologic systems. We found that human plasma contains approximately 7 microM S-nitrosothiols, of which 96% are S-nitrosoproteins, 82% of which is accounted for by S-nitroso-serum albumin. By contrast, plasma levels of free nitric oxide are only in the 3-nM range. In rabbits, plasma S-nitrosothiols are present at approximately 1 microM; 60 min after administration of NG-monomethyl-L-arginine at 50 mg/ml, a selective and potent inhibitor of nitric oxide synthetases, S-nitrosothiols decreased by approximately 40% (greater than 95% of which were accounted for by S-nitrosoproteins, and approximately 80% of which was S-nitroso-serum albumin); this decrease was accompanied by a concomitant increase in mean arterial blood pressure of 22%. These data suggest that naturally produced nitric oxide circulates in plasma primarily complexed in S-nitrosothiol species, principal among which is S-nitroso-serum albumin. This abundant, relatively long-lived adduct likely serves as a reservoir with which plasma levels of highly reactive, short-lived free nitric oxide can be

  20. The development and characterization of a primarily mineral calcium phosphate - poly(epsilon-caprolactone) biocomposite

    Science.gov (United States)

    Dunkley, Ian Robert

    Orthopaedic reconstruction often involves the surgical introduction of structural implants that provide for rigid fixation, skeletal stabilization, and bone integration. The high stresses incurred by these implanted devices have historically limited material choices to metallic and select polymeric formulations. While mechanical requirements are achieved, these non-degradable materials do not participate actively in the remodeling of the skeleton and present the possibility of long-term failure or rejection. This is particularly relevant in cervical fusion, an orthopaedic procedure to treat damaged, degenerative or diseased intervertebral discs. A significant improvement on the available synthetic bone replacement/regeneration options for implants to treat these conditions in the cervical spine may be achieved with the development of primarily mineral biocomposites comprised of a bioactive ceramic matrix reinforced with a biodegradable polymer. Such a biocomposite may be engineered to possess the clinically required mechanical properties of a particular application, while maintaining the ability to be remodeled completely by the body. A biocomposite of Si-doped calcium phosphate (Si-CaP) and poly(epsilon-caprolactone) (PCL) was developed for application as such a synthetic bone material for potential use as a fusion device in the cervical spine. In this thesis, a method by which high mineral content Si-CaP/PCL biocomposites with interpenetrating matrices of mineral and polymer phases may be prepared will be demonstrated, in addition to the effects of the various preparation parameters on the biocomposite density, porosity and mechanical properties. This new technique by which dense, primarily ceramic Si-CaP/PCL biocomposites were prepared, allowed for the incorporation of mineral contents ranging between 45-97vol%. Polymer infiltration, accomplished solely by passive capillary uptake over several days, was found to be capable of fully infiltrating the microporosity

  1. Mathematical modeling of acid-base physiology.

    Science.gov (United States)

    Occhipinti, Rossana; Boron, Walter F

    2015-01-01

    pH is one of the most important parameters in life, influencing virtually every biological process at the cellular, tissue, and whole-body level. Thus, for cells, it is critical to regulate intracellular pH (pHi) and, for multicellular organisms, to regulate extracellular pH (pHo). pHi regulation depends on the opposing actions of plasma-membrane transporters that tend to increase pHi, and others that tend to decrease pHi. In addition, passive fluxes of uncharged species (e.g., CO2, NH3) and charged species (e.g., HCO3(-), [Formula: see text] ) perturb pHi. These movements not only influence one another, but also perturb the equilibria of a multitude of intracellular and extracellular buffers. Thus, even at the level of a single cell, perturbations in acid-base reactions, diffusion, and transport are so complex that it is impossible to understand them without a quantitative model. Here we summarize some mathematical models developed to shed light onto the complex interconnected events triggered by acids-base movements. We then describe a mathematical model of a spherical cells-which to our knowledge is the first one capable of handling a multitude of buffer reactions-that our team has recently developed to simulate changes in pHi and pHo caused by movements of acid-base equivalents across the plasma membrane of a Xenopus oocyte. Finally, we extend our work to a consideration of the effects of simultaneous CO2 and HCO3(-) influx into a cell, and envision how future models might extend to other cell types (e.g., erythrocytes) or tissues (e.g., renal proximal-tubule epithelium) important for whole-body pH homeostasis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Image-Based Models Using Crowdsourcing Strategy

    Directory of Open Access Journals (Sweden)

    Antonia Spanò

    2016-12-01

    Full Text Available The conservation and valorization of Cultural Heritage require an extensive documentation, both in properly historic-artistic terms and regarding the physical characteristics of position, shape, color, and geometry. With the use of digital photogrammetry that make acquisition of overlapping images for 3D photo modeling and with the development of dense and accurate 3D point models, it is possible to obtain high-resolution orthoprojections of surfaces.Recent years have seen a growing interest in crowdsourcing that holds in the field of the protection and dissemination of cultural heritage, in parallel there is an increasing awareness for contributing the generation of digital models with the immense wealth of images available on the web which are useful for documentation heritage.In this way, the availability and ease the automation of SfM (Structure from Motion algorithm enables the generation of digital models of the built heritage, which can be inserted positively in crowdsourcing processes. In fact, non-expert users can handle the technology in the process of acquisition, which today is one of the fundamental points to involve the wider public to the cultural heritage protection. To present the image based models and their derivatives that can be made from a great digital resource; the current approach is useful for the little-known heritage or not easily accessible buildings as an emblematic case study that was selected. It is the Vank Cathedral in Isfahan in Iran: the availability of accurate point clouds and reliable orthophotos are very convenient since the building of the Safavid epoch (cent. XVII-XVIII completely frescoed with the internal surfaces, which the architecture and especially the architectural decoration reach their peak.The experimental part of the paper explores also some aspects of usability of the digital output from the image based modeling methods. The availability of orthophotos allows and facilitates the iconographic

  3. Identification of a novel CoA synthase isoform, which is primarily expressed in Brain

    International Nuclear Information System (INIS)

    Nemazanyy, Ivan; Panasyuk, Ganna; Breus, Oksana; Zhyvoloup, Alexander; Filonenko, Valeriy; Gout, Ivan T.

    2006-01-01

    CoA and its derivatives Acetyl-CoA and Acyl-CoA are important players in cellular metabolism and signal transduction. CoA synthase is a bifunctional enzyme which mediates the final stages of CoA biosynthesis. In previous studies, we have reported molecular cloning, biochemical characterization, and subcellular localization of CoA synthase (CoASy). Here, we describe the existence of a novel CoA synthase isoform, which is the product of alternative splicing and possesses a 29aa extension at the N-terminus. We termed it CoASy β and originally identified CoA synthase, CoASy α. The transcript specific for CoASy β was identified by electronic screening and by RT-PCR analysis of various rat tissues. The existence of this novel isoform was further confirmed by immunoblot analysis with antibodies directed to the N-terminal peptide of CoASy β. In contrast to CoASy α, which shows ubiquitous expression, CoASy β is primarily expressed in Brain. Using confocal microscopy, we demonstrated that both isoforms are localized on mitochondria. The N-terminal extension does not affect the activity of CoA synthase, but possesses a proline-rich sequence which can bring the enzyme into complexes with signalling proteins containing SH3 or WW domains. The role of this novel isoform in CoA biosynthesis, especially in Brain, requires further elucidation

  4. Model-based optimization biofilm based systems performing autotrophic nitrogen removal using the comprehensive NDHA model

    DEFF Research Database (Denmark)

    Valverde Pérez, Borja; Ma, Yunjie; Morset, Martin

    Completely autotrophic nitrogen removal (CANR) can be obtained in single stage biofilm-based bioreactors. However, their environmental footprint is compromised due to elevated N2O emissions. We developed novel spatially explicit biochemical process model of biofilm based CANR systems that predicts...

  5. Circulation-based Modeling of Gravity Currents

    Science.gov (United States)

    Meiburg, E. H.; Borden, Z.

    2013-05-01

    Atmospheric and oceanic flows driven by predominantly horizontal density differences, such as sea breezes, thunderstorm outflows, powder snow avalanches, and turbidity currents, are frequently modeled as gravity currents. Efforts to develop simplified models of such currents date back to von Karman (1940), who considered a two-dimensional gravity current in an inviscid, irrotational and infinitely deep ambient. Benjamin (1968) presented an alternative model, focusing on the inviscid, irrotational flow past a gravity current in a finite-depth channel. More recently, Shin et al. (2004) proposed a model for gravity currents generated by partial-depth lock releases, considering a control volume that encompasses both fronts. All of the above models, in addition to the conservation of mass and horizontal momentum, invoke Bernoulli's law along some specific streamline in the flow field, in order to obtain a closed system of equations that can be solved for the front velocity as function of the current height. More recent computational investigations based on the Navier-Stokes equations, on the other hand, reproduce the dynamics of gravity currents based on the conservation of mass and momentum alone. We propose that it should therefore be possible to formulate a fundamental gravity current model without invoking Bernoulli's law. The talk will show that the front velocity of gravity currents can indeed be predicted as a function of their height from mass and momentum considerations alone, by considering the evolution of interfacial vorticity. This approach does not require information on the pressure field and therefore avoids the need for an energy closure argument such as those invoked by the earlier models. Predictions by the new theory are shown to be in close agreement with direct numerical simulation results. References Von Karman, T. 1940 The engineer grapples with nonlinear problems, Bull. Am. Math Soc. 46, 615-683. Benjamin, T.B. 1968 Gravity currents and related

  6. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  7. Model Based Autonomy for Robust Mars Operations

    Science.gov (United States)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  8. Diet Quality and Nutrient Intake of Urban Overweight and Obese Primarily African American Older Adults with Osteoarthritis

    Directory of Open Access Journals (Sweden)

    Sevasti Vergis

    2018-04-01

    Full Text Available Diet quality may be a unique target for preventing and managing obesity-related osteoarthritis (OA. Using the Healthy Eating Index-2010 (HEI-2010, this study examined the nutrient intake and diet quality of 400 urban overweight and obese primarily African American older adults with self-reported lower extremity OA. Associations between sociodemographic and health-related factors and diet quality were explored. Participants (mean age 67.8 years, SD 5.9 were included. Habitual dietary intake was assessed using a food frequency questionnaire (FFQ. Nutrient intake and diet quality were calculated from the FFQ. Results indicated that diet quality needs improvement (HEI-2010: 66.3 (SD 10.5. Age, body mass index, employment (multivariable model only, and OA severity (bivariate model only were significant predictors of HEI-2010 total score in linear models. Mean intakes for fiber, calcium, and vitamin D were below recommendations, while percentage of calories as total fat exceeded recommendations. These findings can inform future dietary intervention trials and public health messaging for a sub-population at a high risk for obesity-related OA.

  9. Human physiologically based pharmacokinetic model for propofol

    Directory of Open Access Journals (Sweden)

    Schnider Thomas W

    2005-04-01

    Full Text Available Abstract Background Propofol is widely used for both short-term anesthesia and long-term sedation. It has unusual pharmacokinetics because of its high lipid solubility. The standard approach to describing the pharmacokinetics is by a multi-compartmental model. This paper presents the first detailed human physiologically based pharmacokinetic (PBPK model for propofol. Methods PKQuest, a freely distributed software routine http://www.pkquest.com, was used for all the calculations. The "standard human" PBPK parameters developed in previous applications is used. It is assumed that the blood and tissue binding is determined by simple partition into the tissue lipid, which is characterized by two previously determined set of parameters: 1 the value of the propofol oil/water partition coefficient; 2 the lipid fraction in the blood and tissues. The model was fit to the individual experimental data of Schnider et. al., Anesthesiology, 1998; 88:1170 in which an initial bolus dose was followed 60 minutes later by a one hour constant infusion. Results The PBPK model provides a good description of the experimental data over a large range of input dosage, subject age and fat fraction. Only one adjustable parameter (the liver clearance is required to describe the constant infusion phase for each individual subject. In order to fit the bolus injection phase, for 10 or the 24 subjects it was necessary to assume that a fraction of the bolus dose was sequestered and then slowly released from the lungs (characterized by two additional parameters. The average weighted residual error (WRE of the PBPK model fit to the both the bolus and infusion phases was 15%; similar to the WRE for just the constant infusion phase obtained by Schnider et. al. using a 6-parameter NONMEM compartmental model. Conclusion A PBPK model using standard human parameters and a simple description of tissue binding provides a good description of human propofol kinetics. The major advantage of a

  10. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the several...

  11. Particle-based model for skiing traffic.

    Science.gov (United States)

    Holleczek, Thomas; Tröster, Gerhard

    2012-05-01

    We develop and investigate a particle-based model for ski slope traffic. Skiers are modeled as particles with a mass that are exposed to social and physical forces, which define the riding behavior of skiers during their descents on ski slopes. We also report position and speed data of 21 skiers recorded with GPS-equipped cell phones on two ski slopes. A comparison of these data with the trajectories resulting from computer simulations of our model shows a good correspondence. A study of the relationship among the density, speed, and flow of skiers reveals that congestion does not occur even with arrival rates of skiers exceeding the maximum ski lift capacity. In a sensitivity analysis, we identify the kinetic friction coefficient of skis on snow, the skier mass, the range of repelling social forces, and the arrival rate of skiers as the crucial parameters influencing the simulation results. Our model allows for the prediction of speed zones and skier densities on ski slopes, which is important in the prevention of skiing accidents.

  12. Diminishing musyarakah investment model based on equity

    Science.gov (United States)

    Jaffar, Maheran Mohd; Zain, Shaharir Mohamad; Jemain, Abdul Aziz

    2017-11-01

    Most of the mudharabah and musyarakah contract funds are involved in debt financing. This does not support the theory that profit sharing contract is better than that of debt financing due to the sharing of risks and ownership of equity. Indeed, it is believed that Islamic banking is a financial model based on equity or musyarakah which emphasis on the sharing of risks, profit and loss in the investment between the investor and entrepreneur. The focus of this paper is to introduce the mathematical model that internalizes diminishing musyarakah, the sharing of profit and equity between entrepreneur and investor. The entrepreneur pays monthly-differed payment to buy out the equity that belongs to the investor (bank) where at the end of the specified period, the entrepreneur owns the business and the investor (bank) exits the joint venture. The model is able to calculate the amount of equity at any time for both parties and hence would be a guide in helping to estimate the value of investment should the entrepreneur or investor exit before the end of the specified period. The model is closer to the Islamic principles for justice and fairness.

  13. New global ICT-based business models

    DEFF Research Database (Denmark)

    The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative ...... The NEWGIBM Cases Show? The Strategy Concept in Light of the Increased Importance of Innovative Business Models Successful Implementation of Global BM Innovation Globalisation Of ICT Based Business Models: Today And In 2020......The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative....... The NEWGIBM book serves as a part of the final evaluation and documentation of the NEWGIBM project and is supported by results from the following projects: M-commerce, Global Innovation, Global Ebusiness & M-commerce, The Blue Ocean project, International Center for Innovation and Women in Business, NEFFICS...

  14. CONFIRMING THE PRIMARILY SMOOTH STRUCTURE OF THE VEGA DEBRIS DISK AT MILLIMETER WAVELENGTHS

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, A. Meredith; Plambeck, Richard; Chiang, Eugene [Department of Astronomy, University of California, Berkeley, CA 94720 (United States); Wilner, David J.; Andrews, Sean M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Mason, Brian [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903-2475 (United States); Carpenter, John M. [California Institute of Technology, Department of Astronomy, MC 105-24, Pasadena, CA 91125 (United States); Chiang, Hsin-Fang [Institute for Astronomy, University of Hawaii, 640 North Aohoku Place, Hilo, HI 96720 (United States); Williams, Jonathan P. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Hales, Antonio [Joint ALMA Observatory, Av. El Golf 40, Piso 18, Santiago (Chile); Su, Kate [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States); Dicker, Simon; Korngut, Phil; Devlin, Mark, E-mail: mhughes@astro.berkeley.edu [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States)

    2012-05-01

    Clumpy structure in the debris disk around Vega has been previously reported at millimeter wavelengths and attributed to concentrations of dust grains trapped in resonances with an unseen planet. However, recent imaging at similar wavelengths with higher sensitivity has disputed the observed structure. We present three new millimeter-wavelength observations that help to resolve the puzzling and contradictory observations. We have observed the Vega system with the Submillimeter Array (SMA) at a wavelength of 880 {mu}m and an angular resolution of 5''; with the Combined Array for Research in Millimeter-wave Astronomy (CARMA) at a wavelength of 1.3 mm and an angular resolution of 5''; and with the Green Bank Telescope (GBT) at a wavelength of 3.3 mm and angular resolution of 10''. Despite high sensitivity and short baselines, we do not detect the Vega debris disk in either of the interferometric data sets (SMA and CARMA), which should be sensitive at high significance to clumpy structure based on previously reported observations. We obtain a marginal (3{sigma}) detection of disk emission in the GBT data; the spatial distribution of the emission is not well constrained. We analyze the observations in the context of several different models, demonstrating that the observations are consistent with a smooth, broad, axisymmetric disk with inner radius 20-100 AU and width {approx}> 50 AU. The interferometric data require that at least half of the 860 {mu}m emission detected by previous single-dish observations with the James Clerk Maxwell Telescope be distributed axisymmetrically, ruling out strong contributions from flux concentrations on spatial scales of {approx}<100 AU. These observations support recent results from the Plateau de Bure Interferometer indicating that previous detections of clumpy structure in the Vega debris disk were spurious.

  15. CONFIRMING THE PRIMARILY SMOOTH STRUCTURE OF THE VEGA DEBRIS DISK AT MILLIMETER WAVELENGTHS

    International Nuclear Information System (INIS)

    Hughes, A. Meredith; Plambeck, Richard; Chiang, Eugene; Wilner, David J.; Andrews, Sean M.; Mason, Brian; Carpenter, John M.; Chiang, Hsin-Fang; Williams, Jonathan P.; Hales, Antonio; Su, Kate; Dicker, Simon; Korngut, Phil; Devlin, Mark

    2012-01-01

    Clumpy structure in the debris disk around Vega has been previously reported at millimeter wavelengths and attributed to concentrations of dust grains trapped in resonances with an unseen planet. However, recent imaging at similar wavelengths with higher sensitivity has disputed the observed structure. We present three new millimeter-wavelength observations that help to resolve the puzzling and contradictory observations. We have observed the Vega system with the Submillimeter Array (SMA) at a wavelength of 880 μm and an angular resolution of 5''; with the Combined Array for Research in Millimeter-wave Astronomy (CARMA) at a wavelength of 1.3 mm and an angular resolution of 5''; and with the Green Bank Telescope (GBT) at a wavelength of 3.3 mm and angular resolution of 10''. Despite high sensitivity and short baselines, we do not detect the Vega debris disk in either of the interferometric data sets (SMA and CARMA), which should be sensitive at high significance to clumpy structure based on previously reported observations. We obtain a marginal (3σ) detection of disk emission in the GBT data; the spatial distribution of the emission is not well constrained. We analyze the observations in the context of several different models, demonstrating that the observations are consistent with a smooth, broad, axisymmetric disk with inner radius 20-100 AU and width ∼> 50 AU. The interferometric data require that at least half of the 860 μm emission detected by previous single-dish observations with the James Clerk Maxwell Telescope be distributed axisymmetrically, ruling out strong contributions from flux concentrations on spatial scales of ∼<100 AU. These observations support recent results from the Plateau de Bure Interferometer indicating that previous detections of clumpy structure in the Vega debris disk were spurious.

  16. Activity based costing model for inventory valuation

    Directory of Open Access Journals (Sweden)

    Vineet Chouhan

    2017-03-01

    Full Text Available Activity-Based-Model (ABC is used for the purpose of significant improvement for overhead accounting systems by providing the best information required for managerial decision. This pa-per discusses implacability of ABC technique on inventory valuation as a management account-ing innovation. In order to prove the applicability of ABC for inventory control a material driven medium-sized and privately owned company from engineering (iron and steel industry is select-ed and by analysis of its production process and its material dependency and use of indirect in-ventory, an ABC model is explored for better inventory control. The case revealed that the ne-cessity of ABC in the area of inventory control is significant. The company is not only able to increase its quality of decision but also it can significantly analyze its cost of direct material cost, valuation of direct material and use its implications for better decision making.

  17. Reputation based security model for android applications

    OpenAIRE

    Tesfay, Welderufael Berhane; Booth, Todd; Andersson, Karl

    2012-01-01

    The market for smart phones has been booming in the past few years. There are now over 400,000 applications on the Android market. Over 10 billion Android applications have been downloaded from the Android market. Due to the Android popularity, there are now a large number of malicious vendors targeting the platform. Many honest end users are being successfully hacked on a regular basis. In this work, a cloud based reputation security model has been proposed as a solution which greatly mitiga...

  18. Average Nuclear properties based on statistical model

    International Nuclear Information System (INIS)

    El-Jaick, L.J.

    1974-01-01

    The rough properties of nuclei were investigated by statistical model, in systems with the same and different number of protons and neutrons, separately, considering the Coulomb energy in the last system. Some average nuclear properties were calculated based on the energy density of nuclear matter, from Weizsscker-Beth mass semiempiric formulae, generalized for compressible nuclei. In the study of a s surface energy coefficient, the great influence exercised by Coulomb energy and nuclear compressibility was verified. For a good adjust of beta stability lines and mass excess, the surface symmetry energy were established. (M.C.K.) [pt

  19. Ontology-Based Model Of Firm Competitiveness

    Science.gov (United States)

    Deliyska, Boryana; Stoenchev, Nikolay

    2010-10-01

    Competitiveness is important characteristics of each business organization (firm, company, corporation etc). It is of great significance for the organization existence and defines evaluation criteria of business success at microeconomical level. Each criterium comprises set of indicators with specific weight coefficients. In the work an ontology-based model of firm competitiveness is presented as a set of several mutually connected ontologies. It would be useful for knowledge structuring, standardization and sharing among experts and software engineers who develop application in the domain. Then the assessment of the competitiveness of various business organizations could be generated more effectively.

  20. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    Science.gov (United States)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both

  1. Biologically based multistage modeling of radiation effects

    Energy Technology Data Exchange (ETDEWEB)

    William Hazelton; Suresh Moolgavkar; E. Georg Luebeck

    2005-08-30

    This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of

  2. Model based control of refrigeration systems

    Energy Technology Data Exchange (ETDEWEB)

    Sloth Larsen, L.F.

    2005-11-15

    The subject for this Ph.D. thesis is model based control of refrigeration systems. Model based control covers a variety of different types of controls, that incorporates mathematical models. In this thesis the main subject therefore has been restricted to deal with system optimizing control. The optimizing control is divided into two layers, where the system oriented top layers deals with set-point optimizing control and the lower layer deals with dynamical optimizing control in the subsystems. The thesis has two main contributions, i.e. a novel approach for set-point optimization and a novel approach for desynchronization based on dynamical optimization. The focus in the development of the proposed set-point optimizing control has been on deriving a simple and general method, that with ease can be applied on various compositions of the same class of systems, such as refrigeration systems. The method is based on a set of parameter depended static equations describing the considered process. By adapting the parameters to the given process, predict the steady state and computing a steady state gradient of the cost function, the process can be driven continuously towards zero gradient, i.e. the optimum (if the cost function is convex). The method furthermore deals with system constrains by introducing barrier functions, hereby the best possible performance taking the given constrains in to account can be obtained, e.g. under extreme operational conditions. The proposed method has been applied on a test refrigeration system, placed at Aalborg University, for minimization of the energy consumption. Here it was proved that by using general static parameter depended system equations it was possible drive the set-points close to the optimum and thus reduce the power consumption with up to 20%. In the dynamical optimizing layer the idea is to optimize the operation of the subsystem or the groupings of subsystems, that limits the obtainable system performance. In systems

  3. Agent-based modelling in synthetic biology.

    Science.gov (United States)

    Gorochowski, Thomas E

    2016-11-30

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).

  4. Statistically Based Morphodynamic Modeling of Tracer Slowdown

    Science.gov (United States)

    Borhani, S.; Ghasemi, A.; Hill, K. M.; Viparelli, E.

    2017-12-01

    Tracer particles are used to study bedload transport in gravel-bed rivers. One of the advantages associated with using of tracer particles is that they allow for direct measures of the entrainment rates and their size distributions. The main issue in large scale studies with tracer particles is the difference between tracer stone short term and long term behavior. This difference is due to the fact that particles undergo vertical mixing or move to less active locations such as bars or even floodplains. For these reasons the average virtual velocity of tracer particle decreases in time, i.e. the tracer slowdown. In summary, tracer slowdown can have a significant impact on the estimation of bedload transport rate or long term dispersal of contaminated sediment. The vast majority of the morphodynamic models that account for the non-uniformity of the bed material (tracer and not tracer, in this case) are based on a discrete description of the alluvial deposit. The deposit is divided in two different regions; the active layer and the substrate. The active layer is a thin layer in the topmost part of the deposit whose particles can interact with the bed material transport. The substrate is the part of the deposit below the active layer. Due to the discrete representation of the alluvial deposit, active layer models are not able to reproduce tracer slowdown. In this study we try to model the slowdown of tracer particles with the continuous Parker-Paola-Leclair morphodynamic framework. This continuous, i.e. not layer-based, framework is based on a stochastic description of the temporal variation of bed surface elevation, and of the elevation specific particle entrainment and deposition. Particle entrainment rates are computed as a function of the flow and sediment characteristics, while particle deposition is estimated with a step length formulation. Here we present one of the first implementation of the continuum framework at laboratory scale, its validation against

  5. Use of Physiologically Based Pharmacokinetic (PBPK) Models ...

    Science.gov (United States)

    EPA announced the availability of the final report, Use of Physiologically Based Pharmacokinetic (PBPK) Models to Quantify the Impact of Human Age and Interindividual Differences in Physiology and Biochemistry Pertinent to Risk Final Report for Cooperative Agreement. This report describes and demonstrates techniques necessary to extrapolate and incorporate in vitro derived metabolic rate constants in PBPK models. It also includes two case study examples designed to demonstrate the applicability of such data for health risk assessment and addresses the quantification, extrapolation and interpretation of advanced biochemical information on human interindividual variability of chemical metabolism for risk assessment application. It comprises five chapters; topics and results covered in the first four chapters have been published in the peer reviewed scientific literature. Topics covered include: Data Quality ObjectivesExperimental FrameworkRequired DataTwo example case studies that develop and incorporate in vitro metabolic rate constants in PBPK models designed to quantify human interindividual variability to better direct the choice of uncertainty factors for health risk assessment. This report is intended to serve as a reference document for risk assors to use when quantifying, extrapolating, and interpretating advanced biochemical information about human interindividual variability of chemical metabolism.

  6. Intellectual Model-Based Configuration Management Conception

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-07-01

    Full Text Available Software configuration management is one of the most important disciplines within the software development project, which helps control the software evolution process and allows including into the end project only tested and validated changes. To achieve this, software management completes certain tasks. Concrete tools are used for technical implementation of tasks, such as version control systems, servers of continuous integration, compilers, etc. A correct configuration management process usually requires several tools, which mutually exchange information by generating various kinds of transfers. When it comes to introducing the configuration management process, often there are situations when tool installation is started, yet at that given moment there is no general picture of the total process. The article offers a model-based configuration management concept, which foresees the development of an abstract model for the configuration management process that later is transformed to lower abstraction level models and tools are indicated to support the technical process. A solution of this kind allows a more rational introduction and configuration of tools

  7. Homogenous stretching or detachment faulting? Which process is primarily extending the Aegean crust

    Science.gov (United States)

    Kumerics, C.; Ring, U.

    2003-04-01

    In extending orogens like the Aegean Sea of Greece and the Basin-and-Range province of the western United States, knowledge of rates of tectonic processes are important for understanding which process is primarily extending the crust. Platt et al. (1998) proposed that homogeneous stretching of the lithosphere (i.e. vertical ductile thinning associated with a subhorizontal foliation) at rates of 4-5 km Myr-1 is the dominant process that formed the Alboran Sea in the western Mediterranean. The Aegean Sea in the eastern Mediterranean is well-known for its low-angle normal faults (detachments) (Lister et al., 1984; Lister &Forster, 1996) suggesting that detachment faulting may have been the primary agent achieving ~>250 km (McKenzie, 1978) of extension since the Miocene. Ring et al. (2003) provided evidence for a very fast-slipping detachment on the islands of Syros and Tinos in the western Cyclades, which suggests that normal faulting was the dominant tectonic process that formed the Aegean Sea. However, most extensional detachments in the Aegean do not allow to quantify the amount of vertical ductile thinning associated with extension and therefore a full evaluation of the significance of vertical ductile thinning is not possible. On the Island of Ikaria in the eastern Aegean Sea, a subhorizontal extensional ductile shear zone is well exposed. We studied this shear zone in detail to quantify the amount of vertical ductile thinning associated with extension. Numerous studies have shown that natural shear zones usually deviate significantly from progressive simple shear and are characterized by pronounced shortening perpendicular to the shear zone. Numerous deformed pegmatitic veins in this shear zone on Ikaria allow the reconstruction of deformation and flow parameters (Passchier, 1990), which are necessary for quantifying the amount of vertical ductile thinning in the shear zone. Furthermore, a flow-path and finite-strain study in a syn-tectonic granite, which

  8. Hazard identification based on plant functional modelling

    International Nuclear Information System (INIS)

    Rasmussen, B.; Whetton, C.

    1993-10-01

    A major objective of the present work is to provide means for representing a process plant as a socio-technical system, so as to allow hazard identification at a high level. The method includes technical, human and organisational aspects and is intended to be used for plant level hazard identification so as to identify critical areas and the need for further analysis using existing methods. The first part of the method is the preparation of a plant functional model where a set of plant functions link together hardware, software, operations, work organisation and other safety related aspects of the plant. The basic principle of the functional modelling is that any aspect of the plant can be represented by an object (in the sense that this term is used in computer science) based upon an Intent (or goal); associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. The Methods and Constraints can themselves be treated as objects and decomposed into lower-level Intents (hence the procedure is known as functional decomposition) so giving rise to a hierarchical, object-oriented structure. The plant level hazard identification is carried out on the plant functional model using the Concept Hazard Analysis method. In this, the user will be supported by checklists and keywords and the analysis is structured by pre-defined worksheets. The preparation of the plant functional model and the performance of the hazard identification can be carried out manually or with computer support. (au) (4 tabs., 10 ills., 7 refs.)

  9. Agent-based models of financial markets

    Energy Technology Data Exchange (ETDEWEB)

    Samanidou, E [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany); Zschischang, E [HSH Nord Bank, Portfolio Mngmt. and Inv., Martensdamm 6, D-24103 Kiel (Germany); Stauffer, D [Institute for Theoretical Physics, Cologne University, D-50923 Koeln (Germany); Lux, T [Department of Economics, University of Kiel, Olshausenstrasse 40, D-24118 Kiel (Germany)

    2007-03-15

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we

  10. Agent-based models of financial markets

    Science.gov (United States)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  11. Agent-based models of financial markets

    International Nuclear Information System (INIS)

    Samanidou, E; Zschischang, E; Stauffer, D; Lux, T

    2007-01-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  12. The Challenge of Forecasting Metropolitan Growth: Urban Characteristics Based Models versus Regional Dummy Based Models

    OpenAIRE

    NA

    2005-01-01

    This paper presents a study of errors in forecasting the population of Metropolitan Statistical Areas and the Primary MSAs of Consolidated Metropolitan Statistical Areas and New England MAs. The forecasts are for the year 2000 and are based on a semi-structural model estimated by Mills and Lubelle using 1970 to 1990 census data on population, employment and relative real wages. This model allows the testing of regional effects on population and employment growth. The year 2000 forecasts are f...

  13. The Use of Modeling-Based Text to Improve Students' Modeling Competencies

    Science.gov (United States)

    Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan

    2015-01-01

    This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…

  14. Madrasah Culture Based Transformational Leadership Model

    Directory of Open Access Journals (Sweden)

    Nur Khoiri

    2016-10-01

    Full Text Available Leadership is the ability to influence, direct behavior, and have a particular expertise in the field of the group who want to achieve the goals. A dynamic organization requires transformational leadership model. A school principal as a leader at school aims to actualize good learning leadership. Leadership learning focuses on learning which components include curriculum, teaching and learning process, assessment, teacher assessment and development, good service in learning, and developing a learning community in schools based on organizational culture as value, assumption, belief evolved from the roots of member thought of the organization and believed by all members of the organization and implemented in everyday life that could give meaning Keywords: leadership, transformational leadership, headmaster, instructional leadership, organizational culture.

  15. Route constraints model based on polychromatic sets

    Science.gov (United States)

    Yin, Xianjun; Cai, Chao; Wang, Houjun; Li, Dongwu

    2018-03-01

    With the development of unmanned aerial vehicle (UAV) technology, the fields of its application are constantly expanding. The mission planning of UAV is especially important, and the planning result directly influences whether the UAV can accomplish the task. In order to make the results of mission planning for unmanned aerial vehicle more realistic, it is necessary to consider not only the physical properties of the aircraft, but also the constraints among the various equipment on the UAV. However, constraints among the equipment of UAV are complex, and the equipment has strong diversity and variability, which makes these constraints difficult to be described. In order to solve the above problem, this paper, referring to the polychromatic sets theory used in the advanced manufacturing field to describe complex systems, presents a mission constraint model of UAV based on polychromatic sets.

  16. Model Based Control of Refrigeration Systems

    DEFF Research Database (Denmark)

    Larsen, Lars Finn Sloth

    for automation of these procedures, that is to incorporate some "intelligence" in the control system, this project was started up. The main emphasis of this work has been on model based methods for system optimizing control in supermarket refrigeration systems. The idea of implementing a system optimizing...... control is to let an optimization procedure take over the task of operating the refrigeration system and thereby replace the role of the operator in the traditional control structure. In the context of refrigeration systems, the idea is to divide the optimizing control structure into two parts: A part...... optimizing the steady state operation "set-point optimizing control" and a part optimizing dynamic behaviour of the system "dynamical optimizing control". A novel approach for set-point optimization will be presented. The general idea is to use a prediction of the steady state, for computation of the cost...

  17. Ecosystem Based Business Model of Smart Grid

    DEFF Research Database (Denmark)

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support...... the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on the smart grid with the theory of business ecosystem may open opportunities to understand market catalysts. This study...... contributes an understanding of business ecosystem applicable for smart grid. Smart grid infrastructure is an intricate business ecosystem, which have several intentions to deliver the value proposition and what it should be. The findings help to identify and capture value from markets....

  18. Prototype-based models in machine learning.

    Science.gov (United States)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of potentially high-dimensional, complex datasets. We discuss basic schemes of competitive vector quantization as well as the so-called neural gas approach and Kohonen's topology-preserving self-organizing map. Supervised learning in prototype systems is exemplified in terms of learning vector quantization. Most frequently, the familiar Euclidean distance serves as a dissimilarity measure. We present extensions of the framework to nonstandard measures and give an introduction to the use of adaptive distances in relevance learning. © 2016 Wiley Periodicals, Inc.

  19. Discovering Diabetes Complications: an Ontology Based Model.

    Science.gov (United States)

    Daghistani, Tahani; Shammari, Riyad Al; Razzak, Muhammad Imran

    2015-12-01

    Diabetes is a serious disease that spread in the world dramatically. The diabetes patient has an average of risk to experience complications. Take advantage of recorded information to build ontology as information technology solution will help to predict patients who have average of risk level with certain complication. It is helpful to search and present patient's history regarding different risk factors. Discovering diabetes complications could be useful to prevent or delay the complications. We designed ontology based model, using adult diabetes patients' data, to discover the rules of diabetes with its complications in disease to disease relationship. Various rules between different risk factors of diabetes Patients and certain complications generated. Furthermore, new complications (diseases) might be discovered as new finding of this study, discovering diabetes complications could be useful to prevent or delay the complications. The system can identify the patients who are suffering from certain risk factors such as high body mass index (obesity) and starting controlling and maintaining plan.

  20. Model based design of electronic throttle control

    Science.gov (United States)

    Cherian, Fenin; Ranjan, Ashish; Bhowmick, Pathikrit; Rammohan, A.

    2017-11-01

    With the advent of torque based Engine Management Systems, the precise control and robust performance of the throttle body becomes a key factor in the overall performance of the vehicle. Electronic Throttle Control provides benefits such as improved air-fuel ratio for improving the vehicle performance and lower exhausts emissions to meet the stringent emission norms. Modern vehicles facilitate various features such as Cruise Control, Traction Control, Electronic Stability Program and Pre-crash systems. These systems require control over engine power without driver intervention, which is not possible with conventional mechanical throttle system. Thus these systems are integrated to function with the electronic throttle control. However, due to inherent non-linearities in the throttle body, the control becomes a difficult task. In order to eliminate the influence of this hysteresis at the initial operation of the butterfly valve, a control to compensate the shortage must be added to the duty required for starting throttle operation when the initial operation is detected. Therefore, a lot of work is being done in this field to incorporate the various nonlinearities to achieve robust control. In our present work, the ETB was tested to verify the working of the system. Calibration of the TPS sensors was carried out in order to acquire accurate throttle opening angle. The response of the calibrated system was then plotted against a step input signal. A linear model of the ETB was prepared using Simulink and its response was compared with the experimental data to find out the initial deviation of the model from the actual system. To reduce this deviation, non-linearities from existing literature were introduced to the system and a response analysis was performed to check the deviation from the actual system. Based on this investigation, an introduction of a new nonlinearity parameter can be used in future to reduce the deviation further making the control of the ETB more

  1. Analysis of survival for patients with chronic kidney disease primarily related to renal cancer surgery.

    Science.gov (United States)

    Wu, Jitao; Suk-Ouichai, Chalairat; Dong, Wen; Antonio, Elvis Caraballo; Derweesh, Ithaar H; Lane, Brian R; Demirjian, Sevag; Li, Jianbo; Campbell, Steven C

    2018-01-01

    To evaluate predictors of long-term survival for patients with chronic kidney disease primarily due to surgery (CKD-S). Patients with CKD-S have generally good survival that approximates patients who do not have CKD even after renal cancer surgery (RCS), yet there may be heterogeneity within this cohort. From 1997 to 2008, 4 246 patients underwent RCS at our centre. The median (interquartile range [IQR]) follow-up was 9.4 (7.3-11.0) years. New baseline glomerular filtration rate (GFR) was defined as highest GFR between nadir and 6 weeks after RCS. We retrospectively evaluated three cohorts: no-CKD (new baseline GFR of ≥60 mL/min/1.73 m 2 ); CKD-S (new baseline GFR of cancer-related survival (NRCRS) for the CKD-S cohort. Kaplan-Meier analysis assessed the longitudinal impact of new baseline GFR (45-60 mL/min/1.73 m 2 vs <45 mL/min/1.73 m 2 ) and Cox regression evaluated relative impact of preoperative GFR, new baseline GFR, and relevant demographics/comorbidities. Of the 4 246 patients who underwent RCS, 931 had CKD-S and 1 113 had CKD-M/S, whilst 2 202 had no-CKD even after RCS. Partial/radical nephrectomy (PN/RN) was performed in 54%/46% of the patients, respectively. For CKD-S, 641 patients had a new baseline GFR of 45-60 mL/min/1.73 m 2 and 290 had a new baseline GFR of <45 mL/min/1.73 m 2 . Kaplan-Meier analysis showed significantly reduced NRCRS for patients with CKD-S with a GFR of <45 mL/min/1.73 m 2 compared to those with no-CKD or CKD-S with a GFR of 45-60 mL/min/1.73 m 2 (both P ≤ 0.004), and competing risk analysis confirmed this (P < 0.001). Age, gender, heart disease, and new baseline GFR were all associated independently with NRCRS for patients with CKD-S (all P ≤ 0.02). Our data suggest that CKD-S is heterogeneous, and patients with a reduced new baseline GFR have compromised survival, particularly if <45 mL/min/1.73 m 2 . Our findings may have implications regarding choice of PN/RN in patients at risk of developing

  2. Modeling stochastic frontier based on vine copulas

    Science.gov (United States)

    Constantino, Michel; Candido, Osvaldo; Tabak, Benjamin M.; da Costa, Reginaldo Brito

    2017-11-01

    This article models a production function and analyzes the technical efficiency of listed companies in the United States, Germany and England between 2005 and 2012 based on the vine copula approach. Traditional estimates of the stochastic frontier assume that data is multivariate normally distributed and there is no source of asymmetry. The proposed method based on vine copulas allow us to explore different types of asymmetry and multivariate distribution. Using data on product, capital and labor, we measure the relative efficiency of the vine production function and estimate the coefficient used in the stochastic frontier literature for comparison purposes. This production vine copula predicts the value added by firms with given capital and labor in a probabilistic way. It thereby stands in sharp contrast to the production function, where the output of firms is completely deterministic. The results show that, on average, S&P500 companies are more efficient than companies listed in England and Germany, which presented similar average efficiency coefficients. For comparative purposes, the traditional stochastic frontier was estimated and the results showed discrepancies between the coefficients obtained by the application of the two methods, traditional and frontier-vine, opening new paths of non-linear research.

  3. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  4. Parasites Affect Food Web Structure Primarily through Increased Diversity and Complexity

    NARCIS (Netherlands)

    Dunne, J.A.; Lafferty, K.D.; Dobson, A.P.; Hechinger, R.F.; Kuris, A.M.; Martinez, N.D.; McLaughlin, J.P.; Mouritsen, K.N.; Poulin, R.; Reise, K.; Stouffer, D.B.; Thieltges, D.W.; Williams, R.J.; Zander, C.D.

    2013-01-01

    Comparative research on food web structure has revealed generalities in trophic organization, produced simple models, and allowed assessment of robustness to species loss. These studies have mostly focused on free-living species. Recent research has suggested that inclusion of parasites alters

  5. Factors affecting the number and type of student research products for chemistry and physics students at primarily undergraduate institutions: A case study.

    Science.gov (United States)

    Mellis, Birgit; Soto, Patricia; Bruce, Chrystal D; Lacueva, Graciela; Wilson, Anne M; Jayasekare, Rasitha

    2018-01-01

    For undergraduate students, involvement in authentic research represents scholarship that is consistent with disciplinary quality standards and provides an integrative learning experience. In conjunction with performing research, the communication of the results via presentations or publications is a measure of the level of scientific engagement. The empirical study presented here uses generalized linear mixed models with hierarchical bootstrapping to examine the factors that impact the means of dissemination of undergraduate research results. Focusing on the research experiences in physics and chemistry of undergraduates at four Primarily Undergraduate Institutions (PUIs) from 2004-2013, statistical analysis indicates that the gender of the student does not impact the number and type of research products. However, in chemistry, the rank of the faculty advisor and the venue of the presentation do impact the number of research products by undergraduate student, whereas in physics, gender match between student and advisor has an effect on the number of undergraduate research products. This study provides a baseline for future studies of discipline-based bibliometrics and factors that affect the number of research products of undergraduate students.

  6. Factors affecting the number and type of student research products for chemistry and physics students at primarily undergraduate institutions: A case study

    Science.gov (United States)

    Soto, Patricia; Bruce, Chrystal D.; Lacueva, Graciela; Wilson, Anne M.; Jayasekare, Rasitha

    2018-01-01

    For undergraduate students, involvement in authentic research represents scholarship that is consistent with disciplinary quality standards and provides an integrative learning experience. In conjunction with performing research, the communication of the results via presentations or publications is a measure of the level of scientific engagement. The empirical study presented here uses generalized linear mixed models with hierarchical bootstrapping to examine the factors that impact the means of dissemination of undergraduate research results. Focusing on the research experiences in physics and chemistry of undergraduates at four Primarily Undergraduate Institutions (PUIs) from 2004–2013, statistical analysis indicates that the gender of the student does not impact the number and type of research products. However, in chemistry, the rank of the faculty advisor and the venue of the presentation do impact the number of research products by undergraduate student, whereas in physics, gender match between student and advisor has an effect on the number of undergraduate research products. This study provides a baseline for future studies of discipline-based bibliometrics and factors that affect the number of research products of undergraduate students. PMID:29698502

  7. Experiences of LGBTQ Students at a Primarily White Institution in the South

    Science.gov (United States)

    Cain, Leia Kristin

    2015-01-01

    In the United States, often Lesbian, Gay, Bisexual, Transgender, Queer, and/or Questioning (LGBTQ) students are targets of verbal harassment and violence due to their sexual orientation or gender identity and gender expression. Further, 31 states do not offer protection against sexuality- or gender identity-based discrimination (ACLU, 2015).…

  8. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  9. A hybrid agent-based approach for modeling microbiological systems.

    Science.gov (United States)

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  10. Adolescent Pornography Use and Dating Violence among a Sample of Primarily Black and Hispanic, Urban-Residing, Underage Youth

    Directory of Open Access Journals (Sweden)

    Emily F. Rothman

    2015-12-01

    Full Text Available This cross-sectional study was designed to characterize the pornography viewing preferences of a sample of U.S.-based, urban-residing, economically disadvantaged, primarily Black and Hispanic youth (n = 72, and to assess whether pornography use was associated with experiences of adolescent dating abuse (ADA victimization. The sample was recruited from a large, urban, safety net hospital, and participants were 53% female, 59% Black, 19% Hispanic, 14% Other race, 6% White, and 1% Native American. All were 16–17 years old. More than half (51% had been asked to watch pornography together by a dating or sexual partner, and 44% had been asked to do something sexual that a partner saw in pornography. Adolescent dating abuse (ADA victimization was associated with more frequent pornography use, viewing pornography in the company of others, being asked to perform a sexual act that a partner first saw in pornography, and watching pornography during or after marijuana use. Approximately 50% of ADA victims and 32% of non-victims reported that they had been asked to do a sexual act that their partner saw in pornography (p = 0.15, and 58% did not feel happy to have been asked. Results suggest that weekly pornography use among underage, urban-residing youth is common, and may be associated with ADA victimization.

  11. Adolescent Pornography Use and Dating Violence among a Sample of Primarily Black and Hispanic, Urban-Residing, Underage Youth

    Science.gov (United States)

    Rothman, Emily F.; Adhia, Avanti

    2015-01-01

    This cross-sectional study was designed to characterize the pornography viewing preferences of a sample of U.S.-based, urban-residing, economically disadvantaged, primarily Black and Hispanic youth (n = 72), and to assess whether pornography use was associated with experiences of adolescent dating abuse (ADA) victimization. The sample was recruited from a large, urban, safety net hospital, and participants were 53% female, 59% Black, 19% Hispanic, 14% Other race, 6% White, and 1% Native American. All were 16–17 years old. More than half (51%) had been asked to watch pornography together by a dating or sexual partner, and 44% had been asked to do something sexual that a partner saw in pornography. Adolescent dating abuse (ADA) victimization was associated with more frequent pornography use, viewing pornography in the company of others, being asked to perform a sexual act that a partner first saw in pornography, and watching pornography during or after marijuana use. Approximately 50% of ADA victims and 32% of non-victims reported that they had been asked to do a sexual act that their partner saw in pornography (p = 0.15), and 58% did not feel happy to have been asked. Results suggest that weekly pornography use among underage, urban-residing youth may be common, and may be associated with ADA victimization. PMID:26703744

  12. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  13. Parasites affect food web structure primarily through increased diversity and complexity.

    Directory of Open Access Journals (Sweden)

    Jennifer A Dunne

    Full Text Available Comparative research on food web structure has revealed generalities in trophic organization, produced simple models, and allowed assessment of robustness to species loss. These studies have mostly focused on free-living species. Recent research has suggested that inclusion of parasites alters structure. We assess whether such changes in network structure result from unique roles and traits of parasites or from changes to diversity and complexity. We analyzed seven highly resolved food webs that include metazoan parasite data. Our analyses show that adding parasites usually increases link density and connectance (simple measures of complexity, particularly when including concomitant links (links from predators to parasites of their prey. However, we clarify prior claims that parasites "dominate" food web links. Although parasites can be involved in a majority of links, in most cases classic predation links outnumber classic parasitism links. Regarding network structure, observed changes in degree distributions, 14 commonly studied metrics, and link probabilities are consistent with scale-dependent changes in structure associated with changes in diversity and complexity. Parasite and free-living species thus have similar effects on these aspects of structure. However, two changes point to unique roles of parasites. First, adding parasites and concomitant links strongly alters the frequency of most motifs of interactions among three taxa, reflecting parasites' roles as resources for predators of their hosts, driven by trophic intimacy with their hosts. Second, compared to free-living consumers, many parasites' feeding niches appear broader and less contiguous, which may reflect complex life cycles and small body sizes. This study provides new insights about generic versus unique impacts of parasites on food web structure, extends the generality of food web theory, gives a more rigorous framework for assessing the impact of any species on trophic

  14. Modeling base excision repair in Escherichia coli bacterial cells

    International Nuclear Information System (INIS)

    Belov, O.V.

    2011-01-01

    A model describing the key processes in Escherichia coli bacterial cells during base excision repair is developed. The mechanism is modeled of damaged base elimination involving formamidopyrimidine DNA glycosylase (the Fpg protein), which possesses several types of activities. The modeling of the transitions between DNA states is based on a stochastic approach to the chemical reaction description

  15. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  16. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Model A 8.0 2.0 94.52% 88.46% 76 108 12 12 0.86 0.91 0.78 0.94. Model B 2.0 2.0 93.18% 89.33% 64 95 10 9 0.88 0.90 0.75 0.98. The above results for TEST – 1 show details for our two models (Model A and Model B).Performance of Model A after adding of 32 negative dataset of MiRTif on our testing set(MiRecords) ...

  17. Stimulating Scientific Reasoning with Drawing-Based Modeling

    Science.gov (United States)

    Heijnes, Dewi; van Joolingen, Wouter; Leenaars, Frank

    2018-01-01

    We investigate the way students' reasoning about evolution can be supported by drawing-based modeling. We modified the drawing-based modeling tool SimSketch to allow for modeling evolutionary processes. In three iterations of development and testing, students in lower secondary education worked on creating an evolutionary model. After each…

  18. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  19. Learning of Chemical Equilibrium through Modelling-Based Teaching

    Science.gov (United States)

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students…

  20. Recommendation based on trust diffusion model.

    Science.gov (United States)

    Yuan, Jinfeng; Li, Li

    2014-01-01

    Recommender system is emerging as a powerful and popular tool for online information relevant to a given user. The traditional recommendation system suffers from the cold start problem and the data sparsity problem. Many methods have been proposed to solve these problems, but few can achieve satisfactory efficiency. In this paper, we present a method which combines the trust diffusion (DiffTrust) algorithm and the probabilistic matrix factorization (PMF). DiffTrust is first used to study the possible diffusions of trust between various users. It is able to make use of the implicit relationship of the trust network, thus alleviating the data sparsity problem. The probabilistic matrix factorization (PMF) is then employed to combine the users' tastes with their trusted friends' interests. We evaluate the algorithm on Flixster, Moviedata, and Epinions datasets, respectively. The experimental results show that the recommendation based on our proposed DiffTrust + PMF model achieves high performance in terms of the root mean square error (RMSE), Recall, and F Measure.

  1. A 3D City Model with Dynamic Behaviour Based on Geospatial Managed Objects

    DEFF Research Database (Denmark)

    Kjems, Erik; Kolář, Jan

    2014-01-01

    of a geographic data representation of the world. The combination of 3D city models and real time information based systems though can provide a whole new setup for data fusion within an urban environment and provide time critical information preserving our limited resources in the most sustainable way. Using 3D......One of the major development efforts within the GI Science domain are pointing at real time information coming from geographic referenced features in general. At the same time 3D City models are mostly justified as being objects for visualization purposes rather than constituting the foundation...... occasions we have been advocating for a new and advanced formulation of real world features using the concept of Geospatial Managed Objects (GMO). This chapter presents the outcome of the InfraWorld project, a 4 million Euro project financed primarily by the Norwegian Research Council where the concept...

  2. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  3. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    Science.gov (United States)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  4. Supply based on demand dynamical model

    Science.gov (United States)

    Levi, Asaf; Sabuco, Juan; Sanjuán, Miguel A. F.

    2018-04-01

    We propose and numerically analyze a simple dynamical model that describes the firm behaviors under uncertainty of demand. Iterating this simple model and varying some parameter values, we observe a wide variety of market dynamics such as equilibria, periodic, and chaotic behaviors. Interestingly, the model is also able to reproduce market collapses.

  5. Search for primarily non-interacting decay modes of the upsilon

    International Nuclear Information System (INIS)

    Leffler, J.S.

    1986-03-01

    The hadronic transition UPSILON(2S) → π 0 π 0 UPSILON(1S) is utilized to search for the reactions: UPSILON(1S) → non-interacting particles and UPSILON(1S) → γ + non-interacting particles. 44 pb -1 of UPSILON(2S) data were taken by the Crystal Ball detector at the DORIS II storage ring in order to perform this study. An upper limit of BR(UPSILON → Unseen) -1 of UPSILON(2S) data was available for this study. An upper limit on the branching ratio BR(UPSILON → γ + Unseen) is measured for photon energies in the range 500 MeV -3 (90% C.L.), is obtained. The compact size of the Crystal Ball detector enhances the observable branching ratio for noninteracting particles with short lifetimes such as massive axions. The identification of the recent Darmstadt events with a 1.6 MeV axion is excluded by the present result assuming the minimal Peccei-Quinn model. Limits on the spontaneous supersymmetry breaking mass scale are also derived as a function of gravitino mass

  6. Amazonian Amphibian Diversity Is Primarily Derived from Late Miocene Andean Lineages

    Science.gov (United States)

    Santos, Juan C; Coloma, Luis A; Summers, Kyle; Caldwell, Janalee P; Ree, Richard; Cannatella, David C

    2009-01-01

    The Neotropics contains half of remaining rainforests and Earth's largest reservoir of amphibian biodiversity. However, determinants of Neotropical biodiversity (i.e., vicariance, dispersals, extinctions, and radiations) earlier than the Quaternary are largely unstudied. Using a novel method of ancestral area reconstruction and relaxed Bayesian clock analyses, we reconstructed the biogeography of the poison frog clade (Dendrobatidae). We rejected an Amazonian center-of-origin in favor of a complex connectivity model expanding over the Neotropics. We inferred 14 dispersals into and 18 out of Amazonia to adjacent regions; the Andes were the major source of dispersals into Amazonia. We found three episodes of lineage dispersal with two interleaved periods of vicariant events between South and Central America. During the late Miocene, Amazonian, and Central American-Chocoan lineages significantly increased their diversity compared to the Andean and Guianan-Venezuelan-Brazilian Shield counterparts. Significant percentage of dendrobatid diversity in Amazonia and Chocó resulted from repeated immigrations, with radiations at Venezuelan Highlands, and Guiana Shield have undergone extended in situ diversification at near constant rate since the Oligocene. The effects of Miocene paleogeographic events on Neotropical diversification dynamics provided the framework under which Quaternary patterns of endemism evolved. PMID:19278298

  7. Why we cannot conclude that sexual orientation is primarily a biological phenomenon.

    Science.gov (United States)

    Byne, W

    1997-01-01

    While all mental phenomena must have an ultimate biological substrate, the precise contribution of biological factors to the development of sexual orientation remains to be elucidated. Does biology merely provide the slate of neural circuitry upon which sexual orientation is inscribed by experience? Do biological factors directly wire the brain so that it will support a particular orientation? Or do biological factors influence sexual orientation only indirectly, perhaps by influencing personality variables that in turn influence how one interacts with and shapes the environment as it contributes to the social relationships and experiences that shape sexual orientation as it emerges developmentally? Recent neurostructural and genetic linkage evidence pertaining to sexual orientation must be viewed tentatively until it has been adequately corroborated and integrated with psychological and cultural models. Moreover, even a reliable and robust correlation between a biological marker and sexual orientation would be equally compatible with the second and third possibilities delineated above. Yet if the third possibility more closely approximates reality, the search for predisposing biological factors will result in incomplete and misleading findings until their interactions with environmental factors are taken into account and controlled for in adequate longitudinal studies.

  8. Sub-micrometre Particulate Matter is Primarily in Liquid Form over Amazon Rainforests

    Energy Technology Data Exchange (ETDEWEB)

    Bateman, Adam P.; Gong, Z. H.; Liu, Pengfei; Sato, Bruno; Cirino, Glauber; Zhang, Yue; Artaxo, Paulo; Bertram, Allan K.; Manzi, A.; Rizzo, L. V.; Souza, Rodrigo A.; Zaveri, Rahul A.; Martin, Scot T.

    2016-01-01

    Particulate matter (PM) occurs in the Earth’s atmosphere both in liquid and non-liquid forms. The physical state affects the available physical and chemical mechanisms of growth and reactivity, ultimately affecting the number, size, and composition of the atmospheric particle population. Herein, the physical state, including the response to relative humidity (RH), was investigated on-line and in real time for PM (< 1 μm) over the tropical rain forest of central Amazonia during both the wet and dry seasons of 2013. The results show that the PM was liquid for RH > 80% across 296 to 300 K. These results, in conjunction with the distributions of RH and temperature in Amazonia, imply that near-surface submicron PM in Amazonia is liquid most of the time. The observations are consistent with laboratory experiments showing that PM produced by isoprene photo-oxidation is liquid across these meteorological conditions. The findings have implications for the mechanisms of new particle production in Amazonia, the growth of submicron particles and hence dynamics of the cloud life cycle, and the sensitivity of these processes to anthropogenic activities. An approach for inclusion of particle physical state in chemical transport models is presented.

  9. Analytical modeling of a sandwiched plate piezoelectric transformer-based acoustic-electric transmission channel.

    Science.gov (United States)

    Lawry, Tristan J; Wilt, Kyle R; Scarton, Henry A; Saulnier, Gary J

    2012-11-01

    The linear propagation of electromagnetic and dilatational waves through a sandwiched plate piezoelectric transformer (SPPT)-based acoustic-electric transmission channel is modeled using the transfer matrix method with mixed-domain two-port ABCD parameters. This SPPT structure is of great interest because it has been explored in recent years as a mechanism for wireless transmission of electrical signals through solid metallic barriers using ultrasound. The model we present is developed to allow for accurate channel performance prediction while greatly reducing the computational complexity associated with 2- and 3-dimensional finite element analysis. As a result, the model primarily considers 1-dimensional wave propagation; however, approximate solutions for higher-dimensional phenomena (e.g., diffraction in the SPPT's metallic core layer) are also incorporated. The model is then assessed by comparing it to the measured wideband frequency response of a physical SPPT-based channel from our previous work. Very strong agreement between the modeled and measured data is observed, confirming the accuracy and utility of the presented model.

  10. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  11. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  12. Cognitive components underpinning the development of model-based learning.

    Science.gov (United States)

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2017-06-01

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Examination of occupational exposure to medical staff (primarily nurses) during 131I medical treatments

    International Nuclear Information System (INIS)

    Watanabe, Masayoshi; Ishikawa, Naofumi; Ito, Kunihiko; Ito, Koichi

    2004-01-01

    Recently, a new amendment to protect against radiation damage to humans has been enacted based on a 1990 recommendation by the International Commission on Radiological Protection (ICRP). Consequently, the dose limits of occupational exposure to medical staff were cut down sharply compared with conventional readjustments. This amended bill, however, may be triggering a reduction in the number of applicants, which hope to engage in radiotherapy. This being the case, we measured the dose levels of the occupational exposure to medical staff (doctor's group, nuclear medicine technologist's group, nurse's group and pharmacist's group) from 1999 to 2002. Moreover, we investigated what the main factor is in nurse's occupational exposure to 131 I. The highest doses of occupational exposure were 3.640 mSv to doctors, 7.060 mSv to nuclear medicine technologists, 1.486 mSv to nurses and 0.552 mSv to pharmacists. According to our results, it was clear that the highest doses in each group were far below the legally mandated upper limits of exposure doses. Although we investigated the correlations between the factors of nurse's occupational exposure to 131 I with the number of inpatients, the amount of 131 I and the number of servicing times for patients, there were no correlations found. Furthermore, to analyzing the factors in detail, it became clear that the main factor in the nurse's occupational exposure was due to the existence of patients who needed many more servicing times for their care than ordinary patients. (author)

  14. Genetic Algorithm Based Microscale Vehicle Emissions Modelling

    Directory of Open Access Journals (Sweden)

    Sicong Zhu

    2015-01-01

    Full Text Available There is a need to match emission estimations accuracy with the outputs of transport models. The overall error rate in long-term traffic forecasts resulting from strategic transport models is likely to be significant. Microsimulation models, whilst high-resolution in nature, may have similar measurement errors if they use the outputs of strategic models to obtain traffic demand predictions. At the microlevel, this paper discusses the limitations of existing emissions estimation approaches. Emission models for predicting emission pollutants other than CO2 are proposed. A genetic algorithm approach is adopted to select the predicting variables for the black box model. The approach is capable of solving combinatorial optimization problems. Overall, the emission prediction results reveal that the proposed new models outperform conventional equations in terms of accuracy and robustness.

  15. Model-Based Learning Environment Based on The Concept IPS School-Based Management

    Directory of Open Access Journals (Sweden)

    Hamid Darmadi

    2017-03-01

    Full Text Available The results showed: (1 learning model IPS-oriented environment can grow and not you love the cultural values of the area as a basis for the development of national culture, (2 community participation, and the role of government in implementing learning model of IPS-based environment provides a positive impact for the improvement of management school resources, (3 learning model IPS-based environment effectively creating a way of life together peacefully, increase the intensity of togetherness and mutual respect (4 learning model IPS-based environment can improve student learning outcomes, (5 there are differences in the expression of attitudes and results learning among students who are located in the area of conflict with students who are outside the area of conflict (6 analysis of the scale of attitudes among school students da SMA result rewards high school students to the values of unity and nation, respect for diversity and peaceful coexistence, It is recommended that the Department of Education authority as an institution of Trustees and the development of social and cultural values in the province can apply IPS learning model based environments.

  16. Y-Chromosomal Diversity in Europe Is Clinal and Influenced Primarily by Geography, Rather than by Language

    Science.gov (United States)

    Rosser, Zoë H.; Zerjal, Tatiana; Hurles, Matthew E.; Adojaan, Maarja; Alavantic, Dragan; Amorim, António; Amos, William; Armenteros, Manuel; Arroyo, Eduardo; Barbujani, Guido; Beckman, Gunhild; Beckman, Lars; Bertranpetit, Jaume; Bosch, Elena; Bradley, Daniel G.; Brede, Gaute; Cooper, Gillian; Côrte-Real, Helena B. S. M.; de Knijff, Peter; Decorte, Ronny; Dubrova, Yuri E.; Evgrafov, Oleg; Gilissen, Anja; Glisic, Sanja; Gölge, Mukaddes; Hill, Emmeline W.; Jeziorowska, Anna; Kalaydjieva, Luba; Kayser, Manfred; Kivisild, Toomas; Kravchenko, Sergey A.; Krumina, Astrida; Kučinskas, Vaidutis; Lavinha, João; Livshits, Ludmila A.; Malaspina, Patrizia; Maria, Syrrou; McElreavey, Ken; Meitinger, Thomas A.; Mikelsaar, Aavo-Valdur; Mitchell, R. John; Nafa, Khedoudja; Nicholson, Jayne; Nørby, Søren; Pandya, Arpita; Parik, Jüri; Patsalis, Philippos C.; Pereira, Luísa; Peterlin, Borut; Pielberg, Gerli; Prata, Maria João; Previderé, Carlo; Roewer, Lutz; Rootsi, Siiri; Rubinsztein, D. C.; Saillard, Juliette; Santos, Fabrício R.; Stefanescu, Gheorghe; Sykes, Bryan C.; Tolun, Aslihan; Villems, Richard; Tyler-Smith, Chris; Jobling, Mark A.

    2000-01-01

    Clinal patterns of autosomal genetic diversity within Europe have been interpreted in previous studies in terms of a Neolithic demic diffusion model for the spread of agriculture; in contrast, studies using mtDNA have traced many founding lineages to the Paleolithic and have not shown strongly clinal variation. We have used 11 human Y-chromosomal biallelic polymorphisms, defining 10 haplogroups, to analyze a sample of 3,616 Y chromosomes belonging to 47 European and circum-European populations. Patterns of geographic differentiation are highly nonrandom, and, when they are assessed using spatial autocorrelation analysis, they show significant clines for five of six haplogroups analyzed. Clines for two haplogroups, representing 45% of the chromosomes, are continentwide and consistent with the demic diffusion hypothesis. Clines for three other haplogroups each have different foci and are more regionally restricted and are likely to reflect distinct population movements, including one from north of the Black Sea. Principal-components analysis suggests that populations are related primarily on the basis of geography, rather than on the basis of linguistic affinity. This is confirmed in Mantel tests, which show a strong and highly significant partial correlation between genetics and geography but a low, nonsignificant partial correlation between genetics and language. Genetic-barrier analysis also indicates the primacy of geography in the shaping of patterns of variation. These patterns retain a strong signal of expansion from the Near East but also suggest that the demographic history of Europe has been complex and influenced by other major population movements, as well as by linguistic and geographic heterogeneities and the effects of drift. PMID:11078479

  17. Is cruising along European rivers primarily intended for seniors and workers from Eastern Europe?

    Directory of Open Access Journals (Sweden)

    Erdeji Irma

    2017-01-01

    Full Text Available The subject of the paper is river cruising along the rivers of Europe in Belgium, the Netherlands, Germany, Switzerland, France, Austria, Slovakia, Hungary, Croatia, Serbia, Bulgaria, Romania and Italy. The research needed to determine the trends in terms of consumers and labour force, particularly considering great political and economic changes in Europe (and the world in the last ten years. The aims of the research were set in relation to the following: to determine the profile of a tourist as well as the crew members. This paper is based on empirical and theoretical research. It combines quantitative primary and secondary as well as qualitative data collection (interview. Primary data was collected from the 'Uniworld' company by analysing crew manifests in order to define the demographical profile of the employees. Secondary data collection was used to define the profile of the tourist, where latest relevant publications were consulted. Qualitative method was used to gain more insight in the latest trends of River Cruising by interweaving the managers of the 'Uniworld' company. It was determined that 'baby boomers' are no longer prevalent on the cruise ships, but the 'millennials' cohort are on the rise. Such changes will require a new approach among the cruising companies - in terms of the concept of service delivery and marketing. However, among employees there is no significant change, suggesting that this type of job market is tightly regulated by EU regulations. This research offers valuable data in the field of tourism destination management, as well as the needs of some stakeholders, especially in terms of human resources management and management of strategic development issues. This is important both for the countries which already have positioned themselves on the cruising market as well as for emerging destinations.

  18. Model-based accelerator controls: What, why and how

    International Nuclear Information System (INIS)

    Sidhu, S.S.

    1987-01-01

    Model-based control is defined as a gamut of techniques whose aim is to improve the reliability of an accelerator and enhance the capabilities of the operator, and therefore of the whole control system. The aim of model-based control is seen as gradually moving the function of model-reference from the operator to the computer. The role of the operator in accelerator control and the need for and application of model-based control are briefly summarized

  19. Stochastic Wake Modelling Based on POD Analysis

    Directory of Open Access Journals (Sweden)

    David Bastine

    2018-03-01

    Full Text Available In this work, large eddy simulation data is analysed to investigate a new stochastic modeling approach for the wake of a wind turbine. The data is generated by the large eddy simulation (LES model PALM combined with an actuator disk with rotation representing the turbine. After applying a proper orthogonal decomposition (POD, three different stochastic models for the weighting coefficients of the POD modes are deduced resulting in three different wake models. Their performance is investigated mainly on the basis of aeroelastic simulations of a wind turbine in the wake. Three different load cases and their statistical characteristics are compared for the original LES, truncated PODs and the stochastic wake models including different numbers of POD modes. It is shown that approximately six POD modes are enough to capture the load dynamics on large temporal scales. Modeling the weighting coefficients as independent stochastic processes leads to similar load characteristics as in the case of the truncated POD. To complete this simplified wake description, we show evidence that the small-scale dynamics can be captured by adding to our model a homogeneous turbulent field. In this way, we present a procedure to derive stochastic wake models from costly computational fluid dynamics (CFD calculations or elaborated experimental investigations. These numerically efficient models provide the added value of possible long-term studies. Depending on the aspects of interest, different minimalized models may be obtained.

  20. New Temperature-based Models for Predicting Global Solar Radiation

    International Nuclear Information System (INIS)

    Hassan, Gasser E.; Youssef, M. Elsayed; Mohamed, Zahraa E.; Ali, Mohamed A.; Hanafy, Ahmed A.

    2016-01-01

    Highlights: • New temperature-based models for estimating solar radiation are investigated. • The models are validated against 20-years measured data of global solar radiation. • The new temperature-based model shows the best performance for coastal sites. • The new temperature-based model is more accurate than the sunshine-based models. • The new model is highly applicable with weather temperature forecast techniques. - Abstract: This study presents new ambient-temperature-based models for estimating global solar radiation as alternatives to the widely used sunshine-based models owing to the unavailability of sunshine data at all locations around the world. Seventeen new temperature-based models are established, validated and compared with other three models proposed in the literature (the Annandale, Allen and Goodin models) to estimate the monthly average daily global solar radiation on a horizontal surface. These models are developed using a 20-year measured dataset of global solar radiation for the case study location (Lat. 30°51′N and long. 29°34′E), and then, the general formulae of the newly suggested models are examined for ten different locations around Egypt. Moreover, the local formulae for the models are established and validated for two coastal locations where the general formulae give inaccurate predictions. Mostly common statistical errors are utilized to evaluate the performance of these models and identify the most accurate model. The obtained results show that the local formula for the most accurate new model provides good predictions for global solar radiation at different locations, especially at coastal sites. Moreover, the local and general formulas of the most accurate temperature-based model also perform better than the two most accurate sunshine-based models from the literature. The quick and accurate estimations of the global solar radiation using this approach can be employed in the design and evaluation of performance for

  1. Culturable bioaerosols along an urban waterfront are primarily associated with coarse particles

    Directory of Open Access Journals (Sweden)

    Angel Montero

    2016-12-01

    Full Text Available The source, characteristics and transport of viable microbial aerosols in urban centers are topics of significant environmental and public health concern. Recent studies have identified adjacent waterways, and especially polluted waterways, as an important source of microbial aerosols to urban air. The size of these aerosols influences how far they travel, their resistance to environmental stress, and their inhalation potential. In this study, we utilize a cascade impactor and aerosol particle monitor to characterize the size distribution of particles and culturable bacterial and fungal aerosols along the waterfront of a New York City embayment. We seek to address the potential contribution of bacterial aerosols from local sources and to determine how their number, size distribution, and taxonomic identity are affected by wind speed and wind direction (onshore vs. offshore. Total culturable microbial counts were higher under offshore winds (average of 778 CFU/m3 ± 67, with bacteria comprising the majority of colonies (58.5%, as compared to onshore winds (580 CFU/m3 ± 110 where fungi were dominant (87.7%. The majority of cultured bacteria and fungi sampled during both offshore winds (88% and onshore winds (72% were associated with coarse aerosols (>2.1 µm, indicative of production from local sources. There was a significant correlation (p < 0.05 of wind speed with both total and coarse culturable microbial aerosol concentrations. Taxonomic analysis, based on DNA sequencing, showed that Actinobacteria was the dominant phylum among aerosol isolates. In particular, Streptomyces and Bacillus, both spore forming genera that are often soil-associated, were abundant under both offshore and onshore wind conditions. Comparisons of bacterial communities present in the bioaerosol sequence libraries revealed that particle size played an important role in microbial aerosol taxonomy. Onshore and offshore coarse libraries were found to be most similar

  2. Agent Based Reasoning in Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2012-01-01

    to launch the MFM Workbench into an agent based environment, which can complement disadvantages of the original software. The agent-based MFM Workbench is centered on a concept called “Blackboard System” and use an event based mechanism to arrange the reasoning tasks. This design will support the new...

  3. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  4. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  5. A Size-based Ecosystem Model

    DEFF Research Database (Denmark)

    Ravn-Jonsen, Lars

     Ecosystem Management requires models that can link the ecosystem level to the operation level. This link can be created by an ecosystem production model. Because the function of the individual fish in the marine ecosystem, seen in trophic context, is closely related to its size, the model groups...... fish according to size. The model summarises individual predation events into ecosystem level properties, and thereby uses the law of conversation of mass as a framework. This paper provides the background, the conceptual model, basic assumptions, integration of fishing activities, mathematical...... the predator--prey interaction, (ii) mass balance in the predator--prey allocation, and (iii) mortality and somatic growth as a consequence of the predator--prey allocation. By incorporating additional assumptions, the model can be extended to other dimensions of the ecosystem, for example, space or species...

  6. Tree-Based Global Model Tests for Polytomous Rasch Models

    Science.gov (United States)

    Komboz, Basil; Strobl, Carolin; Zeileis, Achim

    2018-01-01

    Psychometric measurement models are only valid if measurement invariance holds between test takers of different groups. Global model tests, such as the well-established likelihood ratio (LR) test, are sensitive to violations of measurement invariance, such as differential item functioning and differential step functioning. However, these…

  7. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  8. Integrating Effects-Based and Attrition-Based Modeling

    National Research Council Canada - National Science Library

    DeGregorio, Edward A; Janssen, Raymond A; Wagenhals, Lee W; Messier, Richard H

    2004-01-01

    .... Modeling the NCW EBO process attempts to codify the belief structure and reasoning of adversaries and their cause-effect relationships with US and coalition actions, including mitigating undesired effects...

  9. Handling high predictor dimensionality in slope-unit-based landslide susceptibility models through LASSO-penalized Generalized Linear Model

    KAUST Repository

    Camilo, Daniela Castro

    2017-08-30

    Grid-based landslide susceptibility models at regional scales are computationally demanding when using a fine grid resolution. Conversely, Slope-Unit (SU) based susceptibility models allows to investigate the same areas offering two main advantages: 1) a smaller computational burden and 2) a more geomorphologically-oriented interpretation. In this contribution, we generate SU-based landslide susceptibility for the Sado Island in Japan. This island is characterized by deep-seated landslides which we assume can only limitedly be explained by the first two statistical moments (mean and variance) of a set of predictors within each slope unit. As a consequence, in a nested experiment, we first analyse the distributions of a set of continuous predictors within each slope unit computing the standard deviation and quantiles from 0.05 to 0.95 with a step of 0.05. These are then used as predictors for landslide susceptibility. In addition, we combine shape indices for polygon features and the normalized extent of each class belonging to the outcropping lithology in a given SU. This procedure significantly enlarges the size of the predictor hyperspace, thus producing a high level of slope-unit characterization. In a second step, we adopt a LASSO-penalized Generalized Linear Model to shrink back the predictor set to a sensible and interpretable number, carrying only the most significant covariates in the models. As a result, we are able to document the geomorphic features (e.g., 95% quantile of Elevation and 5% quantile of Plan Curvature) that primarily control the SU-based susceptibility within the test area while producing high predictive performances. The implementation of the statistical analyses are included in a parallelized R script (LUDARA) which is here made available for the community to replicate analogous experiments.

  10. Handling high predictor dimensionality in slope-unit-based landslide susceptibility models through LASSO-penalized Generalized Linear Model

    KAUST Repository

    Camilo, Daniela Castro; Lombardo, Luigi; Mai, Paul Martin; Dou, Jie; Huser, Raphaë l

    2017-01-01

    Grid-based landslide susceptibility models at regional scales are computationally demanding when using a fine grid resolution. Conversely, Slope-Unit (SU) based susceptibility models allows to investigate the same areas offering two main advantages: 1) a smaller computational burden and 2) a more geomorphologically-oriented interpretation. In this contribution, we generate SU-based landslide susceptibility for the Sado Island in Japan. This island is characterized by deep-seated landslides which we assume can only limitedly be explained by the first two statistical moments (mean and variance) of a set of predictors within each slope unit. As a consequence, in a nested experiment, we first analyse the distributions of a set of continuous predictors within each slope unit computing the standard deviation and quantiles from 0.05 to 0.95 with a step of 0.05. These are then used as predictors for landslide susceptibility. In addition, we combine shape indices for polygon features and the normalized extent of each class belonging to the outcropping lithology in a given SU. This procedure significantly enlarges the size of the predictor hyperspace, thus producing a high level of slope-unit characterization. In a second step, we adopt a LASSO-penalized Generalized Linear Model to shrink back the predictor set to a sensible and interpretable number, carrying only the most significant covariates in the models. As a result, we are able to document the geomorphic features (e.g., 95% quantile of Elevation and 5% quantile of Plan Curvature) that primarily control the SU-based susceptibility within the test area while producing high predictive performances. The implementation of the statistical analyses are included in a parallelized R script (LUDARA) which is here made available for the community to replicate analogous experiments.

  11. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  12. Neural Network Based Models for Fusion Applications

    Science.gov (United States)

    Meneghini, Orso; Tema Biwole, Arsene; Luda, Teobaldo; Zywicki, Bailey; Rea, Cristina; Smith, Sterling; Snyder, Phil; Belli, Emily; Staebler, Gary; Canty, Jeff

    2017-10-01

    Whole device modeling, engineering design, experimental planning and control applications demand models that are simultaneously physically accurate and fast. This poster reports on the ongoing effort towards the development and validation of a series of models that leverage neural-­network (NN) multidimensional regression techniques to accelerate some of the most mission critical first principle models for the fusion community, such as: the EPED workflow for prediction of the H-Mode and Super H-Mode pedestal structure the TGLF and NEO models for the prediction of the turbulent and neoclassical particle, energy and momentum fluxes; and the NEO model for the drift-kinetic solution of the bootstrap current. We also applied NNs on DIII-D experimental data for disruption prediction and quantifying the effect of RMPs on the pedestal and ELMs. All of these projects were supported by the infrastructure provided by the OMFIT integrated modeling framework. Work supported by US DOE under DE-SC0012656, DE-FG02-95ER54309, DE-FC02-04ER54698.

  13. Equifinality and process-based modelling

    Science.gov (United States)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  14. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  15. AGAMA: Action-based galaxy modeling framework

    Science.gov (United States)

    Vasiliev, Eugene

    2018-05-01

    The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).

  16. An ontology-based approach for modelling architectural styles

    OpenAIRE

    Pahl, Claus; Giesecke, Simon; Hasselbring, Wilhelm

    2007-01-01

    peer-reviewed The conceptual modelling of software architectures is of central importance for the quality of a software system. A rich modelling language is required to integrate the different aspects of architecture modelling, such as architectural styles, structural and behavioural modelling, into a coherent framework.We propose an ontological approach for architectural style modelling based on description logic as an abstract, meta-level modelling instrument. Architect...

  17. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith; Keyser, John

    2011-01-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple

  18. Base Flow Model Validation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  19. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Petrovic Nemanja

    2009-01-01

    Full Text Available Abstract We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  20. Models for Patch-Based Image Restoration

    Directory of Open Access Journals (Sweden)

    Mithun Das Gupta

    2009-01-01

    Full Text Available We present a supervised learning approach for object-category specific restoration, recognition, and segmentation of images which are blurred using an unknown kernel. The novelty of this work is a multilayer graphical model which unifies the low-level vision task of restoration and the high-level vision task of recognition in a cooperative framework. The graphical model is an interconnected two-layer Markov random field. The restoration layer accounts for the compatibility between sharp and blurred images and models the association between adjacent patches in the sharp image. The recognition layer encodes the entity class and its location in the underlying scene. The potentials are represented using nonparametric kernel densities and are learnt from training data. Inference is performed using nonparametric belief propagation. Experiments demonstrate the effectiveness of our model for the restoration and recognition of blurred license plates as well as face images.

  1. An operator model-based filtering scheme

    International Nuclear Information System (INIS)

    Sawhney, R.S.; Dodds, H.L.; Schryer, J.C.

    1990-01-01

    This paper presents a diagnostic model developed at Oak Ridge National Laboratory (ORNL) for off-normal nuclear power plant events. The diagnostic model is intended to serve as an embedded module of a cognitive model of the human operator, one application of which could be to assist control room operators in correctly responding to off-normal events by providing a rapid and accurate assessment of alarm patterns and parameter trends. The sequential filter model is comprised of two distinct subsystems --- an alarm analysis followed by an analysis of interpreted plant signals. During the alarm analysis phase, the alarm pattern is evaluated to generate hypotheses of possible initiating events in order of likelihood of occurrence. Each hypothesis is further evaluated through analysis of the current trends of state variables in order to validate/reject (in the form of increased/decreased certainty factor) the given hypothesis. 7 refs., 4 figs

  2. Demand forecast model based on CRM

    Science.gov (United States)

    Cai, Yuancui; Chen, Lichao

    2006-11-01

    With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.

  3. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  4. A Model-Based Privacy Compliance Checker

    OpenAIRE

    Siani Pearson; Damien Allison

    2009-01-01

    Increasingly, e-business organisations are coming under pressure to be compliant to a range of privacy legislation, policies and best practice. There is a clear need for high-level management and administrators to be able to assess in a dynamic, customisable way the degree to which their enterprise complies with these. We outline a solution to this problem in the form of a model-driven automated privacy process analysis and configuration checking system. This system models privacy compliance ...

  5. Examining Change in K-3 Teachers' Mathematical Knowledge, Attitudes, and Beliefs: The Case of Primarily Math

    Science.gov (United States)

    Kutaka, T. S.; Ren, L.; Smith, W. M.; Beattie, H. L.; Edwards, C. P.; Green, J. L.; Chernyavskiy, P.; Stroup, W.; Heaton, R. M.; Lewis, W. J.

    2018-01-01

    This study examines the impact of the Primarily Math Elementary Mathematics Specialist program on K-3 teachers' mathematical content knowledge for teaching, attitudes toward learning mathematics, and beliefs about mathematics teaching and learning. Three cohorts of teachers participating in the program were compared to a similar group of…

  6. An Elaboration of a Strategic Alignment Model of University Information Systems based on SAM Model

    Directory of Open Access Journals (Sweden)

    S. Ahriz

    2018-02-01

    Full Text Available Information system is a guarantee of the universities' ability to anticipate the essential functions to their development and durability. The alignment of information system, one of the pillars of IT governance, has become a necessity. In this paper, we consider the problem of strategic alignment model implementation in Moroccan universities. Literature revealed that few studies have examined strategic alignment in the public sector, particularly in higher education institutions. Hence we opted for an exploratory approach that aims to better understanding the strategic alignment and to evaluate the degree of its use within Moroccan universities. The data gained primarily through interviews with top managers and IT managers reveal that the alignment is not formalized and that it would be appropriate to implement an alignment model. It is found that the implementation of our proposed model can help managers to maximize returns of IT investment and to increase their efficiency.

  7. Dynamic ligand-based pharmacophore modeling and virtual ...

    Indian Academy of Sciences (India)

    Five ligand-based pharmacophore models were generated from 40 different .... the Phase module of the Schrodinger program.35 Each model consisted of six types of ... ligand preparation included the OPLS_2005 force field and to retain the ...

  8. Cloud model construct for transaction-based cooperative systems ...

    African Journals Online (AJOL)

    Cloud model construct for transaction-based cooperative systems. ... procure cutting edge Information Technology infrastructure are some of the problems faced ... Results also reveal that credit cooperatives will benefit from the model by taking ...

  9. Structural Acoustic Physics Based Modeling of Curved Composite Shells

    Science.gov (United States)

    2017-09-19

    NUWC-NPT Technical Report 12,236 19 September 2017 Structural Acoustic Physics -Based Modeling of Curved Composite Shells Rachel E. Hesse...SUBTITLE Structural Acoustic Physics -Based Modeling of Curved Composite Shells 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...study was to use physics -based modeling (PBM) to investigate wave propagations through curved shells that are subjected to acoustic excitation. An

  10. The fractional volatility model: An agent-based interpretation

    Science.gov (United States)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  11. LEARNING CREATIVE WRITING MODEL BASED ON NEUROLINGUISTIC PROGRAMMING

    OpenAIRE

    Rustan, Edhy

    2017-01-01

    The objectives of the study are to determine: (1) condition on learning creative writing at high school students in Makassar, (2) requirement of learning model in creative writing, (3) program planning and design model in ideal creative writing, (4) feasibility of model study based on creative writing in neurolinguistic programming, and (5) the effectiveness of the learning model based on creative writing in neurolinguisticprogramming.The method of this research uses research development of L...

  12. Model-based Prognostics under Limited Sensing

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics is crucial to providing reliable condition-based maintenance decisions. To obtain accurate predictions of component life, a variety of sensors are often...

  13. Optimal portfolio model based on WVAR

    OpenAIRE

    Hao, Tianyu

    2012-01-01

    This article is focused on using a new measurement of risk-- Weighted Value at Risk to develop a new method of constructing initiate from the TVAR solving problem, based on MATLAB software, using the historical simulation method (avoiding income distribution will be assumed to be normal), the results of previous studies also based on, study the U.S. Nasdaq composite index, combining the Simpson formula for the solution of TVAR and its deeply study; then, through the representation of WVAR for...

  14. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  15. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  16. Physics-Based Pneumatic Hammer Instability Model, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Florida Turbine Technologies (FTT) proposes to conduct research necessary to develop a physics-based pneumatic hammer instability model for hydrostatic bearings...

  17. Alcoholics Anonymous and twelve-step recovery: a model based on social and cognitive neuroscience.

    Science.gov (United States)

    Galanter, Marc

    2014-01-01

    In the course of achieving abstinence from alcohol, longstanding members of Alcoholics Anonymous (AA) typically experience a change in their addiction-related attitudes and behaviors. These changes are reflective of physiologically grounded mechanisms which can be investigated within the disciplines of social and cognitive neuroscience. This article is designed to examine recent findings associated with these disciplines that may shed light on the mechanisms underlying this change. Literature review and hypothesis development. Pertinent aspects of the neural impact of drugs of abuse are summarized. After this, research regarding specific brain sites, elucidated primarily by imaging techniques, is reviewed relative to the following: Mirroring and mentalizing are described in relation to experimentally modeled studies on empathy and mutuality, which may parallel the experiences of social interaction and influence on AA members. Integration and retrieval of memories acquired in a setting like AA are described, and are related to studies on storytelling, models of self-schema development, and value formation. A model for ascription to a Higher Power is presented. The phenomena associated with AA reflect greater complexity than the empirical studies on which this article is based, and certainly require further elucidation. Despite this substantial limitation in currently available findings, there is heuristic value in considering the relationship between the brain-based and clinical phenomena described here. There are opportunities for the study of neuroscientific correlates of Twelve-Step-based recovery, and these can potentially enhance our understanding of related clinical phenomena. © American Academy of Addiction Psychiatry.

  18. Interactive Coherence-Based Façade Modeling

    KAUST Repository

    Musialski, Przemyslaw; Wimmer, Michael; Wonka, Peter

    2012-01-01

    We propose a novel interactive framework for modeling building facades from images. Our method is based on the notion of coherence-based editing which allows exploiting partial symmetries across the facade at any level of detail. The proposed

  19. Game Based Learning (GBL) adoption model for universities: cesim ...

    African Journals Online (AJOL)

    Game Based Learning (GBL) adoption model for universities: cesim simulation. ... The global market has escalated the need of Game Based Learning (GBL) to offer a wide range of courses since there is a ... AJOL African Journals Online.

  20. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...... designs. Our framework enables evaluating the risks of an insider attack to happen quantitatively. The framework first identifies an insider's intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using...

  1. Probabilistic mixture-based image modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Havlíček, Vojtěch; Grim, Jiří

    2011-01-01

    Roč. 47, č. 3 (2011), s. 482-500 ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA ČR GA102/08/0593 Grant - others:CESNET(CZ) 387/2010; GA MŠk(CZ) 2C06019; GA ČR(CZ) GA103/11/0335 Institutional research plan: CEZ:AV0Z10750506 Keywords : BTF texture modelling * discrete distribution mixtures * Bernoulli mixture * Gaussian mixture * multi-spectral texture modelling Subject RIV: BD - Theory of Information Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/RO/haindl-0360244.pdf

  2. Automata-Based CSL Model Checking

    DEFF Research Database (Denmark)

    Zhang, Lijun; Jansen, David N.; Nielson, Flemming

    2011-01-01

    For continuous-time Markov chains, the model-checking problem with respect to continuous-time stochastic logic (CSL) has been introduced and shown to be decidable by Aziz, Sanwal, Singhal and Brayton in 1996. The presented decision procedure, however, has exponential complexity. In this paper, we...... probability can then be approximated in polynomial time (using uniformization). This makes the present work the centerpiece of a broadly applicable full CSL model checker. Recently, the decision algorithm by Aziz et al. was shown to be incorrect in general. In fact, it works only for stratified CTMCs...

  3. Kalman filter-based gap conductance modeling

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1983-01-01

    Geometric and thermal property uncertainties contribute greatly to the problem of determining conductance within the fuel-clad gas gap of a nuclear fuel pin. Accurate conductance values are needed for power plant licensing transient analysis and for test analyses at research facilities. Recent work by Meek, Doerner, and Adams has shown that use of Kalman filters to estimate gap conductance is a promising approach. A Kalman filter is simply a mathematical algorithm that employs available system measurements and assumed dynamic models to generate optimal system state vector estimates. This summary addresses another Kalman filter approach to gap conductance estimation and subsequent identification of an empirical conductance model

  4. Model-Based Enterprise Summit Report

    Science.gov (United States)

    2014-02-01

    A Moderator: John Horst 1700-1830 Wrap-Up and Vendor Demos Tuesday , 11 December, 2012 Website: http://www.nist.gov/el/msid/mbesummit_2012.cfm MBE...Affordable Access to Modeling & Simulation and High Performance Computing for SMEs Dennis Thompson, SCRA 1210-1230 NAMII Overview Ed Morris , Director

  5. DNA sequence modeling based on context trees

    NARCIS (Netherlands)

    Kusters, C.J.; Ignatenko, T.; Roland, J.; Horlin, F.

    2015-01-01

    Genomic sequences contain instructions for protein and cell production. Therefore understanding and identification of biologically and functionally meaningful patterns in DNA sequences is of paramount importance. Modeling of DNA sequences in its turn can help to better understand and identify such

  6. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  7. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  8. AADL and Model-based Engineering

    Science.gov (United States)

    2014-10-20

    pictures – MDE and MDA with UML – Automatically generated documents We need language for architecture modeling • Strongly typed • Well-defined...Mail Software Engineering Institute Customer Relations 4500 Fifth Avenue Pittsburgh, PA 15213-2612 USA Web Wiki.sei.cmu.edu/aadl www.aadl.info

  9. Sparse-Based Modeling of Hyperspectral Data

    DEFF Research Database (Denmark)

    Calvini, Rosalba; Ulrici, Alessandro; Amigo Rubio, Jose Manuel

    2016-01-01

    One of the main issues of hyperspectral imaging data is to unravel the relevant, yet overlapped, huge amount of information contained in the spatial and spectral dimensions. When dealing with the application of multivariate models in such high-dimensional data, sparsity can improve...

  10. From qualitative reasoning models to Bayesian-based learner modeling

    NARCIS (Netherlands)

    Milošević, U.; Bredeweg, B.; de Kleer, J.; Forbus, K.D.

    2010-01-01

    Assessing the knowledge of a student is a fundamental part of intelligent learning environments. We present a Bayesian network based approach to dealing with uncertainty when estimating a learner’s state of knowledge in the context of Qualitative Reasoning (QR). A proposal for a global architecture

  11. A model-based control system concept

    International Nuclear Information System (INIS)

    Aarzen, K.E.

    1992-12-01

    This paper presents an overview of a new concept for DCSs developed within the KBRTCS (Knowledge-Based Real-Time Control Systems) project performed between 1988 and 1991 as a part of the Swedish IT4 programme. The partners of the project have been the Department of Automatic Control at Lund University, Asea Brown Boveri, and during parts of the project, SattControl, and TeleLogic. The aim of the project has been to develop a concept for future generations of DCSs based on a plant database containing a description of the plant together with the control system. The database is object-based and supports multiple views of an objects. A demonstrator is presented where a DCS system of this type is emulated. The demonstrator contains a number of control, monitoring, and diagnosis applications that execute in real time against a simulations of Steritherm sterilization process. (25 refs.)

  12. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  13. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  14. Mechanics and model-based control of advanced engineering systems

    CERN Document Server

    Irschik, Hans; Krommer, Michael

    2014-01-01

    Mechanics and Model-Based Control of Advanced Engineering Systems collects 32 contributions presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines, which took place in St. Petersburg, Russia in July 2012. The workshop continued a series of international workshops, which started with a Japan-Austria Joint Workshop on Mechanics and Model Based Control of Smart Materials and Structures and a Russia-Austria Joint Workshop on Advanced Dynamics and Model Based Control of Structures and Machines. In the present volume, 10 full-length papers based on presentations from Russia, 9 from Austria, 8 from Japan, 3 from Italy, one from Germany and one from Taiwan are included, which represent the state of the art in the field of mechanics and model based control, with particular emphasis on the application of advanced structures and machines.

  15. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2008-01-01

    This paper introduces a framework for building CSP based applications, targeted for clusters and next generation CPU designs. CPUs are produced with several cores today and every future CPU generation will feature increasingly more cores, resulting in a requirement for concurrency that has not pr...

  16. Fire and Heat Spreading Model Based on Cellular Automata Theory

    Science.gov (United States)

    Samartsev, A. A.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.; Fominykh, D. S.

    2018-05-01

    The distinctive feature of the proposed fire and heat spreading model in premises is the reduction of the computational complexity due to the use of the theory of cellular automata with probability rules of behavior. The possibilities and prospects of using this model in practice are noted. The proposed model has a simple mechanism of integration with agent-based evacuation models. The joint use of these models could improve floor plans and reduce the time of evacuation from premises during fires.

  17. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  18. Thermodynamics-based models of transcriptional regulation with gene sequence.

    Science.gov (United States)

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  19. Spatial analysis and modelling based on activities

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2010-01-01

    Full Text Available (deliberative attitudes) (Pokahr, 2005). The BDI model does not cover emotional and other ‘higher’ human attitudes. KRONOS is a generic Computational Building Simulation (CBS) tool that was developed over the past three years to work on advanced... featured, stable, mature and platform independent with an easy to use C/C++ Application Program Interface (API). It has advanced joint types and integrated collision detection with friction. ODE is particularly useful for simulating vehicles, objects...

  20. Text document classification based on mixture models

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana; Malík, Antonín

    2004-01-01

    Roč. 40, č. 3 (2004), s. 293-304 ISSN 0023-5954 R&D Projects: GA AV ČR IAA2075302; GA ČR GA102/03/0049; GA AV ČR KSK1019101 Institutional research plan: CEZ:AV0Z1075907 Keywords : text classification * text categorization * multinomial mixture model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.224, year: 2004

  1. Tsunami Propagation Models Based on First Principles

    Science.gov (United States)

    2012-11-21

    geodesic lines from the epicenter shown in the figure are great circles with a longitudinal separation of 90o, which define a ‘ lune ’ that covers one...past which the waves begin to converge according to Model C. A tsunami propagating in this lune does not encounter any continental landmass until...2011 Japan tsunami in a lune of angle 90o with wavefronts at intervals of 5,000 km The 2011 Japan tsunami was felt throughout the Pacific Ocean

  2. Madrasah Culture Based Transformational Leadership Model

    OpenAIRE

    Nur Khoiri

    2016-01-01

    Leadership is the ability to influence, direct behavior, and have a particular expertise in the field of the group who want to achieve the goals. A dynamic organization requires transformational leadership model. A school principal as a leader at school aims to actualize good learning leadership. Leadership learning focuses on learning which components include curriculum, teaching and learning process, assessment, teacher assessment and development, good service in learning, and developing a ...

  3. Annotation-based feature extraction from sets of SBML models.

    Science.gov (United States)

    Alm, Rebekka; Waltemath, Dagmar; Wolfien, Markus; Wolkenhauer, Olaf; Henkel, Ron

    2015-01-01

    Model repositories such as BioModels Database provide computational models of biological systems for the scientific community. These models contain rich semantic annotations that link model entities to concepts in well-established bio-ontologies such as Gene Ontology. Consequently, thematically similar models are likely to share similar annotations. Based on this assumption, we argue that semantic annotations are a suitable tool to characterize sets of models. These characteristics improve model classification, allow to identify additional features for model retrieval tasks, and enable the comparison of sets of models. In this paper we discuss four methods for annotation-based feature extraction from model sets. We tested all methods on sets of models in SBML format which were composed from BioModels Database. To characterize each of these sets, we analyzed and extracted concepts from three frequently used ontologies, namely Gene Ontology, ChEBI and SBO. We find that three out of the methods are suitable to determine characteristic features for arbitrary sets of models: The selected features vary depending on the underlying model set, and they are also specific to the chosen model set. We show that the identified features map on concepts that are higher up in the hierarchy of the ontologies than the concepts used for model annotations. Our analysis also reveals that the information content of concepts in ontologies and their usage for model annotation do not correlate. Annotation-based feature extraction enables the comparison of model sets, as opposed to existing methods for model-to-keyword comparison, or model-to-model comparison.

  4. Weather forecasting based on hybrid neural model

    Science.gov (United States)

    Saba, Tanzila; Rehman, Amjad; AlGhamdi, Jarallah S.

    2017-11-01

    Making deductions and expectations about climate has been a challenge all through mankind's history. Challenges with exact meteorological directions assist to foresee and handle problems well in time. Different strategies have been investigated using various machine learning techniques in reported forecasting systems. Current research investigates climate as a major challenge for machine information mining and deduction. Accordingly, this paper presents a hybrid neural model (MLP and RBF) to enhance the accuracy of weather forecasting. Proposed hybrid model ensure precise forecasting due to the specialty of climate anticipating frameworks. The study concentrates on the data representing Saudi Arabia weather forecasting. The main input features employed to train individual and hybrid neural networks that include average dew point, minimum temperature, maximum temperature, mean temperature, average relative moistness, precipitation, normal wind speed, high wind speed and average cloudiness. The output layer composed of two neurons to represent rainy and dry weathers. Moreover, trial and error approach is adopted to select an appropriate number of inputs to the hybrid neural network. Correlation coefficient, RMSE and scatter index are the standard yard sticks adopted for forecast accuracy measurement. On individual standing MLP forecasting results are better than RBF, however, the proposed simplified hybrid neural model comes out with better forecasting accuracy as compared to both individual networks. Additionally, results are better than reported in the state of art, using a simple neural structure that reduces training time and complexity.

  5. Model-predictive control based on Takagi-Sugeno fuzzy model for electrical vehicles delayed model

    DEFF Research Database (Denmark)

    Khooban, Mohammad-Hassan; Vafamand, Navid; Niknam, Taher

    2017-01-01

    Electric vehicles (EVs) play a significant role in different applications, such as commuter vehicles and short distance transport applications. This study presents a new structure of model-predictive control based on the Takagi-Sugeno fuzzy model, linear matrix inequalities, and a non......-quadratic Lyapunov function for the speed control of EVs including time-delay states and parameter uncertainty. Experimental data, using the Federal Test Procedure (FTP-75), is applied to test the performance and robustness of the suggested controller in the presence of time-varying parameters. Besides, a comparison...... is made between the results of the suggested robust strategy and those obtained from some of the most recent studies on the same topic, to assess the efficiency of the suggested controller. Finally, the experimental results based on a TMS320F28335 DSP are performed on a direct current motor. Simulation...

  6. A Technology-based Model for Learning

    Directory of Open Access Journals (Sweden)

    Michael Williams

    2004-12-01

    Full Text Available The Math Emporium, opened in 1997, is an open 7000-squaremeter facility with 550+ workstations arranged in an array of widely spaced hexagonal "pods", designed to support group work at the same time maintaining an academic air. We operate it 24/7 with math support personnel in attendance 12 hours per day. Students have access to online course resources at all times, from anywhere. We have used this unique asset to transform traditional classroom-based courses into technology based learning programs that have no class meetings at all. The structure of the program is very different from the conventional one, having a new set of expectations and motivations. The results include: more effective students, substantial cost savings, economies of scale and scope and a stream-lined process for creating new on-line courses.

  7. Adopsi Model Competency Based Training dalam Kewirausahaan

    OpenAIRE

    I Ketut Santra

    2009-01-01

    The aim of the research is improving the teaching method in entrepreneurship subject. This research adopted the competency based training (CBT) into the entrepreneurship. The major task in this research is formulated and designed the entrepreneurship competency. Entrepreneurship competency indicated by Personal, Strategic and Situational and Business competence. All of entrepreneurship competences are described into sub topic of competence. After designing and formulating the game and simulat...

  8. X-ray and CT signs of connective tissue dysplasia in patients with primarily diagnosed infiltrative pulmonary tuberculosis

    International Nuclear Information System (INIS)

    Sukhanova, L.A.; Sharmazanova, O.P.

    2009-01-01

    The x-ray signs of connective tissue systemic dysplasia (CTSD) in patients with primarily diagnosed pulmonary tuberculosis was investigated. Fifty-four patients (28 med and 26 women aged 18-70) with primarily diagnosed infiltrative pulmonary tuberculosis underwent x-ray study. In patients with infiltration pulmonary tuberculosis CTSD in the lungs manifests by their diminishing, deformity of the lung pattern, high position of the diaphragm cupola, mediastinum shift to the side of the pathology, which is better seen on CT. The degree of CTSD x-ray signs in the lungs depends on the number of phenotypical signs that is the degree of the disease manifestation. CT allows more accurate determining of the signs of connective tissue dysplasia in which tuberculosis develops

  9. Enabling Accessibility Through Model-Based User Interface Development.

    Science.gov (United States)

    Ziegler, Daniel; Peissner, Matthias

    2017-01-01

    Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.

  10. Constructing rule-based models using the belief functions framework

    NARCIS (Netherlands)

    Almeida, R.J.; Denoeux, T.; Kaymak, U.; Greco, S.; Bouchon-Meunier, B.; Coletti, G.; Fedrizzi, M.; Matarazzo, B.; Yager, R.R.

    2012-01-01

    Abstract. We study a new approach to regression analysis. We propose a new rule-based regression model using the theoretical framework of belief functions. For this purpose we use the recently proposed Evidential c-means (ECM) to derive rule-based models solely from data. ECM allocates, for each

  11. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  12. An Active Learning Exercise for Introducing Agent-Based Modeling

    Science.gov (United States)

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  13. Facilitating Change to a Problem-based Model

    DEFF Research Database (Denmark)

    Kolmos, Anette

    2002-01-01

    The paper presents the barriers which arise during the change process from a traditional educational system to a problem-based educational model.......The paper presents the barriers which arise during the change process from a traditional educational system to a problem-based educational model....

  14. Towards automatic model based controller design for reconfigurable plants

    DEFF Research Database (Denmark)

    Michelsen, Axel Gottlieb; Stoustrup, Jakob; Izadi-Zamanabadi, Roozbeh

    2008-01-01

    This paper introduces model-based Plug and Play Process Control, a novel concept for process control, which allows a model-based control system to be reconfigured when a sensor or an actuator is plugged into a controlled process. The work reported in this paper focuses on composing a monolithic m...

  15. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.

    2011-01-01

    Since the world's energy demand increases every year, the oil & gas industry makes a continuous effort to improve fossil fuel recovery. Physics-based petroleum reservoir modeling and closed-loop model-based reservoir management concept can play an important role here. In this concept measured data

  16. Pattern-based translation of BPMN process models to BPEL web services

    NARCIS (Netherlands)

    Ouyang, C.; Dumas, M.; Hofstede, ter A.H.M.; Aalst, van der W.M.P.

    2008-01-01

    The business process modeling notation (BPMN) is a graph-oriented language primarily targeted at domain analysts and supported by many modeling tools. The business process execution language for Web services (BPEL) on the other hand is a mainly block-structured language targeted at software

  17. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  18. The Community-based Whole Magnetosphere Model

    Science.gov (United States)

    2011-11-15

    2008. Colloquia A.J. Ridley Y. Yu, M. W. Liemohn, A. M. Dodger , Understanding the geoeffective proper- ties of rapid changes in the solar wind and in...enhancement, 2010 AGU Fall Meeting, San Francisco, CA, December 13-17, 2010. A. M. Dodger , A.J. Ridley Comparing a Cou- pled Ionosphere-Plasmasphere Model to...Meeting, San Francisco, CA, December 13-17, 2010. CWMM-20 Ridley CWMM Final Report A. M. Jorgensen, A.J. Ridley A. M. Dodger , J. Lichtenberger

  19. Graph-based modelling in engineering

    CERN Document Server

    Rysiński, Jacek

    2017-01-01

    This book presents versatile, modern and creative applications of graph theory in mechanical engineering, robotics and computer networks. Topics related to mechanical engineering include e.g. machine and mechanism science, mechatronics, robotics, gearing and transmissions, design theory and production processes. The graphs treated are simple graphs, weighted and mixed graphs, bond graphs, Petri nets, logical trees etc. The authors represent several countries in Europe and America, and their contributions show how different, elegant, useful and fruitful the utilization of graphs in modelling of engineering systems can be. .

  20. Assessment of Vegetation Variation on Primarily Creation Zones of the Dust Storms Around the Euphrates Using Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Jamil Amanollahi

    2012-06-01

    Full Text Available Recently, period frequency and effect domain of the dust storms that enter Iran from Iraq have increased. In this study, in addition to detecting the creation zones of the dust storms, the effect of vegetation cover variation on their creation was investigated using remote sensing. Moderate resolution image Spectroradiometer (MODIS and Landsat Thematic Mapper (TM5 have been utilized to identify the primarily creation zones of the dust storms and to assess the vegetation cover variation, respectively. Vegetation cover variation was studied using Normalized Differences Vegetation Index (NDVI obtained from band 3 and band 4 of the Landsate satellite. The results showed that the surrounding area of the Euphrates in Syria, the desert in the vicinity of this river in Iraq, including the deserts of Alanbar Province, and the north deserts of Saudi Arabia are the primarily creation zones of the dust storms entering west and south west of Iran. The results of NDVI showed that excluding the deserts in the border of Syria and Iraq, the area with very weak vegetation cover have increased between 2.44% and 20.65% from 1991 to 2009. In the meanwhile, the retention pound surface areas in the south deserts of Syria as well as the deserts in its border with Iraq have decreased 6320 and 4397 hectares, respectively. As it can be concluded from the findings, one of the main environmental parameters initiating these dust storms is the decrease in the vegetation cover in their primarily creation zones.

  1. Model-free and model-based reward prediction errors in EEG.

    Science.gov (United States)

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Radiobiological analyse based on cell cluster models

    International Nuclear Information System (INIS)

    Lin Hui; Jing Jia; Meng Damin; Xu Yuanying; Xu Liangfeng

    2010-01-01

    The influence of cell cluster dimension on EUD and TCP for targeted radionuclide therapy was studied using the radiobiological method. The radiobiological features of tumor with activity-lack in core were evaluated and analyzed by associating EUD, TCP and SF.The results show that EUD will increase with the increase of tumor dimension under the activity homogeneous distribution. If the extra-cellular activity was taken into consideration, the EUD will increase 47%. Under the activity-lack in tumor center and the requirement of TCP=0.90, the α cross-fire influence of 211 At could make up the maximum(48 μm)3 activity-lack for Nucleus source, but(72 μm)3 for Cytoplasm, Cell Surface, Cell and Voxel sources. In clinic,the physician could prefer the suggested dose of Cell Surface source in case of the future of local tumor control for under-dose. Generally TCP could well exhibit the effect difference between under-dose and due-dose, but not between due-dose and over-dose, which makes TCP more suitable for the therapy plan choice. EUD could well exhibit the difference between different models and activity distributions,which makes it more suitable for the research work. When the user uses EUD to study the influence of activity inhomogeneous distribution, one should keep the consistency of the configuration and volume of the former and the latter models. (authors)

  3. Model based analysis of piezoelectric transformers.

    Science.gov (United States)

    Hemsel, T; Priya, S

    2006-12-22

    Piezoelectric transformers are increasingly getting popular in the electrical devices owing to several advantages such as small size, high efficiency, no electromagnetic noise and non-flammable. In addition to the conventional applications such as ballast for back light inverter in notebook computers, camera flash, and fuel ignition several new applications have emerged such as AC/DC converter, battery charger and automobile lighting. These new applications demand high power density and wide range of voltage gain. Currently, the transformer power density is limited to 40 W/cm(3) obtained at low voltage gain. The purpose of this study was to investigate a transformer design that has the potential of providing higher power density and wider range of voltage gain. The new transformer design utilizes radial mode both at the input and output port and has the unidirectional polarization in the ceramics. This design was found to provide 30 W power with an efficiency of 98% and 30 degrees C temperature rise from the room temperature. An electro-mechanical equivalent circuit model was developed to describe the characteristics of the piezoelectric transformer. The model was found to successfully predict the characteristics of the transformer. Excellent matching was found between the computed and experimental results. The results of this study will allow to deterministically design unipoled piezoelectric transformers with specified performance. It is expected that in near future the unipoled transformer will gain significant importance in various electrical components.

  4. ANFIS-Based Modeling for Photovoltaic Characteristics Estimation

    Directory of Open Access Journals (Sweden)

    Ziqiang Bi

    2016-09-01

    Full Text Available Due to the high cost of photovoltaic (PV modules, an accurate performance estimation method is significantly valuable for studying the electrical characteristics of PV generation systems. Conventional analytical PV models are usually composed by nonlinear exponential functions and a good number of unknown parameters must be identified before using. In this paper, an adaptive-network-based fuzzy inference system (ANFIS based modeling method is proposed to predict the current-voltage characteristics of PV modules. The effectiveness of the proposed modeling method is evaluated through comparison with Villalva’s model, radial basis function neural networks (RBFNN based model and support vector regression (SVR based model. Simulation and experimental results confirm both the feasibility and the effectiveness of the proposed method.

  5. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  6. Prediction of pipeline corrosion rate based on grey Markov models

    International Nuclear Information System (INIS)

    Chen Yonghong; Zhang Dafa; Peng Guichu; Wang Yuemin

    2009-01-01

    Based on the model that combined by grey model and Markov model, the prediction of corrosion rate of nuclear power pipeline was studied. Works were done to improve the grey model, and the optimization unbiased grey model was obtained. This new model was used to predict the tendency of corrosion rate, and the Markov model was used to predict the residual errors. In order to improve the prediction precision, rolling operation method was used in these prediction processes. The results indicate that the improvement to the grey model is effective and the prediction precision of the new model combined by the optimization unbiased grey model and Markov model is better, and the use of rolling operation method may improve the prediction precision further. (authors)

  7. Model and Behavior-Based Robotic Goalkeeper

    DEFF Research Database (Denmark)

    Lausen, H.; Nielsen, J.; Nielsen, M.

    2003-01-01

    This paper describes the design, implementation and test of a goalkeeper robot for the Middle-Size League of RoboCub. The goalkeeper task is implemented by a set of primitive tasks and behaviours coordinated by a 2-level hierarchical state machine. The primitive tasks concerning complex motion...... control are implemented by a non-linear control algorithm, adapted to the different task goals (e.g., follow the ball or the robot posture from local features extracted from images acquired by a catadioptric omni-directional vision system. Most robot parameters were designed based on simulations carried...

  8. Understanding Elementary Astronomy by Making Drawing-Based Models

    NARCIS (Netherlands)

    van Joolingen, Wouter; Aukes, A.V.A.; Gijlers, Aaltje H.; Bollen, Lars

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247

  9. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  10. Adopting a Models-Based Approach to Teaching Physical Education

    Science.gov (United States)

    Casey, Ashley; MacPhail, Ann

    2018-01-01

    Background: The popularised notion of models-based practice (MBP) is one that focuses on the delivery of a model, e.g. Cooperative Learning, Sport Education, Teaching Personal and Social Responsibility, Teaching Games for Understanding. Indeed, while an abundance of research studies have examined the delivery of a single model and some have…

  11. Understanding Elementary Astronomy by Making Drawing-Based Models

    Science.gov (United States)

    van Joolingen, W. R.; Aukes, Annika V.; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 children (ages ranging from 7 to 15) to create a…

  12. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  13. A review of Agent Based Modeling for agricultural policy evaluation

    NARCIS (Netherlands)

    Kremmydas, Dimitris; Athanasiadis, I.N.; Rozakis, Stelios

    2018-01-01

    Farm level scale policy analysis is receiving increased attention due to a changing agricultural policy orientation. Agent based models (ABM) are farm level models that have appeared in the end of 1990's, having several differences from traditional farm level models, like the consideration of

  14. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...... aided molecular design methods and gives the ability to screen numerous process alternatives without the need to use the rigorous process simulation models. The process "property" model calculates the design targets for the generated flowsheet alternatives while a reverse modelling method (also...... developed) determines the design variables matching the target. A simple illustrative example highlighting the main features of the methodology is also presented....

  15. Testing R&D-Based Endogenous Growth Models

    DEFF Research Database (Denmark)

    Kruse-Andersen, Peter Kjær

    2017-01-01

    R&D-based growth models are tested using US data for the period 1953-2014. A general growth model is developed which nests the model varieties of interest. The model implies a cointegrating relationship between multifactor productivity, research intensity, and employment. This relationship...... is estimated using cointegrated VAR models. The results provide evidence against the widely used fully endogenous variety and in favor of the semi-endogenous variety. Forecasts based on the empirical estimates suggest that the slowdown in US productivity growth will continue. Particularly, the annual long...

  16. Fuzzy model-based control of a nuclear reactor

    International Nuclear Information System (INIS)

    Van Den Durpel, L.; Ruan, D.

    1994-01-01

    The fuzzy model-based control of a nuclear power reactor is an emerging research topic world-wide. SCK-CEN is dealing with this research in a preliminary stage, including two aspects, namely fuzzy control and fuzzy modelling. The aim is to combine both methodologies in contrast to conventional model-based PID control techniques, and to state advantages of including fuzzy parameters as safety and operator feedback. This paper summarizes the general scheme of this new research project

  17. Model-based Sensor Data Acquisition and Management

    OpenAIRE

    Aggarwal, Charu C.; Sathe, Saket; Papaioannou, Thanasis G.; Jeung, Ho Young; Aberer, Karl

    2012-01-01

    In recent years, due to the proliferation of sensor networks, there has been a genuine need of researching techniques for sensor data acquisition and management. To this end, a large number of techniques have emerged that advocate model-based sensor data acquisition and management. These techniques use mathematical models for performing various, day-to-day tasks involved in managing sensor data. In this chapter, we survey the state-of-the-art techniques for model-based sensor data acquisition...

  18. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    Science.gov (United States)

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (pEEG patterns such as generalized periodic discharges (pEEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Large-scale Comparative Study of Hi-C-based Chromatin 3D Structure Modeling Methods

    KAUST Repository

    Wang, Cheng

    2018-05-17

    Chromatin is a complex polymer molecule in eukaryotic cells, primarily consisting of DNA and histones. Many works have shown that the 3D folding of chromatin structure plays an important role in DNA expression. The recently proposed Chro- mosome Conformation Capture technologies, especially the Hi-C assays, provide us an opportunity to study how the 3D structures of the chromatin are organized. Based on the data from Hi-C experiments, many chromatin 3D structure modeling methods have been proposed. However, there is limited ground truth to validate these methods and no robust chromatin structure alignment algorithms to evaluate the performance of these methods. In our work, we first made a thorough literature review of 25 publicly available population Hi-C-based chromatin 3D structure modeling methods. Furthermore, to evaluate and to compare the performance of these methods, we proposed a novel data simulation method, which combined the population Hi-C data and single-cell Hi-C data without ad hoc parameters. Also, we designed a global and a local alignment algorithms to measure the similarity between the templates and the chromatin struc- tures predicted by different modeling methods. Finally, the results from large-scale comparative tests indicated that our alignment algorithms significantly outperform the algorithms in literature.

  20. Characteristics-based modelling of flow problems

    International Nuclear Information System (INIS)

    Saarinen, M.

    1994-02-01

    The method of characteristics is an exact way to proceed to the solution of hyperbolic partial differential equations. The numerical solutions, however, are obtained in the fixed computational grid where interpolations of values between the mesh points cause numerical errors. The Piecewise Linear Interpolation Method, PLIM, the utilization of which is based on the method of characteristics, has been developed to overcome these deficiencies. The thesis concentrates on the computer simulation of the two-phase flow. The main topics studied are: (1) the PLIM method has been applied to study the validity of the numerical scheme through solving various flow problems to achieve knowledge for the further development of the method, (2) the mathematical and physical validity and applicability of the two-phase flow equations based on the SFAV (Separation of the two-phase Flow According to Velocities) approach has been studied, and (3) The SFAV approach has been further developed for particular cases such as stratified horizontal two-phase flow. (63 refs., 4 figs.)

  1. Modeling and Analysis of Space Based Transceivers

    Science.gov (United States)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  2. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel...... neighborhood regularization is presented. This framework enables the formulation of the regularization in a way that corresponds well with our prior assumptions of the image data. The proposed method is validated and compared with other approaches on several data sets. Lastly, the intensity......-hue-saturation method is revisited in order to gain additional insight of what implications the spectral consistency has for an image fusion method....

  3. Evaluation of pipeline defect's characteristic axial length via model-based parameter estimation in ultrasonic guided wave-based inspection

    International Nuclear Information System (INIS)

    Wang, Xiaojuan; Tse, Peter W; Dordjevich, Alexandar

    2011-01-01

    The reflection signal from a defect in the process of guided wave-based pipeline inspection usually includes sufficient information to detect and define the defect. In previous research, it has been found that the reflection of guided waves from even a complex defect primarily results from the interference between reflection components generated at the front and the back edges of the defect. The respective contribution of different parameters of a defect to the overall reflection can be affected by the features of the two primary reflection components. The identification of these components embedded in the reflection signal is therefore useful in characterizing the concerned defect. In this research, we propose a method of model-based parameter estimation with the aid of the Hilbert–Huang transform technique for the purpose of decomposition of a reflection signal to enable characterization of the pipeline defect. Once two primary edge reflection components are decomposed and identified, the distance between the reflection positions, which closely relates to the axial length of the defect, could be easily and accurately determined. Considering the irregular profiles of complex pipeline defects at their two edges, which is often the case in real situations, the average of varied axial lengths of such a defect along the circumference of the pipeline is used in this paper as the characteristic value of actual axial length for comparison purpose. The experimental results of artificial defects and real corrosion in sample pipes were considered in this paper to demonstrate the effectiveness of the proposed method

  4. Repetition-based Interactive Facade Modeling

    KAUST Repository

    AlHalawani, Sawsan

    2012-07-01

    Modeling and reconstruction of urban environments has gained researchers attention throughout the past few years. It spreads in a variety of directions across multiple disciplines such as image processing, computer graphics and computer vision as well as in architecture, geoscience and remote sensing. Having a virtual world of our real cities is very attractive in various directions such as entertainment, engineering, governments among many others. In this thesis, we address the problem of processing a single fa cade image to acquire useful information that can be utilized to manipulate the fa cade and generate variations of fa cade images which can be later used for buildings\\' texturing. Typical fa cade structures exhibit a rectilinear distribution where in windows and other elements are organized in a grid of horizontal and vertical repetitions of similar patterns. In the firt part of this thesis, we propose an efficient algorithm that exploits information obtained from a single image to identify the distribution grid of the dominant elements i.e. windows. This detection method is initially assisted with the user marking the dominant window followed by an automatic process for identifying its repeated instances which are used to define the structure grid. Given the distribution grid, we allow the user to interactively manipulate the fa cade by adding, deleting, resizing or repositioning the windows in order to generate new fa cade structures. Having the utility for the interactive fa cade is very valuable to create fa cade variations and generate new textures for building models. Ultimately, there is a wide range of interesting possibilities of interactions to be explored.

  5. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  6. User Context Aware Base Station Power Flow Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  7. Rodent model of activity-based anorexia.

    Science.gov (United States)

    Carrera, Olaia; Fraga, Ángela; Pellón, Ricardo; Gutiérrez, Emilio

    2014-04-10

    Activity-based anorexia (ABA) consists of a procedure that involves the simultaneous exposure of animals to a restricted feeding schedule, while free access is allowed to an activity wheel. Under these conditions, animals show a progressive increase in wheel running, a reduced efficiency in food intake to compensate for their increased activity, and a severe progression of weight loss. Due to the parallelism with the clinical manifestations of anorexia nervosa including increased activity, reduced food intake and severe weight loss, the ABA procedure has been proposed as the best analog of human anorexia nervosa (AN). Thus, ABA research could both allow a better understanding of the mechanisms underlying AN and generate useful leads for treatment development in AN. Copyright © 2014 John Wiley & Sons, Inc.

  8. Modeling thrombin generation: plasma composition based approach.

    Science.gov (United States)

    Brummel-Ziedins, Kathleen E; Everse, Stephen J; Mann, Kenneth G; Orfeo, Thomas

    2014-01-01

    Thrombin has multiple functions in blood coagulation and its regulation is central to maintaining the balance between hemorrhage and thrombosis. Empirical and computational methods that capture thrombin generation can provide advancements to current clinical screening of the hemostatic balance at the level of the individual. In any individual, procoagulant and anticoagulant factor levels together act to generate a unique coagulation phenotype (net balance) that is reflective of the sum of its developmental, environmental, genetic, nutritional and pharmacological influences. Defining such thrombin phenotypes may provide a means to track disease progression pre-crisis. In this review we briefly describe thrombin function, methods for assessing thrombin dynamics as a phenotypic marker, computationally derived thrombin phenotypes versus determined clinical phenotypes, the boundaries of normal range thrombin generation using plasma composition based approaches and the feasibility of these approaches for predicting risk.

  9. Adopsi Model Competency Based Training dalam Kewirausahaan

    Directory of Open Access Journals (Sweden)

    I Ketut Santra

    2009-01-01

    Full Text Available The aim of the research is improving the teaching method in entrepreneurship subject. This research adopted the competency based training (CBT into the entrepreneurship. The major task in this research is formulated and designed the entrepreneurship competency. Entrepreneurship competency indicated by Personal, Strategic and Situational and Business competence. All of entrepreneurship competences are described into sub topic of competence. After designing and formulating the game and simulation the research continuing to implement the competency based training in the real class. The time consumed to implementing the CBT one semester, starting on September 2006 to early February 2007. The lesson learnt from the implementation period, the CBT could improve the student competence in Personal, Situational Strategic and Business. The three of the competencies are important for the success entrepreneur. It is a sign of application of “Kurikulum Berbasis Kompetensi”. There are many evidences to describe the achievement of the CBT in entrepreneurship subject. Firstly, physically achievement, that all of the student’s business plan could became the real business. The evidences are presented by picture of the student’s real business. Secondly theoretically achievement, that the Personal, Situational Strategic and Business competence statistically have significant relation with Business Plan even Real Business quality. The effect of the Personal, Situational Strategic and Business competence to Business Plan quality is 84.4%. and, to the Real Business quality 77.2%. The statistic’s evidence suggests that the redesign of the entrepreneurship subject is the right way. The content of the entrepreneur competence (Personal, Situational and Strategic and Business competence have impact to the student to conduct and running for own business.

  10. Guidelines for visualizing and annotating rule-based models.

    Science.gov (United States)

    Chylek, Lily A; Hu, Bin; Blinov, Michael L; Emonet, Thierry; Faeder, James R; Goldstein, Byron; Gutenkunst, Ryan N; Haugh, Jason M; Lipniacki, Tomasz; Posner, Richard G; Yang, Jin; Hlavacek, William S

    2011-10-01

    Rule-based modeling provides a means to represent cell signaling systems in a way that captures site-specific details of molecular interactions. For rule-based models to be more widely understood and (re)used, conventions for model visualization and annotation are needed. We have developed the concepts of an extended contact map and a model guide for illustrating and annotating rule-based models. An extended contact map represents the scope of a model by providing an illustration of each molecule, molecular component, direct physical interaction, post-translational modification, and enzyme-substrate relationship considered in a model. A map can also illustrate allosteric effects, structural relationships among molecular components, and compartmental locations of molecules. A model guide associates elements of a contact map with annotation and elements of an underlying model, which may be fully or partially specified. A guide can also serve to document the biological knowledge upon which a model is based. We provide examples of a map and guide for a published rule-based model that characterizes early events in IgE receptor (FcεRI) signaling. We also provide examples of how to visualize a variety of processes that are common in cell signaling systems but not considered in the example model, such as ubiquitination. An extended contact map and an associated guide can document knowledge of a cell signaling system in a form that is visual as well as executable. As a tool for model annotation, a map and guide can communicate the content of a model clearly and with precision, even for large models.

  11. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...... aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv...

  12. Improving satellite-based PM2.5 estimates in China using Gaussian processes modeling in a Bayesian hierarchical setting.

    Science.gov (United States)

    Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun

    2017-08-01

    Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2  = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.

  13. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  14. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    Creation of DEVS models has been advanced through Model Driven Architecture and its frameworks. The overarching role of the frameworks has been to help develop model specifications in a disciplined fashion. Frameworks can provide intermediary layers between the higher level mathematical models...... and their corresponding software specifications from both structural and behavioral aspects. Unlike structural modeling, developing models to specify behavior of systems is known to be harder and more complex, particularly when operations with non-trivial control schemes are required. In this paper, we propose specifying...... activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  15. Business model for sensor-based fall recognition systems.

    Science.gov (United States)

    Fachinger, Uwe; Schöpke, Birte

    2014-01-01

    AAL systems require, in addition to sophisticated and reliable technology, adequate business models for their launch and sustainable establishment. This paper presents the basic features of alternative business models for a sensor-based fall recognition system which was developed within the context of the "Lower Saxony Research Network Design of Environments for Ageing" (GAL). The models were developed parallel to the R&D process with successive adaptation and concretization. An overview of the basic features (i.e. nine partial models) of the business model is given and the mutual exclusive alternatives for each partial model are presented. The partial models are interconnected and the combinations of compatible alternatives lead to consistent alternative business models. However, in the current state, only initial concepts of alternative business models can be deduced. The next step will be to gather additional information to work out more detailed models.

  16. PWR surveillance based on correspondence between empirical models and physical

    International Nuclear Information System (INIS)

    Zwingelstein, G.; Upadhyaya, B.R.; Kerlin, T.W.

    1976-01-01

    An on line surveillance method based on the correspondence between empirical models and physicals models is proposed for pressurized water reactors. Two types of empirical models are considered as well as the mathematical models defining the correspondence between the physical and empirical parameters. The efficiency of this method is illustrated for the surveillance of the Doppler coefficient for Oconee I (an 886 MWe PWR) [fr

  17. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  18. Model-based reasoning technology for the power industry

    International Nuclear Information System (INIS)

    Touchton, R.A.; Subramanyan, N.S.; Naser, J.A.

    1991-01-01

    This paper reports on model-based reasoning which refers to an expert system implementation methodology that uses a model of the system which is being reasoned about. Model-based representation and reasoning techniques offer many advantages and are highly suitable for domains where the individual components, their interconnection, and their behavior is well-known. Technology Applications, Inc. (TAI), under contract to the Electric Power Research Institute (EPRI), investigated the use of model-based reasoning in the power industry including the nuclear power industry. During this project, a model-based monitoring and diagnostic tool, called ProSys, was developed. Also, an alarm prioritization system was developed as a demonstration prototype

  19. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  20. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  1. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  2. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  3. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  4. Modeling and knowledge acquisition processes using case-based inference

    Directory of Open Access Journals (Sweden)

    Ameneh Khadivar

    2017-03-01

    Full Text Available The method of acquisition and presentation of the organizational Process Knowledge has considered by many KM researches. In this research a model for process knowledge acquisition and presentation has been presented by using the approach of Case Base Reasoning. The validation of the presented model was evaluated by conducting an expert panel. Then a software has been developed based on the presented model and implemented in Eghtesad Novin Bank of Iran. In this company, based on the stages of the presented model, first the knowledge intensive processes has been identified, then the Process Knowledge was stored in a knowledge base in the format of problem/solution/consequent .The retrieval of the knowledge was done based on the similarity of the nearest neighbor algorithm. For validating of the implemented system, results of the system has compared by the results of the decision making of the expert of the process.

  5. Software for medical image based phantom modelling

    International Nuclear Information System (INIS)

    Possani, R.G.; Massicano, F.; Coelho, T.S.; Yoriyaz, H.

    2011-01-01

    Latest treatment planning systems depends strongly on CT images, so the tendency is that the dosimetry procedures in nuclear medicine therapy be also based on images, such as magnetic resonance imaging (MRI) or computed tomography (CT), to extract anatomical and histological information, as well as, functional imaging or activities map as PET or SPECT. This information associated with the simulation of radiation transport software is used to estimate internal dose in patients undergoing treatment in nuclear medicine. This work aims to re-engineer the software SCMS, which is an interface software between the Monte Carlo code MCNP, and the medical images, that carry information from the patient in treatment. In other words, the necessary information contained in the images are interpreted and presented in a specific format to the Monte Carlo MCNP code to perform the simulation of radiation transport. Therefore, the user does not need to understand complex process of inputting data on MCNP, as the SCMS is responsible for automatically constructing anatomical data from the patient, as well as the radioactive source data. The SCMS was originally developed in Fortran- 77. In this work it was rewritten in an object-oriented language (JAVA). New features and data options have also been incorporated into the software. Thus, the new software has a number of improvements, such as intuitive GUI and a menu for the selection of the energy spectra correspondent to a specific radioisotope stored in a XML data bank. The new version also supports new materials and the user can specify an image region of interest for the calculation of absorbed dose. (author)

  6. Evaluating performances of simplified physically based landslide susceptibility models.

    Science.gov (United States)

    Capparelli, Giovanna; Formetta, Giuseppe; Versace, Pasquale

    2015-04-01

    Rainfall induced shallow landslides cause significant damages involving loss of life and properties. Prediction of shallow landslides susceptible locations is a complex task that involves many disciplines: hydrology, geotechnical science, geomorphology, and statistics. Usually to accomplish this task two main approaches are used: statistical or physically based model. This paper presents a package of GIS based models for landslide susceptibility analysis. It was integrated in the NewAge-JGrass hydrological model using the Object Modeling System (OMS) modeling framework. The package includes three simplified physically based models for landslides susceptibility analysis (M1, M2, and M3) and a component for models verifications. It computes eight goodness of fit indices (GOF) by comparing pixel-by-pixel model results and measurements data. Moreover, the package integration in NewAge-JGrass allows the use of other components such as geographic information system tools to manage inputs-output processes, and automatic calibration algorithms to estimate model parameters. The system offers the possibility to investigate and fairly compare the quality and the robustness of models and models parameters, according a procedure that includes: i) model parameters estimation by optimizing each of the GOF index separately, ii) models evaluation in the ROC plane by using each of the optimal parameter set, and iii) GOF robustness evaluation by assessing their sensitivity to the input parameter variation. This procedure was repeated for all three models. The system was applied for a case study in Calabria (Italy) along the Salerno-Reggio Calabria highway, between Cosenza and Altilia municipality. The analysis provided that among all the optimized indices and all the three models, Average Index (AI) optimization coupled with model M3 is the best modeling solution for our test case. This research was funded by PON Project No. 01_01503 "Integrated Systems for Hydrogeological Risk

  7. Convex-based void filling method for CAD-based Monte Carlo geometry modeling

    International Nuclear Information System (INIS)

    Yu, Shengpeng; Cheng, Mengyun; Song, Jing; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • We present a new void filling method named CVF for CAD based MC geometry modeling. • We describe convex based void description based and quality-based space subdivision. • The results showed improvements provided by CVF for both modeling and MC calculation efficiency. - Abstract: CAD based automatic geometry modeling tools have been widely applied to generate Monte Carlo (MC) calculation geometry for complex systems according to CAD models. Automatic void filling is one of the main functions in the CAD based MC geometry modeling tools, because the void space between parts in CAD models is traditionally not modeled while MC codes such as MCNP need all the problem space to be described. A dedicated void filling method, named Convex-based Void Filling (CVF), is proposed in this study for efficient void filling and concise void descriptions. The method subdivides all the problem space into disjointed regions using Quality based Subdivision (QS) and describes the void space in each region with complementary descriptions of the convex volumes intersecting with that region. It has been implemented in SuperMC/MCAM, the Multiple-Physics Coupling Analysis Modeling Program, and tested on International Thermonuclear Experimental Reactor (ITER) Alite model. The results showed that the new method reduced both automatic modeling time and MC calculation time

  8. Interactive Coherence-Based Façade Modeling

    KAUST Repository

    Musialski, Przemyslaw

    2012-05-01

    We propose a novel interactive framework for modeling building facades from images. Our method is based on the notion of coherence-based editing which allows exploiting partial symmetries across the facade at any level of detail. The proposed workflow mixes manual interaction with automatic splitting and grouping operations based on unsupervised cluster analysis. In contrast to previous work, our approach leads to detailed 3d geometric models with up to several thousand regions per facade. We compare our modeling scheme to others and evaluate our approach in a user study with an experienced user and several novice users.

  9. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  10. Pixel-based meshfree modelling of skeletal muscles.

    Science.gov (United States)

    Chen, Jiun-Shyan; Basava, Ramya Rao; Zhang, Yantao; Csapo, Robert; Malis, Vadim; Sinha, Usha; Hodgson, John; Sinha, Shantanu

    2016-01-01

    This paper introduces the meshfree Reproducing Kernel Particle Method (RKPM) for 3D image-based modeling of skeletal muscles. This approach allows for construction of simulation model based on pixel data obtained from medical images. The material properties and muscle fiber direction obtained from Diffusion Tensor Imaging (DTI) are input at each pixel point. The reproducing kernel (RK) approximation allows a representation of material heterogeneity with smooth transition. A multiphase multichannel level set based segmentation framework is adopted for individual muscle segmentation using Magnetic Resonance Images (MRI) and DTI. The application of the proposed methods for modeling the human lower leg is demonstrated.

  11. Elastoplastic cup model for cement-based materials

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2010-03-01

    Full Text Available Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results. The case study shows that this cup model has extensive applicability for cement-based materials and other quasi-brittle and high-porosity materials in a complex stress state.

  12. A comprehensive gaze stabilization controller based on cerebellar internal models

    DEFF Research Database (Denmark)

    Vannucci, Lorenzo; Falotico, Egidio; Tolu, Silvia

    2017-01-01

    . The VOR works in conjunction with the opto-kinetic reflex (OKR), which is a visual feedback mechanism that allows to move the eye at the same speed as the observed scene. Together they keep the image stationary on the retina. In this work we implement on a humanoid robot a model of gaze stabilization...... based on the coordination of VCR and VOR and OKR. The model, inspired by neuroscientific cerebellar theories, is provided with learning and adaptation capabilities based on internal models. We present the results for the gaze stabilization model on three sets of experiments conducted on the SABIAN robot...

  13. Map-based model of the cardiac action potential

    International Nuclear Information System (INIS)

    Pavlov, Evgeny A.; Osipov, Grigory V.; Chan, C.K.; Suykens, Johan A.K.

    2011-01-01

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  14. Map-based model of the cardiac action potential

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, Evgeny A., E-mail: genie.pavlov@gmail.com [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Osipov, Grigory V. [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Chan, C.K. [Institute of Physics, Academia Sinica, 128 Sec. 2, Academia Road, Nankang, Taipei 115, Taiwan (China); Suykens, Johan A.K. [K.U. Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee) (Belgium)

    2011-07-25

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  15. A physiologically based nonhomogeneous Poisson counter model of visual identification

    DEFF Research Database (Denmark)

    Christensen, Jeppe H; Markussen, Bo; Bundesen, Claus

    2018-01-01

    A physiologically based nonhomogeneous Poisson counter model of visual identification is presented. The model was developed in the framework of a Theory of Visual Attention (Bundesen, 1990; Kyllingsbæk, Markussen, & Bundesen, 2012) and meant for modeling visual identification of objects that are ......A physiologically based nonhomogeneous Poisson counter model of visual identification is presented. The model was developed in the framework of a Theory of Visual Attention (Bundesen, 1990; Kyllingsbæk, Markussen, & Bundesen, 2012) and meant for modeling visual identification of objects...... that mimicked the dynamics of receptive field selectivity as found in neurophysiological studies. Furthermore, the initial sensory response yielded theoretical hazard rate functions that closely resembled empirically estimated ones. Finally, supplied with a Naka-Rushton type contrast gain control, the model...

  16. A stream-based mathematical model for distributed information processing systems - SysLab system model

    OpenAIRE

    Klein, Cornel; Rumpe, Bernhard; Broy, Manfred

    2014-01-01

    In the SysLab project we develop a software engineering method based on a mathematical foundation. The SysLab system model serves as an abstract mathematical model for information systems and their components. It is used to formalize the semantics of all used description techniques such as object diagrams state automata sequence charts or data-flow diagrams. Based on the requirements for such a reference model, we define the system model including its different views and their relationships.

  17. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  18. Model predictive control based on reduced order models applied to belt conveyor system.

    Science.gov (United States)

    Chen, Wei; Li, Xin

    2016-11-01

    In the paper, a model predictive controller based on reduced order model is proposed to control belt conveyor system, which is an electro-mechanics complex system with long visco-elastic body. Firstly, in order to design low-degree controller, the balanced truncation method is used for belt conveyor model reduction. Secondly, MPC algorithm based on reduced order model for belt conveyor system is presented. Because of the error bound between the full-order model and reduced order model, two Kalman state estimators are applied in the control scheme to achieve better system performance. Finally, the simulation experiments are shown that balanced truncation method can significantly reduce the model order with high-accuracy and model predictive control based on reduced-model performs well in controlling the belt conveyor system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Alternative ways of using field-based estimates to calibrate ecosystem models and their implications for carbon cycle studies

    Science.gov (United States)

    He, Yujie; Zhuang, Qianlai; McGuire, David; Liu, Yaling; Chen, Min

    2013-01-01

    Model-data fusion is a process in which field observations are used to constrain model parameters. How observations are used to constrain parameters has a direct impact on the carbon cycle dynamics simulated by ecosystem models. In this study, we present an evaluation of several options for the use of observations in modeling regional carbon dynamics and explore the implications of those options. We calibrated the Terrestrial Ecosystem Model on a hierarchy of three vegetation classification levels for the Alaskan boreal forest: species level, plant-functional-type level (PFT level), and biome level, and we examined the differences in simulated carbon dynamics. Species-specific field-based estimates were directly used to parameterize the model for species-level simulations, while weighted averages based on species percent cover were used to generate estimates for PFT- and biome-level model parameterization. We found that calibrated key ecosystem process parameters differed substantially among species and overlapped for species that are categorized into different PFTs. Our analysis of parameter sets suggests that the PFT-level parameterizations primarily reflected the dominant species and that functional information of some species were lost from the PFT-level parameterizations. The biome-level parameterization was primarily representative of the needleleaf PFT and lost information on broadleaf species or PFT function. Our results indicate that PFT-level simulations may be potentially representative of the performance of species-level simulations while biome-level simulations may result in biased estimates. Improved theoretical and empirical justifications for grouping species into PFTs or biomes are needed to adequately represent the dynamics of ecosystem functioning and structure.

  20. Variance-based sensitivity indices for models with dependent inputs

    International Nuclear Information System (INIS)

    Mara, Thierry A.; Tarantola, Stefano

    2012-01-01

    Computational models are intensively used in engineering for risk analysis or prediction of future outcomes. Uncertainty and sensitivity analyses are of great help in these purposes. Although several methods exist to perform variance-based sensitivity analysis of model output with independent inputs only a few are proposed in the literature in the case of dependent inputs. This is explained by the fact that the theoretical framework for the independent case is set and a univocal set of variance-based sensitivity indices is defined. In the present work, we propose a set of variance-based sensitivity indices to perform sensitivity analysis of models with dependent inputs. These measures allow us to distinguish between the mutual dependent contribution and the independent contribution of an input to the model response variance. Their definition relies on a specific orthogonalisation of the inputs and ANOVA-representations of the model output. In the applications, we show the interest of the new sensitivity indices for model simplification setting. - Highlights: ► Uncertainty and sensitivity analyses are of great help in engineering. ► Several methods exist to perform variance-based sensitivity analysis of model output with independent inputs. ► We define a set of variance-based sensitivity indices for models with dependent inputs. ► Inputs mutual contributions are distinguished from their independent contributions. ► Analytical and computational tests are performed and discussed.

  1. Image based 3D city modeling : Comparative study

    Directory of Open Access Journals (Sweden)

    S. P. Singh

    2014-06-01

    Full Text Available 3D city model is a digital representation of the Earth’s surface and it’s related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing rapidly for various engineering and non-engineering applications. Generally four main image based approaches were used for virtual 3D city models generation. In first approach, researchers were used Sketch based modeling, second method is Procedural grammar based modeling, third approach is Close range photogrammetry based modeling and fourth approach is mainly based on Computer Vision techniques. SketchUp, CityEngine, Photomodeler and Agisoft Photoscan are the main softwares to represent these approaches respectively. These softwares have different approaches & methods suitable for image based 3D city modeling. Literature study shows that till date, there is no complete such type of comparative study available to create complete 3D city model by using images. This paper gives a comparative assessment of these four image based 3D modeling approaches. This comparative study is mainly based on data acquisition methods, data processing techniques and output 3D model products. For this research work, study area is the campus of civil engineering department, Indian Institute of Technology, Roorkee (India. This 3D campus acts as a prototype for city. This study also explains various governing parameters, factors and work experiences. This research work also gives a brief introduction, strengths and weakness of these four image based techniques. Some personal comment is also given as what can do or what can’t do from these softwares. At the last, this study shows; it concluded that, each and every software has some advantages and limitations. Choice of software depends on user requirements of 3D project. For normal visualization project, SketchUp software is a good option. For 3D documentation record, Photomodeler gives good

  2. Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.

  3. Search-based model identification of smart-structure damage

    Science.gov (United States)

    Glass, B. J.; Macalou, A.

    1991-01-01

    This paper describes the use of a combined model and parameter identification approach, based on modal analysis and artificial intelligence (AI) techniques, for identifying damage or flaws in a rotating truss structure incorporating embedded piezoceramic sensors. This smart structure example is representative of a class of structures commonly found in aerospace systems and next generation space structures. Artificial intelligence techniques of classification, heuristic search, and an object-oriented knowledge base are used in an AI-based model identification approach. A finite model space is classified into a search tree, over which a variant of best-first search is used to identify the model whose stored response most closely matches that of the input. Newly-encountered models can be incorporated into the model space. This adaptativeness demonstrates the potential for learning control. Following this output-error model identification, numerical parameter identification is used to further refine the identified model. Given the rotating truss example in this paper, noisy data corresponding to various damage configurations are input to both this approach and a conventional parameter identification method. The combination of the AI-based model identification with parameter identification is shown to lead to smaller parameter corrections than required by the use of parameter identification alone.

  4. Vibration-based health monitoring and model refinement of civil engineering structures

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, C.R.; Doebling, S.W.

    1997-10-01

    Damage or fault detection, as determined by changes in the dynamic properties of structures, is a subject that has received considerable attention in the technical literature beginning approximately 30 years ago. The basic idea is that changes in the structure`s properties, primarily stiffness, will alter the dynamic properties of the structure such as resonant frequencies and mode shapes, and properties derived from these quantities such as modal-based flexibility. Recently, this technology has been investigated for applications to health monitoring of large civil engineering structures. This presentation will discuss such a study undertaken by engineers from New Mexico Sate University, Sandia National Laboratory and Los Alamos National Laboratory. Experimental modal analyses were performed in an undamaged interstate highway bridge and immediately after four successively more severe damage cases were inflicted in the main girder of the structure. Results of these tests provide insight into the abilities of modal-based damage ID methods to identify damage and the current limitations of this technology. Closely related topics that will be discussed are the use of modal properties to validate computer models of the structure, the use of these computer models in the damage detection process, and the general lack of experimental investigation of large civil engineering structures.

  5. Model-based and memory-based collaborative filtering algorithms for complex knowledge models

    NARCIS (Netherlands)

    Lozano, E.; Gracia, J.; Collarana, D.; Corcho, O.; Gómez-Pérez, A.; Villazón, B.; Latour, S.; Liem, J.

    2011-01-01

    In DynaLearn, learners, teachers and domain experts create Qualitative Reasoning (QR) conceptual models that may store in a common repository. These models represent a valuable source of knowledge that could be used to assist new users in the creation of models with related topics. However, finding

  6. Research on Turbofan Engine Model above Idle State Based on NARX Modeling Approach

    Science.gov (United States)

    Yu, Bing; Shu, Wenjun

    2017-03-01

    The nonlinear model for turbofan engine above idle state based on NARX is studied. Above all, the data sets for the JT9D engine from existing model are obtained via simulation. Then, a nonlinear modeling scheme based on NARX is proposed and several models with different parameters are built according to the former data sets. Finally, the simulations have been taken to verify the precise and dynamic performance the models, the results show that the NARX model can well reflect the dynamics characteristic of the turbofan engine with high accuracy.

  7. FADD Expression as a Prognosticator in Early-Stage Glottic Squamous Cell Carcinoma of the Larynx Treated Primarily With Radiotherapy

    International Nuclear Information System (INIS)

    Schrijvers, Michiel L.; Pattje, Wouter J.; Slagter-Menkema, Lorian; Mastik, Mirjam F.; Gibcus, Johan H.; Langendijk, Johannes A.; Wal, Jacqueline E. van der; Laan, Bernard F.A.M. vn der; Schuuring, E.

    2012-01-01

    Purpose: We recently reported on the identification of the Fas-associated death domain (FADD) as a possible driver of the chromosome 11q13 amplicon and the association between increased FADD expression and disease-specific survival in advanced-stage laryngeal carcinoma. The aim of this study was to examine whether expression of FADD and its Ser194-phosphorylated isoform (pFADD) predicts local control in patients with early-stage glottic carcinoma primarily treated with radiotherapy only. Methods and Materials: Immunohistochemical staining for FADD and pFADD was performed on pretreatment biopsy specimens of 92 patients with T1–T2 glottic squamous cell carcinoma primarily treated with radiotherapy between 1996 and 2005. Cox regression analysis was used to correlate expression levels with local control. Results: High levels of pFADD were associated with significantly better local control (hazard ratio, 2.40; 95% confidence interval, 1.04–5.55; p = 0.040). FADD overexpression showed a trend toward better local control (hazard ratio, 3.656; 95% confidence interval, 0.853–15.663; p = 0.081). Multivariate Cox regression analysis showed that high pFADD expression was the best predictor of local control after radiotherapy. Conclusions: This study showed that expression of phosphorylated FADD is a new prognostic biomarker for better local control after radiotherapy in patients with early-stage glottic carcinomas.

  8. A Model-based Avionic Prognostic Reasoner (MAPR)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Model-based Avionic Prognostic Reasoner (MAPR) presented in this paper is an innovative solution for non-intrusively monitoring the state of health (SoH) and...

  9. Physics-Based Pneumatic Hammer Instability Model, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this project is to develop a physics-based pneumatic hammer instability model that accurately predicts the stability of hydrostatic bearings...

  10. Towards Modeling False Memory With Computational Knowledge Bases.

    Science.gov (United States)

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  11. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  12. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  13. An interactive web-based extranet system model for managing ...

    African Journals Online (AJOL)

    ... objectives for students, lecturers and parents to access and compute results ... The database will serve as repository of students' academic records over a ... Keywords: Extranet-Model, Interactive, Web-Based, Students, Academic, Records ...

  14. Model-based Prognostics with Fixed-lag Particle Filters

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics exploits domain knowl- edge of the system, its components, and how they fail by casting the underlying physical phenom- ena in a...

  15. Game Based Learning (GBL) Adoption Model for Universities ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... faced while adopting Game Based Learning (GBL) model, its benefits and ... preferred traditional lectures styles, 7% online class and. 34% preferred .... students in developing problem-solving skills which in return may help ...

  16. Transcription-based model for the induction of chromosomal exchange events by ionising radiation

    International Nuclear Information System (INIS)

    Radford, I.A.

    2003-01-01

    The mechanistic basis for chromosomal aberration formation, following exposure of mammalian cells to ionising radiation, has long been debated. Although chromosomal aberrations are probably initiated by DNA double-strand breaks (DSB), little is understood about the mechanisms that generate and modulate DNA rearrangement. Based on results from our laboratory and data from the literature, a novel model of chromosomal aberration formation has been suggested (Radford 2002). The basic postulates of this model are that: (1) DSB, primarily those involving multiple individual damage sites (i.e. complex DSB), are the critical initiating lesion; (2) only those DSB occurring in transcription units that are associated with transcription 'factories' (complexes containing multiple transcription units) induce chromosomal exchange events; (3) such DSB are brought into contact with a DNA topoisomerase I molecule through RNA polymerase II catalysed transcription and give rise to trapped DNA-topo I cleavage complexes; and (4) trapped complexes interact with another topo I molecule on a temporarily inactive transcription unit at the same transcription factory leading to DNA cleavage and subsequent strand exchange between the cleavage complexes. We have developed a method using inverse PCR that allows the detection and sequencing of putative ionising radiation-induced DNA rearrangements involving different regions of the human genome (Forrester and Radford 1998). The sequences detected by inverse PCR can provide a test of the prediction of the transcription-based model that ionising radiation-induced DNA rearrangements occur between sequences in active transcription units. Accordingly, reverse transcriptase PCR was used to determine if sequences involved in rearrangements were transcribed in the test cells. Consistent with the transcription-based model, nearly all of the sequences examined gave a positive result to reverse transcriptase PCR (Forrester and Radford unpublished)

  17. A mathematical model for camera calibration based on straight lines

    Directory of Open Access Journals (Sweden)

    Antonio M. G. Tommaselli

    2005-12-01

    Full Text Available In other to facilitate the automation of camera calibration process, a mathematical model using straight lines was developed, which is based on the equivalent planes mathematical model. Parameter estimation of the developed model is achieved by the Least Squares Method with Conditions and Observations. The same method of adjustment was used to implement camera calibration with bundles, which is based on points. Experiments using simulated and real data have shown that the developed model based on straight lines gives results comparable to the conventional method with points. Details concerning the mathematical development of the model and experiments with simulated and real data will be presented and the results with both methods of camera calibration, with straight lines and with points, will be compared.

  18. Anisotropy in wavelet-based phase field models

    KAUST Repository

    Korzec, Maciek; Mü nch, Andreas; Sü li, Endre; Wagner, Barbara

    2016-01-01

    When describing the anisotropic evolution of microstructures in solids using phase-field models, the anisotropy of the crystalline phases is usually introduced into the interfacial energy by directional dependencies of the gradient energy coefficients. We consider an alternative approach based on a wavelet analogue of the Laplace operator that is intrinsically anisotropic and linear. The paper focuses on the classical coupled temperature/Ginzburg--Landau type phase-field model for dendritic growth. For the model based on the wavelet analogue, existence, uniqueness and continuous dependence on initial data are proved for weak solutions. Numerical studies of the wavelet based phase-field model show dendritic growth similar to the results obtained for classical phase-field models.

  19. Anisotropy in wavelet-based phase field models

    KAUST Repository

    Korzec, Maciek

    2016-04-01

    When describing the anisotropic evolution of microstructures in solids using phase-field models, the anisotropy of the crystalline phases is usually introduced into the interfacial energy by directional dependencies of the gradient energy coefficients. We consider an alternative approach based on a wavelet analogue of the Laplace operator that is intrinsically anisotropic and linear. The paper focuses on the classical coupled temperature/Ginzburg--Landau type phase-field model for dendritic growth. For the model based on the wavelet analogue, existence, uniqueness and continuous dependence on initial data are proved for weak solutions. Numerical studies of the wavelet based phase-field model show dendritic growth similar to the results obtained for classical phase-field models.

  20. Similar words analysis based on POS-CBOW language model

    Directory of Open Access Journals (Sweden)

    Dongru RUAN

    2015-10-01

    Full Text Available Similar words analysis is one of the important aspects in the field of natural language processing, and it has important research and application values in text classification, machine translation and information recommendation. Focusing on the features of Sina Weibo's short text, this paper presents a language model named as POS-CBOW, which is a kind of continuous bag-of-words language model with the filtering layer and part-of-speech tagging layer. The proposed approach can adjust the word vectors' similarity according to the cosine similarity and the word vectors' part-of-speech metrics. It can also filter those similar words set on the base of the statistical analysis model. The experimental result shows that the similar words analysis algorithm based on the proposed POS-CBOW language model is better than that based on the traditional CBOW language model.

  1. Constitutive modeling of a nickel base superalloy -with a focus on gas turbine applications

    Energy Technology Data Exchange (ETDEWEB)

    Almroth, Per

    2003-05-01

    Gas turbines are used where large amounts of energy is needed, typically as engines in aircraft, ferries and power plants. From an efficiency point of view it is desirable to increase the service temperature as much as possible. One of the limiting factors is then the maximum allowable metal temperatures in the turbine stages, primarily in the blades of the first stage, that are exposed to the highest gas temperatures. Specially designed materials are used to cope with these severe conditions, such as the nickel base superalloy IN792. In order to be able to design the components for higher temperatures and tighter tolerances, a detailed understanding and computationel models of the material behaviour is needed. The models presented in this work have been developed with the objective of being physically well motivated, and with the intention of avoiding excessive numbers of parameters. The influence of the parameters should also be as easy as possible to interpret. The models are to describe the behaviour of IN792, under conditions typically found for a gas turbine blade. Specifically the high- and intermediate temperature isothermal modelling of IN792 have been addressed. One main issue when characterising the material and calibrating the models is the use of relevant tests, that are representative of component conditions. Therefore isothermal tests with an eye on the typical environment of a turbine blade have been planned and performed. Using numerical optimization techniques the material parameters for the isothermal behaviour of IN792 at 650 deg and 850 deg have been estimated. The good overall calibration results for these specific temperatures, using the presented modeling concept and nonstandard constitutive tests, suggests that the model can describe the behaviour of IN792 in gas turbine hot part applications.

  2. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  3. Hysteresis modeling based on saturation operator without constraints

    International Nuclear Information System (INIS)

    Park, Y.W.; Seok, Y.T.; Park, H.J.; Chung, J.Y.

    2007-01-01

    This paper proposes a simple way to model complex hysteresis in a magnetostrictive actuator by employing the saturation operators without constraints. Having no constraints causes a singularity problem, i.e. the inverse matrix cannot be obtained during calculating the weights. To overcome it, a pseudoinverse concept is introduced. Simulation results are compared with the experimental data, based on a Terfenol-D actuator. It is clear that the proposed model is much closer to the experimental data than the modified PI model. The relative error is calculated as 12% and less than 1% with the modified PI Model and proposed model, respectively

  4. A Coupled Simulation Architecture for Agent-Based/Geohydrological Modelling

    Science.gov (United States)

    Jaxa-Rozen, M.

    2016-12-01

    The quantitative modelling of social-ecological systems can provide useful insights into the interplay between social and environmental processes, and their impact on emergent system dynamics. However, such models should acknowledge the complexity and uncertainty of both of the underlying subsystems. For instance, the agent-based models which are increasingly popular for groundwater management studies can be made more useful by directly accounting for the hydrological processes which drive environmental outcomes. Conversely, conventional environmental models can benefit from an agent-based depiction of the feedbacks and heuristics which influence the decisions of groundwater users. From this perspective, this work describes a Python-based software architecture which couples the popular NetLogo agent-based platform with the MODFLOW/SEAWAT geohydrological modelling environment. This approach enables users to implement agent-based models in NetLogo's user-friendly platform, while benefiting from the full capabilities of MODFLOW/SEAWAT packages or reusing existing geohydrological models. The software architecture is based on the pyNetLogo connector, which provides an interface between the NetLogo agent-based modelling software and the Python programming language. This functionality is then extended and combined with Python's object-oriented features, to design a simulation architecture which couples NetLogo with MODFLOW/SEAWAT through the FloPy library (Bakker et al., 2016). The Python programming language also provides access to a range of external packages which can be used for testing and analysing the coupled models, which is illustrated for an application of Aquifer Thermal Energy Storage (ATES).

  5. DEVELOPMENT MODEL OF PATISSERIE PROJECT-BASED LEARNING

    OpenAIRE

    Ana Ana; Lutfhiyah Nurlaela

    2013-01-01

    The study aims to find a model of patisserie project-based learning with production approach that can improve effectiveness of patisserie learning. Delphi Technique, Cohen's Kappa and percentages of agreements were used to assess model of patisserie project based learning. Data collection techniques employed in the study were questionnaire, check list worksheet, observation, and interview sheets. Subjects were 13 lectures of expertise food and nutrition and 91 students of Food and Nutrition ...

  6. Short review of runoff and erosion physically based models

    Directory of Open Access Journals (Sweden)

    Gabrić Ognjen

    2015-01-01

    Full Text Available Processes of runoff and erosion are one of the main research subjects in hydrological science. Based on the field and laboratory measurements, and analogous with development of computational techniques, runoff and erosion models based on equations which describe the physics of the process are also developed. Several models of runoff and erosion which describes entire process of genesis and sediment transport on the catchment are described and compared.

  7. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  8. On Model Based Synthesis of Embedded Control Software

    OpenAIRE

    Alimguzhin, Vadim; Mari, Federico; Melatti, Igor; Salvo, Ivano; Tronci, Enrico

    2012-01-01

    Many Embedded Systems are indeed Software Based Control Systems (SBCSs), that is control systems whose controller consists of control software running on a microcontroller device. This motivates investigation on Formal Model Based Design approaches for control software. Given the formal model of a plant as a Discrete Time Linear Hybrid System and the implementation specifications (that is, number of bits in the Analog-to-Digital (AD) conversion) correct-by-construction control software can be...

  9. A model-based approach to estimating forest area

    Science.gov (United States)

    Ronald E. McRoberts

    2006-01-01

    A logistic regression model based on forest inventory plot data and transformations of Landsat Thematic Mapper satellite imagery was used to predict the probability of forest for 15 study areas in Indiana, USA, and 15 in Minnesota, USA. Within each study area, model-based estimates of forest area were obtained for circular areas with radii of 5 km, 10 km, and 15 km and...

  10. Pixel-based meshfree modelling of skeletal muscles

    OpenAIRE

    Chen, Jiun-Shyan; Basava, Ramya Rao; Zhang, Yantao; Csapo, Robert; Malis, Vadim; Sinha, Usha; Hodgson, John; Sinha, Shantanu

    2015-01-01

    This paper introduces the meshfree Reproducing Kernel Particle Method (RKPM) for 3D image-based modeling of skeletal muscles. This approach allows for construction of simulation model based on pixel data obtained from medical images. The material properties and muscle fiber direction obtained from Diffusion Tensor Imaging (DTI) are input at each pixel point. The reproducing kernel (RK) approximation allows a representation of material heterogeneity with smooth transition. A ...

  11. Modeling the Behaviour of an Advanced Material Based Smart Landing Gear System for Aerospace Vehicles

    International Nuclear Information System (INIS)

    Varughese, Byji; Dayananda, G. N.; Rao, M. Subba

    2008-01-01

    The last two decades have seen a substantial rise in the use of advanced materials such as polymer composites for aerospace structural applications. In more recent years there has been a concerted effort to integrate materials, which mimic biological functions (referred to as smart materials) with polymeric composites. Prominent among smart materials are shape memory alloys, which possess both actuating and sensory functions that can be realized simultaneously. The proper characterization and modeling of advanced and smart materials holds the key to the design and development of efficient smart devices/systems. This paper focuses on the material characterization; modeling and validation of the model in relation to the development of a Shape Memory Alloy (SMA) based smart landing gear (with high energy dissipation features) for a semi rigid radio controlled airship (RC-blimp). The Super Elastic (SE) SMA element is configured in such a way that it is forced into a tensile mode of high elastic deformation. The smart landing gear comprises of a landing beam, an arch and a super elastic Nickel-Titanium (Ni-Ti) SMA element. The landing gear is primarily made of polymer carbon composites, which possess high specific stiffness and high specific strength compared to conventional materials, and are therefore ideally suited for the design and development of an efficient skid landing gear system with good energy dissipation characteristics. The development of the smart landing gear in relation to a conventional metal landing gear design is also dealt with

  12. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources are preserved...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model-based...

  13. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  14. Quality prediction modeling for sintered ores based on mechanism models of sintering and extreme learning machine based error compensation

    Science.gov (United States)

    Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang

    2018-06-01

    Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.

  15. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.

  16. Converting biomolecular modelling data based on an XML representation.

    Science.gov (United States)

    Sun, Yudong; McKeever, Steve

    2008-08-25

    Biomolecular modelling has provided computational simulation based methods for investigating biological processes from quantum chemical to cellular levels. Modelling such microscopic processes requires atomic description of a biological system and conducts in fine timesteps. Consequently the simulations are extremely computationally demanding. To tackle this limitation, different biomolecular models have to be integrated in order to achieve high-performance simulations. The integration of diverse biomolecular models needs to convert molecular data between different data representations of different models. This data conversion is often non-trivial, requires extensive human input and is inevitably error prone. In this paper we present an automated data conversion method for biomolecular simulations between molecular dynamics and quantum mechanics/molecular mechanics models. Our approach is developed around an XML data representation called BioSimML (Biomolecular Simulation Markup Language). BioSimML provides a domain specific data representation for biomolecular modelling which can effciently support data interoperability between different biomolecular simulation models and data formats.

  17. Evaluation of Artificial Intelligence Based Models for Chemical Biodegradability Prediction

    Directory of Open Access Journals (Sweden)

    Aleksandar Sabljic

    2004-12-01

    Full Text Available This study presents a review of biodegradability modeling efforts including a detailed assessment of two models developed using an artificial intelligence based methodology. Validation results for these models using an independent, quality reviewed database, demonstrate that the models perform well when compared to another commonly used biodegradability model, against the same data. The ability of models induced by an artificial intelligence methodology to accommodate complex interactions in detailed systems, and the demonstrated reliability of the approach evaluated by this study, indicate that the methodology may have application in broadening the scope of biodegradability models. Given adequate data for biodegradability of chemicals under environmental conditions, this may allow for the development of future models that include such things as surface interface impacts on biodegradability for example.

  18. Propositions for a PDF model based on fluid particle acceleration

    International Nuclear Information System (INIS)

    Minier, J.P.; Pozorski, J.

    1997-05-01

    This paper describes theoretical propositions to model the acceleration of a fluid particle in a turbulent flow. Such a model is useful for the PDF approach to turbulent reactive flows as well as for the Lagrangian modelling of two-phase flows. The model developed here draws from ideas already put forward by Sawford but which are generalized to the case of non-homogeneous flows. The model is built so as to revert continuously to Pope's model, which uses a Langevin equation for particle velocities, when the Reynolds number becomes very high. The derivation is based on the technique of fast variable elimination. This technique allow a careful analysis of the relations between different levels of modelling. It also allows to address certain problems in a more rigorous way. In particular, application of this technique shows that models presently used can in principle simulate bubbly flows including the pressure-gradient and added-mass forces. (author)

  19. CEAI: CCM-based email authorship identification model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2013-11-01

    Full Text Available In this paper we present a model for email authorship identification (EAI by employing a Cluster-based Classification (CCM technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature set to include some more interesting and effective features for email authorship identification (e.g., the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell. We also included Info Gain feature selection based content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM-based models, as well as the models proposed by Iqbal et al. (2010, 2013 [1,2]. The proposed model attains an accuracy rate of 94% for 10 authors, 89% for 25 authors, and 81% for 50 authors, respectively on Enron dataset, while 89.5% accuracy has been achieved on authors’ constructed real email dataset. The results on Enron dataset have been achieved on quite a large number of authors as compared to the models proposed by Iqbal et al. [1,2].

  20. Fuzzy rule-based model for hydropower reservoirs operation

    Energy Technology Data Exchange (ETDEWEB)

    Moeini, R.; Afshar, A.; Afshar, M.H. [School of Civil Engineering, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of)

    2011-02-15

    Real-time hydropower reservoir operation is a continuous decision-making process of determining the water level of a reservoir or the volume of water released from it. The hydropower operation is usually based on operating policies and rules defined and decided upon in strategic planning. This paper presents a fuzzy rule-based model for the operation of hydropower reservoirs. The proposed fuzzy rule-based model presents a set of suitable operating rules for release from the reservoir based on ideal or target storage levels. The model operates on an 'if-then' principle, in which the 'if' is a vector of fuzzy premises and the 'then' is a vector of fuzzy consequences. In this paper, reservoir storage, inflow, and period are used as premises and the release as the consequence. The steps involved in the development of the model include, construction of membership functions for the inflow, storage and the release, formulation of fuzzy rules, implication, aggregation and defuzzification. The required knowledge bases for the formulation of the fuzzy rules is obtained form a stochastic dynamic programming (SDP) model with a steady state policy. The proposed model is applied to the hydropower operation of ''Dez'' reservoir in Iran and the results are presented and compared with those of the SDP model. The results indicate the ability of the method to solve hydropower reservoir operation problems. (author)

  1. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  2. Stochastic Watershed Models for Risk Based Decision Making

    Science.gov (United States)

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  3. A sediment graph model based on SCS-CN method

    Science.gov (United States)

    Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.

    2008-01-01

    SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.

  4. Pricing Mining Concessions Based on Combined Multinomial Pricing Model

    Directory of Open Access Journals (Sweden)

    Chang Xiao

    2017-01-01

    Full Text Available A combined multinomial pricing model is proposed for pricing mining concession in which the annualized volatility of the price of mineral products follows a multinomial distribution. First, a combined multinomial pricing model is proposed which consists of binomial pricing models calculated according to different volatility values. Second, a method is provided to calculate the annualized volatility and the distribution. Third, the value of convenience yields is calculated based on the relationship between the futures price and the spot price. The notion of convenience yields is used to adjust our model as well. Based on an empirical study of a Chinese copper mine concession, we verify that our model is easy to use and better than the model with constant volatility when considering the changing annualized volatility of the price of the mineral product.

  5. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed....... Also an optimization problem based on a microstructure model is solved, allowing the hardness distribution in the plate to be optimized. The use of purely thermal models represents a simplification of the real process; nonetheless, it shows the applicability of the optimization methods considered...

  6. Modeling of Complex Life Cycle Prediction Based on Cell Division

    Directory of Open Access Journals (Sweden)

    Fucheng Zhang

    2017-01-01

    Full Text Available Effective fault diagnosis and reasonable life expectancy are of great significance and practical engineering value for the safety, reliability, and maintenance cost of equipment and working environment. At present, the life prediction methods of the equipment are equipment life prediction based on condition monitoring, combined forecasting model, and driven data. Most of them need to be based on a large amount of data to achieve the problem. For this issue, we propose learning from the mechanism of cell division in the organism. We have established a moderate complexity of life prediction model across studying the complex multifactor correlation life model. In this paper, we model the life prediction of cell division. Experiments show that our model can effectively simulate the state of cell division. Through the model of reference, we will use it for the equipment of the complex life prediction.

  7. Model-based monitoring of rotors with multiple coexisting faults

    International Nuclear Information System (INIS)

    Rossner, Markus

    2015-01-01

    Monitoring systems are applied to many rotors, but only few monitoring systems can separate coexisting errors and identify their quantity. This research project solves this problem using a combination of signal-based and model-based monitoring. The signal-based part performs a pre-selection of possible errors; these errors are further separated with model-based methods. This approach is demonstrated for the errors unbalance, bow, stator-fixed misalignment, rotor-fixed misalignment and roundness errors. For the model-based part, unambiguous error definitions and models are set up. The Ritz approach reduces the model order and therefore speeds up the diagnosis. Identification algorithms are developed for the different rotor faults. Hereto, reliable damage indicators and proper sub steps of the diagnosis have to be defined. For several monitoring problems, measuring both deflection and bearing force is very useful. The monitoring system is verified by experiments on an academic rotor test rig. The interpretation of the measurements requires much knowledge concerning the dynamics of the rotor. Due to the model-based approach, the system can separate errors with similar signal patterns and identify bow and roundness error online at operation speed. [de

  8. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  9. An Agent-Based Approach to Modeling Online Social Influence

    NARCIS (Netherlands)

    Maanen, P.P. van; Vecht, B. van der

    2013-01-01

    The aim of this study is to better understand social influence in online social media. Therefore, we propose a method in which we implement, validate and improve an individual behavior model. The behavior model is based on three fundamental behavioral principles of social influence from the

  10. Simulating individual-based models of epidemics in hierarchical networks

    NARCIS (Netherlands)

    Quax, R.; Bader, D.A.; Sloot, P.M.A.

    2009-01-01

    Current mathematical modeling methods for the spreading of infectious diseases are too simplified and do not scale well. We present the Simulator of Epidemic Evolution in Complex Networks (SEECN), an efficient simulator of detailed individual-based models by parameterizing separate dynamics

  11. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel

    2006-01-01

    Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions...

  12. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  13. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  14. Model-based segmentation and classification of trajectories (Extended abstract)

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Buchin, K.; Buchin, M.; Sijben, S.; Westenberg, M.A.

    2014-01-01

    We present efficient algorithms for segmenting and classifying a trajectory based on a parameterized movement model like the Brownian bridge movement model. Segmentation is the problem of subdividing a trajectory into parts such that each art is homogeneous in its movement characteristics. We

  15. A pedagogical model for simulation-based learning in healthcare

    Directory of Open Access Journals (Sweden)

    Tuulikki Keskitalo

    2015-11-01

    Full Text Available The aim of this study was to design a pedagogical model for a simulation-based learning environment (SBLE in healthcare. Currently, simulation and virtual reality are a major focus in healthcare education. However, when and how these learning environments should be applied is not well-known. The present study tries to fill that gap. We pose the following research question: What kind of pedagogical model supports and facilitates students’ meaningful learning in SBLEs? The study used design-based research (DBR and case study approaches. We report the results from our second case study and how the pedagogical model was developed based on the lessons learned. The study involved nine facilitators and 25 students. Data were collected and analysed using mixed methods. The main result of this study is the refined pedagogical model. The model is based on the socio-cultural theory of learning and characteristics of meaningful learning as well as previous pedagogical models. The model will provide a more holistic and meaningful approach to teaching and learning in SBLEs. However, the model requires evidence and further development.

  16. Runtime Optimizations for Tree-Based Machine Learning Models

    NARCIS (Netherlands)

    N. Asadi; J.J.P. Lin (Jimmy); A.P. de Vries (Arjen)

    2014-01-01

    htmlabstractTree-based models have proven to be an effective solution for web ranking as well as other machine learning problems in diverse domains. This paper focuses on optimizing the runtime performance of applying such models to make predictions, specifically using gradient-boosted regression

  17. An intelligent trust-based access control model for affective ...

    African Journals Online (AJOL)

    In this study, a fuzzy expert system Trust-Based Access Control (TBAC) model for improving the Quality of crowdsourcing using emotional affective computing is presented. This model takes into consideration a pre-processing module consisting of three inputs such as crowd-workers category, trust metric and emotional ...

  18. WATGIS: A GIS-Based Lumped Parameter Water Quality Model

    Science.gov (United States)

    Glenn P. Fernandez; George M. Chescheir; R. Wayne Skaggs; Devendra M. Amatya

    2002-01-01

    A Geographic Information System (GIS)­based, lumped parameter water quality model was developed to estimate the spatial and temporal nitrogen­loading patterns for lower coastal plain watersheds in eastern North Carolina. The model uses a spatially distributed delivery ratio (DR) parameter to account for nitrogen retention or loss along a drainage network. Delivery...

  19. Perceptual decision neurosciences: a model-based review

    NARCIS (Netherlands)

    Mulder, M.J.; van Maanen, L.; Forstmann, B.U.

    2014-01-01

    In this review we summarize findings published over the past 10 years focusing on the neural correlates of perceptual decision-making. Importantly, this review highlights only studies that employ a model-based approach, i.e., they use quantitative cognitive models in combination with neuroscientific

  20. Simulation-based modeling of building complexes construction management

    Science.gov (United States)

    Shepelev, Aleksandr; Severova, Galina; Potashova, Irina

    2018-03-01

    The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.

  1. A physically based analytical spatial air temperature and humidity model

    Science.gov (United States)

    Yang Yang; Theodore A. Endreny; David J. Nowak

    2013-01-01

    Spatial variation of urban surface air temperature and humidity influences human thermal comfort, the settling rate of atmospheric pollutants, and plant physiology and growth. Given the lack of observations, we developed a Physically based Analytical Spatial Air Temperature and Humidity (PASATH) model. The PASATH model calculates spatial solar radiation and heat...

  2. Simulating an elastic bipedal robot based on musculoskeletal modeling

    NARCIS (Netherlands)

    Bortoletto, Roberto; Sartori, Massimo; He, Fuben; Pagello, Enrico

    2012-01-01

    Many of the processes involved into the synthesis of human motion have much in common with problems found in robotics research. This paper describes the modeling and the simulation of a novel bipedal robot based on series elastic actuators [1]. The robot model takes in- spiration from the human

  3. Structuring Qualitative Data for Agent-Based Modelling

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dijkema, Gerard P.J.; Schrauwen, Noortje

    2015-01-01

    Using ethnography to build agent-based models may result in more empirically grounded simulations. Our study on innovation practice and culture in the Westland horticulture sector served to explore what information and data from ethnographic analysis could be used in models and how. MAIA, a

  4. Statistical model of stress corrosion cracking based on extended

    Indian Academy of Sciences (India)

    The mechanism of stress corrosion cracking (SCC) has been discussed for decades. Here I propose a model of SCC reflecting the feature of fracture in brittle manner based on the variational principle under approximately supposed thermal equilibrium. In that model the functionals are expressed with extended forms of ...

  5. A Judgement-Based Model of Workplace Learning

    Science.gov (United States)

    Athanasou, James A.

    2004-01-01

    The purpose of this paper is to outline a judgement-based model of adult learning. This approach is set out as a Perceptual-Judgemental-Reinforcement approach to social learning under conditions of complexity and where there is no single, clearly identified correct response. The model builds upon the Hager-Halliday thesis of workplace learning and…

  6. A temperature dependent slip factor based thermal model for friction

    Indian Academy of Sciences (India)

    This paper proposes a new slip factor based three-dimensional thermal model to predict the temperature distribution during friction stir welding of 304L stainless steel plates. The proposed model employs temperature and radius dependent heat source to study the thermal cycle, temperature distribution, power required, the ...

  7. A community-based framework for aquatic ecosystem models

    NARCIS (Netherlands)

    Trolle, D.; Hamilton, D.P.; Hipsey, M.R.; Bolding, K.; Bruggeman, J.; Mooij, W.M.; Janse, J.H.; Nielsen, A.; Jeppesen, E.; Elliott, J.A.; Makler-Pick, V.; Petzoldt, T.; Rinke, K.; Flindt, M.R.; Arhonditsis, G.B.; Gal, G.; Bjerring, R.; Tominaga, K.; Hoen, 't J.; Downing, A.S.; Marques, D.M.; Fragoso, C.R.; Sondergaard, M.; Hanson, P.C.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through a

  8. Inquiry based learning as didactic model in distant learning

    NARCIS (Netherlands)

    Rothkrantz, L.J.M.

    2015-01-01

    Recent years many universities are involved in development of Massive Open Online Courses (MOOCs). Unfortunately an appropriate didactic model for cooperated network learning is lacking. In this paper we introduce inquiry based learning as didactic model. Students are assumed to ask themselves

  9. A Memory-Based Model of Hick's Law

    Science.gov (United States)

    Schneider, Darryl W.; Anderson, John R.

    2011-01-01

    We propose and evaluate a memory-based model of Hick's law, the approximately linear increase in choice reaction time with the logarithm of set size (the number of stimulus-response alternatives). According to the model, Hick's law reflects a combination of associative interference during retrieval from declarative memory and occasional savings…

  10. Individual-based modeling of ecological and evolutionary processes

    NARCIS (Netherlands)

    DeAngelis, D.L.; Mooij, W.M.

    2005-01-01

    Individual-based models (IBMs) allow the explicit inclusion of individual variation in greater detail than do classical differential and difference equation models. Inclusion of such variation is important for continued progress in ecological and evolutionary theory. We provide a conceptual basis

  11. Model-Based Engineering of Supervisory Controllers using CIF

    NARCIS (Netherlands)

    Schiffelers, R.R.H.; Theunissen, R.J.M.; Beek, van D.A.; Rooda, J.E.; Levendovsky, T.; Lengyel, L.

    2009-01-01

    In the Model-Based Engineering (MBE) paradigm, models are the core elements in the design process of a system from its requirements to the actual implementation of the system. By means of Supervisory Control Theory (SCT), supervisory controllers (supervisors) can be synthesized instead of

  12. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  13. Towards longitudinal activity-based models of travel demand

    NARCIS (Netherlands)

    Arentze, T.A.; Timmermans, H.J.P.; Lo, H.P.; Leung, Stephen C.H.; Tan, Susanna M.L.

    2008-01-01

    Existing activity-based models of travel demand consider a day as the time unit of observation and predict activity patterns of inhviduals for a typical or average day. In this study we argue that the use of a time span of one day severely limits the ability of the models to predict responsive

  14. A General Polygon-based Deformable Model for Object Recognition

    DEFF Research Database (Denmark)

    Jensen, Rune Fisker; Carstensen, Jens Michael

    1999-01-01

    We propose a general scheme for object localization and recognition based on a deformable model. The model combines shape and image properties by warping a arbitrary prototype intensity template according to the deformation in shape. The shape deformations are constrained by a probabilistic distr...

  15. A Range-Based Multivariate Model for Exchange Rate Volatility

    NARCIS (Netherlands)

    B. Tims (Ben); R.J. Mahieu (Ronald)

    2003-01-01

    textabstractIn this paper we present a parsimonious multivariate model for exchange rate volatilities based on logarithmic high-low ranges of daily exchange rates. The multivariate stochastic volatility model divides the log range of each exchange rate into two independent latent factors, which are

  16. Model based decision support for planning of road maintenance

    NARCIS (Netherlands)

    van Harten, Aart; Worm, J.M.; Worm, J.M.

    1996-01-01

    In this article we describe a Decision Support Model, based on Operational Research methods, for the multi-period planning of maintenance of bituminous pavements. This model is a tool for the road manager to assist in generating an optimal maintenance plan for a road. Optimal means: minimising the

  17. Archive Design Based on Planets Inspired Logical Object Model

    DEFF Research Database (Denmark)

    Zierau, Eld; Johansen, Anders

    2008-01-01

    We describe a proposal for a logical data model based on preliminary work the Planets project In OAIS terms the main areas discussed are related to the introduction of a logical data model for representing the past, present and future versions of the digital object associated with the Archival St...... Storage Package for the publications deposited by our client repositories....

  18. Analysis of Food Hub Commerce and Participation Using Agent-Based Modeling: Integrating Financial and Social Drivers.

    Science.gov (United States)

    Krejci, Caroline C; Stone, Richard T; Dorneich, Michael C; Gilbert, Stephen B

    2016-02-01

    Factors influencing long-term viability of an intermediated regional food supply network (food hub) were modeled using agent-based modeling techniques informed by interview data gathered from food hub participants. Previous analyses of food hub dynamics focused primarily on financial drivers rather than social factors and have not used mathematical models. Based on qualitative and quantitative data gathered from 22 customers and 11 vendors at a midwestern food hub, an agent-based model (ABM) was created with distinct consumer personas characterizing the range of consumer priorities. A comparison study determined if the ABM behaved differently than a model based on traditional economic assumptions. Further simulation studies assessed the effect of changes in parameters, such as producer reliability and the consumer profiles, on long-term food hub sustainability. The persona-based ABM model produced different and more resilient results than the more traditional way of modeling consumers. Reduced producer reliability significantly reduced trade; in some instances, a modest reduction in reliability threatened the sustainability of the system. Finally, a modest increase in price-driven consumers at the outset of the simulation quickly resulted in those consumers becoming a majority of the overall customer base. Results suggest that social factors, such as desire to support the community, can be more important than financial factors. An ABM of food hub dynamics, based on human factors data gathered from the field, can be a useful tool for policy decisions. Similar approaches can be used for modeling customer dynamics with other sustainable organizations. © 2015, Human Factors and Ergonomics Society.

  19. Authentication in Virtual Organizations: A Reputation Based PKI Interconnection Model

    Science.gov (United States)

    Wazan, Ahmad Samer; Laborde, Romain; Barrere, Francois; Benzekri, Abdelmalek

    Authentication mechanism constitutes a central part of the virtual organization work. The PKI technology is used to provide the authentication in each organization involved in the virtual organization. Different trust models are proposed to interconnect the different PKIs in order to propagate the trust between them. While the existing trust models contain many drawbacks, we propose a new trust model based on the reputation of PKIs.

  20. A MODELING METHOD OF FLUTTERING LEAVES BASED ON POINT CLOUD

    OpenAIRE

    J. Tang; Y. Wang; Y. Zhao; Y. Zhao; W. Hao; X. Ning; K. Lv; Z. Shi; M. Zhao

    2017-01-01

    Leaves falling gently or fluttering are common phenomenon in nature scenes. The authenticity of leaves falling plays an important part in the dynamic modeling of natural scenes. The leaves falling model has a widely applications in the field of animation and virtual reality. We propose a novel modeling method of fluttering leaves based on point cloud in this paper. According to the shape, the weight of leaves and the wind speed, three basic trajectories of leaves falling are defined, which ar...