WorldWideScience

Sample records for models based primarily

  1. Inquiry-based learning transitions to interdisciplinary research at a small primarily undergraduate institution.

    Science.gov (United States)

    Lehto, H.; Ward, J. W.

    2016-12-01

    Inquiry-based learning has been shown by many to be a useful way of engaging students and fostering a deeper learning of the subject matter. In traditional geophysics courses we use our equipment in a quad on campus or to a nearby site to have our students run surveys that countless students have run before. While this approach is active and does promote a deeper learning than a lecture only course, it can still be stale and unauthentic. By using new and unexplored sites for inquiry-based learning projects within our courses, we provide opportunities for students to be part of an authentic research experience. Inquiry-based learning started in my geophysics course when I needed a site for my students to run a resistivity survey on. My colleague, James Ward, recommended a site that was contaminated with salts believed to be from either an unlined (or improperly lined) brine pit or a leaking casing from old oil field operations. The goal of the project was to use a resistivity survey to determine the shape and therefore cause of the salt source. The students in my geophysics class were introduced to the `client' (James Ward) who told them about the site and the two different hypotheses for the source of the salt contamination. The students studied site images, looked at soil data, and then each proposed a plan for the resistivity survey. We then met in the field and the students were given a quick explanation of how the system worked and what they needed to do that day. The students were told to take thorough notes, lots of photographs, and ask as many questions as they needed to understand what was going on. On the following Monday I broke the students up into groups and taught them how to use the EarthImager 2D software to analyze the data. The students were then required to interpret their data and write-up a technical report for our `client' individually. The final graded technical reports suggested that authentic, inquiry-based learning facilitated a deeper

  2. Predator effects on a detritus-based food web are primarily mediated by non-trophic interactions.

    Science.gov (United States)

    Majdi, Nabil; Boiché, Anatole; Traunspurger, Walter; Lecerf, Antoine

    2014-07-01

    Predator effects on ecosystems can extend far beyond their prey and are often not solely lethally transmitted. Change in prey traits in response to predation risk can have important repercussions on community assembly and key ecosystem processes (i.e. trait-mediated indirect effects). In addition, some predators themselves alter habitat structure or nutrient cycling through ecological engineering effects. Tracking these non-trophic pathways is thus an important, yet challenging task to gain a better grasp of the functional role of predators. Multiple lines of evidence suggest that, in detritus-based food webs, non-trophic interactions may prevail over purely trophic interactions in determining predator effects on plant litter decomposition. This hypothesis was tested in a headwater stream by modulating the density of a flatworm predator (Polycelis felina) in enclosures containing oak (Quercus robur) leaf litter exposed to natural colonization by small invertebrates and microbial decomposers. Causal path modelling was used to infer how predator effects propagated through the food web. Flatworms accelerated litter decomposition through positive effects on microbial decomposers. The biomass of prey and non-prey invertebrates was not negatively affected by flatworms, suggesting that net predator effect on litter decomposition was primarily determined by non-trophic interactions. Flatworms enhanced the deposition and retention of fine sediments on leaf surface, thereby improving leaf colonization by invertebrates - most of which having strong affinities with interstitial habitats. This predator-induced improvement of habitat availability was attributed to the sticky nature of the mucus that flatworms secrete in copious amount while foraging. Results of path analyses further indicated that this bottom-up ecological engineering effect was as powerful as the top-down effect on invertebrate prey. Our findings suggest that predators have the potential to affect substantially

  3. Predator effects on a detritus‐based food web are primarily mediated by non‐trophic interactions

    National Research Council Canada - National Science Library

    Majdi, Nabil; Boiché, Anatole; Traunspurger, Walter; Lecerf, Antoine; Rudolf, Volker

    2014-01-01

    .... Multiple lines of evidence suggest that, in detritus‐based food webs, non‐trophic interactions may prevail over purely trophic interactions in determining predator effects on plant litter decomposition...

  4. Digital Surface and Terrain Models (DSM,DTM), The DTM associated with the Base Mapping Program consists of mass points and breaklines used primarily for ortho rectification. The DTM specifications included all breaklines for all hydro and transportation features and are the source for the TIPS (Tenn, Published in 2007, 1:4800 (1in=400ft) scale, State of Tennessee.

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — This Digital Surface and Terrain Models (DSM,DTM) dataset, published at 1:4800 (1in=400ft) scale, was produced all or in part from Orthoimagery information as of...

  5. A Simplified PBPK Modeling Approach for Prediction of Pharmacokinetics of Four Primarily Renally Excreted and CYP3A Metabolized Compounds During Pregnancy

    OpenAIRE

    Xia, Binfeng; Heimbach, Tycho; Gollen, Rakesh; Nanavati, Charvi; He, Handan

    2013-01-01

    During pregnancy, a drug’s pharmacokinetics may be altered and hence anticipation of potential systemic exposure changes is highly desirable. Physiologically based pharmacokinetics (PBPK) models have recently been used to influence clinical trial design or to facilitate regulatory interactions. Ideally, whole-body PBPK models can be used to predict a drug’s systemic exposure in pregnant women based on major physiological changes which can impact drug clearance (i.e., in the kidney and liver) ...

  6. Nanoparticles affect PCR primarily via surface interactions with PCR components: using amino-modified silica-coated magnetic nanoparticles as a main model

    Science.gov (United States)

    Nanomaterials have been widely reported to affect the polymerase chain reaction (PCR). However, many studies in which these effects were observed were not comprehensive, and many of the proposed mechanisms have been primarily speculative. In this work, we used amino-modified silica-coated magnetic n...

  7. HIV-1 frameshift efficiency is primarily determined by the stability of base pairs positioned at the mRNA entrance channel of the ribosome.

    Science.gov (United States)

    Mouzakis, Kathryn D; Lang, Andrew L; Vander Meulen, Kirk A; Easterday, Preston D; Butcher, Samuel E

    2013-02-01

    The human immunodeficiency virus (HIV) requires a programmed -1 ribosomal frameshift for Pol gene expression. The HIV frameshift site consists of a heptanucleotide slippery sequence (UUUUUUA) followed by a spacer region and a downstream RNA stem-loop structure. Here we investigate the role of the RNA structure in promoting the -1 frameshift. The stem-loop was systematically altered to decouple the contributions of local and overall thermodynamic stability towards frameshift efficiency. No correlation between overall stability and frameshift efficiency is observed. In contrast, there is a strong correlation between frameshift efficiency and the local thermodynamic stability of the first 3-4 bp in the stem-loop, which are predicted to reside at the opening of the mRNA entrance channel when the ribosome is paused at the slippery site. Insertion or deletions in the spacer region appear to correspondingly change the identity of the base pairs encountered 8 nt downstream of the slippery site. Finally, the role of the surrounding genomic secondary structure was investigated and found to have a modest impact on frameshift efficiency, consistent with the hypothesis that the genomic secondary structure attenuates frameshifting by affecting the overall rate of translation.

  8. The MyD88 pathway in plasmacytoid and CD4+ dendritic cells primarily triggers type I IFN production against measles virus in a mouse infection model.

    Science.gov (United States)

    Takaki, Hiromi; Takeda, Makoto; Tahara, Maino; Shingai, Masashi; Oshiumi, Hiroyuki; Matsumoto, Misako; Seya, Tsukasa

    2013-11-01

    Infection by measles virus (MV) induces type I IFN via the retinoic acid-inducible gene I/melanoma differentiation-associated gene 5/mitochondrial antiviral signaling protein (MAVS) pathway in human cells. However, the in vivo role of the MAVS pathway in host defense against MV infection remains undetermined. CD150 transgenic (Tg) mice, which express human CD150, an entry receptor for MV, with the disrupting IFNR gene (Ifnar(-/-)), are susceptible to MV and serve as a model for MV infection. In this study, we generated CD150Tg/Mavs(-/-) mice and examined MV permissiveness compared with that in CD150Tg/Ifnar(-/-) mice. MV replicated mostly in the spleen of i.p.-infected CD150Tg/Ifnar(-/-) mice. Strikingly, CD150Tg/Mavs(-/-) mice were not permissive to MV in vivo because of substantial type I IFN induction. MV barely replicated in any other organs tested. When T cells, B cells, and dendritic cells (DCs) isolated from CD150Tg/Mavs(-/-) splenocytes were cultured with MV in vitro, only the DCs produced type I IFN. In vitro infection analysis using CD150Tg/Mavs(-/-) DC subsets revealed that CD4(+) and plasmacytoid DCs, but not CD8α(+) and CD8α(-)CD4(-) double negative DCs, were exclusively involved in type I IFN production in response to MV infection. Because CD150Tg/Mavs(-/-) mice turned permissive to MV by anti-IFNAR Ab, type I IFN produced by CD4(+) DCs and plasmacytoid DCs plays a critical role in antiviral protection for neighboring cells expressing IFNAR. Induction of type I IFN in these DC subsets was abolished by the MyD88 inhibitory peptide. Thus, production of type I IFN occurs via the MyD88-dependent and MAVS-independent signaling pathway during MV infection.

  9. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  10. Case-Based Modeling for Learning Management and Interpersonal Skills

    Science.gov (United States)

    Lyons, Paul

    2008-01-01

    This article offers an introduction to case-based modeling (CBM) and a demonstration of the efficacy of this instructional model. CBM is grounded primarily in the concepts and theory of experiential learning, augmented by concepts of script creation. Although it is labor intensive, the model is one that has value for instruction in various…

  11. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.

    2001-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  12. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  13. On Henselian valuations and Brauer groups of primarily quasilocal fields

    OpenAIRE

    Chipchakov, Ivan

    2011-01-01

    This paper finds a classification, up-to an isomorphism, of abelian torsion groups realizable as Brauer groups of major types of Henselian valued primarily quasilocal fields with totally indivisible value groups. When $E$ is a quasilocal field with such a valuation, it shows that the Brauer group of $E$ is divisible and embeddable in the quotient group of the additive group of rational numbers by the subgroup of integers.

  14. Careers in virology: teaching at a primarily undergraduate institution.

    Science.gov (United States)

    Kushner, David B

    2014-10-01

    A faculty position at a primarily undergraduate institution requires working with undergraduates in both the classroom and the research lab. Graduate students and postdoctoral fellows who are interested in such a career should understand that faculty at these institutions need to teach broadly and devise research questions that can be addressed safely and with limited resources compared to a research I university. Aspects of, and ways to prepare for, this career will be reviewed herein. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  15. Human punishment is not primarily motivated by inequality

    Science.gov (United States)

    Marczyk, Jesse

    2017-01-01

    Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player’s payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value. PMID:28187166

  16. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases th...... datasets. Our model also outperforms A Decision Cluster Classification (ADCC) and the Decision Cluster Forest Classification (DCFC) models on the Reuters-21578 dataset....

  17. Method for gesture based modeling

    DEFF Research Database (Denmark)

    2006-01-01

    A computer program based method is described for creating models using gestures. On an input device, such as an electronic whiteboard, a user draws a gesture which is recognized by a computer program and interpreted relative to a predetermined meta-model. Based on the interpretation, an algorithm...... is assigned to the gesture drawn by the user. The executed algorithm may, for example, consist in creating a new model element, modifying an existing model element, or deleting an existing model element....

  18. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  19. Primarily nonlinear effects observed in a driven asymmetrical vibrating wire

    Science.gov (United States)

    Hanson, Roger J.; Macomber, H. Kent; Morrison, Andrew C.; Boucher, Matthew A.

    2005-01-01

    The purpose of the work reported here is to further experimentally explore the wide variety of behaviors exhibited by driven vibrating wires, primarily in the nonlinear regime. When the wire is driven near a resonant frequency, it is found that most such behaviors are significantly affected by the splitting of the resonant frequency and by the existence of a ``characteristic'' axis associated with each split frequency. It is shown that frequency splitting decreases with increasing wire tension and can be altered by twisting. Two methods are described for determining the orientation of characteristic axes. Evidence is provided, with a possible explanation, that each axis has the same orientation everywhere along the wire. Frequency response data exhibiting nonlinear generation of transverse motion perpendicular to the driving direction, hysteresis, linear generation of perpendicular motion (sometimes tubular), and generation of motion at harmonics of the driving frequency are exhibited and discussed. Also reported under seemingly unchanging conditions are abrupt large changes in the harmonic content of the motion that sometimes involve large subharmonics and harmonics thereof. Slow transitions from one stable state of vibration to another and quasiperiodic motions are also exhibited. Possible musical significance is discussed. .

  20. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    with their dynamic behaviour. Using Hidden Markov Models (HMMs) for both modelling and approximating the behaviours of principals, we introduce the HMM-based trust model as a new approach to evaluating trust in systems exhibiting dynamic behaviour. This model avoids the fixed behaviour assumption which is considered...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  1. Model-Based Reasoning

    Science.gov (United States)

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  2. Model-based Software Engineering

    DEFF Research Database (Denmark)

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  3. Model-based Software Engineering

    DEFF Research Database (Denmark)

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  4. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  5. Element-Based Computational Model

    Directory of Open Access Journals (Sweden)

    Conrad Mueller

    2012-02-01

    Full Text Available A variation on the data-flow model is proposed to use for developing parallel architectures. While the model is a data driven model it has significant differences to the data-flow model. The proposed model has an evaluation cycleof processing elements (encapsulated data that is similar to the instruction cycle of the von Neumann model. The elements contain the information required to process them. The model is inherently parallel. An emulation of the model has been implemented. The objective of this paper is to motivate support for taking the research further. Using matrix multiplication as a case study, the element/data-flow based model is compared with the instruction-based model. This is done using complexity analysis followed by empirical testing to verify this analysis. The positive results are given as motivation for the research to be taken to the next stage - that is, implementing the model using FPGAs.

  6. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  7. Central Puget Sound Ecopath/Ecosim model biological parameters - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  8. Central Puget Sound Ecopath/Ecosim model outputs - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  9. Model-based consensus

    NARCIS (Netherlands)

    Boumans, Marcel

    2014-01-01

    The aim of the rational-consensus method is to produce “rational consensus”, that is, “mathematical aggregation”, by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  10. Model-based consensus

    NARCIS (Netherlands)

    M. Boumans

    2014-01-01

    The aim of the rational-consensus method is to produce "rational consensus", that is, "mathematical aggregation", by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  11. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...... infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  12. Efficient Model-Based Exploration

    NARCIS (Netherlands)

    Wiering, M.A.; Schmidhuber, J.

    1998-01-01

    Model-Based Reinforcement Learning (MBRL) can greatly profit from using world models for estimating the consequences of selecting particular actions: an animat can construct such a model from its experiences and use it for computing rewarding behavior. We study the problem of collecting useful exper

  13. What Happens to Marriages Built Primarily on Sex?

    Science.gov (United States)

    Mace, David R.

    1971-01-01

    In an interview, a marriage counselor answers questions concerning sex in marriage. He concludes that sex alone is too narrow a base for a marriage to rest upon and for a successful marriage, there is a need for a deeper basis of companionship. (Author/CG)

  14. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  15. Modeling Web-based Educational Systems: Process Design Teaching Model

    Directory of Open Access Journals (Sweden)

    Elena Rokou

    2004-01-01

    Full Text Available Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of isolated educational multimedia systems, none has optimum results for the description of these systems and, especially, for their pedagogical aspect. Of course this is due primarily to how these systems function and are applied; it is not due to the language itself, although its special characteristics contribute substantially to the development of these systems sometimes positively and sometimes negatively. In this paper, we briefly describe the introduction of stereotypes to the pedagogical design of educational systems and appropriate modifications of the existing package diagrams of UML (Unified Modeling Language. The main objective of these new stereotypes is to describe sufficiently the mechanisms of generation, monitoring and re-adapting of teaching and student’s models which can be used in the educational applications.

  16. Empirically Based, Agent-based models

    Directory of Open Access Journals (Sweden)

    Elinor Ostrom

    2006-12-01

    Full Text Available There is an increasing drive to combine agent-based models with empirical methods. An overview is provided of the various empirical methods that are used for different kinds of questions. Four categories of empirical approaches are identified in which agent-based models have been empirically tested: case studies, stylized facts, role-playing games, and laboratory experiments. We discuss how these different types of empirical studies can be combined. The various ways empirical techniques are used illustrate the main challenges of contemporary social sciences: (1 how to develop models that are generalizable and still applicable in specific cases, and (2 how to scale up the processes of interactions of a few agents to interactions among many agents.

  17. Leg stiffness primarily depends on ankle stiffness during human hopping.

    Science.gov (United States)

    Farley, C T; Morgenroth, D C

    1999-03-01

    When humans hop in place or run forward, they adjust leg stiffness to accommodate changes in stride frequency or surface stiffness. The goal of the present study was to determine the mechanisms by which humans adjust leg stiffness during hopping in place. Five subjects hopped in place at 2.2 Hz while we collected force platform and kinematic data. Each subject completed trials in which they hopped to whatever height they chose ("preferred height hopping") and trials in which they hopped as high as possible ("maximum height hopping"). Leg stiffness was approximately twice as great for maximum height hopping as for preferred height hopping. Ankle torsional stiffness was 1.9-times greater while knee torsional stiffness was 1.7-times greater in maximum height hopping than in preferred height hopping. We used a computer simulation to examine the sensitivity of leg stiffness to the observed changes in ankle and knee stiffness. Our model consisted of four segments (foot, shank, thigh, head-arms-trunk) interconnected by three torsional springs (ankle, knee, hip). In the model, increasing ankle stiffness by 1.9-fold, as observed in the subjects, caused leg stiffness to increase by 2.0-fold. Increasing knee stiffness by 1.7-fold had virtually no effect on leg stiffness. Thus, we conclude that the primary mechanism for leg stiffness adjustment is the adjustment of ankle stiffness.

  18. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  19. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  20. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related activit...

  1. Modelling Gesture Based Ubiquitous Applications

    CERN Document Server

    Zacharia, Kurien; Varghese, Surekha Mariam

    2011-01-01

    A cost effective, gesture based modelling technique called Virtual Interactive Prototyping (VIP) is described in this paper. Prototyping is implemented by projecting a virtual model of the equipment to be prototyped. Users can interact with the virtual model like the original working equipment. For capturing and tracking the user interactions with the model image and sound processing techniques are used. VIP is a flexible and interactive prototyping method that has much application in ubiquitous computing environments. Different commercial as well as socio-economic applications and extension to interactive advertising of VIP are also discussed.

  2. Potentially inappropriate prescribing of primarily renally cleared medications for older veterans affairs nursing home patients.

    Science.gov (United States)

    Hanlon, Joseph T; Wang, Xiaoqiang; Handler, Steven M; Weisbord, Steven; Pugh, Mary Jo; Semla, Todd; Stone, Roslyn A; Aspinall, Sherrie L

    2011-06-01

    Inappropriate prescribing of primarily renally cleared medications in older patients with kidney disease can lead to adverse outcomes. To estimate the prevalence of potentially inappropriate prescribing of 21 primarily renally cleared medications based on 2 separate estimates of renal function and to identify factors associated with this form of suboptimal prescribing in older VA nursing home (NH) patients. Longitudinal study Participants were 1304 patients, aged 65 years or older, admitted between January 1, 2004, and June 30, 2005, for 90 days or more to 1 of 133 VA NHs. Potentially inappropriate prescribing of primarily renally cleared medications determined by estimating creatinine clearance using the Cockcroft Gault (CG) and Modification of Diet in Renal Disease (MDRD) equations and applying explicit guidelines for contraindicated medications and dosing. The median estimated creatinine clearance via CG was 67 mL/min, whereas it was 80 mL/min/1.73m(2) with the MDRD. Overall, 11.89% patients via CG and only 5.98% via MDRD had evidence of potentially inappropriate prescribing of at least 1 renally cleared medication. The most commonly involved medications were ranitidine, glyburide, gabapentin, and nitrofurantoin. Factors associated with potentially inappropriate prescribing as per the CG were age older than 85 (adjusted odds ratio [AOR] 4.24, 95% confidence interval [CI] 2.42-7.43), obesity (AOR 0.26, 95% CI 0.14-0.50) and having multiple comorbidities (AOR 1.09 for each unit increase in the Charlson comorbidity index, 95% CI 1.01-1.19). Potentially inappropriate prescribing of renally cleared medications is common in older VA NH patients. Intervention studies to improve the prescribing of primarily renally cleared medications in nursing homes are needed. Copyright © 2011 American Medical Directors Association. Published by Elsevier Inc. All rights reserved.

  3. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  4. Model-Based Security Testing

    CERN Document Server

    Schieferdecker, Ina; Schneider, Martin; 10.4204/EPTCS.80.1

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST) is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing,...

  5. Kernel model-based diagnosis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The methods for computing the kemel consistency-based diagnoses and the kernel abductive diagnoses are only suited for the situation where part of the fault behavioral modes of the components are known. The characterization of the kernel model-based diagnosis based on the general causal theory is proposed, which can break through the limitation of the above methods when all behavioral modes of each component are known. Using this method, when observation subsets deduced logically are respectively assigned to the empty or the whole observation set, the kernel consistency-based diagnoses and the kernel abductive diagnoses can deal with all situations. The direct relationship between this diagnostic procedure and the prime implicants/implicates is proved, thus linking theoretical result with implementation.

  6. Model-based tomographic reconstruction

    Science.gov (United States)

    Chambers, David H; Lehman, Sean K; Goodman, Dennis M

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  7. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  8. Temporal integration of loudness in listeners with hearing losses of primarily cochlear origin

    DEFF Research Database (Denmark)

    Buus, Søren; Florentine, Mary; Poulsen, Torben

    1999-01-01

    To investigate how hearing loss of primarily cochlear origin affects the loudness of brief tones, loudness matches between 5- and 200-ms tones were obtained as a function of level for 15 listeners with cochlear impairments and for seven age-matched controls. Three frequencies, usually 0.5, 1, and 4......-frequency hearing losses (slopes >50 dB/octave) showed larger-than-normal maximal amounts of temporal integration (40 to 50 dB). This finding is consistent with the shallow loudness functions predicted by our excitation-pattern model for impaired listeners [, in Modeling Sensorineural Hearing Loss, edited by W....... Jesteadt (Erlbaum, Mahwah, NJ, 1997), pp. 187–198]. Loudness functions derived from impaired listeners' temporal-integration functions indicate that restoration of loudness in listeners with cochlear hearing loss usually will require the same gain whether the sound is short or long. ©1999 Acoustical...

  9. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  10. An Agent Based Classification Model

    CERN Document Server

    Gu, Feng; Greensmith, Julie

    2009-01-01

    The major function of this model is to access the UCI Wisconsin Breast Can- cer data-set[1] and classify the data items into two categories, which are normal and anomalous. This kind of classifi cation can be referred as anomaly detection, which discriminates anomalous behaviour from normal behaviour in computer systems. One popular solution for anomaly detection is Artifi cial Immune Sys- tems (AIS). AIS are adaptive systems inspired by theoretical immunology and observed immune functions, principles and models which are applied to prob- lem solving. The Dendritic Cell Algorithm (DCA)[2] is an AIS algorithm that is developed specifi cally for anomaly detection. It has been successfully applied to intrusion detection in computer security. It is believed that agent-based mod- elling is an ideal approach for implementing AIS, as intelligent agents could be the perfect representations of immune entities in AIS. This model evaluates the feasibility of re-implementing the DCA in an agent-based simulation environ- ...

  11. Defeaturing CAD models using a geometry-based size field and facet-based reduction operators.

    Energy Technology Data Exchange (ETDEWEB)

    Quadros, William Roshan; Owen, Steven James

    2010-04-01

    We propose a method to automatically defeature a CAD model by detecting irrelevant features using a geometry-based size field and a method to remove the irrelevant features via facet-based operations on a discrete representation. A discrete B-Rep model is first created by obtaining a faceted representation of the CAD entities. The candidate facet entities are then marked for reduction by using a geometry-based size field. This is accomplished by estimating local mesh sizes based on geometric criteria. If the field value at a facet entity goes below a user specified threshold value then it is identified as an irrelevant feature and is marked for reduction. The reduction of marked facet entities is primarily performed using an edge collapse operator. Care is taken to retain a valid geometry and topology of the discrete model throughout the procedure. The original model is not altered as the defeaturing is performed on a separate discrete model. Associativity between the entities of the discrete model and that of original CAD model is maintained in order to decode the attributes and boundary conditions applied on the original CAD entities onto the mesh via the entities of the discrete model. Example models are presented to illustrate the effectiveness of the proposed approach.

  12. Research on BOM based composable modeling method

    NARCIS (Netherlands)

    Zhang, M.; He, Q.; Gong, J.

    2013-01-01

    Composable modeling method has been a research hotpot in the area of Modeling and Simulation for a long time. In order to increase the reuse and interoperability of BOM based model, this paper put forward a composable modeling method based on BOM, studied on the basic theory of composable modeling m

  13. Intelligent model-based OPC

    Science.gov (United States)

    Huang, W. C.; Lai, C. M.; Luo, B.; Tsai, C. K.; Chih, M. H.; Lai, C. W.; Kuo, C. C.; Liu, R. G.; Lin, H. T.

    2006-03-01

    Optical proximity correction is the technique of pre-distorting mask layouts so that the printed patterns are as close to the desired shapes as possible. For model-based optical proximity correction, a lithographic model to predict the edge position (contour) of patterns on the wafer after lithographic processing is needed. Generally, segmentation of edges is performed prior to the correction. Pattern edges are dissected into several small segments with corresponding target points. During the correction, the edges are moved back and forth from the initial drawn position, assisted by the lithographic model, to finally settle on the proper positions. When the correction converges, the intensity predicted by the model in every target points hits the model-specific threshold value. Several iterations are required to achieve the convergence and the computation time increases with the increase of the required iterations. An artificial neural network is an information-processing paradigm inspired by biological nervous systems, such as how the brain processes information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. A neural network can be a powerful data-modeling tool that is able to capture and represent complex input/output relationships. The network can accurately predict the behavior of a system via the learning procedure. A radial basis function network, a variant of artificial neural network, is an efficient function approximator. In this paper, a radial basis function network was used to build a mapping from the segment characteristics to the edge shift from the drawn position. This network can provide a good initial guess for each segment that OPC has carried out. The good initial guess reduces the required iterations. Consequently, cycle time can be shortened effectively. The optimization of the radial basis function network for this system was practiced by genetic algorithm

  14. Efficient transfection of DNA into primarily cultured rat sertoli cells by electroporation.

    Science.gov (United States)

    Li, Fuping; Yamaguchi, Kohei; Okada, Keisuke; Matsushita, Kei; Enatsu, Noritoshi; Chiba, Koji; Yue, Huanxun; Fujisawa, Masato

    2013-03-01

    The expression of exogenous DNA in Sertoli cells is essential for studying its functional genomics, pathway analysis, and medical applications. Electroporation is a valuable tool for nucleic acid delivery, even in primarily cultured cells, which are considered difficult to transfect. In this study, we developed an optimized protocol for electroporation-based transfection of Sertoli cells and compared its efficiency with conventional lipofection. Sertoli cells were transfected with pCMV-GFP plasmid by square-wave electroporation under different conditions. After transfection of plasmid into Sertoli cells, enhanced green fluorescent protein (EGFP) expression could be easily detected by fluorescent microscopy, and cell survival was evaluated by dye exclusion assay using Trypan blue. In terms of both cell survival and the percentage expressing EGFP, 250 V was determined to produce the greatest number of transiently transfected cells. Keeping the voltage constant (250 V), relatively high cell survival (76.5% ± 3.4%) and transfection efficiency (30.6% ± 5.6%) were observed with a pulse length of 20 μm. The number of pulses significantly affected cell survival and EGFP expression (P electroporation (21.5% ± 5.7%) was significantly higher than those of Lipofectamine 2000 (2.9% ± 1.0%) and Effectene (1.9% ± 0.8%) in this experiment (P electroporation conditions, and the successful electroporation of plasmid DNA into primarily cultured Sertoli cells. Our results indicate that the method of electroporation is more suitable than other approaches for the transfection of Sertoli cells.

  15. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  16. Differential geometry based multiscale models.

    Science.gov (United States)

    Wei, Guo-Wei

    2010-08-01

    Large chemical and biological systems such as fuel cells, ion channels, molecular motors, and viruses are of great importance to the scientific community and public health. Typically, these complex systems in conjunction with their aquatic environment pose a fabulous challenge to theoretical description, simulation, and prediction. In this work, we propose a differential geometry based multiscale paradigm to model complex macromolecular systems, and to put macroscopic and microscopic descriptions on an equal footing. In our approach, the differential geometry theory of surfaces and geometric measure theory are employed as a natural means to couple the macroscopic continuum mechanical description of the aquatic environment with the microscopic discrete atomistic description of the macromolecule. Multiscale free energy functionals, or multiscale action functionals are constructed as a unified framework to derive the governing equations for the dynamics of different scales and different descriptions. Two types of aqueous macromolecular complexes, ones that are near equilibrium and others that are far from equilibrium, are considered in our formulations. We show that generalized Navier-Stokes equations for the fluid dynamics, generalized Poisson equations or generalized Poisson-Boltzmann equations for electrostatic interactions, and Newton's equation for the molecular dynamics can be derived by the least action principle. These equations are coupled through the continuum-discrete interface whose dynamics is governed by potential driven geometric flows. Comparison is given to classical descriptions of the fluid and electrostatic interactions without geometric flow based micro-macro interfaces. The detailed balance of forces is emphasized in the present work. We further extend the proposed multiscale paradigm to micro-macro analysis of electrohydrodynamics, electrophoresis, fuel cells, and ion channels. We derive generalized Poisson-Nernst-Planck equations that are

  17. Developing Empirically Based Models of Practice.

    Science.gov (United States)

    Blythe, Betty J.; Briar, Scott

    1985-01-01

    Over the last decade emphasis has shifted from theoretically based models of practice to empirically based models whose elements are derived from clinical research. These models are defined and a developing model of practice through the use of single-case methodology is examined. Potential impediments to this new role are identified. (Author/BL)

  18. Guide to APA-Based Models

    Science.gov (United States)

    Robins, Robert E.; Delisi, Donald P.

    2008-01-01

    In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.

  19. 12 CFR 225.127 - Investment in corporations or projects designed primarily to promote community welfare.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Investment in corporations or projects designed... equity and debt investments in corporations or projects designed primarily to promote community welfare... not intended to limit projects under § 225.25(b)(6) to that area. Other investments primarily designed...

  20. Proceedings Tenth Workshop on Model Based Testing

    OpenAIRE

    Pakulin, Nikolay; Petrenko, Alexander K.; Schlingloff, Bernd-Holger

    2015-01-01

    The workshop is devoted to model-based testing of both software and hardware. Model-based testing uses models describing the required behavior of the system under consideration to guide such efforts as test selection and test results evaluation. Testing validates the real system behavior against models and checks that the implementation conforms to them, but is capable also to find errors in the models themselves. The intent of this workshop is to bring together researchers and users of model...

  1. Product Modelling for Model-Based Maintenance

    NARCIS (Netherlands)

    Houten, van F.J.A.M.; Tomiyama, T.; Salomons, O.W.

    1998-01-01

    The paper describes the fundamental concepts of maintenance and the role that information technology can play in the support of maintenance activities. Function-Behaviour-State modelling is used to describe faults and deterioration of mechanisms in terms of user perception and measurable quantities.

  2. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through suffi

  3. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through

  4. Cosine Based Latent Factor Model for Precision Oriented Recommendation

    Directory of Open Access Journals (Sweden)

    Bipul Kumar

    2016-01-01

    Full Text Available Recommender systems suggest a list of interesting items to users based on their prior purchase or browsing behaviour on e-commerce platforms. The continuing research in recommender systems have primarily focused on developing algorithms for rating prediction task. However, most e-commerce platforms provide ‘top-k’ list of interesting items for every user. In line with this idea, the paper proposes a novel machine learning algorithm to predict a list of ‘top-k’ items by optimizing the latent factors of users and items with the mapped scores from ratings. The basic idea is to learn latent factors based on the cosine similarity between the users and items latent features which is then used to predict the scores for unseen items for every user. Comprehensive empirical evaluations on publicly available benchmark datasets reveal that the proposed model outperforms the state-of-the-art algorithms in recommending good items to a user.

  5. Return of feature-based cost modeling

    Science.gov (United States)

    Creese, Robert C.; Patrawala, Taher B.

    1998-10-01

    Feature Based Cost Modeling is thought of as a relative new approach to cost modeling, but feature based cost modeling had considerable development in the 1950's. Considerable work was published in the 1950's by Boeing on cost for various casting processes--sand casting, die casting, investment casting and permanent mold casting--as a function of a single casting feature, casting volume. Additional approaches to feature based cost modeling have been made, and this work is a review of previous works and a proposed integrated model to feature based cost modeling.

  6. gis-based hydrological model based hydrological model upstream ...

    African Journals Online (AJOL)

    eobe

    Metrological Agency (NIMET) and Jebba Hydroelectric ... cycle by SWAT is based on the water balance equation: = + (. − ... The estimation of the base flow is done using Equation. 5. = . ( ..... Acetic Acid”, Nigerian Journal of Tecnology, Vol. 32,.

  7. Extensions in model-based system analysis

    OpenAIRE

    Graham, Matthew R.

    2007-01-01

    Model-based system analysis techniques provide a means for determining desired system performance prior to actual implementation. In addition to specifying desired performance, model-based analysis techniques require mathematical descriptions that characterize relevant behavior of the system. The developments of this dissertation give ex. tended formulations for control- relevant model estimation as well as model-based analysis conditions for performance requirements specified as frequency do...

  8. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the (fo

  9. Trace-Based Code Generation for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.-G.

    2009-01-01

    Paper Submitted for review at the Eighth International Conference on Generative Programming and Component Engineering. Model-based testing can be a powerful means to generate test cases for the system under test. However, creating a useful model for model-based testing requires expertise in the

  10. Application of numerical methods for diffusion-based modeling of skin permeation.

    Science.gov (United States)

    Frasch, H Frederick; Barbero, Ana M

    2013-02-01

    The application of numerical methods for mechanistic, diffusion-based modeling of skin permeation is reviewed. Methods considered here are finite difference, method of lines, finite element, finite volume, random walk, cellular automata, and smoothed particle hydrodynamics. First the methods are briefly explained with rudimentary mathematical underpinnings. Current state of the art numerical models are described, and then a chronological overview of published models is provided. Key findings and insights of reviewed models are highlighted. Model results support a primarily transcellular pathway with anisotropic lipid transport. Future endeavors would benefit from a fundamental analysis of drug/vehicle/skin interactions.

  11. An information theory-based approach to modeling the information processing of NPP operators

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute, Taejon (Korea, Republic of)

    2002-08-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory.

  12. Image-Based Structural Modeling of the Cardiac Purkinje Network

    Directory of Open Access Journals (Sweden)

    Benjamin R. Liu

    2015-01-01

    Full Text Available The Purkinje network is a specialized conduction system within the heart that ensures the proper activation of the ventricles to produce effective contraction. Its role during ventricular arrhythmias is less clear, but some experimental studies have suggested that the Purkinje network may significantly affect the genesis and maintenance of ventricular arrhythmias. Despite its importance, few structural models of the Purkinje network have been developed, primarily because current physical limitations prevent examination of the intact Purkinje network. In previous modeling efforts Purkinje-like structures have been developed through either automated or hand-drawn procedures, but these networks have been created according to general principles rather than based on real networks. To allow for greater realism in Purkinje structural models, we present a method for creating three-dimensional Purkinje networks based directly on imaging data. Our approach uses Purkinje network structures extracted from photographs of dissected ventricles and projects these flat networks onto realistic endocardial surfaces. Using this method, we create models for the combined ventricle-Purkinje system that can fully activate the ventricles through a stimulus delivered to the Purkinje network and can produce simulated activation sequences that match experimental observations. The combined models have the potential to help elucidate Purkinje network contributions during ventricular arrhythmias.

  13. Hepatitis B virus infection and replication in primarily cultured human fetal hepatocytes

    Institute of Scientific and Technical Information of China (English)

    Min Lin; Qun Chen; Li-Ye Yang; Wen-Yu Li; Xi-Biao Cao; Jiao-Ren Wu; You-Peng Peng; Mo-Rui Chen

    2007-01-01

    AIM:To investigate the infection and replication of hepatitis B virus(HBV)in primarily cultured human fetal hepatocytes(HFHs).METHODS:The human fetal hepatocytes were cultured in serum-free medium,HBV-positive serum was added into the medium to study the susceptibility of hepatocytes to HBV infection.The supernatant was collected for ELISA assay of HBsAg and HBeAg,and quantitative fluorescence PCR for HBV-DNA assay daily.Albumin and HBcAg,CK8 and CK18 expressions were detected by immunohistochemistry in cultured hepatocytes.Content of lactate dehydrogenate(LDH)was measured to find out the integrity of the cell membrane.RESULTS:A stable hepatocyte culture system was established.HBV could infect the hepatocytes and replicate,and HBcAg expression could be detected by immunohistochemistry in hepatocyte-like cells.HBV-DNA in the supernatant could be detected from d 2 to d 18 and HBsAg and HBeAg were positive on d 3-d 18 after HBV infection.HBV in medium increased from d 0 to d 6 and subsequently decreased as the cells were progressively loosing their hepatocyte phenotypes.CONCLUSION:HBV could infect human fetal hepatocytes and replicate.This in vitro model allowed a detailed Study on early events associated with human HBV entry into cells and subsequent replication.

  14. Electrical Compact Modeling of Graphene Base Transistors

    Directory of Open Access Journals (Sweden)

    Sébastien Frégonèse

    2015-11-01

    Full Text Available Following the recent development of the Graphene Base Transistor (GBT, a new electrical compact model for GBT devices is proposed. The transistor model includes the quantum capacitance model to obtain a self-consistent base potential. It also uses a versatile transfer current equation to be compatible with the different possible GBT configurations and it account for high injection conditions thanks to a transit time based charge model. Finally, the developed large signal model has been implemented in Verilog-A code and can be used for simulation in a standard circuit design environment such as Cadence or ADS. This model has been verified using advanced numerical simulation.

  15. Hierarchical Geometric Constraint Model for Parametric Feature Based Modeling

    Institute of Scientific and Technical Information of China (English)

    高曙明; 彭群生

    1997-01-01

    A new geometric constraint model is described,which is hierarchical and suitable for parametric feature based modeling.In this model,different levels of geometric information are repesented to support various stages of a design process.An efficient approach to parametric feature based modeling is also presented,adopting the high level geometric constraint model.The low level geometric model such as B-reps can be derived automatically from the hig level geometric constraint model,enabling designers to perform their task of detailed design.

  16. Floral biology of two Vanilloideae (Orchidaceae) primarily adapted to pollination by euglossine bees.

    Science.gov (United States)

    Pansarin, E R; Pansarin, L M

    2014-11-01

    Vanilloideae comprises 15 genera distributed worldwide, among which are Vanilla and Epistephium (tribe Vanilleae). Based on field and laboratory investigations, the pollination biology of V. dubia and E. sclerophyllum was analysed. The former was surveyed in a semi-deciduous mesophytic forest at the biological reserve of Serra do Japi and in a marshy forest at the city of Pradópolis, southeastern Brazil. The latter was examined in rocky outcrop vegetation in the Chapada Diamantina, northeastern Brazil. In the studied populations, the tubular flowers of V. dubia and E. sclerophyllum were pollinated by bees. Pollen was deposited on either their scutellum (V. dubia) or scutum (E. sclerophyllum). The mentum region of V. dubia is dry, whereas that of E. sclerophyllum presents a small quantity of dilute nectar. Flowers of E. sclerophyllum are scentless, while those of V. dubia are odoriferous. Although V. dubia is self-compatible, it needs a pollinator to produce fruit. In contrast, E. sclerophyllum sets fruit through spontaneous self-pollination, but biotic pollination also occurs. Both species are primarily adapted to pollination by euglossine bees. Pollination by Euglossina seems to have occurred at least twice during the evolution of Vanilleae. Furthermore, shifts between rewarding and reward-free flowers and between autogamous and allogamous species have been reported among vanillas. © 2014 German Botanical Society and The Royal Botanical Society of the Netherlands.

  17. The Culture Based Model: Constructing a Model of Culture

    Science.gov (United States)

    Young, Patricia A.

    2008-01-01

    Recent trends reveal that models of culture aid in mapping the design and analysis of information and communication technologies. Therefore, models of culture are powerful tools to guide the building of instructional products and services. This research examines the construction of the culture based model (CBM), a model of culture that evolved…

  18. Improving Agent Based Modeling of Critical Incidents

    Directory of Open Access Journals (Sweden)

    Robert Till

    2010-04-01

    Full Text Available Agent Based Modeling (ABM is a powerful method that has been used to simulate potential critical incidents in the infrastructure and built environments. This paper will discuss the modeling of some critical incidents currently simulated using ABM and how they may be expanded and improved by using better physiological modeling, psychological modeling, modeling the actions of interveners, introducing Geographic Information Systems (GIS and open source models.

  19. Testing Strategies for Model-Based Development

    Science.gov (United States)

    Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.

    2006-01-01

    This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.

  20. Model-Based Enterprise Summit Report

    Science.gov (United States)

    2014-02-01

    Models Become Much More Efficient and Effective When Coupled With Knowledge Design Advisors CAD Fit Machine Motion KanBan Trigger Models Tolerance...Based Enterprise Geometry Kinematics Design Advisors Control Physics Planning System Models CAD Fit Machine Motion KanBan Trigger Models Tolerance

  1. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  2. A global parallel model based design of experiments method to minimize model output uncertainty.

    Science.gov (United States)

    Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E

    2012-03-01

    Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.

  3. A Behavior-Based Remote Trust Attestation Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Huanguo; WANG Fan

    2006-01-01

    While remote trust attestation is a useful concept to detect unauthorized changes to software, the current mechanism only ensures authenticity at the start of the operating system and cannot ensure the action of running software. Our approach is to use a behavior-based monitoring agent to make remote attestation more flexible, dynamic, and trustworthy. This approach was mostly made possible by extensive use of process information which is readily available in Unix. We also made use of a behavior tree to effectively record predictable behaviors of each process. In this paper, we primarily focus on building a prototype implementation of such framework, presenting one example built on it, successfully find potential security risks in the run time of a ftp program and then evaluate the performance of this model.

  4. Dynamic modelling and analysis of biochemical networks: mechanism-based models and model-based experiments.

    Science.gov (United States)

    van Riel, Natal A W

    2006-12-01

    Systems biology applies quantitative, mechanistic modelling to study genetic networks, signal transduction pathways and metabolic networks. Mathematical models of biochemical networks can look very different. An important reason is that the purpose and application of a model are essential for the selection of the best mathematical framework. Fundamental aspects of selecting an appropriate modelling framework and a strategy for model building are discussed. Concepts and methods from system and control theory provide a sound basis for the further development of improved and dedicated computational tools for systems biology. Identification of the network components and rate constants that are most critical to the output behaviour of the system is one of the major problems raised in systems biology. Current approaches and methods of parameter sensitivity analysis and parameter estimation are reviewed. It is shown how these methods can be applied in the design of model-based experiments which iteratively yield models that are decreasingly wrong and increasingly gain predictive power.

  5. Firm Based Trade Models and Turkish Economy

    Directory of Open Access Journals (Sweden)

    Nilüfer ARGIN

    2015-12-01

    Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.

  6. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  7. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  8. Base Flow Model Validation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation is the systematic "building-block" validation of CFD/turbulence models employing a GUI driven CFD code (RPFM) and existing as well as new data sets to...

  9. Distributed Prognostics Based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...

  10. Distributed hydrological models: comparison between TOPKAPI, a physically based model and TETIS, a conceptually based model

    Science.gov (United States)

    Ortiz, E.; Guna, V.

    2009-04-01

    The present work aims to carry out a comparison between two distributed hydrological models, the TOPKAPI (Ciarapica and Todini, 1998; Todini and Ciarapica, 2001) and TETIS (Vélez, J. J.; Vélez J. I. and Francés, F, 2002) models, obtaining the hydrological solution computed on the basis of the same storm events. The first model is physically based and the second one is conceptually based. The analysis was performed on the 21,4 km2 Goodwin Creek watershed, located in Panola County, Mississippi. This watershed extensively monitored by the Agricultural Research Service (ARS) National Sediment Laboratory (NSL) has been chosen because it offers a complete database compiling precipitation (16 rain gauges), runoff (6 discharge stations) and GIS data. Three storm events were chosen to evaluate the performance of the two models: the first one was chosen to calibrate the models, and the other two to validate them. Both models performed a satisfactory hydrological response both in calibration and validation events. While for the TOPKAPI model it wasn't a real calibration, due to its really good performance with parameters modal values derived of watershed characteristics, for the TETIS model it has been necessary to perform a previous automatic calibration. This calibration was carried out using the data provided by the observed hydrograph, in order to adjust the modeĺs 9 correction factors. Keywords: TETIS, TOPKAPI, distributed models, hydrological response, ungauged basins.

  11. PCA-based lung motion model

    CERN Document Server

    Li, Ruijiang; Jia, Xun; Zhao, Tianyu; Lamb, James; Yang, Deshan; Low, Daniel A; Jiang, Steve B

    2010-01-01

    Organ motion induced by respiration may cause clinically significant targeting errors and greatly degrade the effectiveness of conformal radiotherapy. It is therefore crucial to be able to model respiratory motion accurately. A recently proposed lung motion model based on principal component analysis (PCA) has been shown to be promising on a few patients. However, there is still a need to understand the underlying reason why it works. In this paper, we present a much deeper and detailed analysis of the PCA-based lung motion model. We provide the theoretical justification of the effectiveness of PCA in modeling lung motion. We also prove that under certain conditions, the PCA motion model is equivalent to 5D motion model, which is based on physiology and anatomy of the lung. The modeling power of PCA model was tested on clinical data and the average 3D error was found to be below 1 mm.

  12. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has......Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...

  13. A Multiple Model Approach to Modeling Based on LPF Algorithm

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Input-output data fitting methods are often used for unknown-structure nonlinear system modeling. Based on model-on-demand tactics, a multiple model approach to modeling for nonlinear systems is presented. The basic idea is to find out, from vast historical system input-output data sets, some data sets matching with the current working point, then to develop a local model using Local Polynomial Fitting (LPF) algorithm. With the change of working points, multiple local models are built, which realize the exact modeling for the global system. By comparing to other methods, the simulation results show good performance for its simple, effective and reliable estimation.``

  14. Pituitary tumor disappearance in a patient with newly diagnosed acromegaly primarily treated with octreotide LAR.

    Science.gov (United States)

    Resmini, E; Murialdo, G; Giusti, M; Boschetti, M; Minuto, F; Ferone, D

    2005-02-01

    We describe the case of an acromegalic patient primarily treated with octreotide LAR in whom the pituitary tumor disappeared after 18 months of treatment. A 68-yr-old woman, with clinical suspicion of acromegaly, was admitted to our Unit with the ultrasonographical evidence of cardiac hypertrophy, arrhythmias, right branch block and interatrial septum aneurism. She referred hands and feet enlargement since the age of 30 and facial disfigurements since the age of 50. At the age of 45 she underwent surgery for carpal tunnel syndrome and at the age of 61 an euthyroid nodular goiter was diagnosed. Hormonal evaluation showed elevated circulating GH levels (25+/-3.2 ng/ml), not suppressible after oral glucose load, and elevated IGF-I levels (646 ng/ml), whereas the remaining pituitary function was normal. Visual perimetry was normal, whereas magnetic resonance imaging (MRI) showed an intrasellar pituitary adenoma with maximal diameter of 9 mm. In order to improve cardiovascular function before surgery, the patient started octreotide LAR 20 mg every 4 weeks for 3 months. Then based on IGF-I values, the dose was adjusted to 30 mg. After 6 months a second MRI showed significant tumor reduction (>50% of baseline maximal diameter), GH and IGF-I were within the normal range and the patient continued the treatment. After one-year therapy, an improvement of cardiac alterations was recorded and the patient was referred to the neurosurgeon. However, she refused the operation. At 18-month follow-up, MRI showed the complete disappearance of direct and indirect signs of pituitary adenoma. To our knowledge, this is the first case of complete radiological remission of pituitary tumor during octreotide LAR treatment in acromegaly.

  15. Behavior and Design Intent Based Product Modeling

    Directory of Open Access Journals (Sweden)

    László Horváth

    2004-11-01

    Full Text Available A knowledge based modeling of mechanical products is presented for industrial CAD/CAM systems. An active model is proposed that comprise knowledge from modeling procedures, generic part models and engineers. Present day models of mechanical systems do not contain data about the background of human decisions. This situation motivated the authors at their investigations on exchange design intent information between engineers. Their concept was extending of product models to be capable of description of design intent information. Several human-computer and human-human communication issues were considered. The complex communication problem has been divided into four sub-problems, namely communication of human intent source with the computer system, representation of human intent, exchange of intent data between modeling procedures and communication of the represented intent with humans. Paper discusses the scenario of intelligent modeling based engineering. Then key concepts for the application of computational intelligence in computer model based engineering systems are detailed including knowledge driven models as well as areas of their application. Next, behavior based models with intelligent content involving specifications and knowledge for the design processes are emphasized and an active part modeling is proposed and possibilities for its application are outlined. Finally, design intent supported intelligent modeling is discussed.

  16. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  17. Modeling the interdependent network based on two-mode networks

    Science.gov (United States)

    An, Feng; Gao, Xiangyun; Guan, Jianhe; Huang, Shupei; Liu, Qian

    2017-10-01

    Among heterogeneous networks, there exist obviously and closely interdependent linkages. Unlike existing research primarily focus on the theoretical research of physical interdependent network model. We propose a two-layer interdependent network model based on two-mode networks to explore the interdependent features in the reality. Specifically, we construct a two-layer interdependent loan network and develop several dependent features indices. The model is verified to enable us to capture the loan dependent features of listed companies based on loan behaviors and shared shareholders. Taking Chinese debit and credit market as case study, the main conclusions are: (1) only few listed companies shoulder the main capital transmission (20% listed companies occupy almost 70% dependent degree). (2) The control of these key listed companies will be more effective of avoiding the spreading of financial risks. (3) Identifying the companies with high betweenness centrality and controlling them could be helpful to monitor the financial risk spreading. (4) The capital transmission channel among Chinese financial listed companies and Chinese non-financial listed companies are relatively strong. However, under greater pressure of demand of capital transmission (70% edges failed), the transmission channel, which constructed by debit and credit behavior, will eventually collapse.

  18. Comparing repetition-based melody segmentation models

    NARCIS (Netherlands)

    Rodríguez López, M.E.; de Haas, Bas; Volk, Anja

    2014-01-01

    This paper reports on a comparative study of computational melody segmentation models based on repetition detection. For the comparison we implemented five repetition-based segmentation models, and subsequently evaluated their capacity to automatically find melodic phrase boundaries in a corpus of 2

  19. A Role-Based Fuzzy Assignment Model

    Institute of Scientific and Technical Information of China (English)

    ZUO Bao-he; FENG Shan

    2002-01-01

    It's very important to dynamically assign the tasks to corresponding actors in workflow management system, especially in complex applications. This improves the flexibility of workflow systems.In this paper, a role-based workflow model with fuzzy optimized intelligent assignment is proposed and applied in the investment management system. A groupware-based software model is also proposed.

  20. Key-Based Data Model

    Science.gov (United States)

    1994-05-16

    prgresive repetition. It is used, principally , to train small units to pefom tasks requiring a high degree of teamwork, such as fire and maneuver actions in...an adminitative structure that has a mission. An established need based on a valid deficiency in an administrative structure with a mission. Person A

  1. Phylogenetic invariants for group-based models

    CERN Document Server

    Donten-Bury, Maria

    2010-01-01

    In this paper we investigate properties of algebraic varieties representing group-based phylogenetic models. We give the (first) example of a nonnormal general group-based model for an abelian group. Following Kaie Kubjas we also determine some invariants of group-based models showing that the associated varieties do not have to be deformation equivalent. We propose a method of generating many phylogenetic invariants and in particular we show that our approach gives the whole ideal of the claw tree for 3-Kimura model under the assumption of the conjecture of Sturmfels and Sullivant. This, combined with the results of Sturmfels and Sullivant, would enable to determine all phylogenetic invariants for any tree for 3-Kimura model and possibly for other group-based models.

  2. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  3. Agent-based modeling of sustainable behaviors

    CERN Document Server

    Sánchez-Maroño, Noelia; Fontenla-Romero, Oscar; Polhill, J; Craig, Tony; Bajo, Javier; Corchado, Juan

    2017-01-01

    Using the O.D.D. (Overview, Design concepts, Detail) protocol, this title explores the role of agent-based modeling in predicting the feasibility of various approaches to sustainability. The chapters incorporated in this volume consist of real case studies to illustrate the utility of agent-based modeling and complexity theory in discovering a path to more efficient and sustainable lifestyles. The topics covered within include: households' attitudes toward recycling, designing decision trees for representing sustainable behaviors, negotiation-based parking allocation, auction-based traffic signal control, and others. This selection of papers will be of interest to social scientists who wish to learn more about agent-based modeling as well as experts in the field of agent-based modeling.

  4. IP Network Management Model Based on NGOSS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jin-yu; LI Hong-hui; LIU Feng

    2004-01-01

    This paper addresses a management model for IP network based on Next Generation Operation Support System (NGOSS). It makes the network management on the base of all the operation actions of ISP, It provides QoS to user service through the whole path by providing end-to-end Service Level Agreements (SLA) management through whole path. Based on web and coordination technology, this paper gives an implement architecture of this model.

  5. Community-Based Participatory Research Conceptual Model: Community Partner Consultation and Face Validity.

    Science.gov (United States)

    Belone, Lorenda; Lucero, Julie E; Duran, Bonnie; Tafoya, Greg; Baker, Elizabeth A; Chan, Domin; Chang, Charlotte; Greene-Moton, Ella; Kelley, Michele A; Wallerstein, Nina

    2016-01-01

    A national community-based participatory research (CBPR) team developed a conceptual model of CBPR partnerships to understand the contribution of partnership processes to improved community capacity and health outcomes. With the model primarily developed through academic literature and expert consensus building, we sought community input to assess face validity and acceptability. Our research team conducted semi-structured focus groups with six partnerships nationwide. Participants validated and expanded on existing model constructs and identified new constructs based on "real-world" praxis, resulting in a revised model. Four cross-cutting constructs were identified: trust development, capacity, mutual learning, and power dynamics. By empirically testing the model, we found community face validity and capacity to adapt the model to diverse contexts. We recommend partnerships use and adapt the CBPR model and its constructs, for collective reflection and evaluation, to enhance their partnering practices and achieve their health and research goals. © The Author(s) 2014.

  6. Identity-Related Influences on the Success of Minority Workers in Primarily Nonminority Organizations.

    Science.gov (United States)

    James, Keith; Khoo, Gillian

    1991-01-01

    Reviews literature at the micro- (individual, interpersonal, and small group) and macro- (organizational, societal, and cultural) levels relating to the experiences and outcomes of minorities in work settings populated primarily by members of the majority. Uses Tajfel and Turner's Social Identity Theory as an organizational and integrative…

  7. 29 CFR 780.607 - “Primarily employed” in agriculture.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false âPrimarily employedâ in agriculture. 780.607 Section 780... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements...

  8. Propulsion and control propellers with thruster nozzles primarily for aircraft applications

    Science.gov (United States)

    Pabst, W.

    1986-01-01

    A propulsion and control propeller with thruster nozzles, primarily for aircraft application is described. Adjustability of rotor blades at the hub and pressurized gas expulsion combined with an air propeller increase power. Both characteristics are combined in one simple device, and, furthermore, incorporate overall aircraft control so that mechanisms which govern lateral and horizontal movement become superfluous.

  9. Examining the Effects of Introducing Online Access to ACS Journals at Primarily Undergraduate Institutions

    Science.gov (United States)

    Landolt, R. G.

    2007-01-01

    In collaboration with the Publications Division of the American Chemical Society (ACS), students and faculty at 24 primarily undergraduate institutions were provided online access to ACS primary research journals for a period of 18 months, and a group of eight schools were granted access to use the archives of ACS journals for a year. Resources…

  10. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    In this paper we present a model for email authorship identification (EAI) by employing a Cluster-based Classification (CCM) technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature-set to include some...... more interesting and effective features for email authorship identification (e.g. the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell). We also included Info Gain feature selection based...... reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM)-based models, as well as the models proposed by Iqbal et al. [1, 2]. The proposed model attains an accuracy rate of 94% for 10...

  11. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  12. Simulation-based Manufacturing System Modeling

    Institute of Scientific and Technical Information of China (English)

    卫东; 金烨; 范秀敏; 严隽琪

    2003-01-01

    In recent years, computer simulation appears to be very advantageous technique for researching the resource-constrained manufacturing system. This paper presents an object-oriented simulation modeling method, which combines the merits of traditional methods such as IDEF0 and Petri Net. In this paper, a four-layer-one-angel hierarchical modeling framework based on OOP is defined. And the modeling description of these layers is expounded, such as: hybrid production control modeling and human resource dispatch modeling. To validate the modeling method, a case study of an auto-product line in a motor manufacturing company has been carried out.

  13. Model-Based Motion Tracking of Infants

    DEFF Research Database (Denmark)

    Olsen, Mikkel Damgaard; Herskind, Anna; Nielsen, Jens Bo;

    2014-01-01

    Even though motion tracking is a widely used technique to analyze and measure human movements, only a few studies focus on motion tracking of infants. In recent years, a number of studies have emerged focusing on analyzing the motion pattern of infants, using computer vision. Most of these studies...... are based on 2D images, but few are based on 3D information. In this paper, we present a model-based approach for tracking infants in 3D. The study extends a novel study on graph-based motion tracking of infants and we show that the extension improves the tracking results. A 3D model is constructed...... that resembles the body surface of an infant, where the model is based on simple geometric shapes and a hierarchical skeleton model....

  14. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...

  15. Probabilistic Model-Based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Andersen, Jakob; Prehn, Thomas

    2005-01-01

    manner. Bayesian propagation over time is used for proper model selection and tracking during model-based background subtraction. Bayes propagation is attractive in our application as it allows to deal with uncertainties during tracking. We have tested our approach on suitable outdoor video data....... is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  16. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  17. Instance-Based Generative Biological Shape Modeling.

    Science.gov (United States)

    Peng, Tao; Wang, Wei; Rohde, Gustavo K; Murphy, Robert F

    2009-01-01

    Biological shape modeling is an essential task that is required for systems biology efforts to simulate complex cell behaviors. Statistical learning methods have been used to build generative shape models based on reconstructive shape parameters extracted from microscope image collections. However, such parametric modeling approaches are usually limited to simple shapes and easily-modeled parameter distributions. Moreover, to maximize the reconstruction accuracy, significant effort is required to design models for specific datasets or patterns. We have therefore developed an instance-based approach to model biological shapes within a shape space built upon diffeomorphic measurement. We also designed a recursive interpolation algorithm to probabilistically synthesize new shape instances using the shape space model and the original instances. The method is quite generalizable and therefore can be applied to most nuclear, cell and protein object shapes, in both 2D and 3D.

  18. Distributed Prognostics based on Structural Model Decomposition

    Science.gov (United States)

    Daigle, Matthew J.; Bregon, Anibal; Roychoudhury, I.

    2014-01-01

    Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based models are constructed that describe the operation of a system and how it fails. Such approaches consist of an estimation phase, in which the health state of the system is first identified, and a prediction phase, in which the health state is projected forward in time to determine the end of life. Centralized solutions to these problems are often computationally expensive, do not scale well as the size of the system grows, and introduce a single point of failure. In this paper, we propose a novel distributed model-based prognostics scheme that formally describes how to decompose both the estimation and prediction problems into independent local subproblems whose solutions may be easily composed into a global solution. The decomposition of the prognostics problem is achieved through structural decomposition of the underlying models. The decomposition algorithm creates from the global system model a set of local submodels suitable for prognostics. Independent local estimation and prediction problems are formed based on these local submodels, resulting in a scalable distributed prognostics approach that allows the local subproblems to be solved in parallel, thus offering increases in computational efficiency. Using a centrifugal pump as a case study, we perform a number of simulation-based experiments to demonstrate the distributed approach, compare the performance with a centralized approach, and establish its scalability. Index Terms-model-based prognostics, distributed prognostics, structural model decomposition ABBREVIATIONS

  19. Sketch-based Interfaces and Modeling

    CERN Document Server

    Jorge, Joaquim

    2011-01-01

    The field of sketch-based interfaces and modeling (SBIM) is concerned with developing methods and techniques to enable users to interact with a computer through sketching - a simple, yet highly expressive medium. SBIM blends concepts from computer graphics, human-computer interaction, artificial intelligence, and machine learning. Recent improvements in hardware, coupled with new machine learning techniques for more accurate recognition, and more robust depth inferencing techniques for sketch-based modeling, have resulted in an explosion of both sketch-based interfaces and pen-based computing

  20. Multiscale agent-based consumer market modeling.

    Energy Technology Data Exchange (ETDEWEB)

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  1. PDF-based heterogeneous multiscale filtration model.

    Science.gov (United States)

    Gong, Jian; Rutland, Christopher J

    2015-04-21

    Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.

  2. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  3. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  4. Model Based Testing for Agent Systems

    Science.gov (United States)

    Zhang, Zhiyong; Thangarajah, John; Padgham, Lin

    Although agent technology is gaining world wide popularity, a hindrance to its uptake is the lack of proper testing mechanisms for agent based systems. While many traditional software testing methods can be generalized to agent systems, there are many aspects that are different and which require an understanding of the underlying agent paradigm. In this paper we present certain aspects of a testing framework that we have developed for agent based systems. The testing framework is a model based approach using the design models of the Prometheus agent development methodology. In this paper we focus on model based unit testing and identify the appropriate units, present mechanisms for generating suitable test cases and for determining the order in which the units are to be tested, present a brief overview of the unit testing process and an example. Although we use the design artefacts from Prometheus the approach is suitable for any plan and event based agent system.

  5. Grey-theory based intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    Qin Boping; Zhou Xianwei; Yang Jun; Song Cunyi

    2006-01-01

    To solve the problem that current intrusion detection model needs large-scale data in formulating the model in real-time use, an intrusion detection system model based on grey theory (GTIDS) is presented. Grey theory has merits of fewer requirements on original data scale, less limitation of the distribution pattern and simpler algorithm in modeling.With these merits GTIDS constructs model according to partial time sequence for rapid detect on intrusive act in secure system. In this detection model rate of false drop and false retrieval are effectively reduced through twice modeling and repeated detect on target data. Furthermore, GTIDS framework and specific process of modeling algorithm are presented. The affectivity of GTIDS is proved through emulated experiments comparing snort and next-generation intrusion detection expert system (NIDES) in SRI international.

  6. Springer handbook of model-based science

    CERN Document Server

    Bertolotti, Tommaso

    2017-01-01

    The handbook offers the first comprehensive reference guide to the interdisciplinary field of model-based reasoning. It highlights the role of models as mediators between theory and experimentation, and as educational devices, as well as their relevance in testing hypotheses and explanatory functions. The Springer Handbook merges philosophical, cognitive and epistemological perspectives on models with the more practical needs related to the application of this tool across various disciplines and practices. The result is a unique, reliable source of information that guides readers toward an understanding of different aspects of model-based science, such as the theoretical and cognitive nature of models, as well as their practical and logical aspects. The inferential role of models in hypothetical reasoning, abduction and creativity once they are constructed, adopted, and manipulated for different scientific and technological purposes is also discussed. Written by a group of internationally renowned experts in ...

  7. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  8. Organ-sparing treatment for primarily disseminated breast cancer with metachronous bilateral involvement

    Directory of Open Access Journals (Sweden)

    Yu. A. Ragulin

    2016-01-01

    Full Text Available Systemic drug therapy remains first-line treatment for primarily disseminated breast cancer (BC. The problem in the use of local methods to treat BC patients with distant metastases has not been fully solved. The most of investigations presented in the modern literature suggest that patients show significantly better survivals after adjuvant systemic therapy with local exposure of a primary tumor, the main goal of which is its local monitoring. At the same time, the choice of optimal treatments and their sequence and combination remain to be explored. The paper describes a case of successful organ-sparing chemoradiation therapy for primarily disseminated BC with metachronous bilateral involvement.

  9. Ensemble-based conditioning of reservoir models to seismic data

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Brouwer, J.; Trani, M.

    2011-01-01

    While 3D seismic has been the basis for geological model building for a long time, time-lapse seismic has primarily been used in a qualitative manner to assist in monitoring reservoir behavior. With the growing acceptance of assisted history matching methods has come an equally rising interest in in

  10. Ensemble-based conditioning of reservoir models to seismic data

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Brouwer, J.; Trani, M.

    2010-01-01

    While 3D seismic has been the basis for geological model building for a long time, time-lapse seismic has primarily been used in a qualitative manner to assist in monitoring reservoir behavior. With the growing acceptance of assisted history matching methods has come an equally rising interest in in

  11. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn, Ellen-Wien; Doldersum, Tom; Useya, Juliana; Augustijn, Denie

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V. cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse

  12. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn, Ellen-Wien; Doldersum, Tom; Useya, Juliana; Augustijn, Denie

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V.cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse d

  13. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn-Beckers, Petronella; Doldersum, Tom; Useya, Juliana; Augustijn, Dionysius C.M.

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V.cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse

  14. New global ICT-based business models

    DEFF Research Database (Denmark)

    House Case The Nano Solar Case The Master Cat Case The Pitfalls Of The Blue Ocean Strategy - Implications Of "The Six Paths Framework" Network-Based Innovation - Combining Exploration and Exploitation? Innovating New Business Models in Inter-firm Collaboration NEW Global Business Models - What Did...

  15. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy approximation a

  16. Rule-based Modelling and Tunable Resolution

    Directory of Open Access Journals (Sweden)

    Russ Harmer

    2009-11-01

    Full Text Available We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  17. Rule-based Modelling and Tunable Resolution

    CERN Document Server

    Harmer, Russ

    2009-01-01

    We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  18. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy approximation a

  19. Mineral resources estimation based on block modeling

    Science.gov (United States)

    Bargawa, Waterman Sulistyana; Amri, Nur Ali

    2016-02-01

    The estimation in this paper uses three kinds of block models of nearest neighbor polygon, inverse distance squared and ordinary kriging. The techniques are weighting scheme which is based on the principle that block content is a linear combination of the grade data or the sample around the block being estimated. The case study in Pongkor area, here is gold-silver resource modeling that allegedly shaped of quartz vein as a hydrothermal process of epithermal type. Resources modeling includes of data entry, statistical and variography analysis of topography and geological model, the block model construction, estimation parameter, presentation model and tabulation of mineral resources. Skewed distribution, here isolated by robust semivariogram. The mineral resources classification generated in this model based on an analysis of the kriging standard deviation and number of samples which are used in the estimation of each block. Research results are used to evaluate the performance of OK and IDS estimator. Based on the visual and statistical analysis, concluded that the model of OK gives the estimation closer to the data used for modeling.

  20. Gradient based filtering of digital elevation models

    DEFF Research Database (Denmark)

    Knudsen, Thomas; Andersen, Rune Carbuhn

    We present a filtering method for digital terrain models (DTMs). The method is based on mathematical morphological filtering within gradient (slope) defined domains. The intention with the filtering procedure is to improbé the cartographic quality of height contours generated from a DTM based on ...... in the landscape are washed out and misrepresented....

  1. Gradient based filtering of digital elevation models

    DEFF Research Database (Denmark)

    Knudsen, Thomas; Andersen, Rune Carbuhn

    We present a filtering method for digital terrain models (DTMs). The method is based on mathematical morphological filtering within gradient (slope) defined domains. The intention with the filtering procedure is to improbé the cartographic quality of height contours generated from a DTM based...

  2. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  3. Multiagent-Based Model For ESCM

    OpenAIRE

    Delia MARINCAS

    2011-01-01

    Web based applications for Supply Chain Management (SCM) are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory,...

  4. Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas R [ORNL; Nutaro, James J [ORNL

    2012-01-01

    This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigm to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.

  5. Flower solid modeling based on sketches

    Institute of Scientific and Technical Information of China (English)

    Zhan DING; Shu-chang XU; Xiu-zi YE; Yin ZHANG; San-yuan ZHANG

    2008-01-01

    In this paper we propose a method to model flowers of solid shape. Based on (Ijiri et al., 2005)'s method, we separate individual flower modeling and inflorescence modeling procedures into structure and geometry modeling. We incorporate interactive editing gestures to allow the user to edit structure parameters freely onto structure diagram. Furthermore, we use free-hand sketching techniques to allow users to create and edit 3D geometrical elements freely and easily. The final step is to automatically merge all independent 3D geometrical elements into a single waterproof mesh. Our experiments show that this solid modeling approach is promising. Using our approach, novice users can create vivid flower models easily and freely. The generated flower model is waterproof. It can have applications in visualization, animation, gaming, and toys and decorations if printed out on 3D rapid prototyping devices.

  6. New global ICT-based business models

    DEFF Research Database (Denmark)

    . Contents: The Theoretical History and Background of Business Models The Th eoretical Background of Business Model Innovation ICT - a Key Enabler in Innovating New Global Business Models The NEWGIBM Research Methodology The Analytical Model for NEWGIBM Industry Service - Technology Centre The KMD Case Smart...... House Case The Nano Solar Case The Master Cat Case The Pitfalls Of The Blue Ocean Strategy - Implications Of "The Six Paths Framework" Network-Based Innovation - Combining Exploration and Exploitation? Innovating New Business Models in Inter-firm Collaboration NEW Global Business Models - What Did......The New Global Business model (NEWGIBM) book describes the background, theory references, case studies, results and learning imparted by the NEWGIBM project, which is supported by ICT, to a research group during the period from 2005-2011. The book is a result of the efforts and the collaborative...

  7. Deformable surface modeling based on dual subdivision

    Institute of Scientific and Technical Information of China (English)

    WANG Huawei; SUN Hanqiu; QIN Kaihuai

    2005-01-01

    Based on dual Doo-Sabin subdivision and the corresponding parameterization, a modeling technique of deformable surfaces is presented in this paper. In the proposed model, all the dynamic parameters are computed in a unified way for both non-defective and defective subdivision matrices, and central differences are used to discretize the Lagrangian dynamics equation instead of backward differences. Moreover, a local scheme is developed to solve the dynamics equation approximately, thus the order of the linear equation is reduced greatly. Therefore, the proposed model is more efficient and faster than the existing dynamic models. It can be used for deformable surface design, interactive surface editing, medical imaging and simulation.

  8. Rule-based transformations for geometric modelling

    Directory of Open Access Journals (Sweden)

    Thomas Bellet

    2011-02-01

    Full Text Available The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc. with relevant data as their geometric shape (position, curve, surface, etc. or application dedicated data (e.g. molecule concentration level in a biological context. We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes have multiple labels.

  9. Rule-based transformations for geometric modelling

    CERN Document Server

    Bellet, Thomas; Gall, Pascale Le; 10.4204/EPTCS.48.5

    2011-01-01

    The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc.) with relevant data as their geometric shape (position, curve, surface, etc.) or application dedicated data (e.g. molecule concentration level in a biological context). We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes hav...

  10. A Nonhydrostatic Model Based On A New Approach

    Science.gov (United States)

    Janjic, Z. I.

    Considerable experience with nonhydrostatic mo dels has been accumulated on the scales of convective clouds and storms. However, numerical weather prediction (NWP) deals with motions on a much wider range of temporal and spatial scales. Thus, difficulties that may not be significant on the small scales, may become important in NWP applications. Having in mind these considerations, a new approach has been proposed and applied in developing nonhydrostatic models intended for NWP applications. Namely, instead of extending the cloud models to synoptic scales, the hydrostatic approximation is relaxed in a hydrostatic NWP model. In this way the model validity is extended to nonhydrostatic motions, and at the same time favorable features of the hydrostatic formulation are preserved. In order to apply this approach, the system of nonhydrostatic equations is split into two parts: (a) the part that corresponds to the hydrostatic system, except for corrections due to vertical acceleration, and (b) the system of equations that allows computation of the corrections appearing in the first system. This procedure does not require any additional approximation. In the model, "isotropic" horizontal finite differencing is employed that conserves a number of basic and derived dynamical and quadratic quantities. The hybrid pressure-sigma vertical coordinate has been chosen as the primary option. The forward-backward scheme is used for horizontally propagating fast waves, and an implicit scheme is used for vertically propagating sound waves. The Adams- Bashforth scheme is applied for the advection of the basic dynamical variables and for the Coriolis terms. In real data runs, the nonhydrostatic dynamics does not require extra computational boundary conditions at the top. The philosophy of the physical package and possible future developments of physical parameterizations are also reviewed. A two-dimensional model based on the described approach successfully reproduced classical

  11. Development of Ensemble Model Based Water Demand Forecasting Model

    Science.gov (United States)

    Kwon, Hyun-Han; So, Byung-Jin; Kim, Seong-Hyeon; Kim, Byung-Seop

    2014-05-01

    In recent years, Smart Water Grid (SWG) concept has globally emerged over the last decade and also gained significant recognition in South Korea. Especially, there has been growing interest in water demand forecast and optimal pump operation and this has led to various studies regarding energy saving and improvement of water supply reliability. Existing water demand forecasting models are categorized into two groups in view of modeling and predicting their behavior in time series. One is to consider embedded patterns such as seasonality, periodicity and trends, and the other one is an autoregressive model that is using short memory Markovian processes (Emmanuel et al., 2012). The main disadvantage of the abovementioned model is that there is a limit to predictability of water demands of about sub-daily scale because the system is nonlinear. In this regard, this study aims to develop a nonlinear ensemble model for hourly water demand forecasting which allow us to estimate uncertainties across different model classes. The proposed model is consist of two parts. One is a multi-model scheme that is based on combination of independent prediction model. The other one is a cross validation scheme named Bagging approach introduced by Brieman (1996) to derive weighting factors corresponding to individual models. Individual forecasting models that used in this study are linear regression analysis model, polynomial regression, multivariate adaptive regression splines(MARS), SVM(support vector machine). The concepts are demonstrated through application to observed from water plant at several locations in the South Korea. Keywords: water demand, non-linear model, the ensemble forecasting model, uncertainty. Acknowledgements This subject is supported by Korea Ministry of Environment as "Projects for Developing Eco-Innovation Technologies (GT-11-G-02-001-6)

  12. MEGen: A Physiologically Based Pharmacokinetic Model Generator

    Directory of Open Access Journals (Sweden)

    George D Loizou

    2011-11-01

    Full Text Available Physiologically based pharmacokinetic models are being used in an increasing number of different areas. These not only include the human safety assessment of pharmaceuticals, pesticides, biocides and environmental chemicals but also for food animal, wild mammal and avian risk assessment. The value of PBPK models is that they are tools for estimating tissue dosimetry by integrating in vitro and in vivo mechanistic, pharmacokinetic and toxicological information through their explicit mathematical description of important anatomical, physiological and biochemical determinants of chemical uptake, disposition and elimination. However, PBPK models are perceived as complex, data hungry, resource intensive and time consuming. In addition, model validation and verification are hindered by the relative complexity of the equations. To begin to address these issues a freely available web application for the rapid construction and documentation of bespoke PBPK models is under development. Here we present an overview of the current capabilities of MEGen, a model equation generator and parameter database and discuss future developments.

  13. Applying Model Checking to Generate Model-Based Integration Tests from Choreography Models

    Science.gov (United States)

    Wieczorek, Sebastian; Kozyura, Vitaly; Roth, Andreas; Leuschel, Michael; Bendisposto, Jens; Plagge, Daniel; Schieferdecker, Ina

    Choreography models describe the communication protocols between services. Testing of service choreographies is an important task for the quality assurance of service-based systems as used e.g. in the context of service-oriented architectures (SOA). The formal modeling of service choreographies enables a model-based integration testing (MBIT) approach. We present MBIT methods for our service choreography modeling approach called Message Choreography Models (MCM). For the model-based testing of service choreographies, MCMs are translated into Event-B models and used as input for our test generator which uses the model checker ProB.

  14. Spatial interactions in agent-based modeling

    CERN Document Server

    Ausloos, Marcel; Merlone, Ugo

    2014-01-01

    Agent Based Modeling (ABM) has become a widespread approach to model complex interactions. In this chapter after briefly summarizing some features of ABM the different approaches in modeling spatial interactions are discussed. It is stressed that agents can interact either indirectly through a shared environment and/or directly with each other. In such an approach, higher-order variables such as commodity prices, population dynamics or even institutions, are not exogenously specified but instead are seen as the results of interactions. It is highlighted in the chapter that the understanding of patterns emerging from such spatial interaction between agents is a key problem as much as their description through analytical or simulation means. The chapter reviews different approaches for modeling agents' behavior, taking into account either explicit spatial (lattice based) structures or networks. Some emphasis is placed on recent ABM as applied to the description of the dynamics of the geographical distribution o...

  15. Activity-based resource capability modeling

    Institute of Scientific and Technical Information of China (English)

    CHENG Shao-wu; XU Xiao-fei; WANG Gang; SUN Xue-dong

    2008-01-01

    To analyse and optimize a enterprise process in a wide scope, an activity-based method of modeling resource capabilities is presented. It models resource capabilities by means of the same structure as an activity, that is, resource capabilities are defined by input objects, actions and output objects. A set of activity-based re-source capability modeling rules and matching rules between an activity and a resource are introduced. This method can not only be used to describe capability of manufacturing tools, but also capability of persons and applications, etc. It unifies methods of modeling capability of all kinds of resources in an enterprise and supports the optimization of the resource allocation of a process.

  16. Graphical model construction based on evolutionary algorithms

    Institute of Scientific and Technical Information of China (English)

    Youlong YANG; Yan WU; Sanyang LIU

    2006-01-01

    Using Bayesian networks to model promising solutions from the current population of the evolutionary algorithms can ensure efficiency and intelligence search for the optimum. However, to construct a Bayesian network that fits a given dataset is a NP-hard problem, and it also needs consuming mass computational resources. This paper develops a methodology for constructing a graphical model based on Bayesian Dirichlet metric. Our approach is derived from a set of propositions and theorems by researching the local metric relationship of networks matching dataset. This paper presents the algorithm to construct a tree model from a set of potential solutions using above approach. This method is important not only for evolutionary algorithms based on graphical models, but also for machine learning and data mining.The experimental results show that the exact theoretical results and the approximations match very well.

  17. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....../panel in Standard Test Conditions (STC) are shown, as well as the parameters extraction from the data-sheet values. The temperature dependence of the cell dark saturation current is expressed with an alternative formula, which gives better correlation with the datasheet values of the power temperature dependence....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  18. Reaction-contingency based bipartite Boolean modelling

    Science.gov (United States)

    2013-01-01

    Background Intracellular signalling systems are highly complex, rendering mathematical modelling of large signalling networks infeasible or impractical. Boolean modelling provides one feasible approach to whole-network modelling, but at the cost of dequantification and decontextualisation of activation. That is, these models cannot distinguish between different downstream roles played by the same component activated in different contexts. Results Here, we address this with a bipartite Boolean modelling approach. Briefly, we use a state oriented approach with separate update rules based on reactions and contingencies. This approach retains contextual activation information and distinguishes distinct signals passing through a single component. Furthermore, we integrate this approach in the rxncon framework to support automatic model generation and iterative model definition and validation. We benchmark this method with the previously mapped MAP kinase network in yeast, showing that minor adjustments suffice to produce a functional network description. Conclusions Taken together, we (i) present a bipartite Boolean modelling approach that retains contextual activation information, (ii) provide software support for automatic model generation, visualisation and simulation, and (iii) demonstrate its use for iterative model generation and validation. PMID:23835289

  19. A Multiagent Based Model for Tactical Planning

    Science.gov (United States)

    2002-10-01

    Pub. Co. 1985. [10] Castillo, J.M. Aproximación mediante procedimientos de Inteligencia Artificial al planeamiento táctico. Doctoral Thesis...been developed under the same conceptual model and using similar Artificial Intelligence Tools. We use four different stimulus/response agents in...The conceptual model is built on base of the Agents theory. To implement the different agents we have used Artificial Intelligence techniques such

  20. GIS-Based Hydrogeological-Parameter Modeling

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A regression model is proposed to relate the variation of water well depth with topographic properties (area and slope), the variation of hydraulic conductivity and vertical decay factor. The implementation of this model in GIS environment (ARC/TNFO) based on known water data and DEM is used to estimate the variation of hydraulic conductivity and decay factor of different lithoiogy units in watershed context.

  1. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...... constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...

  2. Atom-Role-Based Access Control Model

    Science.gov (United States)

    Cai, Weihong; Huang, Richeng; Hou, Xiaoli; Wei, Gang; Xiao, Shui; Chen, Yindong

    Role-based access control (RBAC) model has been widely recognized as an efficient access control model and becomes a hot research topic of information security at present. However, in the large-scale enterprise application environments, the traditional RBAC model based on the role hierarchy has the following deficiencies: Firstly, it is unable to reflect the role relationships in complicated cases effectively, which does not accord with practical applications. Secondly, the senior role unconditionally inherits all permissions of the junior role, thus if a user is under the supervisor role, he may accumulate all permissions, and this easily causes the abuse of permission and violates the least privilege principle, which is one of the main security principles. To deal with these problems, we, after analyzing permission types and role relationships, proposed the concept of atom role and built an atom-role-based access control model, called ATRBAC, by dividing the permission set of each regular role based on inheritance path relationships. Through the application-specific analysis, this model can well meet the access control requirements.

  3. Image-Based Multiresolution Implicit Object Modeling

    Directory of Open Access Journals (Sweden)

    Sarti Augusto

    2002-01-01

    Full Text Available We discuss two image-based 3D modeling methods based on a multiresolution evolution of a volumetric function′s level set. In the former method, the role of the level set implosion is to fuse ("sew" and "stitch" together several partial reconstructions (depth maps into a closed model. In the later, the level set′s implosion is steered directly by the texture mismatch between views. Both solutions share the characteristic of operating in an adaptive multiresolution fashion, in order to boost up computational efficiency and robustness.

  4. A multivalued knowledge-base model

    CERN Document Server

    Achs, Agnes

    2010-01-01

    The basic aim of our study is to give a possible model for handling uncertain information. This model is worked out in the framework of DATALOG. At first the concept of fuzzy Datalog will be summarized, then its extensions for intuitionistic- and interval-valued fuzzy logic is given and the concept of bipolar fuzzy Datalog is introduced. Based on these ideas the concept of multivalued knowledge-base will be defined as a quadruple of any background knowledge; a deduction mechanism; a connecting algorithm, and a function set of the program, which help us to determine the uncertainty levels of the results. At last a possible evaluation strategy is given.

  5. Physically based modeling and animation of tornado

    Institute of Scientific and Technical Information of China (English)

    LIU Shi-guang; WANG Zhang-ye; GONG Zheng; CHEN Fei-fei; PENG Qun-sheng

    2006-01-01

    Realistic modeling and rendering of dynamic tornado scene is recognized as a challenging task for researchers of computer graphics. In this paper a new physically based method for simulating and animating tornado scene is presented. We first propose a Two-Fluid model based on the physical theory of tornado, then we simulate the flow of tornado and its interaction with surrounding objects such as debris, etc. Taking the scattering and absorption of light by the participating media into account, the illumination effects of the tornado scene can be generated realistically. With the support of graphics hardware, various kinds of dynamic tornado scenes can be rendered at interactive rates.

  6. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  7. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the Da......This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together...

  8. SWIFT MODELLER: a Java based GUI for molecular modeling.

    Science.gov (United States)

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  9. Lamin A/C mutation affecting primarily the right side of the heart

    Directory of Open Access Journals (Sweden)

    Laura Ollila

    2013-04-01

    Full Text Available LMNA mutations are amongst the most important causes of familial dilated cardiomyopathy. The most important cause of arrhythmogenic right ventricular cardiomyopathy (ARVC is desmosomal pathology. The aim of the study was to elucidate the role of LMNA mutations among Finnish cardiomyopathy patients. We screened 135 unrelated cardiomyopathy patients for LMNA mutations. Because of unusual phenotype, two patients were screened for the known Finnish ARVC-related mutations of desmosomal genes, and their Plakophilin-2b gene was sequenced. Myocardial samples from two patients were examined by immunohistochemical plakoglobin staining and in one case by electron microscopy. We found a new LMNA mutation Phe237Ser in a family of five affected members with a cardiomyopathy affecting primarily the right side of the heart. The phenotype resembles ARVC but does not fulfill the Task Force Criteria. The main clinical manifestations of the mutation were severe tricuspid insufficiency, right ventricular enlargement and failure. Three of the affected patients died of the heart disease, and the two living patients received heart transplants at ages 44 and 47. Electron microscopy showed nuclear blebbing compatible with laminopathy. Immunohisto - chemical analysis did not suggest desmosomal pathology. No desmosomal mutations were found. The Phe237Ser LMNA mutation causes a phenotype different from traditional cardiolaminopathy. Our findings suggest that cardiomyopathy affecting primarily the right side of the heart is not always caused by desmosomal pathology. Our observations highlight the challenges in classifying cardiomyopathies, as there often is significant overlap between the traditional categories.

  10. Entropy-based portfolio models: Practical issues

    Science.gov (United States)

    Shirazi, Yasaman Izadparast; Sabiruzzaman, Md.; Hamzah, Nor Aishah

    2015-10-01

    Entropy is a nonparametric alternative of variance and has been used as a measure of risk in portfolio analysis. In this paper, the computation of entropy risk for a given set of data is discussed with illustration. A comparison between entropy-based portfolio models is made. We propose a natural extension of the mean entropy portfolio to make it more general and diversified. In terms of performance, this new model is similar to the mean-entropy portfolio when applied to real and simulated data, and offers higher return if no constraint is set for the desired return; also it is found to be the most diversified portfolio model.

  11. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...

  12. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  13. Incident duration modeling using flexible parametric hazard-based models.

    Science.gov (United States)

    Li, Ruimin; Shang, Pan

    2014-01-01

    Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  14. Designing Network-based Business Model Ontology

    DEFF Research Database (Denmark)

    Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz

    2015-01-01

    Survival on dynamic environment is not achieved without a map. Scanning and monitoring of the market show business models as a fruitful tool. But scholars believe that old-fashioned business models are dead; as they are not included the effect of internet and network in themselves. This paper...... is going to propose e-business model ontology from the network point of view and its application in real world. The suggested ontology for network-based businesses is composed of individuals` characteristics and what kind of resources they own. also, their connections and pre-conceptions of connections...... such as shared-mental model and trust. However, it mostly covers previous business model elements. To confirm the applicability of this ontology, it has been implemented in business angel network and showed how it works....

  15. Haptics-based dynamic implicit solid modeling.

    Science.gov (United States)

    Hua, Jing; Qin, Hong

    2004-01-01

    This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback.

  16. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  17. Frequency response function-based model updating using Kriging model

    Science.gov (United States)

    Wang, J. T.; Wang, C. J.; Zhao, J. P.

    2017-03-01

    An acceleration frequency response function (FRF) based model updating method is presented in this paper, which introduces Kriging model as metamodel into the optimization process instead of iterating the finite element analysis directly. The Kriging model is taken as a fast running model that can reduce solving time and facilitate the application of intelligent algorithms in model updating. The training samples for Kriging model are generated by the design of experiment (DOE), whose response corresponds to the difference between experimental acceleration FRFs and its counterpart of finite element model (FEM) at selected frequency points. The boundary condition is taken into account, and a two-step DOE method is proposed for reducing the number of training samples. The first step is to select the design variables from the boundary condition, and the selected variables will be passed to the second step for generating the training samples. The optimization results of the design variables are taken as the updated values of the design variables to calibrate the FEM, and then the analytical FRFs tend to coincide with the experimental FRFs. The proposed method is performed successfully on a composite structure of honeycomb sandwich beam, after model updating, the analytical acceleration FRFs have a significant improvement to match the experimental data especially when the damping ratios are adjusted.

  18. Model based development of engine control algorithms

    NARCIS (Netherlands)

    Dekker, H.J.; Sturm, W.L.

    1996-01-01

    Model based development of engine control systems has several advantages. The development time and costs are strongly reduced because much of the development and optimization work is carried out by simulating both engine and control system. After optimizing the control algorithm it can be executed b

  19. Agent based computational model of trust

    NARCIS (Netherlands)

    A. Gorobets (Alexander); B. Nooteboom (Bart)

    2004-01-01

    textabstractThis paper employs the methodology of Agent-Based Computational Economics (ACE) to investigate under what conditions trust can be viable in markets. The emergence and breakdown of trust is modeled in a context of multiple buyers and suppliers. Agents adapt their trust in a partner, the w

  20. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  1. What's Missing in Model-Based Teaching

    Science.gov (United States)

    Khan, Samia

    2011-01-01

    In this study, the author investigated how four science teachers employed model-based teaching (MBT) over a 1-year period. The purpose of the research was to develop a baseline of the fundamental and specific dimensions of MBT that are present and absent in science teaching. Teacher interviews, classroom observations, and pre and post-student…

  2. Prototype-based models in machine learning

    NARCIS (Netherlands)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of poten

  3. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results...

  4. Prototype-based models in machine learning

    NARCIS (Netherlands)

    Biehl, Michael; Hammer, Barbara; Villmann, Thomas

    2016-01-01

    An overview is given of prototype-based models in machine learning. In this framework, observations, i.e., data, are stored in terms of typical representatives. Together with a suitable measure of similarity, the systems can be employed in the context of unsupervised and supervised analysis of

  5. Port-based modeling of mechatronic systems

    NARCIS (Netherlands)

    Breedveld, Peter C.

    2004-01-01

    Many engineering activities, including mechatronic design, require that a multidomain or ‘multi-physics’ system and its control system be designed as an integrated system. This contribution discusses the background and tools for a port-based approach to integrated modeling and simulation of physical

  6. Sandboxes for Model-Based Inquiry

    Science.gov (United States)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-01-01

    In this article, we introduce a class of constructionist learning environments that we call "Emergent Systems Sandboxes" ("ESSs"), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual…

  7. Deriving Framework Usages Based on Behavioral Models

    Science.gov (United States)

    Zenmyo, Teruyoshi; Kobayashi, Takashi; Saeki, Motoshi

    One of the critical issue in framework-based software development is a huge introduction cost caused by technical gap between developers and users of frameworks. This paper proposes a technique for deriving framework usages to implement a given requirements specification. By using the derived usages, the users can use the frameworks without understanding the framework in detail. Requirements specifications which describe definite behavioral requirements cannot be related to frameworks in as-is since the frameworks do not have definite control structure so that the users can customize them to suit given requirements specifications. To cope with this issue, a new technique based on satisfiability problems (SAT) is employed to derive the control structures of the framework model. In the proposed technique, requirements specifications and frameworks are modeled based on Labeled Transition Systems (LTSs) with branch conditions represented by predicates. Truth assignments of the branch conditions in the framework models are not given initially for representing the customizable control structure. The derivation of truth assignments of the branch conditions is regarded as the SAT by assuming relations between termination states of the requirements specification model and ones of the framework model. This derivation technique is incorporated into a technique we have proposed previously for relating actions of requirements specifications to ones of frameworks. Furthermore, this paper discuss a case study of typical use cases in e-commerce systems.

  8. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  9. Whole body acid-base modeling revisited.

    Science.gov (United States)

    Ring, Troels; Nielsen, Søren

    2017-04-01

    The textbook account of whole body acid-base balance in terms of endogenous acid production, renal net acid excretion, and gastrointestinal alkali absorption, which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. To improve understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production were already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption in terms of urine excretions. With a few assumptions it was possible to see that this expression of net acid balance was arithmetically identical to minus urine charge, whereby under the development of acidosis, urine was predicted to acquire a net negative charge. The literature already mentions unexplained negative urine charges so we scrutinized a series of seminal papers and confirmed empirically the theoretical prediction that observed urine charge did acquire negative charge as acidosis developed. Hence, we can conclude that the conventional model is problematic since it predicts what is physiologically impossible. Therefore, we need a new model for whole body acid-base balance, which does not have impossible implications. Furthermore, new experimental studies are needed to account for charge imbalance in urine under development of acidosis.

  10. Statistical Seasonal Sea Surface based Prediction Model

    Science.gov (United States)

    Suarez, Roberto; Rodriguez-Fonseca, Belen; Diouf, Ibrahima

    2014-05-01

    The interannual variability of the sea surface temperature (SST) plays a key role in the strongly seasonal rainfall regime on the West African region. The predictability of the seasonal cycle of rainfall is a field widely discussed by the scientific community, with results that fail to be satisfactory due to the difficulty of dynamical models to reproduce the behavior of the Inter Tropical Convergence Zone (ITCZ). To tackle this problem, a statistical model based on oceanic predictors has been developed at the Universidad Complutense of Madrid (UCM) with the aim to complement and enhance the predictability of the West African Monsoon (WAM) as an alternative to the coupled models. The model, called S4CAST (SST-based Statistical Seasonal Forecast) is based on discriminant analysis techniques, specifically the Maximum Covariance Analysis (MCA) and Canonical Correlation Analysis (CCA). Beyond the application of the model to the prediciton of rainfall in West Africa, its use extends to a range of different oceanic, atmospheric and helth related parameters influenced by the temperature of the sea surface as a defining factor of variability.

  11. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  12. Agent Based Modeling Applications for Geosciences

    Science.gov (United States)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  13. Entropy Based Modelling for Estimating Demographic Trends.

    Directory of Open Access Journals (Sweden)

    Guoqi Li

    Full Text Available In this paper, an entropy-based method is proposed to forecast the demographical changes of countries. We formulate the estimation of future demographical profiles as a constrained optimization problem, anchored on the empirically validated assumption that the entropy of age distribution is increasing in time. The procedure of the proposed method involves three stages, namely: 1 Prediction of the age distribution of a country's population based on an "age-structured population model"; 2 Estimation the age distribution of each individual household size with an entropy-based formulation based on an "individual household size model"; and 3 Estimation the number of each household size based on a "total household size model". The last stage is achieved by projecting the age distribution of the country's population (obtained in stage 1 onto the age distributions of individual household sizes (obtained in stage 2. The effectiveness of the proposed method is demonstrated by feeding real world data, and it is general and versatile enough to be extended to other time dependent demographic variables.

  14. N170 Changes Show Identifiable Chinese Characters Compete Primarily with Faces Rather than Houses.

    Science.gov (United States)

    Fan, Cong; He, Weiqi; He, Huamin; Ren, Guofang; Luo, Yuejia; Li, Hong; Luo, Wenbo

    2015-01-01

    Character processing is a crucial cognitive skill that is highly emphasized and industriously cultivated in contemporary society. In the present study, using a competition paradigm, we examined the electrophysiological correlates of different relationships between Chinese characters and faces and between Chinese characters and houses during early visual processing. We observed that identifiable Chinese characters compete primarily with faces rather than houses at an early visual processing stage, with a significantly reduced N170 for faces but not for houses, when they were viewed concurrently with identifiable characters relative to when they were viewed concurrently with unidentifiable characters. Consistent with our previous study, there was a significant increase in N170 after characters have been learned, indicating a modulatory effect of Chinese character identification level on N170 amplitude. Furthermore, we found an enlarged N170 in response to faces compared to houses, indicating that the neural mechanisms for processing faces and houses are different at an early visual processing stage.

  15. Which sensory perception is primarily considered, in consumers’ hedonic evaluation of foods?

    DEFF Research Database (Denmark)

    Andersen, Barbara Vad; Brockhoff, Per B.; Hyldig, Grethe

    2015-01-01

    An analysis of the primary hedonic drivers of liking and sensory satisfaction will provide valuable information to product developers on which sensory properties to emphasise the most. The aims of the present study were: a) to study if liking of the sensory properties: appearance, odour, taste...... and texture were considered equally, when consumers rated overall liking and sensory satisfaction b) to study if the relation depended on, whether liking of sensory properties were related to overall liking or sensory satisfaction, and c) to study individual differences in which sensory properties...... the consumers primarily paid attention to when rating overall liking and sensory satisfaction, respectively. Four apple-cherry fruit drinks were used, varying in: type of sweetener, and addition of aroma and fibre. The fruit drinks were used in a in a cross-over consumer study on 67 subjects together...

  16. Non-Hodgkin's Lymphoma Primarily Presenting with Fanconi Syndrome and Acute Kidney Injury

    Institute of Scientific and Technical Information of China (English)

    Wen-ling Ye; Bing Han; Bing-yan Liu; Chan Meng; Wei Ye; Yu-bing Wen; Hang Li; Xue-mei Li

    2010-01-01

    @@ KIDNEY involvement is common in non-Hodg-kin's lymphoma (NHL) with incidence up to 30%-40% in autopsy studies. However, it us-ually occurs late in the course of the disease and is clinically silent. Clinically overt renal disease in-cluding acute kidney injury (AKI) as its primary manifes-tation is rarely reported, moreover, Fanconi syndrome (FS) is extremely rare as the main manifestation in NHL. In this report, we presented a case of NHL primarily presenting with FS and AKI due to diffuse interstitial infiltration of NHL cells and emphasized the important role of renal biopsy, especially renal immunohistochemical analysis in the di-agnosis of renal diffuse lymphoma.

  17. Family-Based Model Checking Without a Family-Based Model Checker

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus

    2015-01-01

    (systems with variability), specialized family-based model checking algorithms allow efficient verification of multiple variants, simultaneously. These algorithms scale much better than ``brute force'' verification of individual systems, one-by-one. Nevertheless, they can deal with only very small...... variational models. We address two key problems of family-based model checking. First, we improve scalability by introducing abstractions that simplify variability. Second, we reduce the burden of maintaining specialized family-based model checkers, by showing how the presented variability abstractions can...... be used to model-check variational models using the standard version of (single system) SPIN. The abstractions are first defined as Galois connections on semantic domains. We then show how to translate them into syntactic source-to-source transformations on variational models. This allows the use of SPIN...

  18. Mesoscopic model of actin-based propulsion.

    Directory of Open Access Journals (Sweden)

    Jie Zhu

    Full Text Available Two theoretical models dominate current understanding of actin-based propulsion: microscopic polymerization ratchet model predicts that growing and writhing actin filaments generate forces and movements, while macroscopic elastic propulsion model suggests that deformation and stress of growing actin gel are responsible for the propulsion. We examine both experimentally and computationally the 2D movement of ellipsoidal beads propelled by actin tails and show that neither of the two models can explain the observed bistability of the orientation of the beads. To explain the data, we develop a 2D hybrid mesoscopic model by reconciling these two models such that individual actin filaments undergoing nucleation, elongation, attachment, detachment and capping are embedded into the boundary of a node-spring viscoelastic network representing the macroscopic actin gel. Stochastic simulations of this 'in silico' actin network show that the combined effects of the macroscopic elastic deformation and microscopic ratchets can explain the observed bistable orientation of the actin-propelled ellipsoidal beads. To test the theory further, we analyze observed distribution of the curvatures of the trajectories and show that the hybrid model's predictions fit the data. Finally, we demonstrate that the model can explain both concave-up and concave-down force-velocity relations for growing actin networks depending on the characteristic time scale and network recoil. To summarize, we propose that both microscopic polymerization ratchets and macroscopic stresses of the deformable actin network are responsible for the force and movement generation.

  19. PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Toni Bielić

    2014-10-01

    Full Text Available 800x600 This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introducing teamwork on board the ship. Three examples of the ship’s accidents are studied and evaluated through “Leader - participation” model. The model of participation based management as a model of the teamwork has been applied in studying the cause - and - effect of accidents with the critical review of the communication and managing the human resources on a ship. The results have showed that the cause of all three accidents is the autocratic behaviour of the leaders and lack of communication within teams. Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4

  20. Mandarin Pronunciation Modeling Based on CASS Corpus

    Institute of Scientific and Technical Information of China (English)

    郑方; 宋战江; Pascale Fung; William Byrne

    2002-01-01

    The pronunciation variability is an important issue that must be faced with when developing practical automatic spontaneous speech recognition systems. In this paper, the factors that may affect the recognition performance are analyzed, including those specific to the Chinese language. By studying the INITIAL/FINAL (IF) characteristics of Chinese language and developing the Bayesian equation, the concepts of generalized INITIAL/FINAL (GIF) and generalized syllable (GS), the GIF modeling and the IF-GIF modeling, as well as the contextdependent pronunciation weighting, are proposed based on a well phonetically transcribed seed database. By using these methods, the Chinese syllable error rate (SER) is reduced by 6.3%and 4.2% compared with the GIF modeling and IF modeling respectively when the language model, such as syllable or word N-gram, is not used. The effectiveness of these methods is also proved when more data without the phonetic transcription are used to refine the acoustic model using the proposed iterative forced-alignment based transcribing (IFABT) method, achieving a 5.7% SER reduction.

  1. Modeling Leaves Based on Real Image

    Institute of Scientific and Technical Information of China (English)

    CAO Yu-kun; LI Yun-feng; ZHU Qing-sheng; LIU Yin-bin

    2004-01-01

    Plants have complex structures. The shape of a plant component is vital for capturing the characteristics of a species. One of the challenges in computer graphics is to create geometry of objects in an intuitive and direct way while allowing interactive manipulation of the resulting shapes. In this paper,an interactive method for modeling leaves based on real image is proposed using biological data for individual plants. The modeling process begins with a one-dimensional analogue of implicit surfaces,from which a 2D silhouette of a leaf is generated based on image segmentation. The silhouette skeleton is thus obtained. Feature parameters of the leaf are extracted based on biologically experimental data, and the obtained leaf structure is then modified by comparing the synthetic result with the real leaf so as to make the leaf structure more realistic. Finally, the leaf mesh is constructed by sweeps.

  2. Multiagent-Based Model For ESCM

    Directory of Open Access Journals (Sweden)

    Delia MARINCAS

    2011-01-01

    Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.

  3. Efficient Model-Based Diagnosis Engine

    Science.gov (United States)

    Fijany, Amir; Vatan, Farrokh; Barrett, Anthony; James, Mark; Mackey, Ryan; Williams, Colin

    2009-01-01

    An efficient diagnosis engine - a combination of mathematical models and algorithms - has been developed for identifying faulty components in a possibly complex engineering system. This model-based diagnosis engine embodies a twofold approach to reducing, relative to prior model-based diagnosis engines, the amount of computation needed to perform a thorough, accurate diagnosis. The first part of the approach involves a reconstruction of the general diagnostic engine to reduce the complexity of the mathematical-model calculations and of the software needed to perform them. The second part of the approach involves algorithms for computing a minimal diagnosis (the term "minimal diagnosis" is defined below). A somewhat lengthy background discussion is prerequisite to a meaningful summary of the innovative aspects of the present efficient model-based diagnosis engine. In model-based diagnosis, the function of each component and the relationships among all the components of the engineering system to be diagnosed are represented as a logical system denoted the system description (SD). Hence, the expected normal behavior of the engineering system is the set of logical consequences of the SD. Faulty components lead to inconsistencies between the observed behaviors of the system and the SD (see figure). Diagnosis - the task of finding faulty components - is reduced to finding those components, the abnormalities of which could explain all the inconsistencies. The solution of the diagnosis problem should be a minimal diagnosis, which is a minimal set of faulty components. A minimal diagnosis stands in contradistinction to the trivial solution, in which all components are deemed to be faulty, and which, therefore, always explains all inconsistencies.

  4. Consensus guidelines for oral dosing of primarily renally cleared medications in older adults.

    Science.gov (United States)

    Hanlon, Joseph T; Aspinall, Sherrie L; Semla, Todd P; Weisbord, Steven D; Fried, Linda F; Good, C Bernie; Fine, Michael J; Stone, Roslyn A; Pugh, Mary Jo V; Rossi, Michelle I; Handler, Steven M

    2009-02-01

    To establish consensus oral dosing guidelines for primarily renally cleared medications prescribed for older adults. Literature search followed by a two-round modified Delphi survey. A nationally representative survey of experts in geriatric clinical pharmacy. Eleven geriatric clinical pharmacists. After a comprehensive literature search and review by an investigative group of six physicians (2 general internal medicine, 2 nephrology, 2 geriatrics), 43 dosing recommendations for 30 medications at various levels of renal function were created. The expert panel rated its agreement with each of these 43 dosing recommendations using a 5-point Likert scale (1=strongly disagree to 5=strongly agree). Recommendation-specific means and 95% confidence intervals were estimated. Consensus was defined as a lower 95% confidence limit of greater than 4.0 for the recommendation-specific mean score. The response rate was 81.8% (9/11) for the first round. All respondents who completed the first round also completed the second round. The expert panel reached consensus on 26 recommendations involving 18 (60%) medications. For 10 medications (chlorpropamide, colchicine, cotrimoxazole, glyburide, meperidine, nitrofurantoin, probenecid, propoxyphene, spironolactone, and triamterene), the consensus recommendation was not to use the medication in older adults below a specified level of renal function (e.g., creatinine clearance <30 mL/min). For the remaining eight medications (acyclovir, amantadine, ciprofloxacin, gabapentin, memantine, ranitidine, rimantadine, and valacyclovir), specific recommendations for dose reduction or interval extension were made. An expert panel of geriatric clinical pharmacists was able to reach consensus agreement on a number of oral medications that are primarily renally cleared.

  5. Modeling acquaintance networks based on balance theory

    Directory of Open Access Journals (Sweden)

    Vukašinović Vida

    2014-09-01

    Full Text Available An acquaintance network is a social structure made up of a set of actors and the ties between them. These ties change dynamically as a consequence of incessant interactions between the actors. In this paper we introduce a social network model called the Interaction-Based (IB model that involves well-known sociological principles. The connections between the actors and the strength of the connections are influenced by the continuous positive and negative interactions between the actors and, vice versa, the future interactions are more likely to happen between the actors that are connected with stronger ties. The model is also inspired by the social behavior of animal species, particularly that of ants in their colony. A model evaluation showed that the IB model turned out to be sparse. The model has a small diameter and an average path length that grows in proportion to the logarithm of the number of vertices. The clustering coefficient is relatively high, and its value stabilizes in larger networks. The degree distributions are slightly right-skewed. In the mature phase of the IB model, i.e., when the number of edges does not change significantly, most of the network properties do not change significantly either. The IB model was found to be the best of all the compared models in simulating the e-mail URV (University Rovira i Virgili of Tarragona network because the properties of the IB model more closely matched those of the e-mail URV network than the other models

  6. Satellite-based terrestrial production efficiency modeling

    Directory of Open Access Journals (Sweden)

    Obersteiner Michael

    2009-09-01

    Full Text Available Abstract Production efficiency models (PEMs are based on the theory of light use efficiency (LUE which states that a relatively constant relationship exists between photosynthetic carbon uptake and radiation receipt at the canopy level. Challenges remain however in the application of the PEM methodology to global net primary productivity (NPP monitoring. The objectives of this review are as follows: 1 to describe the general functioning of six PEMs (CASA; GLO-PEM; TURC; C-Fix; MOD17; and BEAMS identified in the literature; 2 to review each model to determine potential improvements to the general PEM methodology; 3 to review the related literature on satellite-based gross primary productivity (GPP and NPP modeling for additional possibilities for improvement; and 4 based on this review, propose items for coordinated research. This review noted a number of possibilities for improvement to the general PEM architecture - ranging from LUE to meteorological and satellite-based inputs. Current PEMs tend to treat the globe similarly in terms of physiological and meteorological factors, often ignoring unique regional aspects. Each of the existing PEMs has developed unique methods to estimate NPP and the combination of the most successful of these could lead to improvements. It may be beneficial to develop regional PEMs that can be combined under a global framework. The results of this review suggest the creation of a hybrid PEM could bring about a significant enhancement to the PEM methodology and thus terrestrial carbon flux modeling. Key items topping the PEM research agenda identified in this review include the following: LUE should not be assumed constant, but should vary by plant functional type (PFT or photosynthetic pathway; evidence is mounting that PEMs should consider incorporating diffuse radiation; continue to pursue relationships between satellite-derived variables and LUE, GPP and autotrophic respiration (Ra; there is an urgent need for

  7. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  8. Bacterial diversity shift determined by different diets in the gut of the spotted wing fly Drosophila suzukii is primarily reflected on acetic acid bacteria

    KAUST Repository

    Vacchini, Violetta

    2016-11-25

    The pivotal role of diet in shaping gut microbiota has been evaluated in different animal models, including insects. Drosophila flies harbour an inconstant microbiota among which acetic acid bacteria (AAB) are important components. Here, we investigated the bacterial and AAB components of the invasive pest Drosophila suzukii microbiota, by studying the same insect population separately grown on fruit-based or non-fruit artificial diet. AAB were highly prevalent in the gut under both diets (90 and 92% infection rates with fruits and artificial diet, respectively). Fluorescent in situ hybridization and recolonization experiments with green fluorescent protein (Gfp)-labelled strains showed AAB capability to massively colonize insect gut. High-throughput sequencing on 16S rRNA gene indicated that the bacterial microbiota of guts fed with the two diets clustered separately. By excluding AAB-related OTUs from the analysis, insect bacterial communities did not cluster separately according to the diet, suggesting that diet-based diversification of the community is primarily reflected on the AAB component of the community. Diet influenced also AAB alpha-diversity, with separate OTU distributions based on diets. High prevalence, localization and massive recolonization, together with AAB clustering behaviour in relation to diet, suggest an AAB role in the D. suzukii gut response to diet modification. This article is protected by copyright. All rights reserved.

  9. Bacterial diversity shift determined by different diets in the gut of the spotted wing fly Drosophila suzukii is primarily reflected on acetic acid bacteria.

    Science.gov (United States)

    Vacchini, Violetta; Gonella, Elena; Crotti, Elena; Prosdocimi, Erica M; Mazzetto, Fabio; Chouaia, Bessem; Callegari, Matteo; Mapelli, Francesca; Mandrioli, Mauro; Alma, Alberto; Daffonchio, Daniele

    2017-04-01

    The pivotal role of diet in shaping gut microbiota has been evaluated in different animal models, including insects. Drosophila flies harbour an inconstant microbiota among which acetic acid bacteria (AAB) are important components. Here, we investigated the bacterial and AAB components of the invasive pest Drosophila suzukii microbiota, by studying the same insect population separately grown on fruit-based or non-fruit artificial diet. AAB were highly prevalent in the gut under both diets (90 and 92% infection rates with fruits and artificial diet respectively). Fluorescent in situ hybridization and recolonization experiments with green fluorescent protein (Gfp)-labelled strains showed AAB capability to massively colonize insect gut. High-throughput sequencing on 16S rRNA gene indicated that the bacterial microbiota of guts fed with the two diets clustered separately. By excluding AAB-related OTUs from the analysis, insect bacterial communities did not cluster separately according to the diet, suggesting that diet-based diversification of the community is primarily reflected on the AAB component of the community. Diet influenced also AAB alpha-diversity, with separate OTU distributions based on diets. High prevalence, localization and massive recolonization, together with AAB clustering behaviour in relation to diet, suggest an AAB role in the D. suzukii gut response to diet modification. © 2016 Society for Applied Microbiology and John Wiley & Sons Ltd.

  10. Mechanics model for actin-based motility.

    Science.gov (United States)

    Lin, Yuan

    2009-02-01

    We present here a mechanics model for the force generation by actin polymerization. The possible adhesions between the actin filaments and the load surface, as well as the nucleation and capping of filament tips, are included in this model on top of the well-known elastic Brownian ratchet formulation. A closed form solution is provided from which the force-velocity relationship, summarizing the mechanics of polymerization, can be drawn. Model predictions on the velocity of moving beads driven by actin polymerization are consistent with experiment observations. This model also seems capable of explaining the enhanced actin-based motility of Listeria monocytogenes and beads by the presence of Vasodilator-stimulated phosphoprotein, as observed in recent experiments.

  11. An immune based dynamic intrusion detection model

    Institute of Scientific and Technical Information of China (English)

    LI Tao

    2005-01-01

    With the dynamic description method for self and antigen, and the concept of dynamic immune tolerance for lymphocytes in network-security domain presented in this paper, a new immune based dynamic intrusion detection model (Idid) is proposed. In Idid, the dynamic models and the corresponding recursive equations of the lifecycle of mature lymphocytes, and the immune memory are built. Therefore, the problem of the dynamic description of self and nonself in computer immune systems is solved, and the defect of the low efficiency of mature lymphocyte generating in traditional computer immune systems is overcome. Simulations of this model are performed, and the comparison experiment results show that the proposed dynamic intrusion detection model has a better adaptability than the traditional methods.

  12. Electrochemistry-based Battery Modeling for Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2013-01-01

    Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.

  13. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    the information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual......) for calibration of the model resulted in the same predicted level but narrower model prediction bounds than calibrations based on volume-proportional samples, allowing a better exploitation of the resources allocated for stormwater quality management.......Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect...

  14. Physics-Based Modeling of Meteor Entry and Breakup

    Science.gov (United States)

    Prabhu, Dinesh K.; Agrawal, Parul; Allen, Gary A., Jr.; Bauschlicher, Charles W., Jr.; Brandis, Aaron M.; Chen, Yih-Kang; Jaffe, Richard L.; Palmer, Grant E.; Saunders, David A.; Stern, Eric C.; Tauber, Michael E.; Venkatapathy, Ethiraj

    2015-01-01

    A new research effort at NASA Ames Research Center has been initiated in Planetary Defense, which integrates the disciplines of planetary science, atmospheric entry physics, and physics-based risk assessment. This paper describes work within the new program and is focused on meteor entry and breakup.Over the last six decades significant effort was expended in the US and in Europe to understand meteor entry including ablation, fragmentation and airburst (if any) for various types of meteors ranging from stony to iron spectral types. These efforts have produced primarily empirical mathematical models based on observations. Weaknesses of these models, apart from their empiricism, are reliance on idealized shapes (spheres, cylinders, etc.) and simplified models for thermal response of meteoritic materials to aerodynamic and radiative heating. Furthermore, the fragmentation and energy release of meteors (airburst) is poorly understood.On the other hand, flight of human-made atmospheric entry capsules is well understood. The capsules and their requisite heatshields are designed and margined to survive entry. However, the highest speed Earth entry for capsules is 13 kms (Stardust). Furthermore, Earth entry capsules have never exceeded diameters of 5 m, nor have their peak aerothermal environments exceeded 0.3 atm and 1 kW/sq cm. The aims of the current work are: (i) to define the aerothermal environments for objects with entry velocities from 13 to 20 kms; (ii) to explore various hypotheses of fragmentation and airburst of stony meteors in the near term; (iii) to explore the possibility of performing relevant ground-based tests to verify candidate hypotheses; and (iv) to quantify the energy released in airbursts. The results of the new simulations will be used to anchor said risk assessment analyses. With these aims in mind, state-of-the-art entry capsule design tools are being extended for meteor entries. We describe: (i) applications of current simulation tools to

  15. Predicting Learners Styles Based on Fuzzy Model

    Science.gov (United States)

    Alian, Marwah; Shaout, Adnan

    2017-01-01

    Learners style is grouped into four types mainly; Visual, auditory, kinesthetic and Read/Write. Each type of learners learns primarily through one of the main receiving senses, visual, listening, or by doing. Learner style has an effect on the learning process and learner's achievement. It is better to select suitable learning tool for the learner…

  16. Image-based modelling of organogenesis.

    Science.gov (United States)

    Iber, Dagmar; Karimaddini, Zahra; Ünal, Erkan

    2016-07-01

    One of the major challenges in biology concerns the integration of data across length and time scales into a consistent framework: how do macroscopic properties and functionalities arise from the molecular regulatory networks-and how can they change as a result of mutations? Morphogenesis provides an excellent model system to study how simple molecular networks robustly control complex processes on the macroscopic scale despite molecular noise, and how important functional variants can emerge from small genetic changes. Recent advancements in three-dimensional imaging technologies, computer algorithms and computer power now allow us to develop and analyse increasingly realistic models of biological control. Here, we present our pipeline for image-based modelling that includes the segmentation of images, the determination of displacement fields and the solution of systems of partial differential equations on the growing, embryonic domains. The development of suitable mathematical models, the data-based inference of parameter sets and the evaluation of competing models are still challenging, and current approaches are discussed.

  17. Parcels versus pixels: modeling agricultural land use across broad geographic regions using parcel-based field boundaries

    Science.gov (United States)

    Sohl, Terry L.; Dornbierer, Jordan; Wika, Steve; Sayler, Kristi L.; Quenzer, Robert

    2017-01-01

    Land use and land cover (LULC) change occurs at a local level within contiguous ownership and management units (parcels), yet LULC models primarily use pixel-based spatial frameworks. The few parcel-based models being used overwhelmingly focus on small geographic areas, limiting the ability to assess LULC change impacts at regional to national scales. We developed a modified version of the Forecasting Scenarios of land use change model to project parcel-based agricultural change across a large region in the United States Great Plains. A scenario representing an agricultural biofuel scenario was modeled from 2012 to 2030, using real parcel boundaries based on contiguous ownership and land management units. The resulting LULC projection provides a vastly improved representation of landscape pattern over existing pixel-based models, while simultaneously providing an unprecedented combination of thematic detail and broad geographic extent. The conceptual approach is practical and scalable, with potential use for national-scale projections.

  18. Model-based Tomographic Reconstruction Literature Search

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D H; Lehman, S K

    2005-11-30

    In the process of preparing a proposal for internal research funding, a literature search was conducted on the subject of model-based tomographic reconstruction (MBTR). The purpose of the search was to ensure that the proposed research would not replicate any previous work. We found that the overwhelming majority of work on MBTR which used parameterized models of the object was theoretical in nature. Only three researchers had applied the technique to actual data. In this note, we summarize the findings of the literature search.

  19. Soft sensor modeling based on Gaussian processes

    Institute of Scientific and Technical Information of China (English)

    XIONG Zhi-hua; HUANG Guo-hong; SHAO Hui-he

    2005-01-01

    In order to meet the demand of online optimal running, a novel soft sensor modeling approach based on Gaussian processes was proposed. The approach is moderately simple to implement and use without loss of performance. It is trained by optimizing the hyperparameters using the scaled conjugate gradient algorithm with the squared exponential covariance function employed. Experimental simulations show that the soft sensor modeling approach has the advantage via a real-world example in a refinery. Meanwhile, the method opens new possibilities for application of kernel methods to potential fields.

  20. Changes in Greenland ice sheet elevation attributed primarily to snow accumulation variability

    Science.gov (United States)

    McConnell; Arthern; Mosley-Thompson; Davis; Bales; Thomas; Burkhart; Kyne

    2000-08-24

    The response of grounded ice sheets to a changing climate critically influences possible future changes in sea level. Recent satellite surveys over southern Greenland show little overall elevation change at higher elevations, but large spatial variability. Using satellite studies alone, it is not possible to determine the geophysical processes responsible for the observed elevation changes and to decide if recent rates of change exceed the natural variability. Here we derive changes in ice-sheet elevation in southern Greenland, for the years 1978-88, using a physically based model of firn densification and records of annual snow accumulation reconstructed from 12 ice cores at high elevation. Our patterns of accumulation-driven elevation change agree closely with contemporaneous satellite measurements of ice-sheet elevation change, and we therefore attribute the changes observed in 1978-88 to variability in snow accumulation. Similar analyses of longer ice-core records show that in this decade the Greenland ice sheet exhibited typical variability at high elevations, well within the long-term natural variability. Our results indicate that a better understanding of ice-sheet mass changes will require long-term measurements of both surface elevation and snow accumulation.

  1. History-based trust negotiation model

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yi-zhu; ZHAO Yan-hua; LU Hong-wei

    2009-01-01

    Trust negotiation (TN) is an approach to establish trust between strangers through iterative disclosure of digital credentials. Speeding up subsequent negotiations between the same negotiators is a problem worth of research. This paper introduces the concept of visiting card, and presents a history-based trust negotiation (HBTN) model. HBTN creates an account for a counterpart at the first negotiation and records valid credentials that the counterpart disclosed during each trust negotiation in his historical information base (HIB). For the following negotiation, no more credentials need to be disclosed for both parties. HBTN speeds up subsequent negotiations between the entities that interact with each other frequently without impairing the privacy preservation.

  2. The development and characterization of a primarily mineral calcium phosphate - poly(epsilon-caprolactone) biocomposite

    Science.gov (United States)

    Dunkley, Ian Robert

    Orthopaedic reconstruction often involves the surgical introduction of structural implants that provide for rigid fixation, skeletal stabilization, and bone integration. The high stresses incurred by these implanted devices have historically limited material choices to metallic and select polymeric formulations. While mechanical requirements are achieved, these non-degradable materials do not participate actively in the remodeling of the skeleton and present the possibility of long-term failure or rejection. This is particularly relevant in cervical fusion, an orthopaedic procedure to treat damaged, degenerative or diseased intervertebral discs. A significant improvement on the available synthetic bone replacement/regeneration options for implants to treat these conditions in the cervical spine may be achieved with the development of primarily mineral biocomposites comprised of a bioactive ceramic matrix reinforced with a biodegradable polymer. Such a biocomposite may be engineered to possess the clinically required mechanical properties of a particular application, while maintaining the ability to be remodeled completely by the body. A biocomposite of Si-doped calcium phosphate (Si-CaP) and poly(epsilon-caprolactone) (PCL) was developed for application as such a synthetic bone material for potential use as a fusion device in the cervical spine. In this thesis, a method by which high mineral content Si-CaP/PCL biocomposites with interpenetrating matrices of mineral and polymer phases may be prepared will be demonstrated, in addition to the effects of the various preparation parameters on the biocomposite density, porosity and mechanical properties. This new technique by which dense, primarily ceramic Si-CaP/PCL biocomposites were prepared, allowed for the incorporation of mineral contents ranging between 45-97vol%. Polymer infiltration, accomplished solely by passive capillary uptake over several days, was found to be capable of fully infiltrating the microporosity

  3. Modelling Inter-Particle Forces and Resulting Agglomerate Sizes in Cement-Based Materials

    DEFF Research Database (Denmark)

    Kjeldsen, Ane Mette; Geiker, Mette Rica

    2005-01-01

    The theory of inter-particle forces versus external shear in cement-based materials is reviewed. On this basis, calculations on maximum agglomerate size present after the combined action of superplasticizers and shear are carried out. Qualitative experimental results indicate that external shear...... affects the particle size distribution of Mg(OH)2 (used as model material) as well as silica, whereas the addition of superplasticizers affects only the smallest particles in cement and thus primarily acts as water reducers and not dispersers....

  4. Model-Based Trace-Checking

    CERN Document Server

    Howard, Y; Gravell, A; Ferreira, C; Augusto, J C

    2011-01-01

    Trace analysis can be a useful way to discover problems in a program under test. Rather than writing a special purpose trace analysis tool, this paper proposes that traces can usefully be analysed by checking them against a formal model using a standard model-checker or else an animator for executable specifications. These techniques are illustrated using a Travel Agent case study implemented in J2EE. We added trace beans to this code that write trace information to a database. The traces are then extracted and converted into a form suitable for analysis by Spin, a popular model-checker, and Pro-B, a model-checker and animator for the B notation. This illustrates the technique, and also the fact that such a system can have a variety of models, in different notations, that capture different features. These experiments have demonstrated that model-based trace-checking is feasible. Future work is focussed on scaling up the approach to larger systems by increasing the level of automation.

  5. Individual eye model based on wavefront aberration

    Science.gov (United States)

    Guo, Huanqing; Wang, Zhaoqi; Zhao, Qiuling; Quan, Wei; Wang, Yan

    2005-03-01

    Based on the widely used Gullstrand-Le Grand eye model, the individual human eye model has been established here, which has individual corneal data, anterior chamber depth and the eyeball depth. Furthermore, the foremost thing is that the wavefront aberration calculated from the individual eye model is equal to the eye's wavefront aberration measured with the Hartmann-shack wavefront sensor. There are four main steps to build the model. Firstly, the corneal topography instrument was used to measure the corneal surfaces and depth. And in order to input cornea into the optical model, high-order aspheric surface-Zernike Fringe Sag surface was chosen to fit the corneal surfaces. Secondly, the Hartmann-shack wavefront sensor, which can offer the Zernike polynomials to describe the wavefront aberration, was built to measure the wavefront aberration of the eye. Thirdly, the eye's axial lengths among every part were measured with A-ultrasonic technology. Then the data were input into the optical design software-ZEMAX and the crystalline lens's shapes were optimized with the aberration as the merit function. The individual eye model, which has the same wavefront aberrations with the real eye, is established.

  6. Agent Based Model of Livestock Movements

    Science.gov (United States)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  7. Agent-based modeling in ecological economics.

    Science.gov (United States)

    Heckbert, Scott; Baynes, Tim; Reeson, Andrew

    2010-01-01

    Interconnected social and environmental systems are the domain of ecological economics, and models can be used to explore feedbacks and adaptations inherent in these systems. Agent-based modeling (ABM) represents autonomous entities, each with dynamic behavior and heterogeneous characteristics. Agents interact with each other and their environment, resulting in emergent outcomes at the macroscale that can be used to quantitatively analyze complex systems. ABM is contributing to research questions in ecological economics in the areas of natural resource management and land-use change, urban systems modeling, market dynamics, changes in consumer attitudes, innovation, and diffusion of technology and management practices, commons dilemmas and self-governance, and psychological aspects to human decision making and behavior change. Frontiers for ABM research in ecological economics involve advancing the empirical calibration and validation of models through mixed methods, including surveys, interviews, participatory modeling, and, notably, experimental economics to test specific decision-making hypotheses. Linking ABM with other modeling techniques at the level of emergent properties will further advance efforts to understand dynamics of social-environmental systems.

  8. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  9. Notionally steady background noise acts primarily as a modulation masker of speech.

    Science.gov (United States)

    Stone, Michael A; Füllgrabe, Christian; Moore, Brian C J

    2012-07-01

    Stone et al. [J. Acoust. Soc Am. 130, 2874-2881 (2011)], using vocoder processing, showed that the envelope modulations of a notionally steady noise were more effective than the envelope energy as a masker of speech. Here the same effect is demonstrated using non-vocoded signals. Speech was filtered into 28 channels. A masker centered on each channel was added to the channel signal at a target-to-background ratio of -5 or -10 dB. Maskers were sinusoids or noise bands with bandwidth 1/3 or 1 ERB(N) (ERB(N) being the bandwidth of "normal" auditory filters), synthesized with Gaussian (GN) or low-noise (LNN) statistics. To minimize peripheral interactions between maskers, odd-numbered channels were presented to one ear and even to the other. Speech intelligibility was assessed in the presence of each "steady" masker and that masker 100% sinusoidally amplitude modulated (SAM) at 8 Hz. Intelligibility decreased with increasing envelope fluctuation of the maskers. Masking release, the difference in intelligibility between the SAM and its "steady" counterpart, increased with bandwidth from near-zero to around 50 percentage points for the 1-ERB(N) GN. It is concluded that the sinusoidal and GN maskers behaved primarily as energetic and modulation maskers, respectively.

  10. Local field potentials primarily reflect inhibitory neuron activity in human and monkey cortex

    Science.gov (United States)

    Teleńczuk, Bartosz; Dehghani, Nima; Le Van Quyen, Michel; Cash, Sydney S.; Halgren, Eric; Hatsopoulos, Nicholas G.; Destexhe, Alain

    2017-01-01

    The local field potential (LFP) is generated by large populations of neurons, but unitary contribution of spiking neurons to LFP is not well characterised. We investigated this contribution in multi-electrode array recordings from human and monkey neocortex by examining the spike-triggered LFP average (st-LFP). The resulting st-LFPs were dominated by broad spatio-temporal components due to ongoing activity, synaptic inputs and recurrent connectivity. To reduce the spatial reach of the st-LFP and observe the local field related to a single spike we applied a spatial filter, whose weights were adapted to the covariance of ongoing LFP. The filtered st-LFPs were limited to the perimeter of 800 μm around the neuron, and propagated at axonal speed, which is consistent with their unitary nature. In addition, we discriminated between putative inhibitory and excitatory neurons and found that the inhibitory st-LFP peaked at shorter latencies, consistently with previous findings in hippocampal slices. Thus, in human and monkey neocortex, the LFP reflects primarily inhibitory neuron activity. PMID:28074856

  11. Audiovisual contrast enhancement is articulated primarily via the M-pathway.

    Science.gov (United States)

    Jaekl, Philip M; Soto-Faraco, Salvador

    2010-12-17

    Although it has been previously reported that audiovisual integration can modulate performance on some visual tasks, multisensory interactions have not been explicitly assessed in the context of different visual processing pathways. In the present study, we test auditory influences on visual processing employing a psychophysical paradigm that reveals distinct spatial contrast signatures of magnocellular and parvocellular visual pathways. We found that contrast thresholds are reduced when noninformative sounds are presented with transient, low-frequency Gabor patch stimuli and thus favor the M-system. In contrast, visual thresholds are unaffected by concurrent sounds when detection is primarily attributed to P-pathway processing. These results demonstrate that the visual detection enhancement resulting from multisensory integration is mainly articulated by the magnocellular system, which is most sensitive at low spatial frequencies. Such enhancement may subserve stimulus-driven processes including the orientation of spatial attention and fast, automatic ocular and motor responses. This dissociation helps explain discrepancies between the results of previous studies investigating visual enhancement by sounds.

  12. Spontaneous reports of primarily suspected herbal hepatotoxicity by Pelargonium sidoides: was causality adequately ascertained?

    Science.gov (United States)

    Teschke, Rolf; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-06-01

    Spontaneous reports of primarily assumed hepatotoxicity in connection with the use of Pelargonium sidoides (PS) have been interpreted by the Drug Commission of the German Medical Association (DCGMA) as showing some hepatotoxic potential of PS used to treat common cold and other respiratory tract infections. Causality for PS was assessed using the liver specific, structured, quantitative, and updated scale of the Council for International Organizations of Medical Sciences (CIOMS). In none of the 15 cases was there a highly probable or probable causality for PS. Analysis revealed confounding factors such as numerous final diagnoses unrelated to PS and poor data quality in virtually all cases. In only a minority of the cases were data provided to consider even common other diseases of the liver. For instance, biliary tract imaging data were available in only 3 patients; data to exclude virus infections by hepatitis A-C were provided in 4 cases and by CMV and EBV in 1 case, whereas HSV and VZV virus infections remained unconsidered. Thus, convincing evidence is lacking that PS was a potential hepatotoxin in the analyzed cases.

  13. Business Models for NFC based mobile payments

    Directory of Open Access Journals (Sweden)

    Johannes Sang Un Chae

    2015-01-01

    Full Text Available Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.

  14. Realistic face modeling based on multiple deformations

    Institute of Scientific and Technical Information of China (English)

    GONG Xun; WANG Guo-yin

    2007-01-01

    On the basis of the assumption that the human face belongs to a linear class, a multiple-deformation model is proposed to recover face shape from a few points on a single 2D image. Compared to the conventional methods, this study has the following advantages. First, the proposed modified 3D sparse deforming model is a noniterative approach that can compute global translation efficiently and accurately. Subsequently, the overfitting problem can be alleviated based on the proposed multiple deformation model. Finally, by keeping the main features, the texture generated is realistic. The comparison results show that this novel method outperforms the existing methods by using ground truth data and that realistic 3D faces can be recovered efficiently from a single photograph.

  15. Business Models for NFC Based Mobile Payments

    DEFF Research Database (Denmark)

    Chae, Johannes Sang-Un; Hedman, Jonas

    2015-01-01

    from multiple stakeholders and the creation of an ecosystem. Furthermore, they focus on the scalability of their value propositions. Originality / value: The paper offers an applicable business model framework that allows practitioners and academics to study current and future mobile payment approaches.......Purpose: The purpose of the paper is to develop a business model framework for NFC based mobile payment solutions consisting of four mutually interdepended components: the value service, value network, value architecture, and value finance. Design: Using a comparative case study method, the paper...... investigates Google Wallet and ISIS Mobile Wallet and their underlying business models. Findings: Google Wallet and ISIS Mobile Wallet are focusing on providing an enhanced customer experience with their mobile wallet through a multifaceted value proposition. The delivery of its offering requires cooperation...

  16. On Reading-Based Writing Instruction Model

    Institute of Scientific and Technical Information of China (English)

    李大艳; 王建安

    2012-01-01

    English writing is a complex integrative process of comprehensive skills. A host of students are still unable to write a coherent English paragraph after having learned English for many years at school. To help college students improve their writing competence is a great challenge facing the English teaching in China. Researches on writing teaching method abroad have experienced prosperity. In China, however, researches in this field are far behind. There is great need to search for more efficient writing instruction model so that it can serve well in Chinese context. Enlightened by Krashen's input hypothesis and Swain's output hypothesis, the writer put forward Reading-Based Writing Instruction Model. This paper aims to discuss the effectiveness of this model from the different perspectives.

  17. Néron models and base change

    CERN Document Server

    Halle, Lars Halvard

    2016-01-01

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...

  18. Entropy-based consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław Jerzy

    2016-09-01

    A description of software architecture is a plan of the IT system construction, therefore any architecture gaps affect the overall success of an entire project. The definitions mostly describe software architecture as a set of views which are mutually unrelated, hence potentially inconsistent. Software architecture completeness is also often described in an ambiguous way. As a result most methods of IT systems building comprise many gaps and ambiguities, thus presenting obstacles for software building automation. In this article the consistency and completeness of software architecture are mathematically defined based on calculation of entropy of the architecture description. Following this approach, in this paper we also propose our method of automatic verification of consistency and completeness of the software architecture development method presented in our previous article as Consistent Model Driven Architecture (CMDA). The proposed FBS (Functionality-Behaviour-Structure) entropy-based metric applied in our CMDA approach enables IT architects to decide whether the modelling process is complete and consistent. With this metric, software architects could assess the readiness of undergoing modelling work for the start of IT system building. It even allows them to assess objectively whether the designed software architecture of the IT system could be implemented at all. The overall benefit of such an approach is that it facilitates the preparation of complete and consistent software architecture more effectively as well as it enables assessing and monitoring of the ongoing modelling development status. We demonstrate this with a few industry examples of IT system designs.

  19. Concept Tree Based Information Retrieval Model

    Directory of Open Access Journals (Sweden)

    Chunyan Yuan

    2014-05-01

    Full Text Available This paper proposes a novel concept-based query expansion technique named Markov concept tree model (MCTM, discovering term relationship through the concept tree deduced by term markov network. We address two important issues for query expansion: the selection and the weighting of expansion search terms. In contrast to earlier methods, queries are expanded by adding those terms that are most similar to the concept of the query, rather than selecting terms that are similar to a signal query terms. Utilizing Markov network which is constructed according to the co-occurrence information of the terms in collection, it generate concept tree for each original query term, remove the redundant and irrelevant nodes in concept tree, then adjust the weight of original query and the weight of expansion term based on a pruning algorithm. We use this model for query expansion and evaluate the effectiveness of the model by examining the accuracy and robustness of the expansion methods, Compared with the baseline model, the experiments on standard dataset reveal that this method can achieve a better query quality

  20. Model-based target and background characterization

    Science.gov (United States)

    Mueller, Markus; Krueger, Wolfgang; Heinze, Norbert

    2000-07-01

    Up to now most approaches of target and background characterization (and exploitation) concentrate solely on the information given by pixels. In many cases this is a complex and unprofitable task. During the development of automatic exploitation algorithms the main goal is the optimization of certain performance parameters. These parameters are measured during test runs while applying one algorithm with one parameter set to images that constitute of image domains with very different domain characteristics (targets and various types of background clutter). Model based geocoding and registration approaches provide means for utilizing the information stored in GIS (Geographical Information Systems). The geographical information stored in the various GIS layers can define ROE (Regions of Expectations) and may allow for dedicated algorithm parametrization and development. ROI (Region of Interest) detection algorithms (in most cases MMO (Man- Made Object) detection) use implicit target and/or background models. The detection algorithms of ROIs utilize gradient direction models that have to be matched with transformed image domain data. In most cases simple threshold calculations on the match results discriminate target object signatures from the background. The geocoding approaches extract line-like structures (street signatures) from the image domain and match the graph constellation against a vector model extracted from a GIS (Geographical Information System) data base. Apart from geo-coding the algorithms can be also used for image-to-image registration (multi sensor and data fusion) and may be used for creation and validation of geographical maps.

  1. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  2. Model-based vision for car following

    Science.gov (United States)

    Schneiderman, Henry; Nashman, Marilyn; Lumia, Ronald

    1993-08-01

    This paper describes a vision processing algorithm that supports autonomous car following. The algorithm visually tracks the position of a `lead vehicle' from the vantage of a pursuing `chase vehicle.' The algorithm requires a 2-D model of the back of the lead vehicle. This model is composed of line segments corresponding to features that give rise to strong edges. There are seven sequential stages of computation: (1) Extracting edge points; (2) Associating extracted edge points with the model features; (3) Determining the position of each model feature; (4) Determining the model position; (5) Updating the motion model of the object; (6) Predicting the position of the object in next image; (7) Predicting the location of all object features from prediction of object position. All processing is confined to the 2-D image plane. The 2-D model location computed in this processing is used to determine the position of the lead vehicle with respect to a 3-D coordinate frame affixed to the chase vehicle. This algorithm has been used as part of a complete system to drive an autonomous vehicle, a High Mobility Multipurpose Wheeled Vehicle (HMMWV) such that it follows a lead vehicle at speeds up to 35 km/hr. The algorithm runs at an update rate of 15 Hertz and has a worst case computational delay of 128 ms. The algorithm is implemented under the NASA/NBS Standard Reference Model for Telerobotic Control System Architecture (NASREM) and runs on a dedicated vision processing engine and a VME-based multiprocessor system.

  3. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  4. CONFIRMING THE PRIMARILY SMOOTH STRUCTURE OF THE VEGA DEBRIS DISK AT MILLIMETER WAVELENGTHS

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, A. Meredith; Plambeck, Richard; Chiang, Eugene [Department of Astronomy, University of California, Berkeley, CA 94720 (United States); Wilner, David J.; Andrews, Sean M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Mason, Brian [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903-2475 (United States); Carpenter, John M. [California Institute of Technology, Department of Astronomy, MC 105-24, Pasadena, CA 91125 (United States); Chiang, Hsin-Fang [Institute for Astronomy, University of Hawaii, 640 North Aohoku Place, Hilo, HI 96720 (United States); Williams, Jonathan P. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Hales, Antonio [Joint ALMA Observatory, Av. El Golf 40, Piso 18, Santiago (Chile); Su, Kate [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States); Dicker, Simon; Korngut, Phil; Devlin, Mark, E-mail: mhughes@astro.berkeley.edu [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States)

    2012-05-01

    Clumpy structure in the debris disk around Vega has been previously reported at millimeter wavelengths and attributed to concentrations of dust grains trapped in resonances with an unseen planet. However, recent imaging at similar wavelengths with higher sensitivity has disputed the observed structure. We present three new millimeter-wavelength observations that help to resolve the puzzling and contradictory observations. We have observed the Vega system with the Submillimeter Array (SMA) at a wavelength of 880 {mu}m and an angular resolution of 5''; with the Combined Array for Research in Millimeter-wave Astronomy (CARMA) at a wavelength of 1.3 mm and an angular resolution of 5''; and with the Green Bank Telescope (GBT) at a wavelength of 3.3 mm and angular resolution of 10''. Despite high sensitivity and short baselines, we do not detect the Vega debris disk in either of the interferometric data sets (SMA and CARMA), which should be sensitive at high significance to clumpy structure based on previously reported observations. We obtain a marginal (3{sigma}) detection of disk emission in the GBT data; the spatial distribution of the emission is not well constrained. We analyze the observations in the context of several different models, demonstrating that the observations are consistent with a smooth, broad, axisymmetric disk with inner radius 20-100 AU and width {approx}> 50 AU. The interferometric data require that at least half of the 860 {mu}m emission detected by previous single-dish observations with the James Clerk Maxwell Telescope be distributed axisymmetrically, ruling out strong contributions from flux concentrations on spatial scales of {approx}<100 AU. These observations support recent results from the Plateau de Bure Interferometer indicating that previous detections of clumpy structure in the Vega debris disk were spurious.

  5. Family-Based Model Checking Without a Family-Based Model Checker

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Al-Sibahi, Ahmad Salim; Brabrand, Claus;

    2015-01-01

    be used to model-check variational models using the standard version of (single system) SPIN. The abstractions are first defined as Galois connections on semantic domains. We then show how to translate them into syntactic source-to-source transformations on variational models. This allows the use of SPIN...... with all its accumulated optimizations for efficient verification of variational models without any knowledge about variability. We demonstrate the practicality of this method on several examples using both the SNIP (family based) and SPIN (single system) model checkers....

  6. Grid based calibration of SWAT hydrological models

    Directory of Open Access Journals (Sweden)

    D. Gorgan

    2012-07-01

    Full Text Available The calibration and execution of large hydrological models, such as SWAT (soil and water assessment tool, developed for large areas, high resolution, and huge input data, need not only quite a long execution time but also high computation resources. SWAT hydrological model supports studies and predictions of the impact of land management practices on water, sediment, and agricultural chemical yields in complex watersheds. The paper presents the gSWAT application as a web practical solution for environmental specialists to calibrate extensive hydrological models and to run scenarios, by hiding the complex control of processes and heterogeneous resources across the grid based high computation infrastructure. The paper highlights the basic functionalities of the gSWAT platform, and the features of the graphical user interface. The presentation is concerned with the development of working sessions, interactive control of calibration, direct and basic editing of parameters, process monitoring, and graphical and interactive visualization of the results. The experiments performed on different SWAT models and the obtained results argue the benefits brought by the grid parallel and distributed environment as a solution for the processing platform. All the instances of SWAT models used in the reported experiments have been developed through the enviroGRIDS project, targeting the Black Sea catchment area.

  7. Physics-based models of the plasmasphere

    Energy Technology Data Exchange (ETDEWEB)

    Jordanova, Vania K [Los Alamos National Laboratory; Pierrard, Vivane [BELGIUM; Goldstein, Jerry [SWRI; Andr' e, Nicolas [ESTEC/ESA; Kotova, Galina A [SRI, RUSSIA; Lemaire, Joseph F [BELGIUM; Liemohn, Mike W [U OF MICHIGAN; Matsui, H [UNIV OF NEW HAMPSHIRE

    2008-01-01

    We describe recent progress in physics-based models of the plasmasphere using the Auid and the kinetic approaches. Global modeling of the dynamics and inAuence of the plasmasphere is presented. Results from global plasmasphere simulations are used to understand and quantify (i) the electric potential pattern and evolution during geomagnetic storms, and (ii) the inAuence of the plasmasphere on the excitation of electromagnetic ion cyclotron (ElvIIC) waves a.nd precipitation of energetic ions in the inner magnetosphere. The interactions of the plasmasphere with the ionosphere a.nd the other regions of the magnetosphere are pointed out. We show the results of simulations for the formation of the plasmapause and discuss the inAuence of plasmaspheric wind and of ultra low frequency (ULF) waves for transport of plasmaspheric material. Theoretical formulations used to model the electric field and plasma distribution in the plasmasphere are given. Model predictions are compared to recent CLUSTER and MAGE observations, but also to results of earlier models and satellite observations.

  8. Microtechnology-Based Multi-Organ Models

    Directory of Open Access Journals (Sweden)

    Seung Hwan Lee

    2017-05-01

    Full Text Available Drugs affect the human body through absorption, distribution, metabolism, and elimination (ADME processes. Due to their importance, the ADME processes need to be studied to determine the efficacy and side effects of drugs. Various in vitro model systems have been developed and used to realize the ADME processes. However, conventional model systems have failed to simulate the ADME processes because they are different from in vivo, which has resulted in a high attrition rate of drugs and a decrease in the productivity of new drug development. Recently, a microtechnology-based in vitro system called “organ-on-a-chip” has been gaining attention, with more realistic cell behavior and physiological reactions, capable of better simulating the in vivo environment. Furthermore, multi-organ-on-a-chip models that can provide information on the interaction between the organs have been developed. The ultimate goal is the development of a “body-on-a-chip”, which can act as a whole body model. In this review, we introduce and summarize the current progress in the development of multi-organ models as a foundation for the development of body-on-a-chip.

  9. Microtechnology-Based Multi-Organ Models.

    Science.gov (United States)

    Lee, Seung Hwan; Sung, Jong Hwan

    2017-05-21

    Drugs affect the human body through absorption, distribution, metabolism, and elimination (ADME) processes. Due to their importance, the ADME processes need to be studied to determine the efficacy and side effects of drugs. Various in vitro model systems have been developed and used to realize the ADME processes. However, conventional model systems have failed to simulate the ADME processes because they are different from in vivo, which has resulted in a high attrition rate of drugs and a decrease in the productivity of new drug development. Recently, a microtechnology-based in vitro system called "organ-on-a-chip" has been gaining attention, with more realistic cell behavior and physiological reactions, capable of better simulating the in vivo environment. Furthermore, multi-organ-on-a-chip models that can provide information on the interaction between the organs have been developed. The ultimate goal is the development of a "body-on-a-chip", which can act as a whole body model. In this review, we introduce and summarize the current progress in the development of multi-organ models as a foundation for the development of body-on-a-chip.

  10. Fault diagnosis based on continuous simulation models

    Science.gov (United States)

    Feyock, Stefan

    1987-01-01

    The results are described of an investigation of techniques for using continuous simulation models as basis for reasoning about physical systems, with emphasis on the diagnosis of system faults. It is assumed that a continuous simulation model of the properly operating system is available. Malfunctions are diagnosed by posing the question: how can we make the model behave like that. The adjustments that must be made to the model to produce the observed behavior usually provide definitive clues to the nature of the malfunction. A novel application of Dijkstra's weakest precondition predicate transformer is used to derive the preconditions for producing the required model behavior. To minimize the size of the search space, an envisionment generator based on interval mathematics was developed. In addition to its intended application, the ability to generate qualitative state spaces automatically from quantitative simulations proved to be a fruitful avenue of investigation in its own right. Implementations of the Dijkstra transform and the envisionment generator are reproduced in the Appendix.

  11. Mars 2020 Model Based Systems Engineering Pilot

    Science.gov (United States)

    Dukes, Alexandra Marie

    2017-01-01

    The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and

  12. Flow based vs. demand based energy-water modelling

    Science.gov (United States)

    Rozos, Evangelos; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Koukouvinos, Antonios; Makropoulos, Christos

    2015-04-01

    The water flow in hydro-power generation systems is often used downstream to cover other type of demands like irrigation and water supply. However, the typical case is that the energy demand (operation of hydro-power plant) and the water demand do not coincide. Furthermore, the water inflow into a reservoir is a stochastic process. Things become more complicated if renewable resources (wind-turbines or photovoltaic panels) are included into the system. For this reason, the assessment and optimization of the operation of hydro-power systems are challenging tasks that require computer modelling. This modelling should not only simulate the water budget of the reservoirs and the energy production/consumption (pumped-storage), but should also take into account the constraints imposed by the natural or artificial water network using a flow routing algorithm. HYDRONOMEAS, for example, uses an elegant mathematical approach (digraph) to calculate the flow in a water network based on: the demands (input timeseries), the water availability (simulated) and the capacity of the transmission components (properties of channels, rivers, pipes, etc.). The input timeseries of demand should be estimated by another model and linked to the corresponding network nodes. A model that could be used to estimate these timeseries is UWOT. UWOT is a bottom up urban water cycle model that simulates the generation, aggregation and routing of water demand signals. In this study, we explore the potentials of UWOT in simulating the operation of complex hydrosystems that include energy generation. The evident advantage of this approach is the use of a single model instead of one for estimation of demands and another for the system simulation. An application of UWOT in a large scale system is attempted in mainland Greece in an area extending over 130×170 km². The challenges, the peculiarities and the advantages of this approach are examined and critically discussed.

  13. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the several...

  14. Predictors of nephrectomy in high grade blunt renal trauma patients treated primarily with conservative intent.

    Science.gov (United States)

    Prasad, Narla Hari; Devraj, Rahul; Chandriah, G Ram; Sagar, S Vidya; Reddy, Ch Ram; Murthy, Pisapati Venkata Lakshmi Narsimha

    2014-04-01

    There is no consensus on the optimal management of high grade renal trauma. Delayed surgery increases the likelihood of secondary hemorrhage and persistent urinary extravasation, whereas immediate surgery results in high renal loss. Hence, the present study was undertaken to evaluate the predictors of nephrectomy and outcome of high Grade (III-V) renal injury, treated primarily with conservative intent. The records of 55 patients who were admitted to our institute with varying degrees of blunt renal trauma from January 2005 to December 2012 were retrospectively reviewed. Grade III-V renal injury was defined as high grade blunt renal trauma and was present in 44 patients. The factors analyzed to predict emergency intervention were demographic profile, grade of injury, degree of hemodynamic instability, requirement of blood transfusion, need for intervention, mode of intervention, and duration of intensive care unit stay. Rest of the 40 patients with high grade injury (grade 3 and 4)did not require emergency intervention and underwent a trail of conservative management. 7 of the 40 patients with high grade renal injury (grade 3 and 4), who were managed conservatively experienced complications requiring procedural intervention and three required a delayed nephrectomy. Presence of grade V injuries with hemodynamic instability and requirement of more than 10 packed cell units for resuscitation were predictors of nephrectomy. Predictors of complications were urinary extravasation and hemodynamic instability at presentation. Majority of the high grade renal injuries can be successfully managed conservatively. Grade V injuries and the need for more packed cell transfusions during resuscitation predict the need for emergency intervention.

  15. Predictors of nephrectomy in high grade blunt renal trauma patients treated primarily with conservative intent

    Directory of Open Access Journals (Sweden)

    Narla Hari Prasad

    2014-01-01

    Full Text Available Introduction: There is no consensus on the optimal management of high grade renal trauma. Delayed surgery increases the likelihood of secondary hemorrhage and persistent urinary extravasation, whereas immediate surgery results in high renal loss. Hence, the present study was undertaken to evaluate the predictors of nephrectomy and outcome of high Grade (III-V renal injury, treated primarily with conservative intent. Materials and Methods: The records of 55 patients who were admitted to our institute with varying degrees of blunt renal trauma from January 2005 to December 2012 were retrospectively reviewed. Grade III-V renal injury was defined as high grade blunt renal trauma and was present in 44 patients. The factors analyzed to predict emergency intervention were demographic profile, grade of injury, degree of hemodynamic instability, requirement of blood transfusion, need for intervention, mode of intervention, and duration of intensive care unit stay. Results: Rest of the 40 patients with high grade injury (grade 3 and 4 did not require emergency intervention and underwent a trail of conservative management. 7 of the 40 patients with high grade renal injury (grade 3 and 4, who were managed conservatively experienced complications requiring procedural intervention and three required a delayed nephrectomy. Presence of grade V injuries with hemodynamic instability and requirement of more than 10 packed cell units for resuscitation were predictors of nephrectomy. Predictors of complications were urinary extravasation and hemodynamic instability at presentation. Conclusion: Majority of the high grade renal injuries can be successfully managed conservatively. Grade V injuries and the need for more packed cell transfusions during resuscitation predict the need for emergency intervention.

  16. A Comparison of Filter-based Approaches for Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches use domain knowledge about a system and its failure modes through the use of physics-based models. Model-based prognosis is...

  17. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    Science.gov (United States)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both

  18. Cosmic emergy based ecological systems modelling

    Science.gov (United States)

    Chen, H.; Chen, G. Q.; Ji, X.

    2010-09-01

    Ecological systems modelling based on the unified biophysical measure of cosmic emergy in terms of embodied cosmic exergy is illustrated in this paper with ecological accounting, simulation and scenario analysis, by a case study for the regional socio-economic ecosystem associated with the municipality of Beijing. An urbanized regional ecosystem model with eight subsystems of natural support, agriculture, urban production, population, finance, land area, potential environmental impact, and culture is representatively presented in exergy circuit language with 12 state variables governing by corresponding ecodynamic equations, and 60 flows and auxiliary variables. To characterize the regional socio-economy as an ecosystem, a series of ecological indicators based on cosmic emergy are devised. For a systematic ecological account, cosmic exergy transformities are provided for various dimensions including climate flows, natural resources, industrial products, cultural products, population with educational hierarchy, and environmental emissions. For the urban ecosystem of Beijing in the period from 1990 to 2005, ecological accounting is carried out and characterized in full details. Taking 2000 as the starting point, systems modelling is realized to predict the urban evolution in a one hundred time horizon. For systems regulation, scenario analyses with essential policy-making implications are made to illustrate the long term systems effects of the expected water diversion and rise in energy price.

  19. Physiologically Based Pharmacokinetic (PBPK) Modeling of ...

    Science.gov (United States)

    Background: Quantitative estimation of toxicokinetic variability in the human population is a persistent challenge in risk assessment of environmental chemicals. Traditionally, inter-individual differences in the population are accounted for by default assumptions or, in rare cases, are based on human toxicokinetic data.Objectives: To evaluate the utility of genetically diverse mouse strains for estimating toxicokinetic population variability for risk assessment, using trichloroethylene (TCE) metabolism as a case study. Methods: We used data on oxidative and glutathione conjugation metabolism of TCE in 16 inbred and one hybrid mouse strains to calibrate and extend existing physiologically-based pharmacokinetic (PBPK) models. We added one-compartment models for glutathione metabolites and a two-compartment model for dichloroacetic acid (DCA). A Bayesian population analysis of inter-strain variability was used to quantify variability in TCE metabolism. Results: Concentration-time profiles for TCE metabolism to oxidative and glutathione conjugation metabolites varied across strains. Median predictions for the metabolic flux through oxidation was less variable (5-fold range) than that through glutathione conjugation (10-fold range). For oxidative metabolites, median predictions of trichloroacetic acid production was less variable (2-fold range) than DCA production (5-fold range), although uncertainty bounds for DCA exceeded the predicted variability. Conclusions:

  20. Model-based phase-shifting interferometer

    Science.gov (United States)

    Liu, Dong; Zhang, Lei; Shi, Tu; Yang, Yongying; Chong, Shiyao; Miao, Liang; Huang, Wei; Shen, Yibing; Bai, Jian

    2015-10-01

    A model-based phase-shifting interferometer (MPI) is developed, in which a novel calculation technique is proposed instead of the traditional complicated system structure, to achieve versatile, high precision and quantitative surface tests. In the MPI, the partial null lens (PNL) is employed to implement the non-null test. With some alternative PNLs, similar as the transmission spheres in ZYGO interferometers, the MPI provides a flexible test for general spherical and aspherical surfaces. Based on modern computer modeling technique, a reverse iterative optimizing construction (ROR) method is employed for the retrace error correction of non-null test, as well as figure error reconstruction. A self-compiled ray-tracing program is set up for the accurate system modeling and reverse ray tracing. The surface figure error then can be easily extracted from the wavefront data in forms of Zernike polynomials by the ROR method. Experiments of the spherical and aspherical tests are presented to validate the flexibility and accuracy. The test results are compared with those of Zygo interferometer (null tests), which demonstrates the high accuracy of the MPI. With such accuracy and flexibility, the MPI would possess large potential in modern optical shop testing.

  1. Rainwater harvesting: model-based design evaluation.

    Science.gov (United States)

    Ward, S; Memon, F A; Butler, D

    2010-01-01

    The rate of uptake of rainwater harvesting (RWH) in the UK has been slow to date, but is expected to gain momentum in the near future. The designs of two different new-build rainwater harvesting systems, based on simple methods, are evaluated using three different design methods, including a continuous simulation modelling approach. The RWH systems are shown to fulfill 36% and 46% of WC demand. Financial analyses reveal that RWH systems within large commercial buildings maybe more financially viable than smaller domestic systems. It is identified that design methods based on simple approaches generate tank sizes substantially larger than the continuous simulation. Comparison of the actual tank sizes and those calculated using continuous simulation established that the tanks installed are oversized for their associated demand level and catchment size. Oversizing tanks can lead to excessive system capital costs, which currently hinders the uptake of systems. Furthermore, it is demonstrated that the catchment area size is often overlooked when designing UK-based RWH systems. With respect to these findings, a recommendation for a transition from the use of simple tools to continuous simulation models is made.

  2. Patient-Provider Engagement and Chronic Pain in Drug-Using, Primarily African American Persons Living with HIV/AIDS.

    Science.gov (United States)

    Mitchell, Mary M; Nguyen, Trang Q; Maragh-Bass, Allysha C; Isenberg, Sarina R; Beach, Mary Catherine; Knowlton, Amy R

    2016-10-27

    Among disadvantaged persons living with HIV/AIDS (PLHIV), patient-provider engagement, which has been defined as patient-provider relationships that promote the use of health care services and are characterized by active listening and supportive decision making, has been associated with antiretroviral therapy (ART) maintenance and viral suppression. However, chronic pain, depression, and substance use, all of which are prevalent in this population, can reduce the quality of patient-provider engagement. We hypothesized a model in which chronic pain, depression, and substance use would be associated with poorer patient-provider engagement, which would be positively associated with adherence, with the latter associated positively with viral suppression. We analyzed data from the BEACON study, which included surveys from 383 PLHIV who were primarily African American, on ART, and had histories of drug use. Due to six missing cases on the chronic pain variable, we used data from 377 respondents in a structural equation model. Chronic pain and depressive symptoms were significantly associated with poorer patient-provider engagement, while substance use was associated with better engagement. Patient-provider engagement in turn was associated with better ART adherence, which was associated with higher viral suppression. Results suggest the role of chronic pain in poor patient-physician engagement in this population, which has potential implications for quality of HIV patient care and health outcomes. Findings suggest the need for attention to patient-provider engagement in PLHIV.

  3. Human physiologically based pharmacokinetic model for propofol

    Directory of Open Access Journals (Sweden)

    Schnider Thomas W

    2005-04-01

    Full Text Available Abstract Background Propofol is widely used for both short-term anesthesia and long-term sedation. It has unusual pharmacokinetics because of its high lipid solubility. The standard approach to describing the pharmacokinetics is by a multi-compartmental model. This paper presents the first detailed human physiologically based pharmacokinetic (PBPK model for propofol. Methods PKQuest, a freely distributed software routine http://www.pkquest.com, was used for all the calculations. The "standard human" PBPK parameters developed in previous applications is used. It is assumed that the blood and tissue binding is determined by simple partition into the tissue lipid, which is characterized by two previously determined set of parameters: 1 the value of the propofol oil/water partition coefficient; 2 the lipid fraction in the blood and tissues. The model was fit to the individual experimental data of Schnider et. al., Anesthesiology, 1998; 88:1170 in which an initial bolus dose was followed 60 minutes later by a one hour constant infusion. Results The PBPK model provides a good description of the experimental data over a large range of input dosage, subject age and fat fraction. Only one adjustable parameter (the liver clearance is required to describe the constant infusion phase for each individual subject. In order to fit the bolus injection phase, for 10 or the 24 subjects it was necessary to assume that a fraction of the bolus dose was sequestered and then slowly released from the lungs (characterized by two additional parameters. The average weighted residual error (WRE of the PBPK model fit to the both the bolus and infusion phases was 15%; similar to the WRE for just the constant infusion phase obtained by Schnider et. al. using a 6-parameter NONMEM compartmental model. Conclusion A PBPK model using standard human parameters and a simple description of tissue binding provides a good description of human propofol kinetics. The major advantage of a

  4. CNEM: Cluster Based Network Evolution Model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2015-01-01

    Full Text Available This paper presents a network evolution model, which is based on the clustering approach. The proposed approach depicts the network evolution, which demonstrates the network formation from individual nodes to fully evolved network. An agglomerative hierarchical clustering method is applied for the evolution of network. In the paper, we present three case studies which show the evolution of the networks from the scratch. These case studies include: terrorist network of 9/11 incidents, terrorist network of WMD (Weapons Mass Destruction plot against France and a network of tweets discussing a topic. The network of 9/11 is also used for evaluation, using other social network analysis methods which show that the clusters created using the proposed model of network evolution are of good quality, thus the proposed method can be used by law enforcement agencies in order to further investigate the criminal networks

  5. Agent based modeling in tactical wargaming

    Science.gov (United States)

    James, Alex; Hanratty, Timothy P.; Tuttle, Daniel C.; Coles, John B.

    2016-05-01

    Army staffs at division, brigade, and battalion levels often plan for contingency operations. As such, analysts consider the impact and potential consequences of actions taken. The Army Military Decision-Making Process (MDMP) dictates identification and evaluation of possible enemy courses of action; however, non-state actors often do not exhibit the same level and consistency of planned actions that the MDMP was originally designed to anticipate. The fourth MDMP step is a particular challenge, wargaming courses of action within the context of complex social-cultural behaviors. Agent-based Modeling (ABM) and its resulting emergent behavior is a potential solution to model terrain in terms of the human domain and improve the results and rigor of the traditional wargaming process.

  6. Trip Generation Model Based on Destination Attractiveness

    Institute of Scientific and Technical Information of China (English)

    YAO Liya; GUAN Hongzhi; YAN Hai

    2008-01-01

    Traditional trip generation forecasting methods use unified average trip generation rates to determine trip generation volumes in various traffic zones without considering the individual characteristics of each traffic zone.Therefore,the results can have significant errors.To reduce the forecasting error produced by uniform trip generation rates for different traffic zones,the behavior of each traveler was studied instead of the characteristics of the traffic zone.This paper gives a method for calculating the trip efficiency and the effect of traffic zones combined with a destination selection model based on disaggregate theory for trip generation.Beijing data is used with the trip generation method to predict trip volumes.The results show that the disaggregate model in this paper is more accurate than the traditional method.An analysis of the factors influencing traveler behavior and destination selection shows that the attractiveness of the traffic zone strongly affects the trip generation volume.

  7. Cloth Modeling Based on Particle System

    Institute of Scientific and Technical Information of China (English)

    钟跃崎; 王善元

    2001-01-01

    A physical-based particle system is employed for cloth modeling supported by two basic algorithms, between which one is the construction of the internal and external forces acting on the particle system in terms of KES-F bending and shearing tests, and the other is the collision algorithm of which the collision detection is carried by means of bi-section of time step and the collision response is handled according to the empirical law for frictionless collision With these algorithms. the geometric state of parcles can be expressed as ordinary differential equationswhich is numerically solved by fourth order Runge- Kutta integration. Different draping figures of cotton fabric and wool fabric prove that such a particle system is suitable for 3D cloth modeling and simulation.

  8. Activity based costing model for inventory valuation

    Directory of Open Access Journals (Sweden)

    Vineet Chouhan

    2017-03-01

    Full Text Available Activity-Based-Model (ABC is used for the purpose of significant improvement for overhead accounting systems by providing the best information required for managerial decision. This pa-per discusses implacability of ABC technique on inventory valuation as a management account-ing innovation. In order to prove the applicability of ABC for inventory control a material driven medium-sized and privately owned company from engineering (iron and steel industry is select-ed and by analysis of its production process and its material dependency and use of indirect in-ventory, an ABC model is explored for better inventory control. The case revealed that the ne-cessity of ABC in the area of inventory control is significant. The company is not only able to increase its quality of decision but also it can significantly analyze its cost of direct material cost, valuation of direct material and use its implications for better decision making.

  9. Knowledge-based geometric modeling in construction

    DEFF Research Database (Denmark)

    Bonev, Martin; Hvam, Lars

    2012-01-01

    a considerably high amount of their recourses is required for designing and specifying the majority of their product assortment. As design decisions are hereby based on knowledge and experience about behaviour and applicability of construction techniques and materials for a predefined design situation, smart...... tools need to be developed, to support these activities. In order to achieve a higher degree of design automation, this study proposes a framework for using configuration systems within the CAD environment together with suitable geometric modeling techniques on the example of a Danish manufacturer...

  10. Model-based vision using geometric hashing

    Science.gov (United States)

    Akerman, Alexander, III; Patton, Ronald

    1991-04-01

    The Geometric Hashing technique developed by the NYU Courant Institute has been applied to various automatic target recognition applications. In particular, I-MATH has extended the hashing algorithm to perform automatic target recognition ofsynthetic aperture radar (SAR) imagery. For this application, the hashing is performed upon the geometric locations of dominant scatterers. In addition to being a robust model-based matching algorithm -- invariant under translation, scale, and 3D rotations of the target -- hashing is of particular utility because it can still perform effective matching when the target is partially obscured. Moreover, hashing is very amenable to a SIMD parallel processing architecture, and thus potentially realtime implementable.

  11. Agent-based modeling and simulation

    CERN Document Server

    Taylor, Simon

    2014-01-01

    Operational Research (OR) deals with the use of advanced analytical methods to support better decision-making. It is multidisciplinary with strong links to management science, decision science, computer science and many application areas such as engineering, manufacturing, commerce and healthcare. In the study of emergent behaviour in complex adaptive systems, Agent-based Modelling & Simulation (ABMS) is being used in many different domains such as healthcare, energy, evacuation, commerce, manufacturing and defense. This collection of articles presents a convenient introduction to ABMS with pa

  12. Ontology-Based Model Of Firm Competitiveness

    Science.gov (United States)

    Deliyska, Boryana; Stoenchev, Nikolay

    2010-10-01

    Competitiveness is important characteristics of each business organization (firm, company, corporation etc). It is of great significance for the organization existence and defines evaluation criteria of business success at microeconomical level. Each criterium comprises set of indicators with specific weight coefficients. In the work an ontology-based model of firm competitiveness is presented as a set of several mutually connected ontologies. It would be useful for knowledge structuring, standardization and sharing among experts and software engineers who develop application in the domain. Then the assessment of the competitiveness of various business organizations could be generated more effectively.

  13. Energy-based models for environmental biotechnology.

    Science.gov (United States)

    Rodríguez, Jorge; Lema, Juan M; Kleerebezem, Robbert

    2008-07-01

    Environmental biotechnology is evolving. Current process objectives include the production of chemicals and/or energy carriers (biofuels) in addition to the traditional objective of removing pollutants from waste. To maximise product yields and minimise biomass production, future processes will rely on anaerobic microbial communities. Anaerobic processes are characterised by small Gibbs energy changes in the reactions catalysed, and this provides clear thermodynamic process boundaries. Here, a Gibbs-energy-based methodology is proposed for mathematical modelling of energy-limited anaerobic ecosystems. This methodology provides a basis for the description of microbial activities as a function of environmental factors, which will allow enhanced catalysis of specific reactions of interest for process development.

  14. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2008-01-01

    This paper introduces a framework for building CSP based applications, targeted for clusters and next generation CPU designs. CPUs are produced with several cores today and every future CPU generation will feature increasingly more cores, resulting in a requirement for concurrency that has...... not previously been called for. The framework is CSP presented as a scienti¿c work¿ow model, specialized for scienti¿c computing applications. The purpose of the framework is to enable scientists to exploit large parallel computation resources, which has previously been hard due of the dif¿culty of concurrent...... programming using threads and locks....

  15. Integrated Semantic Similarity Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    LIU Ya-Jun; ZHAO Yun

    2004-01-01

    To solve the problem of the inadequacy of semantic processing in the intelligent question answering system, an integrated semantic similarity model which calculates the semantic similarity using the geometric distance and information content is presented in this paper.With the help of interrelationship between concepts, the information content of concepts and the strength of the edges in the ontology network, we can calculate the semantic similarity between two concepts and provide information for the further calculation of the semantic similarity between user's question and answers in knowlegdge base.The results of the experiments on the prototype have shown that the semantic problem in natural language processing can also be solved with the help of the knowledge and the abundant semantic information in ontology.More than 90% accuracy with less than 50 ms average searching time in the intelligent question answering prototype system based on ontology has been reached.The result is very satisfied.

  16. Physiologically based Pharmacokinetic Modeling of 1,4-Dioxane in Rats, Mice, and Humans

    Energy Technology Data Exchange (ETDEWEB)

    Sweeney, Lisa M.; Thrall, Karla D.; Poet, Torka S.; Corley, Rick; Weber, Thomas J.; Locey, B. J.; Clarkson, Jacquelyn; Sager, S.; Gargas, M. L.

    2008-01-01

    ABSTRACT 1,4-Dioxane (CAS No. 123-91-1) is used primarily as a solvent or as a solvent stabilizer. It can cause lung, liver and kidney damage at sufficiently high exposure levels. Two physiologically-based pharmacokinetic (PBPK) models of 1,4-dioxane and its major metabolite, hydroxyethoxyacetic acid (HEAA), were published in 1990. These models have uncertainties and deficiencies that could be addressed and the model strengthened for use in a contemporary cancer risk assessment for 1,4-dioxane. Studies were performed to fill data gaps and reduce uncertainties pertaining to the pharmacokinetics of 1,4-dioxane and HEAA in rats, mice, and humans. Three types of studies were performed:partition coefficient measurements, blood time course in mice, and in vitro pharmacokinetics using rat, mouse, and human hepatocytes. Updated PBPK models were developed based on these new data and previously available data. The optimized rate of metabolism for the mouse was significantly higher than the value previously estimated. The optimized rat kinetic parameters were similar to those in the 1990 models. Only two human studies were identified. Model predictions were consistent with one study, but did not fit the second as well. In addition, a rat nasal exposure was completed. The results confirmed water directly contacts rat nasal tissues during drinking water under bioassays. Consistent with previous PBPK models, nasal tissues were not specifically included in the model. Use of these models will reduce the uncertainty in future 1,4-dioxane risk assessments.

  17. Physiologically based pharmacokinetic modeling of 1,4-Dioxane in rats, mice, and humans.

    Science.gov (United States)

    Sweeney, Lisa M; Thrall, Karla D; Poet, Torka S; Corley, Richard A; Weber, Thomas J; Locey, Betty J; Clarkson, Jacquelyn; Sager, Shawn; Gargas, Michael L

    2008-01-01

    1,4-Dioxane (CAS No. 123-91-1) is used primarily as a solvent or as a solvent stabilizer. It can cause lung, liver, and kidney damage at sufficiently high exposure levels. Two physiologically based pharmacokinetic (PBPK) models of 1,4-dioxane and its major metabolite, hydroxyethoxyacetic acid (HEAA), were published in 1990. These models have uncertainties and deficiencies that could be addressed and the model strengthened for use in a contemporary cancer risk assessment for 1,4-dioxane. Studies were performed to fill data gaps and reduce uncertainties pertaining to the pharmacokinetics of 1,4-dioxane and HEAA in rats, mice, and humans. Three types of studies were performed: partition coefficient measurements, blood time course in mice, and in vitro pharmacokinetics using rat, mouse, and human hepatocytes. Updated PBPK models were developed based on these new data and previously available data. The optimized rate of metabolism for the mouse was significantly higher than the value previously estimated. The optimized rat kinetic parameters were similar to those in the 1990 models. Only two human studies were identified. Model predictions were consistent with one study, but did not fit the second as well. In addition, a rat nasal exposure was completed. The results confirmed water directly contacts rat nasal tissues during drinking water under bioassay conditions. Consistent with previous PBPK models, nasal tissues were not specifically included in the model. Use of these models will reduce the uncertainty in future 1,4-dioxane risk assessments.

  18. Python-Based Applications for Hydrogeological Modeling

    Science.gov (United States)

    Khambhammettu, P.

    2013-12-01

    Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The

  19. Re-examining concepts of occupation and occupation-based models: occupational therapy and community development.

    Science.gov (United States)

    Leclair, Leanne L

    2010-02-01

    A growing body of literature supports the role of occupational therapists in community development. Using a community development approach, occupational therapists respond to community-identified occupational needs. They work to build local resources and capacities and self-sustaining programs that foster change within the community and potentially beyond. The purpose of this paper is to highlight some key issues related to occupational therapy practice in community development. The definitions and classifications of occupation focus primarily on the individual and fail to elaborate on the shared occupations of a community. As well, occupation-based models of practice are not easily applied to occupational therapy practice in community development. In order for occupational therapy to articulate its role in community development, greater heed needs to be given to the definition and categorization of occupation, occupation-based models of practice, and their application to communities.

  20. Biologically based multistage modeling of radiation effects

    Energy Technology Data Exchange (ETDEWEB)

    William Hazelton; Suresh Moolgavkar; E. Georg Luebeck

    2005-08-30

    This past year we have made substantial progress in modeling the contribution of homeostatic regulation to low-dose radiation effects and carcinogenesis. We have worked to refine and apply our multistage carcinogenesis models to explicitly incorporate cell cycle states, simple and complex damage, checkpoint delay, slow and fast repair, differentiation, and apoptosis to study the effects of low-dose ionizing radiation in mouse intestinal crypts, as well as in other tissues. We have one paper accepted for publication in ''Advances in Space Research'', and another manuscript in preparation describing this work. I also wrote a chapter describing our combined cell-cycle and multistage carcinogenesis model that will be published in a book on stochastic carcinogenesis models edited by Wei-Yuan Tan. In addition, we organized and held a workshop on ''Biologically Based Modeling of Human Health Effects of Low dose Ionizing Radiation'', July 28-29, 2005 at Fred Hutchinson Cancer Research Center in Seattle, Washington. We had over 20 participants, including Mary Helen Barcellos-Hoff as keynote speaker, talks by most of the low-dose modelers in the DOE low-dose program, experimentalists including Les Redpath (and Mary Helen), Noelle Metting from DOE, and Tony Brooks. It appears that homeostatic regulation may be central to understanding low-dose radiation phenomena. The primary effects of ionizing radiation (IR) are cell killing, delayed cell cycling, and induction of mutations. However, homeostatic regulation causes cells that are killed or damaged by IR to eventually be replaced. Cells with an initiating mutation may have a replacement advantage, leading to clonal expansion of these initiated cells. Thus we have focused particularly on modeling effects that disturb homeostatic regulation as early steps in the carcinogenic process. There are two primary considerations that support our focus on homeostatic regulation. First, a number of

  1. Model based control of refrigeration systems

    Energy Technology Data Exchange (ETDEWEB)

    Sloth Larsen, L.F.

    2005-11-15

    The subject for this Ph.D. thesis is model based control of refrigeration systems. Model based control covers a variety of different types of controls, that incorporates mathematical models. In this thesis the main subject therefore has been restricted to deal with system optimizing control. The optimizing control is divided into two layers, where the system oriented top layers deals with set-point optimizing control and the lower layer deals with dynamical optimizing control in the subsystems. The thesis has two main contributions, i.e. a novel approach for set-point optimization and a novel approach for desynchronization based on dynamical optimization. The focus in the development of the proposed set-point optimizing control has been on deriving a simple and general method, that with ease can be applied on various compositions of the same class of systems, such as refrigeration systems. The method is based on a set of parameter depended static equations describing the considered process. By adapting the parameters to the given process, predict the steady state and computing a steady state gradient of the cost function, the process can be driven continuously towards zero gradient, i.e. the optimum (if the cost function is convex). The method furthermore deals with system constrains by introducing barrier functions, hereby the best possible performance taking the given constrains in to account can be obtained, e.g. under extreme operational conditions. The proposed method has been applied on a test refrigeration system, placed at Aalborg University, for minimization of the energy consumption. Here it was proved that by using general static parameter depended system equations it was possible drive the set-points close to the optimum and thus reduce the power consumption with up to 20%. In the dynamical optimizing layer the idea is to optimize the operation of the subsystem or the groupings of subsystems, that limits the obtainable system performance. In systems

  2. Ontology Mapping of Business Process Modeling Based on Formal Temporal Logic

    Directory of Open Access Journals (Sweden)

    Irfan Chishti

    2014-08-01

    Full Text Available A business process is the combination of a set of activities with logical order and dependence, whose objective is to produce a desired goal. Business process modeling (BPM using knowledge of the available process modeling techniques enables a common understanding and analysis of a business process. Industry and academics use informal and formal techniques respectively to represent business processes (BP, having the main objective to support an organization. Despite both are aiming at BPM, the techniques used are quite different in their semantics. While carrying out literature research, it has been found that there is no general representation of business process modeling is available that is expressive than the commercial modeling tools and techniques. Therefore, it is primarily conceived to provide an ontology mapping of modeling terms of Business Process Modeling Notation (BPMN, Unified Modeling Language (UML Activity Diagrams (AD and Event Driven Process Chains (EPC to temporal logic. Being a formal system, first order logic assists in thorough understanding of process modeling and its application. However, our contribution is to devise a versatile conceptual categorization of modeling terms/constructs and also formalizing them, based on well accepted business notions, such as action, event, process, connector and flow. It is demonstrated that the new categorization of modeling terms mapped to formal temporal logic, provides the expressive power to subsume business process modeling techniques i.e. BPMN, UML AD and EPC.

  3. Agent-based modelling in synthetic biology.

    Science.gov (United States)

    Gorochowski, Thomas E

    2016-11-30

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).

  4. Models-Based Practice: Great White Hope or White Elephant?

    Science.gov (United States)

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  5. Models-Based Practice: Great White Hope or White Elephant?

    Science.gov (United States)

    Casey, Ashley

    2014-01-01

    Background: Many critical curriculum theorists in physical education have advocated a model- or models-based approach to teaching in the subject. This paper explores the literature base around models-based practice (MBP) and asks if this multi-models approach to curriculum planning has the potential to be the great white hope of pedagogical change…

  6. Intentional switches between bimanual coordination patterns are primarily effectuated by the nondominant hand

    NARCIS (Netherlands)

    de Poel, HJ; Peper, CE; Beek, PJ

    2006-01-01

    Based on indications that hand dominance is characterized by asymmetrical interlimb coupling strength (with the dominant hand exerting stronger influences on the nondominant hand than vice versa), intentional switches between rhythmic bimanual coordination patterns were predicted to be mediated prim

  7. Intellectual Model-Based Configuration Management Conception

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-07-01

    Full Text Available Software configuration management is one of the most important disciplines within the software development project, which helps control the software evolution process and allows including into the end project only tested and validated changes. To achieve this, software management completes certain tasks. Concrete tools are used for technical implementation of tasks, such as version control systems, servers of continuous integration, compilers, etc. A correct configuration management process usually requires several tools, which mutually exchange information by generating various kinds of transfers. When it comes to introducing the configuration management process, often there are situations when tool installation is started, yet at that given moment there is no general picture of the total process. The article offers a model-based configuration management concept, which foresees the development of an abstract model for the configuration management process that later is transformed to lower abstraction level models and tools are indicated to support the technical process. A solution of this kind allows a more rational introduction and configuration of tools

  8. Model-based estimation of individual fitness

    Science.gov (United States)

    Link, W.A.; Cooch, E.G.; Cam, E.

    2002-01-01

    Fitness is the currency of natural selection, a measure of the propagation rate of genotypes into future generations. Its various definitions have the common feature that they are functions of survival and fertility rates. At the individual level, the operative level for natural selection, these rates must be understood as latent features, genetically determined propensities existing at birth. This conception of rates requires that individual fitness be defined and estimated by consideration of the individual in a modelled relation to a group of similar individuals; the only alternative is to consider a sample of size one, unless a clone of identical individuals is available. We present hierarchical models describing individual heterogeneity in survival and fertility rates and allowing for associations between these rates at the individual level. We apply these models to an analysis of life histories of Kittiwakes (Rissa tridactyla ) observed at several colonies on the Brittany coast of France. We compare Bayesian estimation of the population distribution of individual fitness with estimation based on treating individual life histories in isolation, as samples of size one (e.g. McGraw & Caswell, 1996).

  9. Evaluating face trustworthiness: a model based approach.

    Science.gov (United States)

    Todorov, Alexander; Baron, Sean G; Oosterhof, Nikolaas N

    2008-06-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response-as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic--strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension.

  10. Fast Algorithms for Model-Based Diagnosis

    Science.gov (United States)

    Fijany, Amir; Barrett, Anthony; Vatan, Farrokh; Mackey, Ryan

    2005-01-01

    Two improved new methods for automated diagnosis of complex engineering systems involve the use of novel algorithms that are more efficient than prior algorithms used for the same purpose. Both the recently developed algorithms and the prior algorithms in question are instances of model-based diagnosis, which is based on exploring the logical inconsistency between an observation and a description of a system to be diagnosed. As engineering systems grow more complex and increasingly autonomous in their functions, the need for automated diagnosis increases concomitantly. In model-based diagnosis, the function of each component and the interconnections among all the components of the system to be diagnosed (for example, see figure) are represented as a logical system, called the system description (SD). Hence, the expected behavior of the system is the set of logical consequences of the SD. Faulty components lead to inconsistency between the observed behaviors of the system and the SD. The task of finding the faulty components (diagnosis) reduces to finding the components, the abnormalities of which could explain all the inconsistencies. Of course, the meaningful solution should be a minimal set of faulty components (called a minimal diagnosis), because the trivial solution, in which all components are assumed to be faulty, always explains all inconsistencies. Although the prior algorithms in question implement powerful methods of diagnosis, they are not practical because they essentially require exhaustive searches among all possible combinations of faulty components and therefore entail the amounts of computation that grow exponentially with the number of components of the system.

  11. Activity Recognition Using Biomechanical Model Based Pose Estimation

    OpenAIRE

    Reiss, Attila; Hendeby, Gustaf; Bleser, Gabriele; Stricker, Didier

    2010-01-01

    In this paper, a novel activity recognition method based on signal-oriented and model-based features is presented. The model-based features are calculated from shoulder and elbow joint angles and torso orientation, provided by upper-body pose estimation based on a biomechanical body model. The recognition performance of signal-oriented and model-based features is compared within this paper, and the potential of improving recognition accuracy by combining the two approaches is proved: the accu...

  12. Human land uses enhance sediment denitrification and N2O production in Yangtze lakes primarily by influencing lake water quality

    Directory of Open Access Journals (Sweden)

    W. Liu

    2015-05-01

    Full Text Available Sediment denitrification in lakes alleviates the effects of eutrophication through removal of nitrogen to the atmosphere as N2O and N2. However, N2O contributes notably to the greenhouse effect and global warming. Human lands uses (e.g., agricultural and urban areas strongly affect lake water quality and sediment characteristics, which, in turn, may regulate lake sediment denitrification and N2O production. In this study, we investigated sediment denitrification and N2O production and their relationships to within-lake variables and watershed land uses in 20 lakes from the Yangtze River basin in China. The results indicated that both lake water quality and sediment characteristics were significantly influenced by watershed land uses. Increased background denitrification rate would result in increased N2O production rate. Background denitrification and N2O production rates were positively related to water nitrogen concentrations but were not significantly correlated with sediment characteristics and plant community structure. A significant positive relationship was observed between background denitrification rate and percentage of human-dominated land uses (HDL in watersheds. Structural equation modelling revealed that the indirect effects of HDL on sediment denitrification and N2O production in Yangtze lakes were mediated primarily through lake water quality. Our findings also suggest that although sediments in Yangtze lakes can remove large quantities of nitrogen through denitrification, they may also be an important source of N2O, especially in lakes with high nitrogen content.

  13. Parasites Affect Food Web Structure Primarily through Increased Diversity and Complexity

    NARCIS (Netherlands)

    Dunne, J.A.; Lafferty, K.D.; Dobson, A.P.; Hechinger, R.F.; Kuris, A.M.; Martinez, N.D.; McLaughlin, J.P.; Mouritsen, K.N.; Poulin, R.; Reise, K.; Stouffer, D.B.; Thieltges, D.W.; Williams, R.J.; Zander, C.D.

    2013-01-01

    Comparative research on food web structure has revealed generalities in trophic organization, produced simple models, and allowed assessment of robustness to species loss. These studies have mostly focused on free-living species. Recent research has suggested that inclusion of parasites alters struc

  14. Parasites Affect Food Web Structure Primarily through Increased Diversity and Complexity

    NARCIS (Netherlands)

    Dunne, J.A.; Lafferty, K.D.; Dobson, A.P.; Hechinger, R.F.; Kuris, A.M.; Martinez, N.D.; McLaughlin, J.P.; Mouritsen, K.N.; Poulin, R.; Reise, K.; Stouffer, D.B.; Thieltges, D.W.; Williams, R.J.; Zander, C.D.

    2013-01-01

    Comparative research on food web structure has revealed generalities in trophic organization, produced simple models, and allowed assessment of robustness to species loss. These studies have mostly focused on free-living species. Recent research has suggested that inclusion of parasites alters struc

  15. An Attack Modeling Based on Colored Petri Net

    Institute of Scientific and Technical Information of China (English)

    ZHOU Shijie; QIN Zhiguang; ZHANG Feng; LIU Jinde

    2004-01-01

    A color petri net (CPN) based attack modeling approach is addressed.Compared with graph-based modeling,CPN based attack model is fiexible enough to model Intemet intrusions,because of their static and dynamic features.The processes and rules of building CPN based attack model from attack tree are also presented.In order to evaluate the risk of intrusion,some cost elements are added to CPN based attack modeling.This extended model is useful in intrusion detection and risk evaluation.Experiences show that it is easy to exploit CPN based attack modeling approach to provide the controlling functions,such as intrusion response and intrusion defense.A case study given in this paper shows that CPN based attack model has many unique characters which attack tree model hasn't.

  16. Context Based Reasoning in Business Process Models

    OpenAIRE

    Balabko, Pavel; Wegmann, Alain

    2003-01-01

    Modeling approaches often are not adapted to human reasoning: models are ambiguous and imprecise. A same model element may have multiple meanings in different functional roles of a system. Existing modeling approaches do not relate explicitly these functional roles with model elements. A principle that can solve this problem is that model elements should be defined in a context. We believe that the explicit modeling of context is especially useful in Business Process Modeling (BPM) where the ...

  17. Inverted Papilloma Originating Primarily from the Nasolacrimal Duct: A Case Report and Review of the Pertinent Literature

    Directory of Open Access Journals (Sweden)

    Hussein Z. Walijee

    2015-01-01

    Full Text Available Introduction. Inverted papilloma (IP is an uncommon, benign yet aggressive neoplasm characterised by high recurrence rates and tendency towards malignant transformation. The majority of IP cases originate in the ethmoid region, lateral wall of the nasal fossa, and maxillary sinus. The authors report a case of an IP originating primarily from the nasolacrimal duct (NLD. Case. A 69-year-old Caucasian gentleman presented with a lump in his right medial canthal region, epiphora, and discharge bilaterally. Radiological investigation revealed a well-defined, heterogeneous mass within the proximal NLD eroding the bony canal, protruding into the middle meatus and into the right orbit. The tumour was excised en bloc utilizing a combined external and endoscopic approach based on its location. Histology revealed hyperplastic ribbons of basement membrane-enclosed epithelium growing endophytically into the underlying stroma with no evidence of invasive malignancy. The patient made an uneventful recovery with unchanged visual acuity and normal extraocular movements. Conclusion. The case demonstrates variability within the sinonasal tract that IP can develop and the individuality of each case necessitating tailored operative techniques for complete excision whilst minimising recurrence rates. We also present a combined endoscopic approach for the en bloc resection of a NLD IP with no clinical recurrence at 15-month follow-up.

  18. Adolescent Pornography Use and Dating Violence among a Sample of Primarily Black and Hispanic, Urban-Residing, Underage Youth

    Directory of Open Access Journals (Sweden)

    Emily F. Rothman

    2015-12-01

    Full Text Available This cross-sectional study was designed to characterize the pornography viewing preferences of a sample of U.S.-based, urban-residing, economically disadvantaged, primarily Black and Hispanic youth (n = 72, and to assess whether pornography use was associated with experiences of adolescent dating abuse (ADA victimization. The sample was recruited from a large, urban, safety net hospital, and participants were 53% female, 59% Black, 19% Hispanic, 14% Other race, 6% White, and 1% Native American. All were 16–17 years old. More than half (51% had been asked to watch pornography together by a dating or sexual partner, and 44% had been asked to do something sexual that a partner saw in pornography. Adolescent dating abuse (ADA victimization was associated with more frequent pornography use, viewing pornography in the company of others, being asked to perform a sexual act that a partner first saw in pornography, and watching pornography during or after marijuana use. Approximately 50% of ADA victims and 32% of non-victims reported that they had been asked to do a sexual act that their partner saw in pornography (p = 0.15, and 58% did not feel happy to have been asked. Results suggest that weekly pornography use among underage, urban-residing youth is common, and may be associated with ADA victimization.

  19. Adolescent Pornography Use and Dating Violence among a Sample of Primarily Black and Hispanic, Urban-Residing, Underage Youth

    Science.gov (United States)

    Rothman, Emily F.; Adhia, Avanti

    2015-01-01

    This cross-sectional study was designed to characterize the pornography viewing preferences of a sample of U.S.-based, urban-residing, economically disadvantaged, primarily Black and Hispanic youth (n = 72), and to assess whether pornography use was associated with experiences of adolescent dating abuse (ADA) victimization. The sample was recruited from a large, urban, safety net hospital, and participants were 53% female, 59% Black, 19% Hispanic, 14% Other race, 6% White, and 1% Native American. All were 16–17 years old. More than half (51%) had been asked to watch pornography together by a dating or sexual partner, and 44% had been asked to do something sexual that a partner saw in pornography. Adolescent dating abuse (ADA) victimization was associated with more frequent pornography use, viewing pornography in the company of others, being asked to perform a sexual act that a partner first saw in pornography, and watching pornography during or after marijuana use. Approximately 50% of ADA victims and 32% of non-victims reported that they had been asked to do a sexual act that their partner saw in pornography (p = 0.15), and 58% did not feel happy to have been asked. Results suggest that weekly pornography use among underage, urban-residing youth may be common, and may be associated with ADA victimization. PMID:26703744

  20. Ecosystem Based Business Model of Smart Grid

    DEFF Research Database (Denmark)

    Lundgaard, Morten Raahauge; Ma, Zheng; Jørgensen, Bo Nørregaard

    2015-01-01

    This paper tries to investigate the ecosystem based business model in a smart grid infrastructure and the potential of value capture in the highly complex macro infrastructure such as smart grid. This paper proposes an alternative perspective to study the smart grid business ecosystem to support...... the infrastructural challenges, such as the interoperability of business components for smart grid. So far little research has explored the business ecosystem in the smart grid concept. The study on the smart grid with the theory of business ecosystem may open opportunities to understand market catalysts. This study...... contributes an understanding of business ecosystem applicable for smart grid. Smart grid infrastructure is an intricate business ecosystem, which have several intentions to deliver the value proposition and what it should be. The findings help to identify and capture value from markets....

  1. Model-based control of networked systems

    CERN Document Server

    Garcia, Eloy; Montestruque, Luis A

    2014-01-01

    This monograph introduces a class of networked control systems (NCS) called model-based networked control systems (MB-NCS) and presents various architectures and control strategies designed to improve the performance of NCS. The overall performance of NCS considers the appropriate use of network resources, particularly network bandwidth, in conjunction with the desired response of the system being controlled.   The book begins with a detailed description of the basic MB-NCS architecture that provides stability conditions in terms of state feedback updates . It also covers typical problems in NCS such as network delays, network scheduling, and data quantization, as well as more general control problems such as output feedback control, nonlinear systems stabilization, and tracking control.   Key features and topics include: Time-triggered and event-triggered feedback updates Stabilization of uncertain systems subject to time delays, quantization, and extended absence of feedback Optimal control analysis and ...

  2. Model Based Control of Refrigeration Systems

    DEFF Research Database (Denmark)

    Larsen, Lars Finn Sloth

    of the supermarket refrigeration systems therefore greatly relies on a human operator to detect and accommodate failures, and to optimize system performance under varying operational condition. Today these functions are maintained by monitoring centres located all over the world. Initiated by the growing need...... for automation of these procedures, that is to incorporate some "intelligence" in the control system, this project was started up. The main emphasis of this work has been on model based methods for system optimizing control in supermarket refrigeration systems. The idea of implementing a system optimizing.......e. by degrading the performance. The method has been successfully applied on a test frigeration system for minimization of the power consumption; the hereby gained experimental results will be presented. The present control structure in a supermarket refrigeration system is distributed, which means...

  3. Model based optimization of EMC input filters

    Energy Technology Data Exchange (ETDEWEB)

    Raggl, K; Kolar, J. W. [Swiss Federal Institute of Technology, Power Electronic Systems Laboratory, Zuerich (Switzerland); Nussbaumer, T. [Levitronix GmbH, Zuerich (Switzerland)

    2008-07-01

    Input filters of power converters for compliance with regulatory electromagnetic compatibility (EMC) standards are often over-dimensioned in practice due to a non-optimal selection of number of filter stages and/or the lack of solid volumetric models of the inductor cores. This paper presents a systematic filter design approach based on a specific filter attenuation requirement and volumetric component parameters. It is shown that a minimal volume can be found for a certain optimal number of filter stages for both the differential mode (DM) and common mode (CM) filter. The considerations are carried out exemplarily for an EMC input filter of a single phase power converter for the power levels of 100 W, 300 W, and 500 W. (author)

  4. Distributed Damage Estimation for Prognostics based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches capture system knowl- edge in the form of physics-based models of components that include how they fail. These methods consist of...

  5. Multiple Damage Progression Paths in Model-based Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches employ do- main knowledge about a system, its components, and how they fail through the use of physics-based models. Compo- nent...

  6. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  7. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  8. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  9. Parasites affect food web structure primarily through increased diversity and complexity.

    Directory of Open Access Journals (Sweden)

    Jennifer A Dunne

    Full Text Available Comparative research on food web structure has revealed generalities in trophic organization, produced simple models, and allowed assessment of robustness to species loss. These studies have mostly focused on free-living species. Recent research has suggested that inclusion of parasites alters structure. We assess whether such changes in network structure result from unique roles and traits of parasites or from changes to diversity and complexity. We analyzed seven highly resolved food webs that include metazoan parasite data. Our analyses show that adding parasites usually increases link density and connectance (simple measures of complexity, particularly when including concomitant links (links from predators to parasites of their prey. However, we clarify prior claims that parasites "dominate" food web links. Although parasites can be involved in a majority of links, in most cases classic predation links outnumber classic parasitism links. Regarding network structure, observed changes in degree distributions, 14 commonly studied metrics, and link probabilities are consistent with scale-dependent changes in structure associated with changes in diversity and complexity. Parasite and free-living species thus have similar effects on these aspects of structure. However, two changes point to unique roles of parasites. First, adding parasites and concomitant links strongly alters the frequency of most motifs of interactions among three taxa, reflecting parasites' roles as resources for predators of their hosts, driven by trophic intimacy with their hosts. Second, compared to free-living consumers, many parasites' feeding niches appear broader and less contiguous, which may reflect complex life cycles and small body sizes. This study provides new insights about generic versus unique impacts of parasites on food web structure, extends the generality of food web theory, gives a more rigorous framework for assessing the impact of any species on trophic

  10. Parasites affect food web structure primarily through increased diversity and complexity

    Science.gov (United States)

    Dunne, Jennifer A.; Lafferty, Kevin D.; Dobson, Andrew P.; Hechinger, Ryan F.; Kuris, Armand M.; Martinez, Neo D.; McLaughlin, John P.; Mouritsen, Kim N.; Poulin, Robert; Reise, Karsten; Stouffer, Daniel B.; Thieltges, David W.; Williams, Richard J.; Zander, Claus Dieter

    2013-01-01

    Comparative research on food web structure has revealed generalities in trophic organization, produced simple models, and allowed assessment of robustness to species loss. These studies have mostly focused on free-living species. Recent research has suggested that inclusion of parasites alters structure. We assess whether such changes in network structure result from unique roles and traits of parasites or from changes to diversity and complexity. We analyzed seven highly resolved food webs that include metazoan parasite data. Our analyses show that adding parasites usually increases link density and connectance (simple measures of complexity), particularly when including concomitant links (links from predators to parasites of their prey). However, we clarify prior claims that parasites ‘‘dominate’’ food web links. Although parasites can be involved in a majority of links, in most cases classic predation links outnumber classic parasitism links. Regarding network structure, observed changes in degree distributions, 14 commonly studied metrics, and link probabilities are consistent with scale-dependent changes in structure associated with changes in diversity and complexity. Parasite and free-living species thus have similar effects on these aspects of structure. However, two changes point to unique roles of parasites. First, adding parasites and concomitant links strongly alters the frequency of most motifs of interactions among three taxa, reflecting parasites’ roles as resources for predators of their hosts, driven by trophic intimacy with their hosts. Second, compared to free-living consumers, many parasites’ feeding niches appear broader and less contiguous, which may reflect complex life cycles and small body sizes. This study provides new insights about generic versus unique impacts of parasites on food web structure, extends the generality of food web theory, gives a more rigorous framework for assessing the impact of any species on trophic

  11. Validating agent based models through virtual worlds.

    Energy Technology Data Exchange (ETDEWEB)

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  12. A model-based evaluation system of enterprise

    Institute of Scientific and Technical Information of China (English)

    Yan Junwei; Ye Yang; Wang Jian

    2005-01-01

    This paper analyses the architecture of enterprise modeling, proposesindicator selection principles and indicator decomposition methods, examines the approaches to the evaluation of enterprise modeling and designs an evaluation model of AHP. Then a model-based evaluation system of enterprise is presented toeffectively evaluate the business model in the framework of enterprise modeling.

  13. Model Mapping Approach Based on Ontology Semantics

    Directory of Open Access Journals (Sweden)

    Jinkui Hou

    2013-09-01

    Full Text Available The mapping relations between different models are the foundation for model transformation in model-driven software development. On the basis of ontology semantics, model mappings between different levels are classified by using structural semantics of modeling languages. The general definition process for mapping relations is explored, and the principles of structure mapping are proposed subsequently. The approach is further illustrated by the mapping relations from class model of object oriented modeling language to the C programming codes. The application research shows that the approach provides a theoretical guidance for the realization of model mapping, and thus can make an effective support to model-driven software development

  14. Adipose stem cell-derived nanovesicles inhibit emphysema primarily via an FGF2-dependent pathway.

    Science.gov (United States)

    Kim, You-Sun; Kim, Ji-Young; Cho, RyeonJin; Shin, Dong-Myung; Lee, Sei Won; Oh, Yeon-Mok

    2017-01-13

    Cell therapy using stem cells has produced therapeutic benefits in animal models of COPD. Secretory mediators are proposed as one mechanism for stem cell effects because very few stem cells engraft after injection into recipient animals. Recently, nanovesicles that overcome the disadvantages of natural exosomes have been generated artificially from cells. We generated artificial nanovesicles from adipose-derived stem cells (ASCs) using sequential penetration through polycarbonate membranes. ASC-derived artificial nanovesicles displayed a 100 nm-sized spherical shape similar to ASC-derived natural exosomes and expressed both exosomal and stem cell markers. The proliferation rate of lung epithelial cells was increased in cells treated with ASC-derived artificial nanovesicles compared with cells treated with ASC-derived natural exosomes. The lower dose of ASC-derived artificial nanovesicles had similar regenerative capacity compared with a higher dose of ASCs and ASC-derived natural exosomes. In addition, FGF2 levels in the lungs of mice treated with ASC-derived artificial nanovesicles were increased. The uptake of ASC-derived artificial nanovesicles was inhibited by heparin, which is a competitive inhibitor of heparan sulfate proteoglycan that is associated with FGF2 signaling. Taken together, the data indicate that lower doses of ASC-derived artificial nanovesicles may have beneficial effects similar to higher doses of ASCs or ASC-derived natural exosomes in an animal model with emphysema, suggesting that artificial nanovesicles may have economic advantages that warrant future clinical studies.

  15. Adipose stem cell-derived nanovesicles inhibit emphysema primarily via an FGF2-dependent pathway

    Science.gov (United States)

    Kim, You-Sun; Kim, Ji-Young; Cho, RyeonJin; Shin, Dong-Myung; Lee, Sei Won; Oh, Yeon-Mok

    2017-01-01

    Cell therapy using stem cells has produced therapeutic benefits in animal models of COPD. Secretory mediators are proposed as one mechanism for stem cell effects because very few stem cells engraft after injection into recipient animals. Recently, nanovesicles that overcome the disadvantages of natural exosomes have been generated artificially from cells. We generated artificial nanovesicles from adipose-derived stem cells (ASCs) using sequential penetration through polycarbonate membranes. ASC-derived artificial nanovesicles displayed a 100 nm-sized spherical shape similar to ASC-derived natural exosomes and expressed both exosomal and stem cell markers. The proliferation rate of lung epithelial cells was increased in cells treated with ASC-derived artificial nanovesicles compared with cells treated with ASC-derived natural exosomes. The lower dose of ASC-derived artificial nanovesicles had similar regenerative capacity compared with a higher dose of ASCs and ASC-derived natural exosomes. In addition, FGF2 levels in the lungs of mice treated with ASC-derived artificial nanovesicles were increased. The uptake of ASC-derived artificial nanovesicles was inhibited by heparin, which is a competitive inhibitor of heparan sulfate proteoglycan that is associated with FGF2 signaling. Taken together, the data indicate that lower doses of ASC-derived artificial nanovesicles may have beneficial effects similar to higher doses of ASCs or ASC-derived natural exosomes in an animal model with emphysema, suggesting that artificial nanovesicles may have economic advantages that warrant future clinical studies. PMID:28082743

  16. Framework of Pattern Recognition Model Based on the Cognitive Psychology

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    According to the fundamental theory of visual cognition mechanism and cognitive psychology,the visual pattern recognition model is introduced briefly.Three pattern recognition models,i.e.template-based matching model,prototype-based matching model and feature-based matching model are built and discussed separately.In addition,the influence of object background information and visual focus point to the result of pattern recognition is also discussed with the example of recognition for fuzzy letters and figures.

  17. Embedding task-based neural models into a connectome-based model of the cerebral cortex

    Directory of Open Access Journals (Sweden)

    Antonio Ulloa

    2016-08-01

    Full Text Available A number of recent efforts have used large-scale, biologically realistic, neural models to help understand the neural basis for the patterns of activity observed in both resting state and task-related functional neural imaging data. An example of the former is The Virtual Brain (TVB software platform, which allows one to apply large-scale neural modeling in a whole brain framework. TVB provides a set of structural connectomes of the human cerebral cortex, a collection of neural processing units for each connectome node, and various forward models that can convert simulated neural activity into a variety of functional brain imaging signals. In this paper, we demonstrate how to embed a previously or newly constructed task-based large-scale neural model into the TVB platform. We tested our method on a previously constructed large-scale neural model (LSNM of visual object processing that consisted of interconnected neural populations that represent, primary and secondary visual, inferotemporal , and prefrontal cortex. Some neural elements in the original model were 'non task-specific' (NS neurons that served as noise generators to 'task-specific' neurons that processed shapes during a delayed match-to-sample (DMS task. We replaced the NS neurons with an anatomical TVB connectome model of the cerebral cortex comprising 998 regions of interest interconnected by white matter fiber tract weights. We embedded our LSNM of visual object processing into corresponding nodes within the TVB connectome. Reciprocal connections between TVB nodes and our task-based modules were included in this framework. We ran visual object processing simulations and showed that the TVB simulator successfully replaced the noise generation originally provided by NS neurons; i.e., the DMS tasks performed with the hybrid LSNM/TVB simulator generated equivalent neural and fMRI activity to that of the original task-based models. Additionally, we found partial agreement between the

  18. Intramuscular adipose tissue determined by T1-weighted MRI at 3T primarily reflects extramyocellular lipids.

    Science.gov (United States)

    Akima, Hiroshi; Hioki, Maya; Yoshiko, Akito; Koike, Teruhiko; Sakakibara, Hisataka; Takahashi, Hideyuki; Oshida, Yoshiharu

    2016-05-01

    The purpose of this study was to assess relationships between intramuscular adipose tissue (IntraMAT) content determined by MRI and intramyocellular lipids (IMCL) and extramyocellular lipids (EMCL) determined by (1)H magnetic resonance spectroscopy ((1)H MRS) or echo intensity determined by B-mode ultrasonography of human skeletal muscles. Thirty young and elderly men and women were included. T1-weighted MRI was taken from the right mid-thigh to measure IntraMAT content of the vastus lateralis (VL) and biceps femoris (BF) using a histogram shape-based thresholding technique. IMCL and EMCL were measured from the VL and BF at the right mid-thigh using (1)H MRS. Ultrasonographic images were taken from the VL and BF of the right mid-thigh to measure echo intensity based on gray-scale level for quantitative analysis. There was a significant correlation between IntraMAT content by MRI and EMCL of the VL and BF (VL, r=0.506, Plipids, not intramyocellular lipids, in human skeletal muscles. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Orsay, Santeuil and Le Blanc viruses primarily infect intestinal cells in Caenorhabditis nematodes.

    Science.gov (United States)

    Franz, Carl J; Renshaw, Hilary; Frezal, Lise; Jiang, Yanfang; Félix, Marie-Anne; Wang, David

    2014-01-05

    The discoveries of Orsay, Santeuil and Le Blanc viruses, three viruses infecting either Caenorhabditis elegans or its relative Caenorhabditis briggsae, enable the study of virus-host interactions using natural pathogens of these two well-established model organisms. We characterized the tissue tropism of infection in Caenorhabditis nematodes by these viruses. Using immunofluorescence assays targeting proteins from each of the viruses, and in situ hybridization, we demonstrate viral proteins and RNAs localize to intestinal cells in larval stage Caenorhabditis nematodes. Viral proteins were detected in one to six of the 20 intestinal cells present in Caenorhabditis nematodes. In Orsay virus-infected C. elegans, viral proteins were detected as early as 6h post-infection. The RNA-dependent RNA polymerase and capsid proteins of Orsay virus exhibited different subcellular localization patterns. Collectively, these observations provide the first experimental insights into viral protein expression in any nematode host, and broaden our understanding of viral infection in Caenorhabditis nematodes.

  20. A Software Service Framework Model Based on Agent

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents an agent-based software service framework model called ASF, and definesthe basic concepts and structure of ASF model. It also describes the management and process mechanismsin ASF model.

  1. Application of Multi-Step Parameter Estimation Method Based on Optimization Algorithm in Sacramento Model

    Directory of Open Access Journals (Sweden)

    Gang Zhang

    2017-07-01

    Full Text Available The Sacramento model is widely utilized in hydrological forecast, of which the accuracy and performance are primarily determined by the model parameters, indicating the key role of parameter estimation. This paper presents a multi-step parameter estimation method, which divides the parameter estimation of Sacramento model into three steps and realizes optimization step by step. We firstly use the immune clonal selection algorithm (ICSA to solve the non-liner objective function of parameter estimation, and compare the parameter calibration result of ideal artificial data with Shuffled Complex Evolution (SCE-UA, Parallel Genetic Algorithm (PGA, and Serial Master-slaver Swarms Shuffling Evolution Algorithm Based on Particle Swarms Optimization (SMSE-PSO. The comparison result shows that ICSA has the best convergence, efficiency and precision. Then we apply ICSA to the parameter estimation of single-step and multi-step Sacramento model and simulate 32 floods based on application examples of Dongyang and Tantou river basins for validation. It is clearly shown that the results of multi-step method based on ICSA show higher accuracy and 100% qualified rate, indicating its higher precision and reliability, which has great potential to improve Sacramento model and hydrological forecast.

  2. Radiotherapy planning for glioblastoma based on a tumor growth model: Improving target volume delineation

    CERN Document Server

    Unkelbach, Jan; Konukoglu, Ender; Dittmann, Florian; Le, Matthieu; Ayache, Nicholas; Shih, Helen A

    2013-01-01

    Glioblastoma are known to infiltrate the brain parenchyma instead of forming a solid tumor mass with a defined boundary. Only the part of the tumor with high tumor cell density can be localized through imaging directly. In contrast, brain tissue infiltrated by tumor cells at low density appears normal on current imaging modalities. In clinical practice, a uniform margin is applied to account for microscopic spread of disease. The current treatment planning procedure can potentially be improved by accounting for the anisotropy of tumor growth: Anatomical barriers such as the falx cerebri represent boundaries for migrating tumor cells. In addition, tumor cells primarily spread in white matter and infiltrate gray matter at lower rate. We investigate the use of a phenomenological tumor growth model for treatment planning. The model is based on the Fisher-Kolmogorov equation, which formalizes these growth characteristics and estimates the spatial distribution of tumor cells in normal appearing regions of the brain...

  3. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  4. Measurement-based load modeling: Theory and application

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.

  5. MRO CTX-based Digital Terrain Models

    Science.gov (United States)

    Dumke, Alexander

    2016-04-01

    In planetary surface sciences, digital terrain models (DTM) are paramount when it comes to understanding and quantifying processes. In this contribution an approach for the derivation of digital terrain models from stereo images of the NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) are described. CTX consists of a 350 mm focal length telescope and 5000 CCD sensor elements and is operated as pushbroom camera. It acquires images with ~6 m/px over a swath width of ~30 km of the Mars surface [1]. Today, several approaches for the derivation of CTX DTMs exist [e. g. 2, 3, 4]. The discussed approach here is based on established software and combines them with proprietary software as described below. The main processing task for the derivation of CTX stereo DTMs is based on six steps: (1) First, CTX images are radiometrically corrected using the ISIS software package [5]. (2) For selected CTX stereo images, exterior orientation data from reconstructed NAIF SPICE data are extracted [6]. (3) In the next step High Resolution Stereo Camera (HRSC) DTMs [7, 8, 9] are used for the rectification of CTX stereo images to reduce the search area during the image matching. Here, HRSC DTMs are used due to their higher spatial resolution when compared to MOLA DTMs. (4) The determination of coordinates of homologous points between stereo images, i.e. the stereo image matching process, consists of two steps: first, a cross-correlation to obtain approximate values and secondly, their use in a least-square matching (LSM) process in order to obtain subpixel positions. (5) The stereo matching results are then used to generate object points from forward ray intersections. (6) As a last step, the DTM-raster generation is performed using software developed at the German Aerospace Center, Berlin. Whereby only object points are used that have a smaller error than a threshold value. References: [1] Malin, M. C. et al., 2007, JGR 112, doi:10.1029/2006JE002808 [2] Broxton, M. J. et al

  6. Sub-micrometre Particulate Matter is Primarily in Liquid Form over Amazon Rainforests

    Energy Technology Data Exchange (ETDEWEB)

    Bateman, Adam P.; Gong, Z. H.; Liu, Pengfei; Sato, Bruno; Cirino, Glauber; Zhang, Yue; Artaxo, Paulo; Bertram, Allan K.; Manzi, A.; Rizzo, L. V.; Souza, Rodrigo A.; Zaveri, Rahul A.; Martin, Scot T.

    2016-01-01

    Particulate matter (PM) occurs in the Earth’s atmosphere both in liquid and non-liquid forms. The physical state affects the available physical and chemical mechanisms of growth and reactivity, ultimately affecting the number, size, and composition of the atmospheric particle population. Herein, the physical state, including the response to relative humidity (RH), was investigated on-line and in real time for PM (< 1 μm) over the tropical rain forest of central Amazonia during both the wet and dry seasons of 2013. The results show that the PM was liquid for RH > 80% across 296 to 300 K. These results, in conjunction with the distributions of RH and temperature in Amazonia, imply that near-surface submicron PM in Amazonia is liquid most of the time. The observations are consistent with laboratory experiments showing that PM produced by isoprene photo-oxidation is liquid across these meteorological conditions. The findings have implications for the mechanisms of new particle production in Amazonia, the growth of submicron particles and hence dynamics of the cloud life cycle, and the sensitivity of these processes to anthropogenic activities. An approach for inclusion of particle physical state in chemical transport models is presented.

  7. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    Science.gov (United States)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  8. Analytical modeling of a sandwiched plate piezoelectric transformer-based acoustic-electric transmission channel.

    Science.gov (United States)

    Lawry, Tristan J; Wilt, Kyle R; Scarton, Henry A; Saulnier, Gary J

    2012-11-01

    The linear propagation of electromagnetic and dilatational waves through a sandwiched plate piezoelectric transformer (SPPT)-based acoustic-electric transmission channel is modeled using the transfer matrix method with mixed-domain two-port ABCD parameters. This SPPT structure is of great interest because it has been explored in recent years as a mechanism for wireless transmission of electrical signals through solid metallic barriers using ultrasound. The model we present is developed to allow for accurate channel performance prediction while greatly reducing the computational complexity associated with 2- and 3-dimensional finite element analysis. As a result, the model primarily considers 1-dimensional wave propagation; however, approximate solutions for higher-dimensional phenomena (e.g., diffraction in the SPPT's metallic core layer) are also incorporated. The model is then assessed by comparing it to the measured wideband frequency response of a physical SPPT-based channel from our previous work. Very strong agreement between the modeled and measured data is observed, confirming the accuracy and utility of the presented model.

  9. Evaluation of source water protection strategies: a fuzzy-based model.

    Science.gov (United States)

    Islam, Nilufar; Sadiq, Rehan; Rodriguez, Manuel J; Francisque, Alex

    2013-05-30

    Source water protection (SWP) is an important step in the implementation of a multi-barrier approach that ensures the delivery of safe drinking water. Available decision-making models for SWP primarily use complex mathematical formulations that require large data sets to perform analysis, which limit their use. Moreover, most of them cannot handle interconnection and redundancy among the parameters, or missing information. A fuzzy-based model is proposed in this study to overcome the above limitations. This model can estimate a reduction in the pollutant loads based on selected SWP strategies (e.g., storm water management ponds, vegetated filter strips). The proposed model employs an export coefficient approach and account for the number of animals to estimate the pollutant loads generated by different land usages (e.g., agriculture, forests, highways, livestock, and pasture land). Water quality index is used for the assessment of water quality once these pollutant loads are discharged into the receiving waters. To demonstrate the application of the proposed model, a case study of Page Creek was performed in the Clayburn watershed (British Columbia, Canada). The results show that increasing urban development and poorly managed agricultural areas have the most adverse effects on source water quality. The proposed model can help decision makers to make informed decisions related to the land use and resource allocation.

  10. The Community-based Whole Magnetosphere Model

    Science.gov (United States)

    2011-11-15

    Tribulations and Exultations in Coupling Models of the Magnetosphere with Ionosphere-Thermosphere Models, Plane- tary Aeronomy ISSI Meeting, Bern...Exultations in Coupling Models of the Magnetosphere CWMM-19 Ridley CWMM Final Report with Ionosphere-Thermosphere Models, Plane- tary Aeronomy ISSI

  11. Probabilistic Based Modeling and Simulation Assessment

    Science.gov (United States)

    2010-06-01

    but with the head and neck replaced with a high fidelity cervical spine and head model. The occupant models were used to determine the effects of...fidelity cervical spine and head model... vertebrae , including the disks, ligaments and musculature, Figure 6. In total there are 57837 elements with 63713 nodes. A full description of the model

  12. Y-Chromosomal Diversity in Europe Is Clinal and Influenced Primarily by Geography, Rather than by Language

    Science.gov (United States)

    Rosser, Zoë H.; Zerjal, Tatiana; Hurles, Matthew E.; Adojaan, Maarja; Alavantic, Dragan; Amorim, António; Amos, William; Armenteros, Manuel; Arroyo, Eduardo; Barbujani, Guido; Beckman, Gunhild; Beckman, Lars; Bertranpetit, Jaume; Bosch, Elena; Bradley, Daniel G.; Brede, Gaute; Cooper, Gillian; Côrte-Real, Helena B. S. M.; de Knijff, Peter; Decorte, Ronny; Dubrova, Yuri E.; Evgrafov, Oleg; Gilissen, Anja; Glisic, Sanja; Gölge, Mukaddes; Hill, Emmeline W.; Jeziorowska, Anna; Kalaydjieva, Luba; Kayser, Manfred; Kivisild, Toomas; Kravchenko, Sergey A.; Krumina, Astrida; Kučinskas, Vaidutis; Lavinha, João; Livshits, Ludmila A.; Malaspina, Patrizia; Maria, Syrrou; McElreavey, Ken; Meitinger, Thomas A.; Mikelsaar, Aavo-Valdur; Mitchell, R. John; Nafa, Khedoudja; Nicholson, Jayne; Nørby, Søren; Pandya, Arpita; Parik, Jüri; Patsalis, Philippos C.; Pereira, Luísa; Peterlin, Borut; Pielberg, Gerli; Prata, Maria João; Previderé, Carlo; Roewer, Lutz; Rootsi, Siiri; Rubinsztein, D. C.; Saillard, Juliette; Santos, Fabrício R.; Stefanescu, Gheorghe; Sykes, Bryan C.; Tolun, Aslihan; Villems, Richard; Tyler-Smith, Chris; Jobling, Mark A.

    2000-01-01

    Clinal patterns of autosomal genetic diversity within Europe have been interpreted in previous studies in terms of a Neolithic demic diffusion model for the spread of agriculture; in contrast, studies using mtDNA have traced many founding lineages to the Paleolithic and have not shown strongly clinal variation. We have used 11 human Y-chromosomal biallelic polymorphisms, defining 10 haplogroups, to analyze a sample of 3,616 Y chromosomes belonging to 47 European and circum-European populations. Patterns of geographic differentiation are highly nonrandom, and, when they are assessed using spatial autocorrelation analysis, they show significant clines for five of six haplogroups analyzed. Clines for two haplogroups, representing 45% of the chromosomes, are continentwide and consistent with the demic diffusion hypothesis. Clines for three other haplogroups each have different foci and are more regionally restricted and are likely to reflect distinct population movements, including one from north of the Black Sea. Principal-components analysis suggests that populations are related primarily on the basis of geography, rather than on the basis of linguistic affinity. This is confirmed in Mantel tests, which show a strong and highly significant partial correlation between genetics and geography but a low, nonsignificant partial correlation between genetics and language. Genetic-barrier analysis also indicates the primacy of geography in the shaping of patterns of variation. These patterns retain a strong signal of expansion from the Near East but also suggest that the demographic history of Europe has been complex and influenced by other major population movements, as well as by linguistic and geographic heterogeneities and the effects of drift. PMID:11078479

  13. Transient activation of microglia following acute alcohol exposure in developing mouse neocortex is primarily driven by BAX-dependent neurodegeneration.

    Science.gov (United States)

    Ahlers, Katelin E; Karaçay, Bahri; Fuller, Leah; Bonthius, Daniel J; Dailey, Michael E

    2015-10-01

    Fetal alcohol exposure is the most common known cause of preventable mental retardation, yet we know little about how microglia respond to, or are affected by, alcohol in the developing brain in vivo. Using an acute (single day) model of moderate (3 g/kg) to severe (5 g/kg) alcohol exposure in postnatal day (P) 7 or P8 mice, we found that alcohol-induced neuroapoptosis in the neocortex is closely correlated in space and time with the appearance of activated microglia near dead cells. The timing and molecular pattern of microglial activation varied with the level of cell death. Although microglia rapidly mobilized to contact and engulf late-stage apoptotic neurons, apoptotic bodies temporarily accumulated in neocortex, suggesting that in severe cases of alcohol toxicity the neurodegeneration rate exceeds the clearance capacity of endogenous microglia. Nevertheless, most dead cells were cleared and microglia began to deactivate within 1-2 days of the initial insult. Coincident with microglial activation and deactivation, there was a transient increase in expression of pro-inflammatory factors, TNFα and IL-1β, after severe (5 g/kg) but not moderate (3 g/kg) EtOH levels. Alcohol-induced microglial activation and pro-inflammatory factor expression were largely abolished in BAX null mice lacking neuroapoptosis, indicating that microglial activation is primarily triggered by apoptosis rather than the alcohol. Therefore, acute alcohol exposure in the developing neocortex causes transient microglial activation and mobilization, promoting clearance of dead cells and tissue recovery. Moreover, cortical microglia show a remarkable capacity to rapidly deactivate following even severe neurodegenerative insults in the developing brain.

  14. A stability condition for turbulence model: From EMMS model to EMMS-based turbulence model

    CERN Document Server

    Zhang, Lin; Wang, Limin; Li, Jinghai

    2013-01-01

    The closure problem of turbulence is still a challenging issue in turbulence modeling. In this work, a stability condition is used to close turbulence. Specifically, we regard single-phase flow as a mixture of turbulent and non-turbulent fluids, separating the structure of turbulence. Subsequently, according to the picture of the turbulent eddy cascade, the energy contained in turbulent flow is decomposed into different parts and then quantified. A turbulence stability condition, similar to the principle of the energy-minimization multi-scale (EMMS) model for gas-solid systems, is formulated to close the dynamic constraint equations of turbulence, allowing the heterogeneous structural parameters of turbulence to be optimized. We call this model the `EMMS-based turbulence model', and use it to construct the corresponding turbulent viscosity coefficient. To validate the EMMS-based turbulence model, it is used to simulate two classical benchmark problems, lid-driven cavity flow and turbulent flow with forced con...

  15. A Bit Progress on Word—Based Language Model

    Institute of Scientific and Technical Information of China (English)

    陈勇; 陈国评

    2003-01-01

    A good language model is essential to a postprocessing algorithm for recognition systems. In the past, researchers have pre-sented various language models, such as character based language models, word based language model, syntactical rules :language mod-el, hybrid models, etc. The word N-gram model is by far an effective and efficient model, but one has to address the problem of data sparseness in establishing the model. Katz and Kneser et al. respectively presented effective remedies to solve this challenging prob-lem. In this study, we proposed an improvement to their methods by incorporating Chinese language-specific information or Chinese word class information into the system.

  16. Study of chaos based on a hierarchical model

    Energy Technology Data Exchange (ETDEWEB)

    Yagi, Masatoshi; Itoh, Sanae-I. [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics

    2001-12-01

    Study of chaos based on a hierarchical model is briefly reviewed. Here we categorize hierarchical model equations, i.e., (1) a model with a few degrees of freedom, e.g., the Lorenz model, (2) a model with intermediate degrees of freedom like a shell model, and (3) a model with many degrees of freedom such as a Navier-Stokes equation. We discuss the nature of chaos and turbulence described by these models via Lyapunov exponents. The interpretation of results observed in fundamental plasma experiments is also shown based on a shell model. (author)

  17. One-dimensional analytical model development of a plasma-based actuator

    Science.gov (United States)

    Popkin, Sarah Haack

    This dissertation provides a method for modeling the complex, multi-physics, multi-dimensional processes associated with a plasma-based flow control actuator, also known as the SparkJet, by using a one-dimensional analytical model derived from the Euler and thermodynamic equations, under varying assumptions. This model is compared to CFD simulations and experimental data to verify and/or modify the model where simplifying assumptions poorly represent the real actuator. The model was exercised to explore high-frequency actuation and methods of improving actuator performance. Using peak jet momentum as a performance metric, the model shows that a typical SparkJet design (1 mm orifice diameter, 84.8 mm3 cavity volume, and 0.5 J energy input) operated over a range of frequencies from 1 Hz to 10 kHz shows a decrease in peak momentum corresponding to an actuation cutoff frequency of 800 Hz. The model results show that the cutoff frequency is primarily a function of orifice diameter and cavity volume. To further verify model accuracy, experimental testing was performed involving time-dependent, cavity pressure and arc power measurements as a function of orifice diameter, cavity volume, input energy, and electrode gap. The cavity pressure measurements showed that pressure-based efficiency ranges from 20% to 40%. The arc power measurements exposed the deficiency in assuming instantaneous energy deposition and a calorically perfect gas and also showed that arc efficiency was approximately 80%. Additional comparisons between the pressure-based modeling and experimental results show that the model captures the actuator dependence on orifice diameter, cavity volume, and input energy but over-estimates the duration of the jet flow during Stage 2. The likely cause of the disagreement is an inaccurate representation of thermal heat transfer related to convective heat transfer or heat loss to the electrodes.

  18. Retinal Morphology and Sensitivity Are Primarily Impaired in Eyes with Neuromyelitis Optica Spectrum Disorder (NMOSD)

    Science.gov (United States)

    Akiba, Ryutaro; Yokouchi, Hirotaka; Mori, Masahiro; Oshitari, Toshiyuki; Baba, Takayuki; Sawai, Setsu; Kuwabara, Satoshi; Yamamoto, Shuichi

    2016-01-01

    Background Previous studies of neuromyelitis optica spectrum disorder (NMOSD) using spectral domain optical coherence tomography (SD-OCT) showed that the outer nuclear layer (ONL) in eyes without a history of optic neuritis (ON) was thinner than that of healthy controls. It remains unclear whether the ONL thinning is caused by a direct attack on the retina by an autoantibody or a retrograde degeneration. Objective To determine the mechanisms involved in the retinal damage in eyes with NMOSD without ON. Methods SD-OCT was used to determine the thicknesses of the different retinal layers of 21 eyes of 12 NMOSD patients without prior ON and 19 eyes of 10 healthy controls. Eyes with peripapillary retinal nerve fiber layer (RNFL) thinning were excluded to eliminate the confounding effects of retrograde degeneration. Microperimetry was used to determine the central retinal sensitivity. The data of the two groups were compared using generalized estimated equation models to account for inter-eye dependencies. Results The ganglion cell plus inner plexiform layer and the inner nuclear layer plus outer plexiform layer thicknesses of the NMOSD eyes were not significantly different from that of the control eyes (P = 0.28, P = 0.78). However, the ONL and average macular thickness (AMT) in the NMOSD eyes were significantly thinner than that of the control eyes (P = 0.022, P = 0.036). The retinal sensitivity in the central 10°, 10° to 2°, and 2° sectors were significantly lower in the NMOSD eyes than in the control eyes (P = 0.013, P = 0.022, P = 0.002). Conclusions The ONL thinning, AMT thinning, and reduced retinal sensitivity in eyes with NMOSD without significant peripapillary RNFL thinning are most likely due to direct retinal pathology. PMID:27936154

  19. Cosmic-Ray Background Flux Model based on a Gamma-Ray Large-Area Space Telescope Balloon Flight Engineering Model

    Energy Technology Data Exchange (ETDEWEB)

    Mizuno, T

    2004-09-03

    Cosmic-ray background fluxes were modeled based on existing measurements and theories and are presented here. The model, originally developed for the Gamma-ray Large Area Space Telescope (GLAST) Balloon Experiment, covers the entire solid angle (4{pi} sr), the sensitive energy range of the instrument ({approx} 10 MeV to 100 GeV) and abundant components (proton, alpha, e{sup -}, e{sup +}, {mu}{sup -}, {mu}{sup +} and gamma). It is expressed in analytic functions in which modulations due to the solar activity and the Earth geomagnetism are parameterized. Although the model is intended to be used primarily for the GLAST Balloon Experiment, model functions in low-Earth orbit are also presented and can be used for other high energy astrophysical missions. The model has been validated via comparison with the data of the GLAST Balloon Experiment.

  20. Recommendation based on trust diffusion model.

    Science.gov (United States)

    Yuan, Jinfeng; Li, Li

    2014-01-01

    Recommender system is emerging as a powerful and popular tool for online information relevant to a given user. The traditional recommendation system suffers from the cold start problem and the data sparsity problem. Many methods have been proposed to solve these problems, but few can achieve satisfactory efficiency. In this paper, we present a method which combines the trust diffusion (DiffTrust) algorithm and the probabilistic matrix factorization (PMF). DiffTrust is first used to study the possible diffusions of trust between various users. It is able to make use of the implicit relationship of the trust network, thus alleviating the data sparsity problem. The probabilistic matrix factorization (PMF) is then employed to combine the users' tastes with their trusted friends' interests. We evaluate the algorithm on Flixster, Moviedata, and Epinions datasets, respectively. The experimental results show that the recommendation based on our proposed DiffTrust + PMF model achieves high performance in terms of the root mean square error (RMSE), Recall, and F Measure.

  1. Discovering Diabetes Complications: an Ontology Based Model

    Science.gov (United States)

    Daghistani, Tahani; Shammari, Riyad Al; Razzak, Muhammad Imran

    2015-01-01

    Background: Diabetes is a serious disease that spread in the world dramatically. The diabetes patient has an average of risk to experience complications. Take advantage of recorded information to build ontology as information technology solution will help to predict patients who have average of risk level with certain complication. It is helpful to search and present patient’s history regarding different risk factors. Discovering diabetes complications could be useful to prevent or delay the complications. Method: We designed ontology based model, using adult diabetes patients’ data, to discover the rules of diabetes with its complications in disease to disease relationship. Result: Various rules between different risk factors of diabetes Patients and certain complications generated. Furthermore, new complications (diseases) might be discovered as new finding of this study, discovering diabetes complications could be useful to prevent or delay the complications. Conclusion: The system can identify the patients who are suffering from certain risk factors such as high body mass index (obesity) and starting controlling and maintaining plan. PMID:26862251

  2. GENETIC-BASED NUTRITION RECOMMENDATION MODEL

    Directory of Open Access Journals (Sweden)

    S. A.A. Fayoumi

    2014-01-01

    Full Text Available Evolutionary computing is the collective name for a range of problem-solving techniques based on principles of biological evolution, such as natural selection and genetic inheritance. These techniques are being widely applied to a variety of problems in many vital fields. Also, Evolutionary Algorithms (EA which applied the principles of Evolutionary computations, such as genetic algorithm, particle swarm, ant colony and bees algorithm and so on play an important role in decision making process. EAs serve a lot of fields which can affect our life directly, such as medicine, engineering, transportations, communications. One of these vital fields is Nutrition which can be viewed from several points of view as medical, physical, social, environmental and psychological point of view. This study, presents a proposed model that shows how evolutionary computing generally and genetic algorithm specifically-as a powerful algorithm of evolutionary algorithms-can be used to recommend an appropriate nutrition style in a medical and physical sides only to each person according to his/her personal and medical measurements.

  3. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  4. Culturable bioaerosols along an urban waterfront are primarily associated with coarse particles

    Directory of Open Access Journals (Sweden)

    Angel Montero

    2016-12-01

    Full Text Available The source, characteristics and transport of viable microbial aerosols in urban centers are topics of significant environmental and public health concern. Recent studies have identified adjacent waterways, and especially polluted waterways, as an important source of microbial aerosols to urban air. The size of these aerosols influences how far they travel, their resistance to environmental stress, and their inhalation potential. In this study, we utilize a cascade impactor and aerosol particle monitor to characterize the size distribution of particles and culturable bacterial and fungal aerosols along the waterfront of a New York City embayment. We seek to address the potential contribution of bacterial aerosols from local sources and to determine how their number, size distribution, and taxonomic identity are affected by wind speed and wind direction (onshore vs. offshore. Total culturable microbial counts were higher under offshore winds (average of 778 CFU/m3 ± 67, with bacteria comprising the majority of colonies (58.5%, as compared to onshore winds (580 CFU/m3 ± 110 where fungi were dominant (87.7%. The majority of cultured bacteria and fungi sampled during both offshore winds (88% and onshore winds (72% were associated with coarse aerosols (>2.1 µm, indicative of production from local sources. There was a significant correlation (p < 0.05 of wind speed with both total and coarse culturable microbial aerosol concentrations. Taxonomic analysis, based on DNA sequencing, showed that Actinobacteria was the dominant phylum among aerosol isolates. In particular, Streptomyces and Bacillus, both spore forming genera that are often soil-associated, were abundant under both offshore and onshore wind conditions. Comparisons of bacterial communities present in the bioaerosol sequence libraries revealed that particle size played an important role in microbial aerosol taxonomy. Onshore and offshore coarse libraries were found to be most similar

  5. Modeling dark fermentation for biohydrogen production: ADM1-based model vs. Gompertz model

    Energy Technology Data Exchange (ETDEWEB)

    Gadhamshetty, Venkataramana [Air Force Research Laboratory, Tyndall AFB, 139 Barnes Drive, Panama City, FL 32403 (United States); Arudchelvam, Yalini; Nirmalakhandan, Nagamany [Civil Engineering Department, New Mexico State University, Las Cruces, NM 88003 (United States); Johnson, David C. [Institute for Energy and Environment, New Mexico State University, Las Cruces, NM 88003 (United States)

    2010-01-15

    Biohydrogen production by dark fermentation in batch reactors was modeled using the Gompertz equation and a model based on Anaerobic Digestion Model (ADM1). The ADM1 framework, which has been well accepted for modeling methane production by anaerobic digestion, was modified in this study for modeling hydrogen production. Experimental hydrogen production data from eight reactor configurations varying in pressure conditions, temperature, type and concentration of substrate, inocula source, and stirring conditions were used to evaluate the predictive abilities of the two modeling approaches. Although the quality of fit between the measured and fitted hydrogen evolution by the Gompertz equation was high in all the eight reactor configurations with r{sup 2} {proportional_to}0.98, each configuration required a different set of model parameters, negating its utility as a general approach to predict hydrogen evolution. On the other hand, the ADM1-based model (ADM1BM) with predefined parameters was able to predict COD, cumulative hydrogen production, as well as volatile fatty acids production, albeit at a slightly lower quality of fit. Agreement between the experimental temporal hydrogen evolution data and the ADM1BM predictions was statistically significant with r{sup 2} > 0.91 and p-value <1E-04. Sensitivity analysis of the validated model revealed that hydrogen production was sensitive to only six parameters in the ADM1BM. (author)

  6. Computer Profiling Based Model for Investigation

    Directory of Open Access Journals (Sweden)

    Neeraj Choudhary

    2011-10-01

    Full Text Available Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the informationneeded to decide whether manual analysis is required.

  7. Research of database-based modeling for mining management system

    Institute of Scientific and Technical Information of China (English)

    WU Hai-feng; JIN Zhi-xin; BAI Xi-jun

    2005-01-01

    Put forward the method to construct the simulation model automatically with database-based automatic modeling(DBAM) for mining system. Designed the standard simulation model linked with some open cut Pautomobile dispatch system. Analyzed and finded out the law among them, and designed model maker to realize the automatic programming of the new model program.

  8. Definition of Model-based diagnosis problems with Altarica

    OpenAIRE

    Pencolé, Yannick; Chanthery, Elodie; Peynot, Thierry

    2016-01-01

    International audience; This paper presents a framework for modeling diagnosis problems based on a formal language called Altarica. The initial purpose of the language Altarica was to define a modeling language for safety analysis. This language has been developed as a collaboration between academics and industrial partners and is used in some industrial companies. The paper shows that the expres-sivity of this language, mixing event-based and state-based models, is sufficient to model classi...

  9. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  10. Model based sustainable production of biomethane

    OpenAIRE

    Biernacki, Piotr

    2014-01-01

    The main intention of this dissertation was to evaluate sustainable production of biomethane with use of mathematical modelling. To achieve this goal, widely acknowledged models like Anaerobic Digestion Model No.1 (ADM1), describing anaerobic digestion, and electrolyte Non-Random Two Liquid Model (eNRTL), for gas purification, were utilized. The experimental results, batch anaerobic digestion of different substrates and carbon dioxide solubility in 2-(Ethylamino)ethanol, were used to determin...

  11. THE DEVELOPMENT OF LECTURE MODEL OF CHEMICAL EDUCATION MANAGEMENT BASED ON LESSON STUDY TO IMPROVE CHEMISTRY TEACHER CANDIDATES’ PROFESIONALISM

    Directory of Open Access Journals (Sweden)

    S.S. Sumarti

    2015-04-01

    Full Text Available The purpose of this research is to produce a lecture model of chemical education management based on lesson study as an effort to improve chemistry teacher candidates’ professionalism. This study used a model of ADDIE (Analysis-Design-Implement-Develop-Evaluate. Based on the results of the reflection, lecturer and team can arrange the post-presentation activities (discussing material theoretically with a variety of management practices in the field. Activities will be carried out by presenting a real problem in the field to find the solution, thus the students’ curiosity about management implementation will be fulfilled. Lecture Model of Chemical Education Management Based on Lesson Study can improve the chemistry teacher candidates’ professionalism, primarily in preparing, presenting and being responsible of their work by learning from their learning experience.

  12. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  13. Genetic Algorithm Based Microscale Vehicle Emissions Modelling

    Directory of Open Access Journals (Sweden)

    Sicong Zhu

    2015-01-01

    Full Text Available There is a need to match emission estimations accuracy with the outputs of transport models. The overall error rate in long-term traffic forecasts resulting from strategic transport models is likely to be significant. Microsimulation models, whilst high-resolution in nature, may have similar measurement errors if they use the outputs of strategic models to obtain traffic demand predictions. At the microlevel, this paper discusses the limitations of existing emissions estimation approaches. Emission models for predicting emission pollutants other than CO2 are proposed. A genetic algorithm approach is adopted to select the predicting variables for the black box model. The approach is capable of solving combinatorial optimization problems. Overall, the emission prediction results reveal that the proposed new models outperform conventional equations in terms of accuracy and robustness.

  14. A Size-based Ecosystem Model

    DEFF Research Database (Denmark)

    Ravn-Jonsen, Lars

     Ecosystem Management requires models that can link the ecosystem level to the operation level. This link can be created by an ecosystem production model. Because the function of the individual fish in the marine ecosystem, seen in trophic context, is closely related to its size, the model groups...... fish according to size. The model summarises individual predation events into ecosystem level properties, and thereby uses the law of conversation of mass as a framework. This paper provides the background, the conceptual model, basic assumptions, integration of fishing activities, mathematical...... completion, and a numeric implementation. Using two experiments, the model's ability to act as tool for economic production analysis and regulation design testing is demonstrated. The presented model is the simplest possible and is built on the principles of (i) size, as the attribute that determines...

  15. A model-based multisensor data fusion knowledge management approach

    Science.gov (United States)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  16. Mars Science Laboratory; A Model for Event-Based EPO

    Science.gov (United States)

    Mayo, Louis; Lewis, E.; Cline, T.; Stephenson, B.; Erickson, K.; Ng, C.

    2012-10-01

    The NASA Mars Science Laboratory (MSL) and its Curiosity Rover, a part of NASA's Mars Exploration Program, represent the most ambitious undertaking to date to explore the red planet. MSL/Curiosity was designed primarily to determine whether Mars ever had an environment capable of supporting microbial life. NASA's MSL education program was designed to take advantage of existing, highly successful event based education programs to communicate Mars science and education themes to worldwide audiences through live webcasts, video interviews with scientists, TV broadcasts, professional development for teachers, and the latest social media frameworks. We report here on the success of the MSL education program and discuss how this methodological framework can be used to enhance other event based education programs.

  17. Overcoming limitations of model-based diagnostic reasoning systems

    Science.gov (United States)

    Holtzblatt, Lester J.; Marcotte, Richard A.; Piazza, Richard L.

    1989-01-01

    The development of a model-based diagnostic system to overcome the limitations of model-based reasoning systems is discussed. It is noted that model-based reasoning techniques can be used to analyze the failure behavior and diagnosability of system and circuit designs as part of the system process itself. One goal of current research is the development of a diagnostic algorithm which can reason efficiently about large numbers of diagnostic suspects and can handle both combinational and sequential circuits. A second goal is to address the model-creation problem by developing an approach for using design models to construct the GMODS model in an automated fashion.

  18. Agent Based Reasoning in Multilevel Flow Modeling

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2012-01-01

    to launch the MFM Workbench into an agent based environment, which can complement disadvantages of the original software. The agent-based MFM Workbench is centered on a concept called “Blackboard System” and use an event based mechanism to arrange the reasoning tasks. This design will support the new...

  19. Prognostic cell biological markers in cervical cancer patients primarily treated with (chemo)radiation : a systematic review

    NARCIS (Netherlands)

    Noordhuis, Maartje G; Eijsink, Jasper J H; Roossink, Frank; de Graeff, Pauline; Pras, Elisabeth; Schuuring, Ed; Wisman, G Bea A; de Bock, Geertruida H; van der Zee, Ate G J

    2011-01-01

    The aim of this study was to systematically review the prognostic and predictive significance of cell biological markers in cervical cancer patients primarily treated with (chemo)radiation. A PubMed, Embase, and Cochrane literature search was performed. Studies describing a relation between a cell b

  20. Ruptured episiotomia resutured primarily.

    Science.gov (United States)

    Monberg, J; Hammen, S

    1987-01-01

    In a randomized study, 35 patients with ruptured episiotomy were treated in two ways. One group, treated with Clindamycin and primary resuture, did better than the other group, not resutured but spontaneously healed.

  1. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil;

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented...

  2. Comparing model predictions for ecosystem-based management

    DEFF Research Database (Denmark)

    Jacobsen, Nis Sand; Essington, Timothy E.; Andersen, Ken Haste

    2016-01-01

    Ecosystem modeling is becoming an integral part of fisheries management, but there is a need to identify differences between predictions derived from models employed for scientific and management purposes. Here, we compared two models: a biomass-based food-web model (Ecopath with Ecosim (Ew......E)) and a size-structured fish community model. The models were compared with respect to predicted ecological consequences of fishing to identify commonalities and differences in model predictions for the California Current fish community. We compared the models regarding direct and indirect responses to fishing...... on one or more species. The size-based model predicted a higher fishing mortality needed to reach maximum sustainable yield than EwE for most species. The size-based model also predicted stronger top-down effects of predator removals than EwE. In contrast, EwE predicted stronger bottom-up effects...

  3. Biglan Model Test Based on Institutional Diversity.

    Science.gov (United States)

    Roskens, Ronald W.; Creswell, John W.

    The Biglan model, a theoretical framework for empirically examining the differences among subject areas, classifies according to three dimensions: adherence to common set of paradigms (hard or soft), application orientation (pure or applied), and emphasis on living systems (life or nonlife). Tests of the model are reviewed, and a further test is…

  4. Functional Behavioral Assessment: A School Based Model.

    Science.gov (United States)

    Asmus, Jennifer M.; Vollmer, Timothy R.; Borrero, John C.

    2002-01-01

    This article begins by discussing requirements for functional behavioral assessment under the Individuals with Disabilities Education Act and then describes a comprehensive model for the application of behavior analysis in the schools. The model includes descriptive assessment, functional analysis, and intervention and involves the participation…

  5. A Network Formation Model Based on Subgraphs

    CERN Document Server

    Chandrasekhar, Arun

    2016-01-01

    We develop a new class of random-graph models for the statistical estimation of network formation that allow for substantial correlation in links. Various subgraphs (e.g., links, triangles, cliques, stars) are generated and their union results in a network. We provide estimation techniques for recovering the rates at which the underlying subgraphs were formed. We illustrate the models via a series of applications including testing for incentives to form cross-caste relationships in rural India, testing to see whether network structure is used to enforce risk-sharing, testing as to whether networks change in response to a community's exposure to microcredit, and show that these models significantly outperform stochastic block models in matching observed network characteristics. We also establish asymptotic properties of the models and various estimators, which requires proving a new Central Limit Theorem for correlated random variables.

  6. Web-based hydrological modeling system for flood forecasting and risk mapping

    Science.gov (United States)

    Wang, Lei; Cheng, Qiuming

    2008-10-01

    Mechanism of flood forecasting is a complex system, which involves precipitation, drainage characterizes, land use/cover types, ground water and runoff discharge. The application of flood forecasting model require the efficient management of large spatial and temporal datasets, which involves data acquisition, storage, pre-processing and manipulation, analysis and display of model results. The extensive datasets usually involve multiple organizations, but no single organization can collect and maintain all the multidisciplinary data. The possible usage of the available datasets remains limited primarily because of the difficulty associated with combining data from diverse and distributed data sources. Difficulty in linking data, analysis tools and model is one of the barriers to be overcome in developing real-time flood forecasting and risk prediction system. The current revolution in technology and online availability of spatial data, particularly, with the construction of Canadian Geospatial Data Infrastructure (CGDI), a lot of spatial data and information can be accessed in real-time from distributed sources over the Internet to facilitate Canadians' need for information sharing in support of decision-making. This has resulted in research studies demonstrating the suitability of the web as a medium for implementation of flood forecasting and flood risk prediction. Web-based hydrological modeling system can provide the framework within which spatially distributed real-time data accessed remotely to prepare model input files, model calculation and evaluate model results for flood forecasting and flood risk prediction. This paper will develop a prototype web-base hydrological modeling system for on-line flood forecasting and risk mapping in the Oak Ridges Moraine (ORM) area, southern Ontario, Canada, integrating information retrieval, analysis and model analysis for near real time river runoff prediction, flood frequency prediction, flood risk and flood inundation

  7. Alcoholics Anonymous and twelve-step recovery: a model based on social and cognitive neuroscience.

    Science.gov (United States)

    Galanter, Marc

    2014-01-01

    In the course of achieving abstinence from alcohol, longstanding members of Alcoholics Anonymous (AA) typically experience a change in their addiction-related attitudes and behaviors. These changes are reflective of physiologically grounded mechanisms which can be investigated within the disciplines of social and cognitive neuroscience. This article is designed to examine recent findings associated with these disciplines that may shed light on the mechanisms underlying this change. Literature review and hypothesis development. Pertinent aspects of the neural impact of drugs of abuse are summarized. After this, research regarding specific brain sites, elucidated primarily by imaging techniques, is reviewed relative to the following: Mirroring and mentalizing are described in relation to experimentally modeled studies on empathy and mutuality, which may parallel the experiences of social interaction and influence on AA members. Integration and retrieval of memories acquired in a setting like AA are described, and are related to studies on storytelling, models of self-schema development, and value formation. A model for ascription to a Higher Power is presented. The phenomena associated with AA reflect greater complexity than the empirical studies on which this article is based, and certainly require further elucidation. Despite this substantial limitation in currently available findings, there is heuristic value in considering the relationship between the brain-based and clinical phenomena described here. There are opportunities for the study of neuroscientific correlates of Twelve-Step-based recovery, and these can potentially enhance our understanding of related clinical phenomena. © American Academy of Addiction Psychiatry.

  8. MQ-2 A Tool for Prolog-based Model Querying

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald

    2012-01-01

    MQ-2 integrates a Prolog console into the MagicDraw1 modeling environment and equips this console with features targeted specifically to the task of querying models. The vision of MQ-2 is to make Prolog-based model querying accessible to both student and expert modelers by offering powerful query...

  9. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  10. Testing R&D-Based Endogenous Growth Models

    DEFF Research Database (Denmark)

    Kruse-Andersen, Peter Kjær

    2017-01-01

    R&D-based growth models are tested using US data for the period 1953-2014. A general growth model is developed which nests the model varieties of interest. The model implies a cointegrating relationship between multifactor productivity, research intensity, and employment. This relationship is est......-run growth rate of GDP per worker converges to between zero and 1.1 pct....

  11. A Stock Pricing Model Based on Arithmetic Brown Motion

    Institute of Scientific and Technical Information of China (English)

    YAN Yong-xin; HAN Wen-xiu

    2001-01-01

    This paper presents a new stock pricing model based on arithmetic Brown motion. The model overcomes the shortcomings of Gordon model completely. With the model investors can estimate the stock value of surplus companies, deficit companies, zero increase companies and bankrupt companies in long term investment or in short term investment.

  12. Conflict Detection and Merging in Model based SCM Systems

    OpenAIRE

    Waqar Mehmood; Arshad Ali

    2014-01-01

    This study presents a fine-grained approach to the problem of conflict detection and merging in model-based Software Configuration Management (SCM) systems. Traditional SCM systems uses textual or structured data to represent models at fine-grained level. Our approach is based on defining graph structure to represent models data at fine-grained level. The approach is based on transforming the textual or structured data into graph structure and then performing the diff, merge and evolution con...

  13. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  14. Simplified Atmospheric Dispersion Model andModel Based Real Field Estimation System ofAir Pollution

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    The atmospheric dispersion model has been well developed and applied in pollution emergency and prediction. Based on thesophisticated air diffusion model, this paper proposes a simplified model and some optimization about meteorological andgeological conditions. The model is suitable for what is proposed as Real Field Monitor and Estimation system. The principle ofsimplified diffusion model and its optimization is studied. The design of Real Field Monitor system based on this model and itsfundamental implementations are introduced.

  15. Multimedia Data Modeling Based on Temporal Logic and XYZ System

    Institute of Scientific and Technical Information of China (English)

    MA Huadong; LIU Shenquan

    1999-01-01

    This paper proposes a new approach to modeling multimedia data. The newapproach is the multimedia data model based on temporal logic and XYZSystem. It supports the formal specifications in a multimedia system.Using this model, we can not only specify information unitsbut also design and script a multimedia title in an unified framework.Based on this model, an interactive multimedia authoring environment hasbeen developed.

  16. Physics-Based Pneumatic Hammer Instability Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Florida Turbine Technologies (FTT) proposes to conduct research necessary to develop a physics-based pneumatic hammer instability model for hydrostatic bearings...

  17. Image-Based Modeling of Plants and Trees

    CERN Document Server

    Kang, Sing Bang

    2009-01-01

    Plants and trees are among the most complex natural objects. Much work has been done attempting to model them, with varying degrees of success. In this book, we review the various approaches in computer graphics, which we categorize as rule-based, image-based, and sketch-based methods. We describe our approaches for modeling plants and trees using images. Image-based approaches have the distinct advantage that the resulting model inherits the realistic shape and complexity of a real plant or tree. We use different techniques for modeling plants (with relatively large leaves) and trees (with re

  18. Comparing Ray-Based and Wave-Based Models of Cross-Beam Energy Transfer

    Science.gov (United States)

    Follett, R. K.; Edgell, D. H.; Shaw, J. G.; Froula, D. H.; Myatt, J. F.

    2016-10-01

    Ray-based models of cross-beam energy transfer (CBET) are used in radiation-hydrodynamics codes to calculate laser-energy deposition. The accuracy of ray-based CBET models is limited by assumptions about the polarization and phase of the interacting laser beams and by the use of a paraxial Wentzel-Kramers-Brillouin (WKB) approximation. A 3-D wave-based solver (LPSE-CBET) is used to study the nonlinear interaction between overlapping laser beams in underdense plasma. A ray-based CBET model is compared to the wave-based model and shows good agreement in simple geometries where the assumptions of the ray-based model are satisfied. Near caustic surfaces, the assumptions of the ray-based model break down and the calculated energy transfer deviates from wave-based calculations. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  19. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  20. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  1. A resonance based model of biological evolution

    Science.gov (United States)

    Damasco, Achille; Giuliani, Alessandro

    2017-04-01

    We propose a coarse grained physical model of evolution. The proposed model 'at least in principle' is amenable of an experimental verification even if this looks as a conundrum: evolution is a unique historical process and the tape cannot be reversed and played again. Nevertheless, we can imagine a phenomenological scenario tailored upon state transitions in physical chemistry in which different agents of evolution play the role of the elements of a state transition like thermal noise or resonance effects. The abstract model we propose can be of help for sketching hypotheses and getting rid of some well-known features of natural history like the so-called Cambrian explosion. The possibility of an experimental proof of the model is discussed as well.

  2. Demand forecast model based on CRM

    Science.gov (United States)

    Cai, Yuancui; Chen, Lichao

    2006-11-01

    With interiorizing day by day management thought that regarding customer as the centre, forecasting customer demand becomes more and more important. In the demand forecast of customer relationship management, the traditional forecast methods have very great limitation because much uncertainty of the demand, these all require new modeling to meet the demands of development. In this paper, the notion is that forecasting the demand according to characteristics of the potential customer, then modeling by it. The model first depicts customer adopting uniform multiple indexes. Secondly, the model acquires characteristic customers on the basis of data warehouse and the technology of data mining. The last, there get the most similar characteristic customer by their comparing and forecast the demands of new customer by the most similar characteristic customer.

  3. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  4. Geometric deviation modeling by kinematic matrix based on Lagrangian coordinate

    Science.gov (United States)

    Liu, Weidong; Hu, Yueming; Liu, Yu; Dai, Wanyi

    2015-09-01

    Typical representation of dimension and geometric accuracy is limited to the self-representation of dimension and geometric deviation based on geometry variation thinking, yet the interactivity affection of geometric variation and gesture variation of multi-rigid body is not included. In this paper, a kinematic matrix model based on Lagrangian coordinate is introduced, with the purpose of unified model for geometric variation and gesture variation and their interactive and integrated analysis. Kinematic model with joint, local base and movable base is built. The ideal feature of functional geometry is treated as the base body; the fitting feature of functional geometry is treated as the adjacent movable body; the local base of the kinematic model is fixed onto the ideal geometry, and the movable base of the kinematic model is fixed onto the fitting geometry. Furthermore, the geometric deviation is treated as relative location or rotation variation between the movable base and the local base, and it's expressed by the Lagrangian coordinate. Moreover, kinematic matrix based on Lagrangian coordinate for different types of geometry tolerance zones is constructed, and total freedom for each kinematic model is discussed. Finally, the Lagrangian coordinate library, kinematic matrix library for geometric deviation modeling is illustrated, and an example of block and piston fits is introduced. Dimension and geometric tolerances of the shaft and hole fitting feature are constructed by kinematic matrix and Lagrangian coordinate, and the results indicate that the proposed kinematic matrix is capable and robust in dimension and geometric tolerances modeling.

  5. ALC: automated reduction of rule-based models

    Directory of Open Access Journals (Sweden)

    Gilles Ernst

    2008-10-01

    Full Text Available Abstract Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.

  6. AN INFLATION-BASED MAINTENANCE SCHEDULING MODEL

    Directory of Open Access Journals (Sweden)

    S.A. Oke

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper attempts to extend an original Gantt-charting model for optimal maintenance scheduling of multiple facilities in multiple periods credited to Charles- Owaba. In particular, an inflationary factor is incorporated into the period-dependent cost function element of the model. Some computational experimentation on the new model is tested. This work is perhaps the first to advance the extension of the Optimal Gantt Charting (OGC model in terms of inflation. The work presents a wealth of research opportunities and has at least the modest potential to evaluate maintenance scheduling theory into the ranks of major theories of maintenance.

    AFRIKAANSE OPSOMMING: Die artikel poog om inflasie te akkommodeer binne die raamwerk van ‘n Ganttkaartmodel vir die bepaling van optimum skedulering van instandhouding vir meervoudige fasiliteite met die verloop van tyd. Inflasie vorm dus deel van die koste-element van die model. Die model is ook op ‘n beperkte wyse gevalideer met behulp van ‘n rekenaarstudie. Die verkreë resultate toon dat addisionele navorsing gedoen kan word om sodoende die metode te vestig in die voorste rang van instandhoudingsbeplanning

  7. Evaluating carbon fluxes of global forest ecosystems by using an individual tree-based model FORCCHN.

    Science.gov (United States)

    Ma, Jianyong; Shugart, Herman H; Yan, Xiaodong; Cao, Cougui; Wu, Shuang; Fang, Jing

    2017-02-14

    The carbon budget of forest ecosystems, an important component of the terrestrial carbon cycle, needs to be accurately quantified and predicted by ecological models. As a preamble to apply the model to estimate global carbon uptake by forest ecosystems, we used the CO2 flux measurements from 37 forest eddy-covariance sites to examine the individual tree-based FORCCHN model's performance globally. In these initial tests, the FORCCHN model simulated gross primary production (GPP), ecosystem respiration (ER) and net ecosystem production (NEP) with correlations of 0.72, 0.70 and 0.53, respectively, across all forest biomes. The model underestimated GPP and slightly overestimated ER across most of the eddy-covariance sites. An underestimation of NEP arose primarily from the lower GPP estimates. Model performance was better in capturing both the temporal changes and magnitude of carbon fluxes in deciduous broadleaf forest than in evergreen broadleaf forest, and it performed less well for sites in Mediterranean climate. We then applied the model to estimate the carbon fluxes of forest ecosystems on global scale over 1982-2011. This application of FORCCHN gave a total GPP of 59.41±5.67 and an ER of 57.21±5.32PgCyr(-1) for global forest ecosystems during 1982-2011. The forest ecosystems over this same period contributed a large carbon storage, with total NEP being 2.20±0.64PgCyr(-1). These values are comparable to and reinforce estimates reported in other studies. This analysis highlights individual tree-based model FORCCHN could be used to evaluate carbon fluxes of forest ecosystems on global scale.

  8. 3D Object Recognition Based on Linear Lie Algebra Model

    Institute of Scientific and Technical Information of China (English)

    LI Fang-xing; WU Ping-dong; SUN Hua-fei; PENG Lin-yu

    2009-01-01

    A surface model called the fibre bundle model and a 3D object model based on linear Lie algebra model are proposed.Then an algorithm of 3D object recognition using the linear Lie algebra models is presented.It is a convenient recognition method for the objects which are symmetric about some axis.By using the presented algorithm,the representation matrices of the fibre or the base curve from only finite points of the linear Lie algebra model can be obtained.At last some recognition results of practicalities are given.

  9. Model-based Prognostics under Limited Sensing

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics is crucial to providing reliable condition-based maintenance decisions. To obtain accurate predictions of component life, a variety of sensors are often...

  10. Mechanics and model-based control of advanced engineering systems

    CERN Document Server

    Irschik, Hans; Krommer, Michael

    2014-01-01

    Mechanics and Model-Based Control of Advanced Engineering Systems collects 32 contributions presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines, which took place in St. Petersburg, Russia in July 2012. The workshop continued a series of international workshops, which started with a Japan-Austria Joint Workshop on Mechanics and Model Based Control of Smart Materials and Structures and a Russia-Austria Joint Workshop on Advanced Dynamics and Model Based Control of Structures and Machines. In the present volume, 10 full-length papers based on presentations from Russia, 9 from Austria, 8 from Japan, 3 from Italy, one from Germany and one from Taiwan are included, which represent the state of the art in the field of mechanics and model based control, with particular emphasis on the application of advanced structures and machines.

  11. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  12. Understanding uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, M. P.; Kavetski, D.; Slater, A. G.; Newman, A. J.; Marks, D. G.; Landry, C.; Lundquist, J. D.; Rupp, D. E.; Nijssen, B.

    2013-12-01

    Building an environmental model requires making a series of decisions regarding the appropriate representation of natural processes. While some of these decisions can already be based on well-established physical understanding, gaps in our current understanding of environmental dynamics, combined with incomplete knowledge of properties and boundary conditions of most environmental systems, make many important modeling decisions far more ambiguous. There is consequently little agreement regarding what a 'correct' model structure is, especially at relatively larger spatial scales such as catchments and beyond. In current practice, faced with such a range of decisions, different modelers will generally make different modeling decisions, often on an ad hoc basis, based on their balancing of process understanding, the data available to evaluate the model, the purpose of the modeling exercise, and their familiarity with or investment in an existing model infrastructure. This presentation describes development and application of multiple-hypothesis models to evaluate process-based hydrologic models. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including multiple options for model parameterizations (e.g., below-canopy wind speed, thermal conductivity, storage and transmission of liquid water through soil, etc.), as well as multiple options for model architecture, that is, the coupling and organization of different model components (e.g., representations of sub-grid variability and hydrologic connectivity, coupling with groundwater, etc.). Application of this modeling framework across a collection of different research basins demonstrates that differences among model parameterizations are often overwhelmed by differences among equally-plausible model parameter sets, while differences in model architecture lead

  13. Active Appearance Model Based Hand Gesture Recognition

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper addresses the application of hand gesture recognition in monocular image sequences using Active Appearance Model (AAM). For this work, the proposed algorithm is conposed of constructing AAMs and fitting the models to the interest region. In training stage, according to the manual labeled feature points, the relative AAM is constructed and the corresponding average feature is obtained. In recognition stage, the interesting hand gesture region is firstly segmented by skin and movement cues.Secondly, the models are fitted to the image that includes the hand gesture, and the relative features are extracted.Thirdly, the classification is done by comparing the extracted features and average features. 30 different gestures of Chinese sign language are applied for testing the effectiveness of the method. The Experimental results are given indicating good performance of the algorithm.

  14. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...... designs. Our framework enables evaluating the risks of an insider attack to happen quantitatively. The framework first identifies an insider's intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using...... probabilistic model checking. We provide prototype tool support using Matlab for Bayesian networks and PRISM for the analysis of Markov decision processes, and validate the framework with case studies....

  15. Flood forecasting for River Mekong with data-based models

    Science.gov (United States)

    Shahzad, Khurram M.; Plate, Erich J.

    2014-09-01

    In many regions of the world, the task of flood forecasting is made difficult because only a limited database is available for generating a suitable forecast model. This paper demonstrates that in such cases parsimonious data-based hydrological models for flood forecasting can be developed if the special conditions of climate and topography are used to advantage. As an example, the middle reach of River Mekong in South East Asia is considered, where a database of discharges from seven gaging stations on the river and 31 rainfall stations on the subcatchments between gaging stations is available for model calibration. Special conditions existing for River Mekong are identified and used in developing first a network connecting all discharge gages and then models for forecasting discharge increments between gaging stations. Our final forecast model (Model 3) is a linear combination of two structurally different basic models: a model (Model 1) using linear regressions for forecasting discharge increments, and a model (Model 2) using rainfall-runoff models. Although the model based on linear regressions works reasonably well for short times, better results are obtained with rainfall-runoff modeling. However, forecast accuracy of Model 2 is limited by the quality of rainfall forecasts. For best results, both models are combined by taking weighted averages to form Model 3. Model quality is assessed by means of both persistence index PI and standard deviation of forecast error.

  16. Assessment of Vegetation Variation on Primarily Creation Zones of the Dust Storms Around the Euphrates Using Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Jamil Amanollahi

    2012-06-01

    Full Text Available Recently, period frequency and effect domain of the dust storms that enter Iran from Iraq have increased. In this study, in addition to detecting the creation zones of the dust storms, the effect of vegetation cover variation on their creation was investigated using remote sensing. Moderate resolution image Spectroradiometer (MODIS and Landsat Thematic Mapper (TM5 have been utilized to identify the primarily creation zones of the dust storms and to assess the vegetation cover variation, respectively. Vegetation cover variation was studied using Normalized Differences Vegetation Index (NDVI obtained from band 3 and band 4 of the Landsate satellite. The results showed that the surrounding area of the Euphrates in Syria, the desert in the vicinity of this river in Iraq, including the deserts of Alanbar Province, and the north deserts of Saudi Arabia are the primarily creation zones of the dust storms entering west and south west of Iran. The results of NDVI showed that excluding the deserts in the border of Syria and Iraq, the area with very weak vegetation cover have increased between 2.44% and 20.65% from 1991 to 2009. In the meanwhile, the retention pound surface areas in the south deserts of Syria as well as the deserts in its border with Iraq have decreased 6320 and 4397 hectares, respectively. As it can be concluded from the findings, one of the main environmental parameters initiating these dust storms is the decrease in the vegetation cover in their primarily creation zones.

  17. CLINICAL OBSERVATION ON TREATMENT OF ACUTE BRONCHITIS PRIMARILY WITH PRICKING-CUPPING ON BACK-SHU POINTS

    Institute of Scientific and Technical Information of China (English)

    XU Wei-dong; ZHANG Yong-juan; YANG Jie; CHEN Xiao-xiang; LIU Yong-xiang

    2006-01-01

    Objective: To observe the clinical effect of treatment of acute bronchitis primarily with prickingcupping method on Back-shu points. Methods: The patients of acute bronchitis were randomly divided into 2groups. In the observation group, there were 36 cases, who were treated with the integrated traditional Chinese and western medicine, primarily the pricking-cupping method on Back-shu points; while in the control group, there were 29 cases who were given the conventinal treatment of western medicine. All the 2 groups were treated for 7 days as one treating course. Results: The total effective rate of the observation group was 97.2% while that of the control group was 82.8%, so there is a significant difference between them. On the first and third days the clinical manifestations were more satisfactorily improved in the treatment group than in the control group (P < 0.01 ) and on the fifth and seventh days, the comparison showed no significant difference (P >0.05). Conclusion: The treatment of acute bronchitis by means of the integrated traditional Chinese and western medicine primarily with pricking-cupping method on Back-shu points is of marked therapeutic effect, simple manipulation, and little untoward effects, thus claiming the unique advantage.

  18. Community perceptions of safety in relation to perceived youth violence-delinquency in a primarily native Hawaiian and Asian American community in Hawai'i.

    Science.gov (United States)

    Hishinuma, Earl S; Chang, Janice Y; Soli, Faapisa M

    2012-02-01

    Perception of safety is an important component to the well-being of community members in their own neighborhood. The present study was the first of its kind to model community perception of safety utilizing a primarily Native Hawaiian and Asian American community sample (N = 101) and with perceived youth violence and delinquency as prominent potential influences. The study found that the majority of participants felt that several types of youth violence and delinquency were problems in the community. The overall social-ecological model evidenced a strong fit and indicated that community perception of safety was adversely impacted by perceived youth violence and delinquency and increased through positive relations with neighbors. The implications included the need for a more comprehensive approach to positive youth development and community capacity-building, including incorporation of cultural components, and to determine whether the model is applicable to other minority communities.

  19. Modeling cookoff of HMX based PBX explosives

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L.

    2017-03-01

    We have previously developed a PBX 9501 cookoff model for the plastic bonded explosive PBX 9501 consisting of 95 wt% octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazoncine (HMX), 2.5 wt% Estane® 5703 (a polyurethane thermoplastic), and 2.5 wt% of a nitroplasticizer (NP): BDNPA/F, a 50/50 wt% eutectic mixture bis(2,2-dinitropropyl)-acetal (BDNPA) and bis(2,2-dinitropropyl)-formal (BDNPF). This fivestep model includes desorption of water, decomposition of the NP to form NO2, reaction of the NO2 with Estane and HMX, and decomposition of HMX [1]. This model has been successfully validated with data from six laboratories with scales ranging from 2 g to more than 2.5 kg of explosive. We have determined, that the PBX 9501 model can be used to predict cookoff of other plastic bonded explosives containing HMX and an inert binder, such as LX-04 consisting of 85 wt% HMX and 15 wt% Viton A (vinylidine fluoride/hexafluoropropylene copolymer), LX-07 (90 wt% HMX and 10 wt% Viton A), LX- 10-0 (95 wt% HMX and 5 wt% Viton A), and LX-14 consisting of 95.5 wt % HMX and 4.5 wt% Estane® 5702-F1 (a polyurethane thermoplastic). Normally our cookoff models are verified using Sandia’s Instrumented Thermal Initiation (SITI) experiment. However, SITI data for LX-04, LX-07, LX-10-0, and LX-14 are not available at pressed density; although, some molding powder SITI data on LX-10-0 and LX-14 exists. Tarver and Tran [2] provide some one-dimensional time-to-explosion (ODTX) data for these explosives. The applicability of the PBX 9501 model to LX-04, LX-07, LX-10-0, AND LX-14 was made using this ODTX data [2]. The PBX 9501 model is applied to these other explosives by accounting for the correct amount of HMX in the explosive and limiting the NP reaction. We have found the PBX 9501 model to be useful for predicting the response of these PBXs to abnormal thermal environments such as fire.

  20. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect...... the information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual...

  1. Improved world-based language model

    Institute of Scientific and Technical Information of China (English)

    CHEN Yong(陈勇); CHAN Kwok-ping

    2004-01-01

    In order to construct a good language model used in the postprocessing phase of a recognition system.A smoothing technique must be used to solve the data sparseness problem. In the past, many smoothing techniques have been proposed. Among them, Katz' s smoothing technique is well known. However, we found that a weakness with the Katz' s smoothing technique. We improved this approach by incorporating one kind of special Chinese language information and Chinese word class information into the language model. We tested the new smoothing technique with a Chinese character recognition system. The experimental result showed that a better performance can be achieved.

  2. Selection of probability based weighting models for Boolean retrieval system

    Energy Technology Data Exchange (ETDEWEB)

    Ebinuma, Y. (Japan Atomic Energy Research Inst., Tokai, Ibaraki. Tokai Research Establishment)

    1981-09-01

    Automatic weighting models based on probability theory were studied if they can be applied to boolean search logics including logical sum. The INIS detabase was used for searching of one particular search formula. Among sixteen models three with good ranking performance were selected. These three models were further applied to searching of nine search formulas in the same database. It was found that two models among them show slightly better average ranking performance while the other model, the simplest one, seems also practical.

  3. Genetic programming-based chaotic time series modeling

    Institute of Scientific and Technical Information of China (English)

    张伟; 吴智铭; 杨根科

    2004-01-01

    This paper proposes a Genetic Programming-Based Modeling (GPM) algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space, and the Particle Swarm Optimization (PSO) algorithm is used for Nonlinear Parameter Estimation (NPE) of dynamic model structures. In addition, GPM integrates the results of Nonlinear Time Series Analysis (NTSA) to adjust the parameters and takes them as the criteria of established models. Experiments showed the effectiveness of such improvements on chaotic time series modeling.

  4. Theory-based Practice: Comparing and Contrasting OT Models

    DEFF Research Database (Denmark)

    Nielsen, Kristina Tomra; Berg, Brett

    2012-01-01

    Theory- Based Practice: Comparing and Contrasting OT Models The workshop will present a critical analysis of the major models of occupational therapy, A Model of Human Occupation, Enabling Occupation II, and Occupational Therapy Intervention Process Model. Similarities and differences among...... the models will be discussed, including each model’s limitations and unique contributions to the profession. Workshop format will include short lectures and group discussions....

  5. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    Science.gov (United States)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  6. Gradient-based adaptation of continuous dynamic model structures

    Science.gov (United States)

    La Cava, William G.; Danai, Kourosh

    2016-01-01

    A gradient-based method of symbolic adaptation is introduced for a class of continuous dynamic models. The proposed model structure adaptation method starts with the first-principles model of the system and adapts its structure after adjusting its individual components in symbolic form. A key contribution of this work is its introduction of the model's parameter sensitivity as the measure of symbolic changes to the model. This measure, which is essential to defining the structural sensitivity of the model, not only accommodates algebraic evaluation of candidate models in lieu of more computationally expensive simulation-based evaluation, but also makes possible the implementation of gradient-based optimisation in symbolic adaptation. The proposed method is applied to models of several virtual and real-world systems that demonstrate its potential utility.

  7. Thermodynamics-based models of transcriptional regulation with gene sequence.

    Science.gov (United States)

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  8. Model-based satellite image fusion

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Sveinsson, J. R.; Nielsen, Allan Aasbjerg

    2008-01-01

    A method is proposed for pixel-level satellite image fusion derived directly from a model of the imaging sensor. By design, the proposed method is spectrally consistent. It is argued that the proposed method needs regularization, as is the case for any method for this problem. A framework for pixel...

  9. Will Rule based BPM obliterate Process Models?

    NARCIS (Netherlands)

    Joosten, S.; Joosten, H.J.M.

    2007-01-01

    Business rules can be used directly for controlling business processes, without reference to a business process model. In this paper we propose to use business rules to specify both business processes and the software that supports them. Business rules expressed in smart mathematical notations bring

  10. Automata-Based CSL Model Checking

    DEFF Research Database (Denmark)

    Zhang, Lijun; Jansen, David N.; Nielson, Flemming;

    2011-01-01

    For continuous-time Markov chains, the model-checking problem with respect to continuous-time stochastic logic (CSL) has been introduced and shown to be decidable by Aziz, Sanwal, Singhal and Brayton in 1996. The presented decision procedure, however, has exponential complexity. In this paper, we...

  11. Sparse-Based Modeling of Hyperspectral Data

    DEFF Research Database (Denmark)

    Calvini, Rosalba; Ulrici, Alessandro; Amigo Rubio, Jose Manuel

    2016-01-01

    One of the main issues of hyperspectral imaging data is to unravel the relevant, yet overlapped, huge amount of information contained in the spatial and spectral dimensions. When dealing with the application of multivariate models in such high-dimensional data, sparsity can improve...

  12. Recursive renormalization group theory based subgrid modeling

    Science.gov (United States)

    Zhou, YE

    1991-01-01

    Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.

  13. An Optimization Model Based on Game Theory

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2014-04-01

    Full Text Available Game Theory has a wide range of applications in department of economics, but in the field of computer science, especially in the optimization algorithm is seldom used. In this paper, we integrate thinking of game theory into optimization algorithm, and then propose a new optimization model which can be widely used in optimization processing. This optimization model is divided into two types, which are called “the complete consistency” and “the partial consistency”. In these two types, the partial consistency is added disturbance strategy on the basis of the complete consistency. When model’s consistency is satisfied, the Nash equilibrium of the optimization model is global optimal and when the model’s consistency is not met, the presence of perturbation strategy can improve the application of the algorithm. The basic experiments suggest that this optimization model has broad applicability and better performance, and gives a new idea for some intractable problems in the field of artificial intelligence

  14. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  15. Controlling reuse in pattern-based model-to-model transformations

    OpenAIRE

    Guerra, Esther,; De Lara, Juan,; Orejas, Fernando

    2010-01-01

    Model-to-model transformation is a central activity in Model-Driven Engineering that consists of transforming models from a source to a target language. Pattern-based model-to-model transformation is our approach for specifying transformations in a declarative, relational and formal style. The approach relies on patterns describing allowed or forbidden relations between two models. These patterns are compiled into operational mechanisms to perform forward and backward transformations. Inspire...

  16. Kinetic data base for combustion modeling

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, W.; Herron, J.T. [National Institute of Standards and Technology, Gaithersburg, MD (United States)

    1993-12-01

    The aim of this work is to develop a set of evaluated rate constants for use in the simulation of hydrocarbon combustion. The approach has been to begin with the small molecules and then introduce larger species with the various structural elements that can be found in all hydrocarbon fuels and decomposition products. Currently, the data base contains most of the species present in combustion systems with up to four carbon atoms. Thus, practically all the structural grouping found in aliphatic compounds have now been captured. The direction of future work is the addition of aromatic compounds to the data base.

  17. (Re)configuration based on model generation

    CERN Document Server

    Friedrich, Gerhard; Falkner, Andreas A; Haselböck, Alois; Schenner, Gottfried; Schreiner, Herwig; 10.4204/EPTCS.65.3

    2011-01-01

    Reconfiguration is an important activity for companies selling configurable products or services which have a long life time. However, identification of a set of required changes in a legacy configuration is a hard problem, since even small changes in the requirements might imply significant modifications. In this paper we show a solution based on answer set programming, which is a logic-based knowledge representation formalism well suited for a compact description of (re)configuration problems. Its applicability is demonstrated on simple abstractions of several real-world scenarios. The evaluation of our solution on a set of benchmark instances derived from commercial (re)configuration problems shows its practical applicability.

  18. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.

    2011-01-01

    Since the world's energy demand increases every year, the oil & gas industry makes a continuous effort to improve fossil fuel recovery. Physics-based petroleum reservoir modeling and closed-loop model-based reservoir management concept can play an important role here. In this concept measured data a

  19. Neural mass model-based tracking of anesthetic brain states

    NARCIS (Netherlands)

    Kuhlmann, Levin; Freestone, Dean R.; Manton, Jonathan H.; Heyse, Bjorn; Vereecke, Hugo E. M.; Lipping, Tarmo; Struys, Michel M. R. F.; Liley, David T. J.

    2016-01-01

    Neural mass model-based tracking of brain states from electroencephalographic signals holds the promise of simultaneously tracking brain states while inferring underlying physiological changes in various neuroscientific and clinical applications. Here, neural mass model-based tracking of brain state

  20. An Active Learning Exercise for Introducing Agent-Based Modeling

    Science.gov (United States)

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  1. A Granular Computing Model Based on Tolerance relation

    Institute of Scientific and Technical Information of China (English)

    WANG Guo-yin; HU Feng; HUANG Hai; WU Yu

    2005-01-01

    Granular computing is a new intelligent computing theory based on partition of problem concepts. It is an important problem in Rough Set theory to process incomplete information systems directly. In this paper, a granular computing model based on tolerance relation for processing incomplete information systems is developed. Furthermore, a criteria condition for attribution necessity is proposed in this model.

  2. Agent-based modelling of socio-technical systems

    CERN Document Server

    van Dam, Koen H; Lukszo, Zofia

    2012-01-01

    Here is a practical introduction to agent-based modelling of socio-technical systems, based on methodology developed at TU Delft, which has been deployed in a number of case studies. Offers theory, methods and practical steps for creating real-world models.

  3. Towards automatic model based controller design for reconfigurable plants

    DEFF Research Database (Denmark)

    Michelsen, Axel Gottlieb; Stoustrup, Jakob; Izadi-Zamanabadi, Roozbeh

    2008-01-01

    This paper introduces model-based Plug and Play Process Control, a novel concept for process control, which allows a model-based control system to be reconfigured when a sensor or an actuator is plugged into a controlled process. The work reported in this paper focuses on composing a monolithic m...

  4. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    43 Figure 14: Simulation Study Methodology for the Weapon System Analysis Metrics Definition and Data Collection The analysis plan calls for...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Presented to the Faculty Department of Operational Sciences

  5. Software Reuse of Mobile Systems based on Modelling

    Directory of Open Access Journals (Sweden)

    Guo Ping

    2016-01-01

    Full Text Available This paper presents an architectural style based modelling approach for architectural design, analysis of mobile systems. The approach is developed based on UML-like meta models and graph transformation techniques to support sound methodological principals, formal analysis and refinement. The approach could support mobile system development.

  6. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    Gani, Rafiqul; d'Anterroches, Loïc

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...

  7. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.

    2011-01-01

    Since the world's energy demand increases every year, the oil & gas industry makes a continuous effort to improve fossil fuel recovery. Physics-based petroleum reservoir modeling and closed-loop model-based reservoir management concept can play an important role here. In this concept measured data a

  8. A data base for galaxy evolution modeling

    NARCIS (Netherlands)

    Leitherer, C; Alloin, D; FritzVonAlvensleben, U; Gallagher, JS; Huchra, JP; Matteucci, F; OConnell, RW; Beckman, JE; Bertelli, GP; Bica, E; Boisson, C; Bonatto, C; Bothun, GD; Bressan, A; Brodie, JP; Bruzual, G; Burstein, D; Buser, R; Caldwell, N; Casuso, E; Cervino, M; Charlot, S; Chavez, M; Chiosi, C; Christian, CA; Cuisinier, F; Dallier, R; deKoter, A; Delisle, S; Diaz, AI; Dopita, MA; Dorman, B; Fagotto, F; Fanelli, MN; Fioc, M; GarciaVargas, ML; Girardi, L; Goldader, JD; Hardy, E; Heckman, TM; Iglesias, J; Jablonka, P; Joly, M; Jones, L; Kurth, O; Lancon, A; Lejeune, T; Loxen, J; Maeder, A; Malagnini, ML; Marigo, P; MasHesse, JM; Meynet, G; Moller, CS; Molla, ML; Morossi, C; Nasi, E; Nichols, JS; Odegaard, KJR; Parker, JWM; Pastoriza, MG; Peletier, R; Robert, C; RoccaVolmerange, B; Schaerer, D; Schmidt, A; Schmitt, HR; Schommer, RA; Schmutz, W; Silva, L; Stasinska, G; Sutherland, RS; Tantalo, R; Traat, P; Vallenari, A; Vazdekis, A; Walborn, NR; Worthey, G

    1996-01-01

    This paper represents a collective effort to provide an extensive electronic data base useful for the interpretation of the spectra and evolution of galaxies. A broad variety of empirical and theoretical data is discussed here, and the data are made fully available in the AAS CD-ROM Series, Vol. 7.

  9. Geometric Feature Extraction and Model Reconstruction Based on Scattered Data

    Institute of Scientific and Technical Information of China (English)

    胡鑫; 习俊通; 金烨

    2004-01-01

    A method of 3D model reconstruction based on scattered point data in reverse engineering is presented here. The topological relationship of scattered points was established firstly, then the data set was triangulated to reconstruct the mesh surface model. The curvatures of cloud data were calculated based on the mesh surface, and the point data were segmented by edge-based method; Every patch of data was fitted by quadric surface of freeform surface, and the type of quadric surface was decided by parameters automatically, at last the whole CAD model was created. An example of mouse model was employed to confirm the effect of the algorithm.

  10. Model-based scenarios of Mediterranean droughts

    Directory of Open Access Journals (Sweden)

    M. Weiß

    2007-11-01

    Full Text Available This study examines the change in current 100-year hydrological drought frequencies in the Mediterranean in comparison to the 2070s as simulated by the global model WaterGAP. The analysis considers socio-economic and climate changes as indicated by the IPCC scenarios A2 and B2 and the global general circulation model ECHAM4. Under these conditions today's 100-year drought is estimated to occur 10 times more frequently in the future over a large part of the Northern Mediterranean while in North Africa, today's 100-year drought will occur less frequently. Water abstractions are shown to play a minor role in comparison to the impact of climate change, but can intensify the situation.

  11. Warehouse Optimization Model Based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Guofeng Qin

    2013-01-01

    Full Text Available This paper takes Bao Steel logistics automated warehouse system as an example. The premise is to maintain the focus of the shelf below half of the height of the shelf. As a result, the cost time of getting or putting goods on the shelf is reduced, and the distance of the same kind of goods is also reduced. Construct a multiobjective optimization model, using genetic algorithm to optimize problem. At last, we get a local optimal solution. Before optimization, the average cost time of getting or putting goods is 4.52996 s, and the average distance of the same kinds of goods is 2.35318 m. After optimization, the average cost time is 4.28859 s, and the average distance is 1.97366 m. After analysis, we can draw the conclusion that this model can improve the efficiency of cargo storage.

  12. Nonlinear system modeling based on experimental data

    Energy Technology Data Exchange (ETDEWEB)

    PAEZ,THOMAS L.; HUNTER,NORMAN F.

    2000-02-02

    The canonical variate analysis technique is used in this investigation, along with a data transformation algorithm, to identify a system in a transform space. The transformation algorithm involves the preprocessing of measured excitation/response data with a zero-memory-nonlinear transform, specifically, the Rosenblatt transform. This transform approximately maps the measured excitation and response data from its own space into the space of uncorrelated, standard normal random variates. Following this transform, it is appropriate to model the excitation/response relation as linear since Gaussian inputs excite Gaussian responses in linear structures. The linear model is identified in the transform space using the canonical variate analysis approach, and system responses in the original space are predicted using inverse Rosenblatt transformation. An example is presented.

  13. Fujisaki Model Based Intonation Modeling for Korean TTS System

    Science.gov (United States)

    Kim, Byeongchang; Lee, Jinsik; Lee, Gary Geunbae

    One of the enduring problems in developing high-quality TTS (text-to-speech) system is pitch contour generation. Considering language specific knowledge, an adjusted Fujisaki model for Korean TTS system is introduced along with refined machine learning features. The results of quantitative and qualitative evaluations show the validity of our system: the accuracy of the phrase command prediction is 0.8928; the correlations of the predicted amplitudes of a phrase command and an accent command are 0.6644 and 0.6002, respectively; our method achieved the level of "fair" naturalness (3.6) in a MOS scale for generated F0 curves.

  14. Uncertainty Models for Knowledge-Based Systems

    Science.gov (United States)

    1991-08-01

    D. V. (1982). Improving judgment by reconciling incoherence. The behavioral and brain Sciences, 4, 317-370. (26] Carnap , R. (1958). Introduction to...Symbolic Logic and its Applications. Dover, N. Y. References 597 [271 Carnap , R. (1959). The Logical Syntax of Language. Littlefield, Adam and Co...Paterson, New Jersey. [28] Carnap , R. (1960). Meaning and Necessity, a Study in Semantic and Model Logic. Phoenix Books, Univ. of Chicago. [29] Carrega

  15. Internal Model Based Active Disturbance Rejection Control

    OpenAIRE

    Pan, Jinwen; Wang, Yong

    2016-01-01

    The basic active disturbance rejection control (BADRC) algorithm with only one order higher extended state observer (ESO) proves to be robust to both internal and external disturbances. An advantage of BADRC is that in many applications it can achieve high disturbance attenuation level without requiring a detailed model of the plant or disturbance. However, this can be regarded as a disadvantage when the disturbance characteristic is known since the BADRC algorithm cannot exploit such informa...

  16. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically...... of abelian varieties. The final chapter contains a list of challenging open questions. This book is aimed towards researchers with a background in algebraic and arithmetic geometry...

  17. Robust speech features representation based on computational auditory model

    Institute of Scientific and Technical Information of China (English)

    LU Xugang; JIA Chuan; DANG Jianwu

    2004-01-01

    A speech signal processing and features extracting method based on computational auditory model is proposed. The computational model is based on psychological, physiological knowledge and digital signal processing methods. In each stage of a hearing perception system, there is a corresponding computational model to simulate its function. Based on this model, speech features are extracted. In each stage, the features in different kinds of level are extracted. A further processing for primary auditory spectrum based on lateral inhibition is proposed to extract much more robust speech features. All these features can be regarded as the internal representations of speech stimulation in hearing system. The robust speech recognition experiments are conducted to test the robustness of the features. Results show that the representations based on the proposed computational auditory model are robust representations for speech signals.

  18. Component-based event composition modeling for CPS

    Science.gov (United States)

    Yin, Zhonghai; Chu, Yanan

    2017-06-01

    In order to combine event-drive model with component-based architecture design, this paper proposes a component-based event composition model to realize CPS’s event processing. Firstly, the formal representations of component and attribute-oriented event are defined. Every component is consisted of subcomponents and the corresponding event sets. The attribute “type” is added to attribute-oriented event definition so as to describe the responsiveness to the component. Secondly, component-based event composition model is constructed. Concept lattice-based event algebra system is built to describe the relations between events, and the rules for drawing Hasse diagram are discussed. Thirdly, as there are redundancies among composite events, two simplification methods are proposed. Finally, the communication-based train control system is simulated to verify the event composition model. Results show that the event composition model we have constructed can be applied to express composite events correctly and effectively.

  19. A method to manage the model base in DSS

    Institute of Scientific and Technical Information of China (English)

    孙成双; 李桂君

    2004-01-01

    How to manage and use models in DSS is a most important subject. Generally, it costs a lot of money and time to develop the model base management system in the development of DSS and most are simple in function or cannot be used efficiently in practice. It is a very effective, applicable, and economical choice to make use of the interfaces of professional computer software to develop a model base management system. This paper presents the method of using MATLAB, a well-known statistics software, as the development platform of a model base management system. The main functional framework of a MATLAB-based model base managementsystem is discussed. Finally, in this paper, its feasible application is illustrated in the field of construction projects.

  20. A Family of RBAC- Based Workflow Authorization Models

    Institute of Scientific and Technical Information of China (English)

    HONG Fan; XING Guang-lin

    2005-01-01

    A family of RBAC-based workflow authorization models, called RWAM, are proposed. RWAM consists of a basic model and other models constructed from the basic model. The basic model provides the notion of temporal permission which means that a user can perform certain operation on a task only for a time interval, this not only ensure that only authorized users could execute a task but also ensure that the authorization flow is synchronized with workflow. The two advance models of RWAM deal with role hierarchy and constraints respectively. RWAM ranges from simple to complex and provides a general reference model for other researches and developments of such area.

  1. Weather forecasting based on hybrid neural model

    Science.gov (United States)

    Saba, Tanzila; Rehman, Amjad; AlGhamdi, Jarallah S.

    2017-02-01

    Making deductions and expectations about climate has been a challenge all through mankind's history. Challenges with exact meteorological directions assist to foresee and handle problems well in time. Different strategies have been investigated using various machine learning techniques in reported forecasting systems. Current research investigates climate as a major challenge for machine information mining and deduction. Accordingly, this paper presents a hybrid neural model (MLP and RBF) to enhance the accuracy of weather forecasting. Proposed hybrid model ensure precise forecasting due to the specialty of climate anticipating frameworks. The study concentrates on the data representing Saudi Arabia weather forecasting. The main input features employed to train individual and hybrid neural networks that include average dew point, minimum temperature, maximum temperature, mean temperature, average relative moistness, precipitation, normal wind speed, high wind speed and average cloudiness. The output layer composed of two neurons to represent rainy and dry weathers. Moreover, trial and error approach is adopted to select an appropriate number of inputs to the hybrid neural network. Correlation coefficient, RMSE and scatter index are the standard yard sticks adopted for forecast accuracy measurement. On individual standing MLP forecasting results are better than RBF, however, the proposed simplified hybrid neural model comes out with better forecasting accuracy as compared to both individual networks. Additionally, results are better than reported in the state of art, using a simple neural structure that reduces training time and complexity.

  2. Characteristics of a Logistics-Based Business Model

    OpenAIRE

    Sandberg, Erik; Kihlén, Tobias; Abrahamsson, Mats

    2011-01-01

    In companies where excellence in logistics is decisive for the outperformance of competitors and logistics has an outspoken role for the strategy of the firm, there is present what we refer to here as a “logistics-based business model.” Based on a multiple case study of three Nordic retail companies, the purpose of this article is to explore the characteristics of such a logistics-based business model. As such, this research helps to provide structure to logistics-based business models and id...

  3. Improving satellite-based PM2.5 estimates in China using Gaussian processes modeling in a Bayesian hierarchical setting.

    Science.gov (United States)

    Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun

    2017-08-01

    Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM2.5 is a promising way to fill the areas that are not covered by ground PM2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R(2) = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM2.5 estimates.

  4. Solitonic Models Based on Quantum Groups and the Standard Model

    CERN Document Server

    Finkelstein, Robert J

    2010-01-01

    The idea that the elementary particles might have the symmetry of knots has had a long history. In any current formulation of this idea, however, the knot must be quantized. The present review is a summary of a small set of papers that began as an attempt to correlate the properties of quantized knots with the empirical properties of the elementary particles. As the ideas behind these papers have developed over a number of years the model has evolved, and this review is intended to present the model in its current form. The original picture of an elementary fermion as a solitonic knot of field, described by the trefoil representation of SUq(2), has expanded into its current form in which a knotted field is complementary to a composite structure composed of three or more preons that in turn are described by the fundamental representation of SLq(2). These complementary descriptions may be interpreted as describing single composite particles composed of three or more preons bound by a knotted field.

  5. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  6. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  7. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  8. Clone Detection for Graph-Based Model Transformation Languages

    DEFF Research Database (Denmark)

    Strüber, Daniel; Plöger, Jennifer; Acretoaie, Vlad

    2016-01-01

    has been proposed for programming and modeling languages; yet no specific ones have emerged for model transformation languages. In this paper, we explore clone detection for graph-based model transformation languages. We introduce potential use cases for such techniques in the context of constructive...

  9. Understanding Elementary Astronomy by Making Drawing-Based Models

    NARCIS (Netherlands)

    van Joolingen, W. R.; Aukes, Annika V A; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 ch

  10. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...

  11. Physiologically based kinetic modeling of the bioactivation of myristicin

    NARCIS (Netherlands)

    Al-Malahmeh, Amer J.; Al-Ajlouni, Abdelmajeed; Wesseling, Sebastiaan; Soffers, Ans E.M.F.; Al-Subeihi, A.; Kiwamoto, Reiko; Vervoort, Jacques; Rietjens, Ivonne M.C.M.

    2016-01-01

    The present study describes physiologically based kinetic (PBK) models for the alkenylbenzene myristicin that were developed by extension of the PBK models for the structurally related alkenylbenzene safrole in rat and human. The newly developed myristicin models revealed that the formation of th

  12. Model-Based Traffic Control for Sustainable Mobility

    NARCIS (Netherlands)

    Zegeye, S.K.

    2011-01-01

    Computationally efficient dynamic fuel consumption, emissions, and dispersion of emissions models are developed. Fast and practically feasible model-based controller is proposed. Using the developed models, the controller steers the traffic flow in such a way that a balanced trade-off between the t

  13. The social dimensions of system dynamics-based modelling

    NARCIS (Netherlands)

    Vriens, D.J.; Achterbergh, J.M.I.M.

    2006-01-01

    In this paper, the social dimension of system dynamics (SD)-based modelling is explored. Three manifestations of this dimension are identified: SD-models are made of social systems, they are built in social systems, and SD-models are built for social systems. The paper (1) explains the nature of

  14. A bond model for ribbed bars based on concrete confinement

    NARCIS (Netherlands)

    Den Uijl, J.A.; Bigaj, A.J.

    1996-01-01

    A new bond model for ribbed bars embedded in concrete has been developed. The model is based on the confining capacity of the concrete surrounding the bar. This confinement capacity is evaluated with the help of a thick-wailed-cylinder model, with which the relation between the radial displacement

  15. Understanding Elementary Astronomy by Making Drawing-Based Models

    NARCIS (Netherlands)

    van Joolingen, W. R.; Aukes, Annika V A; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 ch

  16. An XML-based information model for archaeological pottery

    Institute of Scientific and Technical Information of China (English)

    LIU De-zhi; RAZDAN Anshuman; SIMON Arleyn; BAE Myungsoo

    2005-01-01

    An information model is defined to support sharing scientific information on Web for archaeological pottery. Apart from non-shape information, such as age, material, etc., the model also consists of shape information and shape feature information. Shape information is collected by Lasers Scanner and geometric modelling techniques. Feature information is generated from shape information via feature extracting techniques. The model is used in an integrated storage, archival, and sketch-based query and retrieval system for 3D objects, native American ceramic vessels. A novel aspect of the information model is that it is totally implemented with XML, and is designed for Web-based visual query and storage application.

  17. On grey relation projection model based on projection pursuit

    Institute of Scientific and Technical Information of China (English)

    Wang Shuo; Yang Shanlin; Ma Xijun

    2008-01-01

    Multidimensional grey relation projection value can be synthesized as one-dimensional projection value by u-sing projection pursuit model.The larger the projection value is,the better the model.Thus,according to the projection value,the best one can be chosen from the model aggregation.Because projection pursuit modeling based on accelera-ting genetic algorithm can simplify the implementation procedure of the projection pursuit technique and overcome its complex calculation as well as the difficulty in implementing its program,a new method can be obtained for choosing the best grey relation projection model based on the projection pursuit technique.

  18. Online Knowledge-Based Model for Big Data Topic Extraction.

    Science.gov (United States)

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half.

  19. A VENSIM BASED ANALYSIS FOR SUPPLY CHAIN MODEL

    Directory of Open Access Journals (Sweden)

    Mohammad SHAMSUDDOHA

    2014-01-01

    Full Text Available The emphasis on supply chain has increased in recent years among academic and industry circles. In this paper, a supply chain model will be developed based on a case study of the poultry industry under the Vensim environment. System dynamics, supply chain, design science and case method under positivist and quantitative paradigm will be studied to develop a simulation model. The objectives of this paper are to review literature, develop a Vensim based simulation supply chain model, and examine the model qualitatively and quantitatively. The model will be also briefly discussed in relation of among forward, reverse and mainstream supply chain of the case.

  20. Online Knowledge-Based Model for Big Data Topic Extraction

    Science.gov (United States)

    Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan

    2016-01-01

    Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half. PMID:27195004

  1. Model based control of dynamic atomic force microscope

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chibum [Department of Mechanical System Design Engineering, Seoul National University of Science and Technology, Seoul 139-743 (Korea, Republic of); Salapaka, Srinivasa M., E-mail: salapaka@illinois.edu [Department of Mechanical Science and Engineering, University of Illinois at Urbana-Champaign, Urbana, Illinois 61801 (United States)

    2015-04-15

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H{sub ∞} control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  2. Model based control of dynamic atomic force microscope.

    Science.gov (United States)

    Lee, Chibum; Salapaka, Srinivasa M

    2015-04-01

    A model-based robust control approach is proposed that significantly improves imaging bandwidth for the dynamic mode atomic force microscopy. A model for cantilever oscillation amplitude and phase dynamics is derived and used for the control design. In particular, the control design is based on a linearized model and robust H(∞) control theory. This design yields a significant improvement when compared to the conventional proportional-integral designs and verified by experiments.

  3. A Technology-based Model for Learning

    Directory of Open Access Journals (Sweden)

    Michael Williams

    2004-12-01

    Full Text Available The Math Emporium, opened in 1997, is an open 7000-squaremeter facility with 550+ workstations arranged in an array of widely spaced hexagonal "pods", designed to support group work at the same time maintaining an academic air. We operate it 24/7 with math support personnel in attendance 12 hours per day. Students have access to online course resources at all times, from anywhere. We have used this unique asset to transform traditional classroom-based courses into technology based learning programs that have no class meetings at all. The structure of the program is very different from the conventional one, having a new set of expectations and motivations. The results include: more effective students, substantial cost savings, economies of scale and scope and a stream-lined process for creating new on-line courses.

  4. A Case Based Learning Model in Therapeutics

    Directory of Open Access Journals (Sweden)

    Ângelo Jesus

    2012-01-01

    Full Text Available Nowadays, learning a Pharmaceutical Profession is an increasing challenge. Apart from traditional texts, lectures and self-guided individual learning, pharmaceutical educators are encouraged to find and implement ways to promote higher order thinking, collaborative learning and to increase students’ motivation. One way of achieving these objectives is to complement traditional learning methods with the development and implementation of Case Based Learning (CBL, supported in real life situations. Methods regarding real problems stand in contrast to a more traditional approach to learning and instruction. They promote learner-centered, small group, interactive learning experiences, instead of large group, didactic, teacher-centered instruction. Developing such a learning approach can be a challenge. In this sense, it becomes relevant to promote and share experiences already underway and by doing so, disseminate knowledge in this field. It is our goal with this text to share our experience in the design and implementation of a Case Based Approach to Therapeutics.

  5. Ray-Based Reflectance Model for Diffraction

    CERN Document Server

    Cuypers, Tom; Haber, Tom; Bekaert, Philippe; Raskar, Ramesh

    2011-01-01

    We present a novel method of simulating wave effects in graphics using ray--based renderers with a new function: the Wave BSDF (Bidirectional Scattering Distribution Function). Reflections from neighboring surface patches represented by local BSDFs are mutually independent. However, in many surfaces with wavelength-scale microstructures, interference and diffraction requires a joint analysis of reflected wavefronts from neighboring patches. We demonstrate a simple method to compute the BSDF for the entire microstructure, which can be used independently for each patch. This allows us to use traditional ray--based rendering pipelines to synthesize wave effects of light and sound. We exploit the Wigner Distribution Function (WDF) to create transmissive, reflective, and emissive BSDFs for various diffraction phenomena in a physically accurate way. In contrast to previous methods for computing interference, we circumvent the need to explicitly keep track of the phase of the wave by using BSDFs that include positiv...

  6. Spherical Individual Cell-Based Models

    OpenAIRE

    Krinner, Axel

    2010-01-01

    Over the last decade a huge amount of experimental data on biological systems has been generated by modern high-throughput methods. Aided by bioinformatics, the '-omics' (genomics, transcriptomics, proteomics, metabolomics and interactomics) have listed, quantif ed and analyzed molecular components and interactions on all levels of cellular regulation. However, a comprehensive framework, that does not only list, but links all those components, is still largely missing. The biology-based but h...

  7. Modeling Morphogenesis in silico and in vitro: Towards Quantitative, Predictive, Cell-based Modeling

    NARCIS (Netherlands)

    R.M.H. Merks (Roeland); P. Koolwijk

    2009-01-01

    htmlabstractCell-based, mathematical models help make sense of morphogenesis—i.e. cells organizing into shape and pattern—by capturing cell behavior in simple, purely descriptive models. Cell-based models then predict the tissue-level patterns the cells produce collectively. The first

  8. Constrained generalized predictive control of battery charging process based on a coupled thermoelectric model

    Science.gov (United States)

    Liu, Kailong; Li, Kang; Zhang, Cheng

    2017-04-01

    Battery temperature is a primary factor affecting the battery performance, and suitable battery temperature control in particular internal temperature control can not only guarantee battery safety but also improve its efficiency. This is however challenging as current controller designs for battery charging have no mechanisms to incorporate such information. This paper proposes a novel battery charging control strategy which applies the constrained generalized predictive control (GPC) to charge a LiFePO4 battery based on a newly developed coupled thermoelectric model. The control target primarily aims to maintain the battery cell internal temperature within a desirable range while delivering fast charging. To achieve this, the coupled thermoelectric model is firstly introduced to capture the battery behaviours in particular SOC and internal temperature which are not directly measurable in practice. Then a controlled auto-regressive integrated moving average (CARIMA) model whose parameters are identified by the recursive least squares (RLS) algorithm is developed as an online self-tuning predictive model for a GPC controller. Then the constrained generalized predictive controller is developed to control the charging current. Experiment results confirm the effectiveness of the proposed control strategy. Further, the best region of heat dissipation rate and proper internal temperature set-points are also investigated and analysed.

  9. Numerical simulation of base flow with hot base bleed for two jet models

    OpenAIRE

    Wen-jie Yu; Yong-gang Yu; Bin Ni

    2014-01-01

    In order to improve the benefits of base bleed in base flow field, the base flow with hot base bleed for two jet models is studied. Two-dimensional axisymmetric Navier–Stokes equations are computed by using a finite volume scheme. The base flow of a cylinder afterbody with base bleed is simulated. The simulation results are validated with the experimental data, and the experimental results are well reproduced. On this basis, the base flow fields with base bleed for a circular jet model and an...

  10. Identity-based encryption with wildcards in the standard model

    Institute of Scientific and Technical Information of China (English)

    MING Yang; SHEN Xiao-qin; WANG Yu-min

    2009-01-01

    In this article, based on Chatterjee-Sarkar' hierarchical identity-based encryption (HIBE), a novel identity-based encryption with wildcards (WIBE) scheme is proposed and is proven secure in the standard model (without random oracle). The proposed scheme is proven to be secure assuming that the decisional Bilinear Diffie-Hellman (DBDH) problem is hard. Compared with the Wa-WIBE scheme that is secure in the standard model, our scheme has shorter common parameters and ciphertext length.

  11. BeetleBase: the model organism database for Tribolium castaneum

    OpenAIRE

    Wang, Liangjiang; Wang, Suzhi; Li, Yonghua; Paradesi, Martin S. R.; Brown, Susan J

    2006-01-01

    BeetleBase () is an integrated resource for the Tribolium research community. The red flour beetle (Tribolium castaneum) is an important model organism for genetics, developmental biology, toxicology and comparative genomics, the genome of which has recently been sequenced. BeetleBase is constructed to integrate the genomic sequence data with information about genes, mutants, genetic markers, expressed sequence tags and publications. BeetleBase uses the Chado data model and software component...

  12. User Context Aware Base Station Power Flow Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  13. A Knowledge Representation Model for Video—Based Animation

    Institute of Scientific and Technical Information of China (English)

    劳志强; 潘云鹤

    1998-01-01

    In this paper,a brief survey on knowledge-based animation techniques is given.Then a VideoStream-based Knowledge Representation Model(VSKRM)for Joint Objects is presented which includes the knowledge representation of :Graphic Object,Action and VideoStream.Next a general description of the UI framework of a system is given based on the VSKRM model.Finally,a conclusion is reached.

  14. Reduced model-based decision-making in schizophrenia.

    Science.gov (United States)

    Culbreth, Adam J; Westbrook, Andrew; Daw, Nathaniel D; Botvinick, Matthew; Barch, Deanna M

    2016-08-01

    Individuals with schizophrenia have a diminished ability to use reward history to adaptively guide behavior. However, tasks traditionally used to assess such deficits often rely on multiple cognitive and neural processes, leaving etiology unresolved. In the current study, we adopted recent computational formalisms of reinforcement learning to distinguish between model-based and model-free decision-making in hopes of specifying mechanisms associated with reinforcement-learning dysfunction in schizophrenia. Under this framework, decision-making is model-free to the extent that it relies solely on prior reward history, and model-based if it relies on prospective information such as motivational state, future consequences, and the likelihood of obtaining various outcomes. Model-based and model-free decision-making was assessed in 33 schizophrenia patients and 30 controls using a 2-stage 2-alternative forced choice task previously demonstrated to discern individual differences in reliance on the 2 forms of reinforcement-learning. We show that, compared with controls, schizophrenia patients demonstrate decreased reliance on model-based decision-making. Further, parameter estimates of model-based behavior correlate positively with IQ and working memory measures, suggesting that model-based deficits seen in schizophrenia may be partially explained by higher-order cognitive deficits. These findings demonstrate specific reinforcement-learning and decision-making deficits and thereby provide valuable insights for understanding disordered behavior in schizophrenia. (PsycINFO Database Record

  15. Tsunami Propagation Models Based on First Principles

    Science.gov (United States)

    2012-11-21

    obstacle and strike land in the shadow regions. Since v h according to Eq. (9), the velocity decreases nearer the coast as the depth decreases. The wave...Earth by the two locations is, from spherical trigonometry ,  1cos sin sin cos cos coss d s d d sθ λ λ λ λ φ φ      (37) The linear...speed of propagation, bending of tsunamis around obstacles and depth of the ocean, among others. Two-dimensional models on flat and spherical ocean

  16. Piecewise Linear Model-Based Image Enhancement

    Directory of Open Access Journals (Sweden)

    Fabrizio Russo

    2004-09-01

    Full Text Available A novel technique for the sharpening of noisy images is presented. The proposed enhancement system adopts a simple piecewise linear (PWL function in order to sharpen the image edges and to reduce the noise. Such effects can easily be controlled by varying two parameters only. The noise sensitivity of the operator is further decreased by means of an additional filtering step, which resorts to a nonlinear model too. Results of computer simulations show that the proposed sharpening system is simple and effective. The application of the method to contrast enhancement of color images is also discussed.

  17. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  18. Graph-based modelling in engineering

    CERN Document Server

    Rysiński, Jacek

    2017-01-01

    This book presents versatile, modern and creative applications of graph theory in mechanical engineering, robotics and computer networks. Topics related to mechanical engineering include e.g. machine and mechanism science, mechatronics, robotics, gearing and transmissions, design theory and production processes. The graphs treated are simple graphs, weighted and mixed graphs, bond graphs, Petri nets, logical trees etc. The authors represent several countries in Europe and America, and their contributions show how different, elegant, useful and fruitful the utilization of graphs in modelling of engineering systems can be. .

  19. Online constrained model-based reinforcement learning

    CSIR Research Space (South Africa)

    Van Niekerk, B

    2017-08-01

    Full Text Available and forth in order to develop enough momentum to swing the pendulum up. The state of the system, x = [x, v, θ, ω], is described by the position of the cart, the velocity of the cart, the angle of the pendulum and its angular velocity. A horizontal force u... by assuming a no-slip model. The state space is described by the vector [x, y, v, φ], where x and y denote the position of the car, v the lon- gitudinal velocity of the car, and φ the car’s orientation. The control signal consists of the PWM duty cycle...

  20. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.;

    2012-01-01

    aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv......Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...

  1. Phase Correlation Based Iris Image Registration Model

    Institute of Scientific and Technical Information of China (English)

    Jun-Zhou Huang; Tie-Niu Tan; Li Ma; Yun-Hong Wang

    2005-01-01

    Iris recognition is one of the most reliable personal identification methods. In iris recognition systems, image registration is an important component. Accurately registering iris images leads to higher recognition rate for an iris recognition system. This paper proposes a phase correlation based method for iris image registration with sub-pixel accuracy.Compared with existing methods, it is insensitive to image intensity and can compensate to a certain extent the non-linear iris deformation caused by pupil movement. Experimental results show that the proposed algorithm has an encouraging performance.

  2. Spatial analysis and modelling based on activities

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2010-01-01

    Full Text Available system. At the moment two fundamentally different types of approaches are used with regards microscopic traffic simulation, i.e. cellular automata and agent based approaches. A cellular automaton consists of a regular grid of cells, each in one of a... finite number of states. The grid can have any finite number of dimensions. Time is also discrete and the state of a particular cell at time t is a function of the states of a finite number of cells (called its neighbourhood) at time t – 1...

  3. A SPICE model for a phase-change memory cell based on the analytical conductivity model

    Science.gov (United States)

    Yiqun, Wei; Xinnan, Lin; Yuchao, Jia; Xiaole, Cui; Jin, He; Xing, Zhang

    2012-11-01

    By way of periphery circuit design of the phase-change memory, it is necessary to present an accurate compact model of a phase-change memory cell for the circuit simulation. Compared with the present model, the model presented in this work includes an analytical conductivity model, which is deduced by means of the carrier transport theory instead of the fitting model based on the measurement. In addition, this model includes an analytical temperature model based on the 1D heat-transfer equation and the phase-transition dynamic model based on the JMA equation to simulate the phase-change process. The above models for phase-change memory are integrated by using Verilog-A language, and results show that this model is able to simulate the I-V characteristics and the programming characteristics accurately.

  4. [Facial pain- a rare cause. Impacted lower third molars causing primarily "unclear" facial pain: a case report].

    Science.gov (United States)

    Gander, Thomas; Dagassan-Berndt, Dorothea; Mascolo, Luana; Kruse, Astrid L; Grätz, Klaus W; Lübbers, Heinz-Theo

    2013-01-01

    Orofacial pain often causes special difficulties to patients and dentists. Numerous differential diagnoses require the utilization of a coordinated diagnostic concept. Often, multiple causes lead to the need for a complex treatment plan. Impacted third molars are a potential cause of a variety of complications. Caries, pulp necrosis, and periapical infection are some of the infrequent causes of such pain. The presented case shows just such a constellation, resulting in primarily "unclear" orofacial pain. A diagnostic sequence generally leads to the correct diagnosis and thereby allows for fast and effective therapy. This shows how important structured diagnostics are, especially in cases of "unclear" pain.

  5. Pervasive Computing Location-aware Model Based on Ontology

    Institute of Scientific and Technical Information of China (English)

    PU Fang; CAI Hai-bin; CAO Qi-ying; SUN Dao-qing; LI Tong

    2008-01-01

    In order to integrate heterogeneous location-aware systems into pervasive computing environment, a novel pervasive computing location-aware model based on ontology is presented. A location-aware model ontology (LMO) is constructed. The location-aware model has the capabilities of sharing knowledge, reasoning and adjusting the usage policies of services dynamically through a unified semantic location manner. At last, the work process of our proposed location-aware model is explained by an application scenario.

  6. Markovian Building Blocks for Individual-Based Modelling

    OpenAIRE

    Nilsson, Lars Anders Fredrik; Nielsen, Bo Friis; Thygesen, Uffe Høgsbro; Beyer, Jan

    2007-01-01

    The present thesis consists of a summary report, four research articles, one technical report and one manuscript. The subject of the thesis is individual-based stochastic models. The summary report is composed of three parts and a brief history of some basic models in population biology. This history is included in order to provide a reader that has no previous exposure to models in population biology with a sufficient background to understand some of the biological models that are mentioned ...

  7. Research on Modeling of Hydropneumatic Suspension Based on Fractional Order

    OpenAIRE

    Junwei Zhang; Sizhong Chen; Yuzhuang Zhao; Jianbo Feng; Chang Liu; Ying Fan

    2015-01-01

    With such excellent performance as nonlinear stiffness, adjustable vehicle height, and good vibration resistance, hydropneumatic suspension (HS) has been more and more applied to heavy vehicle and engineering vehicle. Traditional modeling methods are still confined to simple models without taking many factors into consideration. A hydropneumatic suspension model based on fractional order (HSM-FO) is built with the advantage of fractional order (FO) in viscoelastic material modeling considerin...

  8. Weibull Parameters Estimation Based on Physics of Failure Model

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Reliability estimation procedures are discussed for the example of fatigue development in solder joints using a physics of failure model. The accumulated damage is estimated based on a physics of failure model, the Rainflow counting algorithm and the Miner’s rule. A threshold model is used...... distribution. Methods from structural reliability analysis are used to model the uncertainties and to assess the reliability for fatigue failure. Maximum Likelihood and Least Square estimation techniques are used to estimate fatigue life distribution parameters....

  9. UML statechart based rigorous modeling of real-time system

    Institute of Scientific and Technical Information of China (English)

    LAI Ming-zhi; YOU Jin-yuan

    2005-01-01

    Rigorous modeling could ensure correctness and could verify a reduced cost in embedded real-time system development for models. Software methods are needed for rigorous modeling of embedded real-time systems. PVS is a formal method with precise syntax and semantics defined. System modeled by PVS specification could be verified by tools. Combining the widely used UML with PVS, this paper provides a novel modeling and verification approach for embedded real-time systems. In this approach, we provide 1 ) a time-extended UML statechart for modeling dynamic behavior of an embedded real-time system; 2) an approach to capture timed automata based semantics from a timed statechart; and 3) an algorithm to generate a finite state model expressed in PVS specification for model checking. The benefits of our approach include flexibility and user friendliness in modeling, extendability in formalization and verification content, and better performance. Time constraints are modeled and verified and is a highlight of this paper.

  10. Business model for sensor-based fall recognition systems.

    Science.gov (United States)

    Fachinger, Uwe; Schöpke, Birte

    2014-01-01

    AAL systems require, in addition to sophisticated and reliable technology, adequate business models for their launch and sustainable establishment. This paper presents the basic features of alternative business models for a sensor-based fall recognition system which was developed within the context of the "Lower Saxony Research Network Design of Environments for Ageing" (GAL). The models were developed parallel to the R&D process with successive adaptation and concretization. An overview of the basic features (i.e. nine partial models) of the business model is given and the mutual exclusive alternatives for each partial model are presented. The partial models are interconnected and the combinations of compatible alternatives lead to consistent alternative business models. However, in the current state, only initial concepts of alternative business models can be deduced. The next step will be to gather additional information to work out more detailed models.

  11. Physiologically-based pharmacokinetic simulation modelling.

    Science.gov (United States)

    Grass, George M; Sinko, Patrick J

    2002-03-31

    Drug selection is now widely viewed as an important and relatively new, yet largely unsolved, bottleneck in the drug discovery and development process. In order to achieve an efficient selection process, high quality, rapid, predictive and correlative ADME models are required in order for them to be confidently used to support critical financial decisions. Systems that can be relied upon to accurately predict performance in humans have not existed, and decisions have been made using tools whose capabilities could not be verified until candidates went to clinical trial, leading to the high failure rates historically observed. However, with the sequencing of the human genome, advances in proteomics, the anticipation of the identification of a vastly greater number of potential targets for drug discovery, and the potential of pharmacogenomics to require individualized evaluation of drug kinetics as well as drug effects, there is an urgent need for rapid and accurately computed pharmacokinetic properties.

  12. Distributed Maximality based CTL Model Checking

    Directory of Open Access Journals (Sweden)

    Djamel Eddine Saidouni

    2010-05-01

    Full Text Available In this paper we investigate an approach to perform a distributed CTL Model checker algorithm on a network of workstations using Kleen three value logic, the state spaces is partitioned among the network nodes, We represent the incomplete state spaces as a Maximality labeled Transition System MLTS which are able to express true concurrency. we execute in parallel the same algorithm in each node, for a certain property on an incomplete MLTS , this last compute the set of states which satisfy or which if they fail are assigned the value .The third value mean unknown whether true or false because the partial state space lacks sufficient information needed for a precise answer concerning the complete state space .To solve this problem each node exchange the information needed to conclude the result about the complete state space. The experimental version of the algorithm is currently being implemented using the functional programming language Erlang.

  13. Repetition-based Interactive Facade Modeling

    KAUST Repository

    AlHalawani, Sawsan

    2012-07-01

    Modeling and reconstruction of urban environments has gained researchers attention throughout the past few years. It spreads in a variety of directions across multiple disciplines such as image processing, computer graphics and computer vision as well as in architecture, geoscience and remote sensing. Having a virtual world of our real cities is very attractive in various directions such as entertainment, engineering, governments among many others. In this thesis, we address the problem of processing a single fa cade image to acquire useful information that can be utilized to manipulate the fa cade and generate variations of fa cade images which can be later used for buildings\\' texturing. Typical fa cade structures exhibit a rectilinear distribution where in windows and other elements are organized in a grid of horizontal and vertical repetitions of similar patterns. In the firt part of this thesis, we propose an efficient algorithm that exploits information obtained from a single image to identify the distribution grid of the dominant elements i.e. windows. This detection method is initially assisted with the user marking the dominant window followed by an automatic process for identifying its repeated instances which are used to define the structure grid. Given the distribution grid, we allow the user to interactively manipulate the fa cade by adding, deleting, resizing or repositioning the windows in order to generate new fa cade structures. Having the utility for the interactive fa cade is very valuable to create fa cade variations and generate new textures for building models. Ultimately, there is a wide range of interesting possibilities of interactions to be explored.

  14. Alternative ways of using field-based estimates to calibrate ecosystem models and their implications for carbon cycle studies

    Science.gov (United States)

    He, Yujie; Zhuang, Qianlai; McGuire, David; Liu, Yaling; Chen, Min

    2013-01-01

    Model-data fusion is a process in which field observations are used to constrain model parameters. How observations are used to constrain parameters has a direct impact on the carbon cycle dynamics simulated by ecosystem models. In this study, we present an evaluation of several options for the use of observations in modeling regional carbon dynamics and explore the implications of those options. We calibrated the Terrestrial Ecosystem Model on a hierarchy of three vegetation classification levels for the Alaskan boreal forest: species level, plant-functional-type level (PFT level), and biome level, and we examined the differences in simulated carbon dynamics. Species-specific field-based estimates were directly used to parameterize the model for species-level simulations, while weighted averages based on species percent cover were used to generate estimates for PFT- and biome-level model parameterization. We found that calibrated key ecosystem process parameters differed substantially among species and overlapped for species that are categorized into different PFTs. Our analysis of parameter sets suggests that the PFT-level parameterizations primarily reflected the dominant species and that functional information of some species were lost from the PFT-level parameterizations. The biome-level parameterization was primarily representative of the needleleaf PFT and lost information on broadleaf species or PFT function. Our results indicate that PFT-level simulations may be potentially representative of the performance of species-level simulations while biome-level simulations may result in biased estimates. Improved theoretical and empirical justifications for grouping species into PFTs or biomes are needed to adequately represent the dynamics of ecosystem functioning and structure.

  15. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  16. Viscoelastic Parameter Model of Magnetorheological Elastomers Based on Abel Dashpot

    Directory of Open Access Journals (Sweden)

    Fei Guo

    2014-04-01

    Full Text Available In this paper, a parametric constitutive model based on Abel dashpot is established in a simple form and with clear physical meaning to deduce the expression of dynamic mechanical modulus of MREs. Meanwhile, in consideration for the pressure stress on MREs in the experiment of shear mechanical properties or the application to vibration damper, some improvements are made on the particle chain model based on the coupled field. In addition, in order to verify the accuracy of the overall model, five groups of MREs samples based on silicone rubber with different volume fractions are prepared and the MCR51 rheometer is used to conduct the experiment of dynamic mechanical properties based on frequency and magnetic field scanning. Finally, experimental results indicate that the established model fits well with laboratory data; namely, the relationship between the dynamic modulus of MREs and changes in frequency and magnetic field is well described by the model.

  17. Evaluating Emulation-based Models of Distributed Computing Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T.

    2017-10-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  18. Discrete Element Simulation of Asphalt Mastics Based on Burgers Model

    Institute of Scientific and Technical Information of China (English)

    LIU Yu; FENG Shi-rong; HU Xia-guang

    2007-01-01

    In order to investigate the viscoelastic performance of asphalt mastics, a micro-mechanical model for asphalt mastics was built by applying Burgers model to discrete element simulation and constructing Burgers contact model. Then the numerical simulation of creep tests was conducted, and results from the simulation were compared with the analytical solution for Burgers model. The comparision snowed that the two results agreed well with each other, suggesting that discrete element model based on Burgers model could be employed in the numerical simulation for asphalt mastics.

  19. INTRUSION DETECTION BASED ON THE SECOND-ORDER STOCHASTIC MODEL

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper presents a new method based on a second-order stochastic model for computer intrusion detection. The results show that the performance of the second-order stochastic model is better than that of a first-order stochastic model. In this study, different window sizes are also used to test the performance of the model. The detection results show that the second-order stochastic model is not so sensitive to the window size, comparing with the first-order stochastic model and other previous researches. The detection result of window sizes 6 and 10 is the same.

  20. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.