WorldWideScience

Sample records for automated model abstraction

  1. Abstract Delta Modeling

    OpenAIRE

    Dave Clarke; Michiel Helvensteijn; Ina Schaefer

    2011-01-01

    Delta modeling is an approach to facilitate automated product derivation for software product lines. It is based on a set of deltas specifying modifications that are incrementally applied to a core product. The applicability of deltas depends on feature-dependent conditions. This paper presents abstract delta modeling, which explores delta modeling from an abstract, algebraic perspective. Compared to previous work, we take a more flexible approach with respect to conflicts between modificatio...

  2. Automated Assume-Guarantee Reasoning by Abstraction Refinement

    Science.gov (United States)

    Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra

    2008-01-01

    Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.

  3. Teaching for Abstraction: A Model

    Science.gov (United States)

    White, Paul; Mitchelmore, Michael C.

    2010-01-01

    This article outlines a theoretical model for teaching elementary mathematical concepts that we have developed over the past 10 years. We begin with general ideas about the abstraction process and differentiate between "abstract-general" and "abstract-apart" concepts. A 4-phase model of teaching, called Teaching for Abstraction, is then proposed…

  4. Abstract Models of Probability

    Science.gov (United States)

    Maximov, V. M.

    2001-12-01

    Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.

  5. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  6. Replication and Abstraction: Symmetry in Automated Formal Verification

    Directory of Open Access Journals (Sweden)

    Thomas Wahl

    2010-04-01

    Full Text Available This article surveys fundamental and applied aspects of symmetry in system models, and of symmetry reduction methods used to counter state explosion in model checking, an automated formal verification technique. While covering the research field broadly, we particularly emphasize recent progress in applying the technique to realistic systems, including tools that promise to elevate the scope of symmetry reduction to large-scale program verification. The article targets researchers and engineers interested in formal verification of concurrent systems.

  7. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  8. Automated spatial and thematic generalization using a context transformation model: integrating steering parameters, classification and aggregation hierarchies, reduction factors, and topological structures for multiple abstractions.

    NARCIS (Netherlands)

    Richardson, D.E.

    1993-01-01

    This dissertation presents a model for spatial and thematic digital generalization. To do so, the development of digital generalization over the last thirty years is first reviewedThe approach to generalization taken in this research differs from other existing works as it tackles the task from a da

  9. Engineering Abstractions in Model Checking and Testing

    OpenAIRE

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet and testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implicati...

  10. An Abstraction Theory for Qualitative Models of Biological Systems

    CERN Document Server

    Banks, Richard; 10.4204/EPTCS.40.3

    2010-01-01

    Multi-valued network models are an important qualitative modelling approach used widely by the biological community. In this paper we consider developing an abstraction theory for multi-valued network models that allows the state space of a model to be reduced while preserving key properties of the model. This is important as it aids the analysis and comparison of multi-valued networks and in particular, helps address the well-known problem of state space explosion associated with such analysis. We also consider developing techniques for efficiently identifying abstractions and so provide a basis for the automation of this task. We illustrate the theory and techniques developed by investigating the identification of abstractions for two published MVN models of the lysis-lysogeny switch in the bacteriophage lambda.

  11. Engineering Abstractions in Model Checking and Testing

    DEFF Research Database (Denmark)

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet...... and testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implications of remaining weaknesses of these tools. We believe that a principled engineering approach to designing...

  12. SATURATED ZONE FLOW AND TRANSPORT MODEL ABSTRACTION

    Energy Technology Data Exchange (ETDEWEB)

    B.W. ARNOLD

    2004-10-27

    The purpose of the saturated zone (SZ) flow and transport model abstraction task is to provide radionuclide-transport simulation results for use in the total system performance assessment (TSPA) for license application (LA) calculations. This task includes assessment of uncertainty in parameters that pertain to both groundwater flow and radionuclide transport in the models used for this purpose. This model report documents the following: (1) The SZ transport abstraction model, which consists of a set of radionuclide breakthrough curves at the accessible environment for use in the TSPA-LA simulations of radionuclide releases into the biosphere. These radionuclide breakthrough curves contain information on radionuclide-transport times through the SZ. (2) The SZ one-dimensional (I-D) transport model, which is incorporated in the TSPA-LA model to simulate the transport, decay, and ingrowth of radionuclide decay chains in the SZ. (3) The analysis of uncertainty in groundwater-flow and radionuclide-transport input parameters for the SZ transport abstraction model and the SZ 1-D transport model. (4) The analysis of the background concentration of alpha-emitting species in the groundwater of the SZ.

  13. SATURATED ZONE FLOW AND TRANSPORT MODEL ABSTRACTION

    International Nuclear Information System (INIS)

    The purpose of the saturated zone (SZ) flow and transport model abstraction task is to provide radionuclide-transport simulation results for use in the total system performance assessment (TSPA) for license application (LA) calculations. This task includes assessment of uncertainty in parameters that pertain to both groundwater flow and radionuclide transport in the models used for this purpose. This model report documents the following: (1) The SZ transport abstraction model, which consists of a set of radionuclide breakthrough curves at the accessible environment for use in the TSPA-LA simulations of radionuclide releases into the biosphere. These radionuclide breakthrough curves contain information on radionuclide-transport times through the SZ. (2) The SZ one-dimensional (I-D) transport model, which is incorporated in the TSPA-LA model to simulate the transport, decay, and ingrowth of radionuclide decay chains in the SZ. (3) The analysis of uncertainty in groundwater-flow and radionuclide-transport input parameters for the SZ transport abstraction model and the SZ 1-D transport model. (4) The analysis of the background concentration of alpha-emitting species in the groundwater of the SZ

  14. Abstract polymer models with general pair interactions

    CERN Document Server

    Procacci, Aldo

    2007-01-01

    A convergence criterion of cluster expansion is presented in the case of an abstract polymer system with general pair interactions (i.e. not necessarily hard core or repulsive). As a concrete example, the low temperature disordered phase of the BEG model with infinite range interactions, decaying polynomially as $1/r^{d+\\lambda}$ with $\\lambda>0$, is studied.

  15. Solicited abstract: Global hydrological modeling and models

    Science.gov (United States)

    Xu, Chong-Yu

    2010-05-01

    The origins of rainfall-runoff modeling in the broad sense can be found in the middle of the 19th century arising in response to three types of engineering problems: (1) urban sewer design, (2) land reclamation drainage systems design, and (3) reservoir spillway design. Since then numerous empirical, conceptual and physically-based models are developed including event based models using unit hydrograph concept, Nash's linear reservoir models, HBV model, TOPMODEL, SHE model, etc. From the late 1980s, the evolution of global and continental-scale hydrology has placed new demands on hydrologic modellers. The macro-scale hydrological (global and regional scale) models were developed on the basis of the following motivations (Arenll, 1999). First, for a variety of operational and planning purposes, water resource managers responsible for large regions need to estimate the spatial variability of resources over large areas, at a spatial resolution finer than can be provided by observed data alone. Second, hydrologists and water managers are interested in the effects of land-use and climate variability and change over a large geographic domain. Third, there is an increasing need of using hydrologic models as a base to estimate point and non-point sources of pollution loading to streams. Fourth, hydrologists and atmospheric modellers have perceived weaknesses in the representation of hydrological processes in regional and global climate models, and developed global hydrological models to overcome the weaknesses of global climate models. Considerable progress in the development and application of global hydrological models has been achieved to date, however, large uncertainties still exist considering the model structure including large scale flow routing, parameterization, input data, etc. This presentation will focus on the global hydrological models, and the discussion includes (1) types of global hydrological models, (2) procedure of global hydrological model development

  16. An Abstract Model of Historical Processes

    CERN Document Server

    Poulshock, Michael

    2016-01-01

    A game theoretic model is presented which attempts to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents play a dynamic, noncooperative, perfect information game where the goal is to maximize payoffs based on positional utility and intertemporal preference, while being constrained by social inertia. Agents use the power they have in order to get more of it, both in an absolute and relative sense. More research is needed to assess the model's empirical validity.

  17. Abstract Action Potential Models for Toxin Recognition

    OpenAIRE

    Peterson, James; Khan, Taufiquar

    2005-01-01

    In this paper, we present a robust methodology using mathematical pattern recognition schemes to detect and classify events in action potentials for recognizing toxins in biological cells. We focus on event detection in action potential via abstraction of information content into a low dimensional feature vector within the constrained computational environment of a biosensor. We use generated families of action potentials from a classic Hodgkin–Huxley model to verify our methodology and build...

  18. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The Western Theories of War Ethics and Contemporary Controversies Li Xiaodong U Ruijing (4) [ Abstract] In the field of international relations, war ethics is a concept with distinct westem ideological color. Due to factors of history and reality, the in

  19. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A Preliminary Inquiry into the Intellectual Origins of Li Dazhao's My Idea of Marxism Abstract ;By translingual-textual comparison, this paper attempts to make a preliminary inquiry into the intellectual origins of Li Dazhao's My Idea of Marxism, suggesting that Li's article, instead of being "a complete copy" of the Japanese scholar

  20. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [ Abstract] Interaction between China and the international system has been a highlighted is- sue and drawing a great deal of attention all over the world. It has been approached from structural point of view and in a way of a conflicting pair of self and the other, which is the prevailing ontological perspective of IR studies. Contrary to it, processual

  1. Abstract

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cognitive Structure of Scientific Theory in the Scientist-Philosopher's Eyes、Two Theories of Scientific Abstraction Centered on Practices、Many-worlds Interpretation in Quantum Measurement and Its Meaning、Scientific Instrument: Paradigm Shift from Instrumentalism to Realism and Phenomenology

  2. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    China' s Dual-identity Dilemma and Its Countermeasm'es Li Shaojun(4) [ Abstract] The international system, as the overall structure for interactions among actors, is the environment and stage for implementation of China' s foreign policy. In this system, identity is a fundamental factor determining China' s international position and interests, and how to achieve them. China has long stressed that it is a "developing country,"

  3. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Abstract: The ethylene plant at SINOPEC Shanghai Petrochemical Company Limited ranked in the middle among SINOPEC subsidiaries in terms of ethylene and propylene yields, technical economical indicator and so on, and its performance ranking went no further in chemical sector. By means of feedstock optimization, steam optimization, and energy saving and consumption reduction, the company enhanced its competitiveness in the market and improved its efficiency. In addition, some ideas were put forward on performance improvement of the ethylene plant in the future.

  4. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [ Abstract ] The global resurgence of religion and the return of religion from the so-call "Westphalia Exile" to the central stage of international religions have significantly trans- formed the viewpoints of both media and academia toward the role of religion in IR, and the challenges posed by religion to the contemporary international relations are often described as entirely subversive. The author argues that as a second-tier factor in most countries' for- eign policies and international affairs,

  5. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    (1) Lenin's "Unity of Three Dialectics": Notes Science of Logic and The Capital on Philosophy in the Dual Contexts of Sun Zhengyu 4 Lenin's dialectics in Notes on Philosophy is essentially a unity of materialistic logic, dialectics and epistemology that has arisen from interactions between Hegel' s Science of Logic and Marx' s The Capital. Due to a lack of understanding of Lenin' s "unity of three dialectics," people tend to misunderstand his dialectics for the meeting of two extremes of the "sum total of living instances" and "abstract methods,

  6. ABSTRACT

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    --Based on of Marx's Economic Philosophy Manuscripts of 1844 HE Jian-jin (Philosophy Department, Fujian Provincial Committee Party School, Fuzhou, Fujian 350012, China) Abstract: Socialism with Chinese characteristics has a close relationship with the return and growth of capital in China. To implement the scientific concept of development, we must confront the problem of scientifically controlling the capital. In Economic and Philosophical Manuscripts of 1844, Marx criticized the three old philosophical thinking of treating capital: Object-oriented thinking, intuitive thinking, purely spiritual abstract thinking, and he established his own unique understanding of the capital that is to understand the capital from the human perceptual activities and practical activities. Contemporary Chinese society exist the problem of underdevelopment and abnormal development, and the three heterogeneity problems of pre-modern, modern, postmodern concurrent. In order to implement the scientific concept of development, we must reject any abstract positive or negative to modern basic principles under the guidance of the capital, against the eternal capital theory and capital theory of evil, and we must oppose the thinking that the capital is eternal or evil. Key words: socialism with Chinese characteristics; capital; national economics; scientific concept of development

  7. Automating Data Abstraction in a Quality Improvement Platform for Surgical and Interventional Procedures

    Science.gov (United States)

    Yetisgen, Meliha; Klassen, Prescott; Tarczy-Hornoch, Peter

    2014-01-01

    Objective: This paper describes a text processing system designed to automate the manual data abstraction process in a quality improvement (QI) program. The Surgical Care and Outcomes Assessment Program (SCOAP) is a clinician-led, statewide performance benchmarking QI platform for surgical and interventional procedures. The data elements abstracted as part of this program cover a wide range of clinical information from patient medical history to details of surgical interventions. Methods: Statistical and rule-based extractors were developed to automatically abstract data elements. A preprocessing pipeline was created to chunk free-text notes into its sections, sentences, and tokens. The information extracted in this preprocessing step was used by the statistical and rule-based extractors as features. Findings: Performance results for 25 extractors (14 statistical, 11 rule based) are presented. The average f1-scores for 11 rule-based extractors and 14 statistical extractors are 0.785 (min=0.576,max=0.931,std-dev=0.113) and 0.812 (min=0.571,max=0.993,std-dev=0.135) respectively. Discussion: Our error analysis revealed that most extraction errors were due either to data imbalance in the data set or the way the gold standard had been created. Conclusion: As future work, more experiments will be conducted with a more comprehensive data set from multiple institutions contributing to the QI project. PMID:25848598

  8. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Spiritual Construction and Its Ideological Sources in Contemporary China The spiritual construction in contemporary China is an important ideological task proposed by the historical practice of China. Modernized development often entails the meaning of entering into " modern civilization. " Nevertheless, an abstract understanding of this civilization has covered up its essential stipulation and historical nature. China has pursued its development on a different historical prerequisite from the west, and therefore, only partially belongs to modern capitalist modernization. The practical prospects of Chinese development imply a transformation and remodeling of the general lifestyle, life attitudes and values, which inevitably calls for a new form of philosophy. The ideological sources for this new philosophy are: Chinese philosophy, Western integration of them may point to philosophy and Marxist philosophy. A creative a potentially new type of civilization.

  9. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On the Construction of Water Conservancy during 1930s in Hubei Yu Tao ( 1 ) Abstract: Two extra-large floods in the 1930s had drawn the National Government's attention to introspect itself. After those disasters, the government had made some progress by a series of measures like repairing the dike, completing water conservancy institutions and enacting regulations in order to strengthen the water conservancy construction in Hubei. The government took water conservancy construction as a complex system project so that they had a relatively comprehensive consideration. It reflects the advancement of modern government to mobilize local people and put such social power into the unified planning. However, thelimitations of the government's policy implementation weakened the effect of water conservancy construction. Keywords: flood; government; water conservancy construction; effect

  10. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Relation between Individuals and Work Units in Stated-Owned Enterprises in Economic Transitional Pe- riod: Changes and their Influences Abstract: As a representation of extinction of work unit system, some dramatic changes have taken place on the rela- tion between individuals and work units in state-owned enterprises. Among many changes are the radical change of the way work unit stimulating and controlling its employees, the extinction of previous system supported by "work unit people", a tense relation between the employees and the work unit caused by the enterprise' s over-pursuit of performance. These changes result in such problems as grievous inequality, violation of personal interest, lack of mechanism for employees' voices and their low sense of belonging, which has brought unprecedented challenges for business administration and corpo- ration euhure development in China. Keywords: danwei/work unit; stimulate and control; relation between individuals and work units; work unit people

  11. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On Rousseau's Equality and Freedom GONG Qun Abstract:Equality and freedom are the two core concepts of political philosophy and Rousseau~s political philosophy is no exception. Freedom and equality in Rousseau in- cludes two levels: natural state and social state under social contract, and among them, there is one state of un-equality. The relationship between the two concepts here is that equality is a necessary precondition of freedom, and that there is no equality, there is no freedom. The achievement of Rousseau~s equality is by one contractual behavior that all the members transfer their rights, especially property rights, and form of the Community. Freedom in Rousseau's mind is through the people's sovereignty in the Community to achieve freedom.

  12. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [Abstract] The essay analyzed the action logic of hegemon with a power approach. Hegemony can be classified as benign or malignant. A benign hegemon should be pro- ductive, inclusive and maintain procedure justice when it uses its power. The power of hegemon can be categorized into two types: the hard power, which is the use of coer- cion and payment and can be measured by public products, and the soft power, which shows the ability to attract and co-opt and can be measured by the relationship-specific investments. The relationship between the input of public products and the relationship -specific investments is not positively correlative. Confusing with the public products and the soft power might lead to strategic misleading. A country rich in power re- sources should comply with the following principles if it wanted to improve its hard power and soft power: first, analyze the scope of the existing hegemon's soft power and avoid investing public products in the scope; second, maintain honesty in a long term and continue to increase others' benefits following the rule of neutral Pareto im- provement; third, provide both public goods and public bads; fourth, be more patient to obtain soft power. [ Key Words] hegemon, soft power, relationship-specific investment, strategic misleading [Authors]Feng Weijiang, Ph.D., Associate Professor, Institute of World Economics and Politics, Chinese Academy of Social Science; Yu Jieya, Master, PBC Shanghai Headquarters.

  13. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Western Characteristics of the Pardims of International Studies in America:With the Huaxla System as a Counterexample Ye Zicheng (4)[ Abstract ] Three flaws are obvious in the three paradigms of International Studies in America. Specifically, their arguments are based on the assumption that the world is anarchic ; they go too far in employing the scientific and rational methodology; they pay little attention to the humans. Hence, the three paradigms of international studies in America aren' t necessarily useful for the explanation of China' s history and culture as well as its relations with the outside world. The Huaxia system, for example, is anarchic but also apparently hierarchical; the approach of pursuing security in understanding the rise of western powers may be meaningless, for the hegemony in the Huaxia System needn't worry about its security; the theory of power-balancing seemingly couldn' t explain why Qin ended up in defeating the alliance of the other six states in the Warring-states period. The Huaxia system is quite open, and has free movement of people, goods, and ideas. Some interstate regimes and institutions were formed through Huimeng (alliance-making) among states. However, this kind of limited and fragile interdependence and cooperation soon came to an end after the hegemonies of Qi, Jin and Wei. There does exit the identity problem among states in the Huaxia System, but this problem doesn't play such a great role as the constructivists expect it would.

  14. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Discussions of Design Highlights for Tailgas Treatment in Sulphuric Acid Plant Using New Technology for Flue Gas Desulfurization Through Catalytic Reduction LI Xin , CAO Long-wen , YIN Hua-qiang , El Yue-li , LI Jian-iun ( 1 ,College of Architecture and Environment, Sichuan University, Chengdu 610065, China;2 ,Daye Nonferrous Metals Co., Ltd., Huangshi 435000, China; 3 ,Tile Sixth Construction Company Ltd. of China National Chemical Engineering Corp., Xiangfan 441021, China) Abstract : For the present situation of tailgas treatment in current sulphuric acid plants and existing problems with commonly used technologies, the fun- damental working principle, process flow and reference project for a new technology for flue gas desulfurization through catalytic redaction which is used for tailgas treatment in a sulphuric acid plant and recovery of sulphur resource are outlined. The design highlights of this technology are analyzed and the are proposed. Compared to conventional technologies, this new technology offers high desulfurization efficiency and unique technology, which can effectively tackle the difficuhies of tailgas treatment in sulphuric acid plants after enforcement of the new standard. This new technology is thought to be significant economic benefit, environmental benefit, as well as a promising future of application.

  15. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Strategic Realism: An Option for China' s Grand Strategy Song Dexing (4) [ Abstract] As a non-Western emerging power, China should positively adapt its grand strategy to the strategic psychological traits in the 21st century, maintain a realist tone consistent with the national conditions of China, and avoid adventurist policies while awaring both strategic strength and weakness. In the 21st century, China' s grand strategy should be based on such core values as security, development, peace and justice, especially focusing on development in particular, which we named "strategic realism". Given the profound changes in China and the world, strategic realism encourages active foreign policy to safe- guard the long-term national interests of China. Following the self-help logic and the fun- damental values of security and prosperity, strategic realism concerns national interests as its top-priority. It advocates smart use of power, and aims to achieve its objectives by optimizing both domestic and international conditions. From the perspective of diplomatic phi- losophy, strategic realism is not a summarization of concrete policies but a description of China' s grand strategy orientations in the new century. [ Key Words] China, grand strategy, strategic realism [ Author]Song Dexing, Professor, Ph.D. Supervisor, and Director of the Center for International Strategic Studies, University of International Studies of PLA.

  16. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Research of Theory & Method Miscible Flooding Well Test Model With Carbon Dioxide Injection and Its Pressure Analysis. 2011,20(4):1 -4 Zhu Jianwei, Shao Changjin, Liao Xinwei, Yang Zhenqing( China University of Petroleum (Beijing) ) Based on miscible flooding well test analysis theory with carbon dioxide injection, the diffusion model of mixture components when carbon dioxide miscible with the oil and variation law of the temperature and viscosity are analyzed,

  17. Injecting Abstract Interpretations into Linear Cost Models

    Directory of Open Access Journals (Sweden)

    David Cachera

    2010-06-01

    Full Text Available We present a semantics based framework for analysing the quantitative behaviour of programs with regard to resource usage. We start from an operational semantics equipped with costs. The dioid structure of the set of costs allows for defining the quantitative semantics as a linear operator. We then present an abstraction technique inspired from abstract interpretation in order to effectively compute global cost information from the program. Abstraction has to take two distinct notions of order into account: the order on costs and the order on states. We show that our abstraction technique provides a correct approximation of the concrete cost computations.

  18. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Ma Xiwu' s trial mode is a model of adjudication in the Shaanxi-Gansu-Ningxia Border Region, resulting from the joint forces of the border region' s specific wartime environment, local environment, the border region' s social transformation and judicial reform as well as many other factors.

  19. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Post-Western International System and the Rise of the East;Hegemonlc Dependence and the Logic in the Declining Asceudance of Leading Powers;Constructive Leadership and China's Diplomatic Transformation;The Bargaining Model of International Mediation Onset:A Quantitative Test;The Imnact of Gender Differences on National Military Expenditure

  20. Abstract

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Management fraud and auditing scandals became more serious since the 70s-80s of the last century, so that the independence of CPA faced nprecedented challenges. Growing emphasis was put on the independence of CPA on which the academic research deepened too. This article analyzes the nfluence of the independence of CPA from the conflict of the owners / shareholders, managers, and CPA. By analysing the balance of power of those conflict nd the factors that restrict them, and basing on a summary research in this area of other scholars, this paper put forward a CPA conflict model based on the orporate governance structure, and suggest on the issue of how to protect the auditing independence under this new model.

  1. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    A Representative work that Vainly Attempts to Westernize China On Yang Ji.sheng's Paper "My View of Chinese Pattern" XU Chong-wenAbstract: Mr. Yang Ji-sheng calls economic connotation of Chinese pattern "market economy of power" with all sorts of drawbacks, it is to take the problems that Chinese model deliberately struggles with and even the objects must be resolutely eliminated as the parts of Chinese pattern, thus they are absolute nonsense; he boils down political connotation of Chinese pattern to "authority politics" of "thoroughly denying modem democratic system",

  2. ABSTRACT

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Discussion on the Training Model of Training the Nursing Humanistic Quality by Employing the College English Teaching Li Hongfeng (Nursing College of Zhengzhou University, Zhenghou, Herman, 450052) Read and Write Periodica,vol.8, No.11,27,2011(ISSN1672-1578,in,Chinese) Abstract: The college English, as one of the most important public humanities courses in the curriculum system of collegiate nursing education, its teaching has positive orientation of human values and strong function in humanistic quality education.

  3. ABSTRACT

    Directory of Open Access Journals (Sweden)

    Michelle de Stefano Sabino

    2011-12-01

    Full Text Available This paper aims to describe and to analyze the integration observed in the Sintonia project with respect to the comparison of project management processes to the model of the Stage-Gate ®. The literature addresses these issues conceptually, but lack an alignment between them that is evident in practice. As a method was used single case study. The report is as if the Sintonia project, developed by PRODESP - Data Processing Company of São Paulo. The results show the integration of project management processes with the Stage-Gate model developed during the project life cycle. The formalization of the project was defined in stages in which allowed the exploitation of economies of repetition and recombination to the development of new projects. This study contributes to the technical vision in dealing with the integration of project management processes. It was concluded that this system represents an attractive way, in terms of creating economic value and technological innovation for the organization.

  4. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    STUDY ON THE FEATURES OF DIFFERENT REFLECTION IN TAHE OILFIELD PALEOCAVES RESERVOIR/Jianfeng Wang, Pei Jin, Xinhua, Li et al. Northwest Oilfield Company of SINOPEC, Urumqi, Xinfiang ,830011/Xinfiang ShiYou TianRan Qi ,2011,7 ( 3 ) : 1 - 5 Abstract:In This paper, in the light of Seismic migration section and tectonic analysis data, seismic echo styles of Tahe oilfield paleocaves reservoir in time migration are summed up; different reflection features are classified. In the meantime, the classification criteria of quantizing identification with reservoir seismic echo model is established. With those methods above, the paleocaves reservoirs forecasting degree and well arrangement ratio are improved, the risk of developmental drilling is reduced and efficiency is reduced. The high development efficiency in Ordovician paleocaves reservoir is fulfilled. Key Words : Tahe Oilfield ; Paleocaves Reservoir; Reflection features ; Quantizing identification ; Reservoirs forecasting

  5. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Research on the Theory and Standard of Peasants" Life Cycle Pension Compensation Mu Huaizhong Shen Yi· 2 · Thedifficulties of full coverage in pension system lie in rural farmers. In this paper, we put forward a "dual agricultural welfare difference" theory and apply it to the issues regarding peasants' life cycle pension compensation. Taking differential between equilib- rium and biased agricultural incomes as the key indicator, we build mathematical models of "dual agricultural welfare balance" and measure the size from 1953 to 2009. Our finding shows that China's "dual agricultnral welfare difference" has a fluctuation ranged be- tween 0.4 and 0.6. Based on life cycle characteristics, such as natural life cycle, policy and institutional life cycle, our suggestion is to compensate peasants' primary pension with a balance of "dual agriculture welfare difference" and other countermeasures.

  6. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Fold distribution, offset distribution and azi- muth distribution in a bin have direct effects on ge- ometry attributes in 3D seismic survey design. If two adjacent bins have the same fold but different offsets, this non-uniform offset distribution brings some stack amplitude diversity in the adjacent bins. At present, 3D geometry attribute uniformi- ty of the most analytical methods is expressed by qualitative analysis chart. We introduce in this pa- per a uniformity quantitative analysis method for offset distribution by average value, square devia- tion and weighting factor. The paper analyses effects on offset distribution uniformity of different 3D geometry parameters by the proposed uniformi- ty quantitative analysis method. Furthermore, the paper analyses effects on seismic stack amplitude or frequency uniformity of different 3D geometry parameters by the seismic wave modeling. The re- suits show that offset distribution uniformity is in good agreement with seismic stack amplitude or frequency uniformity. Therefore this improved method can be considered as a useful tool for ana- lyzing and evaluating 3D geometry attribute uni- formity.

  7. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    Science.gov (United States)

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results. PMID:20012610

  8. Geographic information abstractions: conceptual clarity for geographic modeling

    OpenAIRE

    T L Nyerges

    1991-01-01

    Just as we abstract our reality to make life intellectually manageable, we must create abstractions when we build models of geographic structure and process. Geographic information abstractions with aspects of theme, time, and space can be used to provide a comprehensive description of geographic reality in a geographic information system (GIS). In the context of geographic modeling a geographic information abstraction is defined as a simultaneous focus on important characteristics of geograp...

  9. Integration of Automated Decision Support Systems with Data Mining Abstract: A Client Perspective

    Directory of Open Access Journals (Sweden)

    Abdullah Saad AL-Malaise

    2013-03-01

    Full Text Available Customer’s behavior and satisfaction are always play important role to increase organization’s growth and market value. Customers are on top priority for the growing organization to build up their businesses. In this paper presents the architecture of Decision Support Systems (DSS in connection to deal with the customer’s enquiries and requests. Main purpose behind the proposed model is to enhance the customer’s satisfaction and behavior using DSS. We proposed model by extension in traditional DSS concepts with integration of Data Mining (DM abstract. The model presented in this paper shows the comprehensive architecture to work on the customer requests using DSS and knowledge management (KM for improving the customer’s behavior and satisfaction. Furthermore, DM abstract provides more methods and techniques; to understand the contacted customer’s data, to classify the replied answers in number of classes, and to generate association between the same type of queries, and finally to maintain the KM for future correspondence.

  10. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  11. Model-based Abstraction of Data Provenance

    OpenAIRE

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions. This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potent...

  12. Traffic Modeling and Probabilistic Process Abstraction

    Institute of Scientific and Technical Information of China (English)

    HU Lian-ming

    2003-01-01

    State-based models provide an attractive and simple approach to performance modeling. Unfortunately,this approach gives rise to two fundamental problems: 1) capturing the input loads to a system efficiently within such presentations; and 2) coping with the explosion in the number of states whtn system is compositionally presented. Both problems can be regarded as searching for some optimal representative state model with a minimal const. In this paper a probabilistic feedback search approach (popularly referred to as a genetic algorithm) was presented for locating good models with low (state) cost.

  13. A Model-Driven Parser Generator, from Abstract Syntax Trees to Abstract Syntax Graphs

    CERN Document Server

    Quesada, Luis; Cubero, Juan-Carlos

    2012-01-01

    Model-based parser generators decouple language specification from language processing. The model-driven approach avoids the limitations that conventional parser generators impose on the language designer. Conventional tools require the designed language grammar to conform to the specific kind of grammar supported by the particular parser generator (being LL and LR parser generators the most common). Model-driven parser generators, like ModelCC, do not require a grammar specification, since that grammar can be automatically derived from the language model and, if needed, adapted to conform to the requirements of the given kind of parser, all of this without interfering with the conceptual design of the language and its associated applications. Moreover, model-driven tools such as ModelCC are able to automatically resolve references between language elements, hence producing abstract syntax graphs instead of abstract syntax trees as the result of the parsing process. Such graphs are not confined to directed ac...

  14. Abstract Stobjs and Their Application to ISA Modeling

    Directory of Open Access Journals (Sweden)

    Shilpi Goel

    2013-04-01

    Full Text Available We introduce a new ACL2 feature, the abstract stobj, and show how to apply it to modeling the instruction set architecture of a microprocessor. Benefits of abstract stobjs over traditional ("concrete'' stobjs can include faster execution, support for symbolic simulation, more efficient reasoning, and resilience of proof developments under modeling optimization.

  15. Automated systemic-cognitive analysis of images pixels (generalization, abstraction, classification and identification

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-09-01

    Full Text Available In the article the application of systemic-cognitive analysis and its mathematical model i.e. the system theory of the information and its program toolkit which is "Eidos" system for loading images from graphics files, synthesis of the generalized images of classes, their abstraction, classification of the generalized images (clusters and constructs comparisons of concrete images with the generalized images (identification are examined. We suggest using the theory of information for processing the data and its size for every pixel which indicates that the image is of a certain class. A numerical example is given in which on the basis of a number of specific examples of images belonging to different classes, forming generalized images of these classes, independent of their specific implementations, i.e., the "Eidoses" of these images (in the definition of Plato – the prototypes or archetypes of images (in the definition of Jung. But the "Eidos" system provides not only the formation of prototype images, which quantitatively reflects the amount of information in the elements of specific images on their belonging to a particular proto-types, but a comparison of specific images with generic (identification and the generalization of pictures images with each other (classification

  16. Syntactic Abstraction of B Models to Generate Tests

    CERN Document Server

    Julliand, Jacques; Bué, Pierre-Christophe; Masson, Pierre-Alain

    2010-01-01

    In a model-based testing approach as well as for the verification of properties, B models provide an interesting solution. However, for industrial applications, the size of their state space often makes them hard to handle. To reduce the amount of states, an abstraction function can be used, often combining state variable elimination and domain abstractions of the remaining variables. This paper complements previous results, based on domain abstraction for test generation, by adding a preliminary syntactic abstraction phase, based on variable elimination. We define a syntactic transformation that suppresses some variables from a B event model, in addition to a method that chooses relevant variables according to a test purpose. We propose two methods to compute an abstraction A of an initial model M. The first one computes A as a simulation of M, and the second one computes A as a bisimulation of M. The abstraction process produces a finite state system. We apply this abstraction computation to a Model Based T...

  17. Demo abstract: Flexhouse-2-an open source building automation platform with a focus on flexible control

    DEFF Research Database (Denmark)

    Gehrke, Oliver; Kosek, Anna Magdalena; Svendsen, Mathias

    2014-01-01

    distributed energy resources (DER) has been a strong focus of smart grid research in recent years. These resources include building-integrated generation such as photovoltaics, loads capable of demand response (DR) and local energy storage. A large portion of these resources will be installed......, an open-source implementation of a building automation system which has been designed with a strong focus on enabling the integration of the building into a smart power system and dedicated support for the requirements of an R&D environment. We will demonstrate the need for such a platform, discuss...

  18. Compositional Abstraction of PEPA Models for Transient Analysis

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    Stochastic process algebras such as PEPA allow complex stochastic models to be described in a compositional way, but this leads to state space explosion problems. To combat this, there has been a great deal of work in developing techniques for abstracting Markov chains. In particular, abstract - or...... explicitly. In this paper, we present a compositional application of abstract Markov chains to PEPA, based on a Kronecker representation of the underlying CTMC. This can be used to bound probabilistic reachability properties in the Continuous Stochastic Logic (CSL), and we have implemented this as part of...

  19. 07361 Abstracts Collection -- Programming Models for Ubiquitous Parallelism

    OpenAIRE

    Wong, David Chi-Leung; Cohen, Albert; Garzarán, María J.; Lengauer, Christian; Midkiff, Samuel P.

    2008-01-01

    From 02.09. to 07.09.2007, the Dagstuhl Seminar 07361 ``Programming Models for Ubiquitous Parallelism'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section des...

  20. An intensionally fully-abstract sheaf model for π

    DEFF Research Database (Denmark)

    Eberhart, Clovis; Hirschowitz, Tom; Seiller, Thomas

    2015-01-01

    Following previous work on CCS, we propose a compositional model for the pi-calculus in which processes are interpreted as sheaves on certain simple sites. We define an analogue of fair testing equivalence in the model and show that our interpretation is intensionally fully abstract for it. That ...

  1. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  2. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  3. Automating Derivations of Abstract Machines from Reduction Semantics: A Generic Formalization of Refocusing in Coq

    DEFF Research Database (Denmark)

    Sieczkowski, Filip; Biernacka, Małgorzata; Biernacki, Dariusz

    2011-01-01

    machine via a succession of simple program transfor- mations. So far, refocusing has been used only as an informal procedure: the conditions required of a reduction semantics have not been formally captured, and the transformation has not been formally proved correct. The aim of this work is to formalize...... into an abstract machine equivalent to it. The article is accompanied by a Coq development that contains the formalization of the refocusing method and a number of case studies that serve both as an illustration of the method and as a sanity check on the axiomatization....

  4. Learning Models of Communication Protocols using Abstraction Techniques

    OpenAIRE

    Uijen, Johan

    2009-01-01

    In order to accelerate the usage of model based verification in real life software life cycles, an approach is introduced in this thesis to learn models from black box software modules. These models can be used to perform model based testing on. In this thesis models of communication protocols are considered. To learn these models efficiently, an abstraction needs to be defined over the parameters that are used in the messages that are sent and received by the protocols. The tools that are us...

  5. Particle Tracking Model and Abstraction of Transport Processes

    Energy Technology Data Exchange (ETDEWEB)

    B. Robinson

    2004-10-21

    The purpose of this report is to document the abstraction model being used in total system performance assessment (TSPA) model calculations for radionuclide transport in the unsaturated zone (UZ). The UZ transport abstraction model uses the particle-tracking method that is incorporated into the finite element heat and mass model (FEHM) computer code (Zyvoloski et al. 1997 [DIRS 100615]) to simulate radionuclide transport in the UZ. This report outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the UZ at Yucca Mountain. In addition, methods for determining and inputting transport parameters are outlined for use in the TSPA for license application (LA) analyses. Process-level transport model calculations are documented in another report for the UZ (BSC 2004 [DIRS 164500]). Three-dimensional, dual-permeability flow fields generated to characterize UZ flow (documented by BSC 2004 [DIRS 169861]; DTN: LB03023DSSCP9I.001 [DIRS 163044]) are converted to make them compatible with the FEHM code for use in this abstraction model. This report establishes the numerical method and demonstrates the use of the model that is intended to represent UZ transport in the TSPA-LA. Capability of the UZ barrier for retarding the transport is demonstrated in this report, and by the underlying process model (BSC 2004 [DIRS 164500]). The technical scope, content, and management of this report are described in the planning document ''Technical Work Plan for: Unsaturated Zone Transport Model Report Integration'' (BSC 2004 [DIRS 171282]). Deviations from the technical work plan (TWP) are noted within the text of this report, as appropriate. The latest version of this document is being prepared principally to correct parameter values found to be in error due to transcription errors, changes in source data that were not captured in the report, calculation errors, and errors in interpretation of source data.

  6. Particle Tracking Model and Abstraction of Transport Processes

    International Nuclear Information System (INIS)

    The purpose of this report is to document the abstraction model being used in total system performance assessment (TSPA) model calculations for radionuclide transport in the unsaturated zone (UZ). The UZ transport abstraction model uses the particle-tracking method that is incorporated into the finite element heat and mass model (FEHM) computer code (Zyvoloski et al. 1997 [DIRS 100615]) to simulate radionuclide transport in the UZ. This report outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the UZ at Yucca Mountain. In addition, methods for determining and inputting transport parameters are outlined for use in the TSPA for license application (LA) analyses. Process-level transport model calculations are documented in another report for the UZ (BSC 2004 [DIRS 164500]). Three-dimensional, dual-permeability flow fields generated to characterize UZ flow (documented by BSC 2004 [DIRS 169861]; DTN: LB03023DSSCP9I.001 [DIRS 163044]) are converted to make them compatible with the FEHM code for use in this abstraction model. This report establishes the numerical method and demonstrates the use of the model that is intended to represent UZ transport in the TSPA-LA. Capability of the UZ barrier for retarding the transport is demonstrated in this report, and by the underlying process model (BSC 2004 [DIRS 164500]). The technical scope, content, and management of this report are described in the planning document ''Technical Work Plan for: Unsaturated Zone Transport Model Report Integration'' (BSC 2004 [DIRS 171282]). Deviations from the technical work plan (TWP) are noted within the text of this report, as appropriate. The latest version of this document is being prepared principally to correct parameter values found to be in error due to transcription errors, changes in source data that were not captured in the report, calculation errors, and errors in interpretation of source data

  7. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  8. ALC: automated reduction of rule-based models

    Directory of Open Access Journals (Sweden)

    Gilles Ernst

    2008-10-01

    Full Text Available Abstract Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files.

  9. Modelling the influence of irrigation abstractions on Scotland's water resources.

    Science.gov (United States)

    Dunn, S M; Chalmers, N; Stalham, M; Lilly, A; Crabtree, B; Johnston, L

    2003-01-01

    Legislation to control abstraction of water in Scotland is limited and for purposes such as irrigation there are no restrictions in place over most of the country. This situation is set to change with implementation of the European Water Framework Directive. As a first step towards the development of appropriate policy for irrigation control there is a need to assess the current scale of irrigation practices in Scotland. This paper presents a modelling approach that has been used to quantify spatially the volume of water abstractions across the country for irrigation of potato crops under typical climatic conditions. A water balance model was developed to calculate soil moisture deficits and identify the potential need for irrigation. The results were then combined with spatial data on potato cropping and integrated to the sub-catchment scale to identify the river systems most at risk from over-abstraction. The results highlight that the areas that have greatest need for irrigation of potatoes are all concentrated in the central east-coast area of Scotland. The difference between irrigation demand in wet and dry years is very significant, although spatial patterns of the distribution are similar.

  10. Situation models, mental simulations, and abstract concepts in discourse comprehension.

    Science.gov (United States)

    Zwaan, Rolf A

    2016-08-01

    This article sets out to examine the role of symbolic and sensorimotor representations in discourse comprehension. It starts out with a review of the literature on situation models, showing how mental representations are constrained by linguistic and situational factors. These ideas are then extended to more explicitly include sensorimotor representations. Following Zwaan and Madden (2005), the author argues that sensorimotor and symbolic representations mutually constrain each other in discourse comprehension. These ideas are then developed further to propose two roles for abstract concepts in discourse comprehension. It is argued that they serve as pointers in memory, used (1) cataphorically to integrate upcoming information into a sensorimotor simulation, or (2) anaphorically integrate previously presented information into a sensorimotor simulation. In either case, the sensorimotor representation is a specific instantiation of the abstract concept.

  11. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  12. Automation life-cycle cost model

    Science.gov (United States)

    Gathmann, Thomas P.; Reeves, Arlinda J.; Cline, Rick; Henrion, Max; Ruokangas, Corinne

    1992-01-01

    The problem domain being addressed by this contractual effort can be summarized by the following list: Automation and Robotics (A&R) technologies appear to be viable alternatives to current, manual operations; Life-cycle cost models are typically judged with suspicion due to implicit assumptions and little associated documentation; and Uncertainty is a reality for increasingly complex problems and few models explicitly account for its affect on the solution space. The objectives for this effort range from the near-term (1-2 years) to far-term (3-5 years). In the near-term, the envisioned capabilities of the modeling tool are annotated. In addition, a framework is defined and developed in the Decision Modelling System (DEMOS) environment. Our approach is summarized as follows: Assess desirable capabilities (structure into near- and far-term); Identify useful existing models/data; Identify parameters for utility analysis; Define tool framework; Encode scenario thread for model validation; and Provide transition path for tool development. This report contains all relevant, technical progress made on this contractual effort.

  13. The Conceptual Integration Modeling Framework: Abstracting from the Multidimensional Model

    CERN Document Server

    Rizzolo, Flavio; Pottinger, Rachel; Wong, Kwok

    2010-01-01

    Data warehouses are overwhelmingly built through a bottom-up process, which starts with the identification of sources, continues with the extraction and transformation of data from these sources, and then loads the data into a set of data marts according to desired multidimensional relational schemas. End user business intelligence tools are added on top of the materialized multidimensional schemas to drive decision making in an organization. Unfortunately, this bottom-up approach is costly both in terms of the skilled users needed and the sheer size of the warehouses. This paper proposes a top-down framework in which data warehousing is driven by a conceptual model. The framework offers both design time and run time environments. At design time, a business user first uses the conceptual modeling language as a multidimensional object model to specify what business information is needed; then she maps the conceptual model to a pre-existing logical multidimensional representation. At run time, a system will tra...

  14. Models in Movies: Teaching Abstract Concepts in Concrete Models

    OpenAIRE

    Madeline Breer; Bianca Christensen; Jennifer Taylor

    2012-01-01

    The tool created here is an instructional video that demonstrates how to create models of the heavy and light chains of an antibody using pipe cleaners, and how the process of V, D, and J gene recombination functions, using the model as a visual aide. The video was created by undergraduate students, with the intended audience being other undergraduates. This type of direct peer teaching aids in education because the “teachers” in this situation greatly enhance their own knowledge throu...

  15. The Abstract Machine Model for Transaction-based System Control

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.

    2003-01-31

    Recent work applying statistical mechanics to economic modeling has demonstrated the effectiveness of using thermodynamic theory to address the complexities of large scale economic systems. Transaction-based control systems depend on the conjecture that when control of thermodynamic systems is based on price-mediated strategies (e.g., auctions, markets), the optimal allocation of resources in a market-based control system results in an emergent optimal control of the thermodynamic system. This paper proposes an abstract machine model as the necessary precursor for demonstrating this conjecture and establishes the dynamic laws as the basis for a special theory of emergence applied to the global behavior and control of complex adaptive systems. The abstract machine in a large system amounts to the analog of a particle in thermodynamic theory. The permit the establishment of a theory dynamic control of complex system behavior based on statistical mechanics. Thus we may be better able to engineer a few simple control laws for a very small number of devices types, which when deployed in very large numbers and operated as a system of many interacting markets yields the stable and optimal control of the thermodynamic system.

  16. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  17. Automated smoother for the numerical decoupling of dynamics models

    Directory of Open Access Journals (Sweden)

    Santos Helena

    2007-08-01

    Full Text Available Abstract Background Structure identification of dynamic models for complex biological systems is the cornerstone of their reverse engineering. Biochemical Systems Theory (BST offers a particularly convenient solution because its parameters are kinetic-order coefficients which directly identify the topology of the underlying network of processes. We have previously proposed a numerical decoupling procedure that allows the identification of multivariate dynamic models of complex biological processes. While described here within the context of BST, this procedure has a general applicability to signal extraction. Our original implementation relied on artificial neural networks (ANN, which caused slight, undesirable bias during the smoothing of the time courses. As an alternative, we propose here an adaptation of the Whittaker's smoother and demonstrate its role within a robust, fully automated structure identification procedure. Results In this report we propose a robust, fully automated solution for signal extraction from time series, which is the prerequisite for the efficient reverse engineering of biological systems models. The Whittaker's smoother is reformulated within the context of information theory and extended by the development of adaptive signal segmentation to account for heterogeneous noise structures. The resulting procedure can be used on arbitrary time series with a nonstationary noise process; it is illustrated here with metabolic profiles obtained from in-vivo NMR experiments. The smoothed solution that is free of parametric bias permits differentiation, which is crucial for the numerical decoupling of systems of differential equations. Conclusion The method is applicable in signal extraction from time series with nonstationary noise structure and can be applied in the numerical decoupling of system of differential equations into algebraic equations, and thus constitutes a rather general tool for the reverse engineering of

  18. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  19. Proposal of a three level complexity model for office automation

    OpenAIRE

    Sáez Vacas, Fernando; Alonso García, Gustavo

    1989-01-01

    Office automation is one of the fields where the complexity related with technologies and working environments can be best shown. This is the starting point we have chosen to build up a theoretical model that shows us a scene quite different from the one traditionally considered. Through the development of the model, the levels of complexity associated with office automation and office environments have been identified, establishing a relationship between them. Thus...

  20. Selected translated abstracts of Russian-language climate-change publications. 4: General circulation models

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.] [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Razuvaev, V.N.; Sivachok, S.G. [All-Russian Research Inst. of Hydrometeorological Information--World Data Center, Obninsk (Russian Federation)

    1996-10-01

    This report presents English-translated abstracts of important Russian-language literature concerning general circulation models as they relate to climate change. Into addition to the bibliographic citations and abstracts translated into English, this report presents the original citations and abstracts in Russian. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  1. Automation Marketplace 2010: New Models, Core Systems

    Science.gov (United States)

    Breeding, Marshall

    2010-01-01

    In a year when a difficult economy presented fewer opportunities for immediate gains, the major industry players have defined their business strategies with fundamentally different concepts of library automation. This is no longer an industry where companies compete on the basis of the best or the most features in similar products but one where…

  2. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  3. Resource Allocation Model for Modelling Abstract RTOS on Multiprocessor System-on-Chip

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2003-01-01

    Resource Allocation is an important problem in RTOS's, and has been an active area of research. Numerous approaches have been developed and many different techniques have been combined for a wide range of applications. In this paper, we address the problem of resource allocation in the context of...... modelling an abstract RTOS on multiprocessor SoC platforms. We discuss the implementation details of a simplified basic priority inheritance protocol for our abstract system model in SystemC....

  4. Abstract interpretation over non-deterministic finite tree automate for set-based analysis of logic programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Puebla, G.

    2002-01-01

    Set-based program analysis has many potential applications, including compiler optimisations, type-checking, debugging, verification and planning. One method of set-based analysis is to solve a set of {\\it set constraints} derived directly from the program text. Another approach is based on abstr...... of achieving the precision of set-constraints in the abstract interpretation framework....

  5. Abstract interpretation over non-deterministic finite tree automate for set-based analysis of logic programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Puebla, G.

    2002-01-01

    Set-based program analysis has many potential applications, including compiler optimisations, type-checking, debugging, verification and planning. One method of set-based analysis is to solve a set of {\\it set constraints} derived directly from the program text. Another approach is based on abstr......Set-based program analysis has many potential applications, including compiler optimisations, type-checking, debugging, verification and planning. One method of set-based analysis is to solve a set of {\\it set constraints} derived directly from the program text. Another approach is based...... on abstract interpretation (with widening) over an infinite-height domain of regular types. Up till now only deterministic types have been used in abstract interpretations, whereas solving set constraints yields non-deterministic types, which are more precise. It was pointed out by Cousot and Cousot that set...... constraint analysis of a particular program $P$ could be understood as an abstract interpretation over a finite domain of regular tree grammars, constructed from $P$. In this paper we define such an abstract interpretation for logic programs, formulated over a domain of non-deterministic finite tree automata...

  6. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is consider

  7. Simulation Model of Automated Peat Briquetting Press Drive

    Directory of Open Access Journals (Sweden)

    A. Marozka

    2012-01-01

    Full Text Available The paper presents the developed fully functional simulation model of an automated peat briquetting press drive. The given model makes it possible to reduce financial and time costs while developing, designing and operating a double-stamp peat briquetting press drive.

  8. Mathematical models in marketing a collection of abstracts

    CERN Document Server

    Funke, Ursula H

    1976-01-01

    Mathematical models can be classified in a number of ways, e.g., static and dynamic; deterministic and stochastic; linear and nonlinear; individual and aggregate; descriptive, predictive, and normative; according to the mathematical technique applied or according to the problem area in which they are used. In marketing, the level of sophistication of the mathe­ matical models varies considerably, so that a nurnber of models will be meaningful to a marketing specialist without an extensive mathematical background. To make it easier for the nontechnical user we have chosen to classify the models included in this collection according to the major marketing problem areas in which they are applied. Since the emphasis lies on mathematical models, we shall not as a rule present statistical models, flow chart models, computer models, or the empirical testing aspects of these theories. We have also excluded competitive bidding, inventory and transportation models since these areas do not form the core of ·the market...

  9. Parmodel: a web server for automated comparative modeling of proteins.

    Science.gov (United States)

    Uchôa, Hugo Brandão; Jorge, Guilherme Eberhart; Freitas Da Silveira, Nelson José; Camera, João Carlos; Canduri, Fernanda; De Azevedo, Walter Filgueira

    2004-12-24

    Parmodel is a web server for automated comparative modeling and evaluation of protein structures. The aim of this tool is to help inexperienced users to perform modeling, assessment, visualization, and optimization of protein models as well as crystallographers to evaluate structures solved experimentally. It is subdivided in four modules: Parmodel Modeling, Parmodel Assessment, Parmodel Visualization, and Parmodel Optimization. The main module is the Parmodel Modeling that allows the building of several models for a same protein in a reduced time, through the distribution of modeling processes on a Beowulf cluster. Parmodel automates and integrates the main softwares used in comparative modeling as MODELLER, Whatcheck, Procheck, Raster3D, Molscript, and Gromacs. This web server is freely accessible at .

  10. Modeling situated abstraction : action coalescence via multidimensional coherence.

    Energy Technology Data Exchange (ETDEWEB)

    Sallach, D. L.; Decision and Information Sciences; Univ. of Chicago

    2007-01-01

    Situated social agents weigh dozens of priorities, each with its own complexities. Domains of interest are intertwined, and progress in one area either complements or conflicts with other priorities. Interpretive agents address these complexities through: (1) integrating cognitive complexities through the use of radial concepts, (2) recognizing the role of emotion in prioritizing alternatives and urgencies, (3) using Miller-range constraints to avoid oversimplified notions omniscience, and (4) constraining actions to 'moves' in multiple prototype games. Situated agent orientations are dynamically grounded in pragmatic considerations as well as intertwined with internal and external priorities. HokiPoki is a situated abstraction designed to shape and focus strategic agent orientations. The design integrates four pragmatic pairs: (1) problem and solution, (2) dependence and power, (3) constraint and affordance, and (4) (agent) intent and effect. In this way, agents are empowered to address multiple facets of a situation in an exploratory, or even arbitrary, order. HokiPoki is open to the internal orientation of the agent as it evolves, but also to the communications and actions of other agents.

  11. Modelling of evapotranspiration at field and landscape scales. Abstract

    DEFF Research Database (Denmark)

    Overgaard, Jesper; Butts, M.B.; Rosbjerg, Dan

    2002-01-01

    -covariance observations from three 2-m masts representing fluxes from grass, winter wheat and spring barley. Observations from 40-m mast representing a mixture of the land-use classes in the model domain were used to validate the model at the larger scale. Good agreement was found at both the field and landscape scale...

  12. Modelling and simulation of superalloys. Book of abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Rogal, Jutta; Hammerschmidt, Thomas; Drautz, Ralf (eds.)

    2014-07-01

    Superalloys are multi-component materials with complex microstructures that offer unique properties for high-temperature applications. The complexity of the superalloy materials makes it particularly challenging to obtain fundamental insight into their behaviour from the atomic structure to turbine blades. Recent advances in modelling and simulation of superalloys contribute to a better understanding and prediction of materials properties and therefore offer guidance for the development of new alloys. This workshop will give an overview of recent progress in modelling and simulation of materials for superalloys, with a focus on single crystal Ni-base and Co-base alloys. Topics will include electronic structure methods, atomistic simulations, microstructure modelling and modelling of microstructural evolution, solidification and process simulation as well as the modelling of phase stability and thermodynamics.

  13. Reusing Test-Cases on Different Levels of Abstraction in a Model Based Development Tool

    CERN Document Server

    Blech, Jan Olaf; Ratiu, Daniel; 10.4204/EPTCS.80.2

    2012-01-01

    Seamless model based development aims to use models during all phases of the development process of a system. During the development process in a component-based approach, components of a system are described at qualitatively differing abstraction levels: during requirements engineering component models are rather abstract high-level and underspecified, while during implementation the component models are rather concrete and fully specified in order to enable code generation. An important issue that arises is assuring that the concrete models correspond to abstract models. In this paper, we propose a method to assure that concrete models for system components refine more abstract models for the same components. In particular we advocate a framework for reusing testcases at different abstraction levels. Our approach, even if it cannot completely prove the refinement, can be used to ensure confidence in the development process. In particular we are targeting the refinement of requirements which are represented ...

  14. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  15. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  16. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  17. Model-driven design, refinement and transformation of abstract interactions

    NARCIS (Netherlands)

    Almeida, João Paolo A.; Dijkman, Remco; Ferreira Pires, Luis; Quartel, Dick; Sinderen, van Marten

    2006-01-01

    In a model-driven design process the interaction between application parts can be described at various levels of platform-independence. At the lowest level of platform-independence, interaction is realized by interaction mechanisms provided by specific middleware platforms. At higher levels of platf

  18. Total human exposure and indoor air quality: An automated bibliography (BLIS) with summary abstracts. Volume 2. Final report, January 1987-December 1989

    International Nuclear Information System (INIS)

    The Bibliographical Literature Information System (BLIS) is a computer database that provides a comprehensive review of available literature on total human exposure to environmental pollution. Brief abstracts (often condensed versions of the original abstract) are included; if the original document had no abstract, one was prepared. Unpublished draft reports are listed, as well as final reports of the U.S. Government and other countries, reports by governmental research contractors, journal articles, and other publications on exposure models field data, and newly emerging research methodologies. Emphasis is placed on those field studies measuring all the concentrations to which people may be exposed, including indoors, outdoors, and in-transit

  19. Dynamics Model Abstraction Scheme Using Radial Basis Functions

    Directory of Open Access Journals (Sweden)

    Silvia Tolu

    2012-01-01

    Full Text Available This paper presents a control model for object manipulation. Properties of objects and environmental conditions influence the motor control and learning. System dynamics depend on an unobserved external context, for example, work load of a robot manipulator. The dynamics of a robot arm change as it manipulates objects with different physical properties, for example, the mass, shape, or mass distribution. We address active sensing strategies to acquire object dynamical models with a radial basis function neural network (RBF. Experiments are done using a real robot’s arm, and trajectory data are gathered during various trials manipulating different objects. Biped robots do not have high force joint servos and the control system hardly compensates all the inertia variation of the adjacent joints and disturbance torque on dynamic gait control. In order to achieve smoother control and lead to more reliable sensorimotor complexes, we evaluate and compare a sparse velocity-driven versus a dense position-driven control scheme.

  20. On Privacy Losses in the Trusted Agent Model (Abstract)

    OpenAIRE

    Mateus, Paulo; Vaudenay, Serge

    2009-01-01

    Tamper-proof devices are pretty powerful. They typically make security applications simpler (provided that the tamper-proof assumption is not violated). For application requiring privacy, we observe that some properties may become harder (if possible at all) to achieve when devices are maliciously used. We take the example of deniability, receipt-freeness, and anonymity. We formalize the trusted agent model which assumes tamper-proof hardware in a way which captures the notion of programmable...

  1. An automated in vitro model for the evaluation of ultrasound modalities measuring myocardial deformation

    Directory of Open Access Journals (Sweden)

    Stigö Albin

    2010-09-01

    Full Text Available Abstract Background Echocardiography is the method of choice when one wishes to examine myocardial function. Qualitative assessment of the 2D grey scale images obtained is subjective, and objective methods are required. Speckle Tracking Ultrasound is an emerging technology, offering an objective mean of quantifying left ventricular wall motion. However, before a new ultrasound technology can be adopted in the clinic, accuracy and reproducibility needs to be investigated. Aim It was hypothesized that the collection of ultrasound sample data from an in vitro model could be automated. The aim was to optimize an in vitro model to allow for efficient collection of sample data. Material & Methods A tissue-mimicking phantom was made from water, gelatin powder, psyllium fibers and a preservative. Sonomicrometry crystals were molded into the phantom. The solid phantom was mounted in a stable stand and cyclically compressed. Peak strain was then measured by Speckle Tracking Ultrasound and sonomicrometry. Results We succeeded in automating the acquisition and analysis of sample data. Sample data was collected at a rate of 200 measurement pairs in 30 minutes. We found good agreement between Speckle Tracking Ultrasound and sonomicrometry in the in vitro model. Best agreement was 0.83 ± 0.70%. Worst agreement was -1.13 ± 6.46%. Conclusions It has been shown possible to automate a model that can be used for evaluating the in vitro accuracy and precision of ultrasound modalities measuring deformation. Sonomicrometry and Speckle Tracking Ultrasound had acceptable agreement.

  2. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  3. Abstract Platform and Transformations for Model-Driven Service-Oriented Development

    OpenAIRE

    Andrade Almeida, J.P.; Ferreira Pires, L.; Sinderen, van, Marten

    2006-01-01

    In this paper, we discuss the use of abstract platforms and transformation for designing applications according to the principles of the service-oriented architecture. We illustrate our approach by discussing the use of the service discovery pattern at a platform-independent design level. We show how a trader service can be specified at a high-level of abstraction and incorporated in an abstract platform for service-oriented development. Designers can then build platform-independent models of...

  4. Model Search: Formalizing and Automating Constraint Solving in MDE Platforms

    Science.gov (United States)

    Kleiner, Mathias; Del Fabro, Marcos Didonet; Albert, Patrick

    Model Driven Engineering (MDE) and constraint programming (CP) have been widely used and combined in different applications. However, existing results are either ad-hoc, not fully integrated or manually executed. In this article, we present a formalization and an approach for automating constraint-based solving in a MDE platform. Our approach generalizes existing work by combining known MDE concepts with CP techniques into a single operation called model search. We present the theoretical basis for model search, as well as an automated process that details the involved operations. We validate our approach by comparing two implemented solutions (one based on Alloy/SAT, the other on OPL/CP), and by executing them over an academic use-case.

  5. Model-Based approaches to Human-Automation Systems Design

    DEFF Research Database (Denmark)

    Jamieson, Greg A.; Andersson, Jonas; Bisantz, Ann;

    2012-01-01

    Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However, the framewo......Human-automation interaction in complex systems is common, yet design for this interaction is often conducted without explicit consideration of the role of the human operator. Fortunately, there are a number of modeling frameworks proposed for supporting this design activity. However......, the frameworks are often adapted from other purposes, usually applied to a limited range of problems, sometimes not fully described in the open literature, and rarely critically reviewed in a manner acceptable to proponents and critics alike. The present paper introduces a panel session wherein these proponents...

  6. Abstraction and Model Checking in the PEPA Plug-in for Eclipse

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    The stochastic process algebra PEPA is a widely used language for performance modelling, and a large part of its success is due to the rich tool support that is available. As a compositional Markovian formalism, however, it suffers from the state space explosion problem, where even small models can...... lead to very large Markov chains. One way of analysing such models is to use abstraction - constructing a smaller model that bounds the properties of the original. We present an extension to the PEPA plug-in for Eclipse that enables abstracting and model checking of PEPA models. This implements two new...

  7. Automated Decomposition of Model-based Learning Problems

    Science.gov (United States)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  8. Abstracting and reasoning over ship trajectories and web data with the Simple Event Model (SEM)

    NARCIS (Netherlands)

    W.R. van Hage; V. Malaisé; G.K.D. de Vries; A.Th. Schreiber; M.W. van Someren

    2012-01-01

    Bridging the gap between low-level features and semantics is a problem commonly acknowledged in the Multimedia community. Event modeling can fill this gap by representing knowledge about the data at different level of abstraction. In this paper we present the Simple Event Model (SEM) and its applica

  9. Abstract algebra

    CERN Document Server

    Deskins, W E

    1996-01-01

    This excellent textbook provides undergraduates with an accessible introduction to the basic concepts of abstract algebra and to the analysis of abstract algebraic systems. These systems, which consist of sets of elements, operations, and relations among the elements, and prescriptive axioms, are abstractions and generalizations of various models which evolved from efforts to explain or discuss physical phenomena.In Chapter 1, the author discusses the essential ingredients of a mathematical system, and in the next four chapters covers the basic number systems, decompositions of integers, diop

  10. Automated Modeling of Microwave Structures by Enhanced Neural Networks

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2006-12-01

    Full Text Available The paper describes the methodology of the automated creation of neural models of microwave structures. During the creation process, artificial neural networks are trained using the combination of the particle swarm optimization and the quasi-Newton method to avoid critical training problems of the conventional neural nets. In the paper, neural networks are used to approximate the behavior of a planar microwave filter (moment method, Zeland IE3D. In order to evaluate the efficiency of neural modeling, global optimizations are performed using numerical models and neural ones. Both approaches are compared from the viewpoint of CPU-time demands and the accuracy. Considering conclusions, methodological recommendations for including neural networks to the microwave design are formulated.

  11. Selected Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Three - Stage Interaction Model of Lexicon - Syntax Interface Liu Yuhong (80) The Three -Stage Interaction Model maintains that the interactions at the lexicon -syntax interface are divisible into three levels, namely, the interaction between lexical meaning and lexical grammar that determines syntactic items, the interaction among lexical items that determines syntactic structure, and the interaction between abstract syntactic structure (i. e. construction) and temporary syntactic combinations that determines and coerces grammaticality of the latter. The Three - Stage Interaction Model is hierarchical, complete and bidirectional in language comprehension. It also testifies to the varying abstractness between grammar (syntax) and semantics, and between the five grammatical cases. According to this model, temporary syntactic combination is sanctioned by abstract syntactic structure, therefore the conventional linguistic significance of P600 is maintained without coining contradictory new terms.

  12. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology.

  13. Combining search space partition and search Space partition and abstraction for LTL model checking

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The state space explosion problem is still the key obstacle for applying model checking to systems of industrial size.Abstraction-based methods have been particularly successful in this regard.This paper presents an approach based on refinement of search space partition and abstraction which combines these two techniques for reducing the complexity of model checking.The refinement depends on the representation of each portion of search space. Especially, search space can be refined stepwise to get a better reduction. As reported in the case study, the Integration of search space partition and abstraction improves the efficiencyof verification with respect to the requirement of memory and obtains significant advantage over the use of each of them in isolation.

  14. Geochemistry Model Abstraction and Sensitivity Studies for the 21 PWR CSNF Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    P. Bernot; S. LeStrange; E. Thomas; K. Zarrabi; S. Arthur

    2002-10-29

    The CSNF geochemistry model abstraction, as directed by the TWP (BSC 2002b), was developed to provide regression analysis of EQ6 cases to obtain abstracted values of pH (and in some cases HCO{sub 3}{sup -} concentration) for use in the Configuration Generator Model. The pH of the system is the controlling factor over U mineralization, CSNF degradation rate, and HCO{sub 3}{sup -} concentration in solution. The abstraction encompasses a large variety of combinations for the degradation rates of materials. The ''base case'' used EQ6 simulations looking at differing steel/alloy corrosion rates, drip rates, and percent fuel exposure. Other values such as the pH/HCO{sub 3}{sup -} dependent fuel corrosion rate and the corrosion rate of A516 were kept constant. Relationships were developed for pH as a function of these differing rates to be used in the calculation of total C and subsequently, the fuel rate. An additional refinement to the abstraction was the addition of abstracted pH values for cases where there was limited O{sub 2} for waste package corrosion and a flushing fluid other than J-13, which has been used in all EQ6 calculation up to this point. These abstractions also used EQ6 simulations with varying combinations of corrosion rates of materials to abstract the pH (and HCO{sub 3}{sup -} in the case of the limiting O{sub 2} cases) as a function of WP materials corrosion rates. The goodness of fit for most of the abstracted values was above an R{sup 2} of 0.9. Those below this value occurred during the time at the very beginning of WP corrosion when large variations in the system pH are observed. However, the significance of F-statistic for all the abstractions showed that the variable relationships are significant. For the abstraction, an analysis of the minerals that may form the ''sludge'' in the waste package was also presented. This analysis indicates that a number a different iron and aluminum minerals may form in

  15. Development of an automated core model for nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Mosteller, R.D.

    1998-12-31

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop an automated package of computer codes that can model the steady-state behavior of nuclear-reactor cores of various designs. As an added benefit, data produced for steady-state analysis also can be used as input to the TRAC transient-analysis code for subsequent safety analysis of the reactor at any point in its operating lifetime. The basic capability to perform steady-state reactor-core analysis already existed in the combination of the HELIOS lattice-physics code and the NESTLE advanced nodal code. In this project, the automated package was completed by (1) obtaining cross-section libraries for HELIOS, (2) validating HELIOS by comparing its predictions to results from critical experiments and from the MCNP Monte Carlo code, (3) validating NESTLE by comparing its predictions to results from numerical benchmarks and to measured data from operating reactors, and (4) developing a linkage code to transform HELIOS output into NESTLE input.

  16. An Automated 3d Indoor Topological Navigation Network Modelling

    Science.gov (United States)

    Jamali, A.; Rahman, A. A.; Boguslawski, P.; Gold, C. M.

    2015-10-01

    Indoor navigation is important for various applications such as disaster management and safety analysis. In the last decade, indoor environment has been a focus of wide research; that includes developing techniques for acquiring indoor data (e.g. Terrestrial laser scanning), 3D indoor modelling and 3D indoor navigation models. In this paper, an automated 3D topological indoor network generated from inaccurate 3D building models is proposed. In a normal scenario, 3D indoor navigation network derivation needs accurate 3D models with no errors (e.g. gap, intersect) and two cells (e.g. rooms, corridors) should touch each other to build their connections. The presented 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. For reducing time and cost of indoor building data acquisition process, Trimble LaserAce 1000 as surveying instrument is used. The modelling results were validated against an accurate geometry of indoor building environment which was acquired using Trimble M3 total station.

  17. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  18. Random many-particle systems: applications from biology, and propagation of chaos in abstract models

    CERN Document Server

    Wennberg, Bernt

    2011-01-01

    The paper discusses a family of Markov processes that represent many particle systems, and their limiting behaviour when the number of particles go to infinity. The first part concerns model of biological systems: a model for sympatric speciation, i.e. the process in which a genetically homogeneous population is split in two or more different species sharing the same habitat, and models for swarming animals. The second part of the paper deals with abstract many particle systems, and methods for rigorously deriving mean field models.

  19. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  20. HLA-Modeler: Automated Homology Modeling of Human Leukocyte Antigens

    Directory of Open Access Journals (Sweden)

    Shinji Amari

    2013-01-01

    Full Text Available The three-dimensional (3D structures of human leukocyte antigen (HLA molecules are indispensable for the studies on the functions at molecular level. We have developed a homology modeling system named HLA-modeler specialized in the HLA molecules. Segment matching algorithm is employed for modeling and the optimization of the model is carried out by use of the PFROSST force field considering the implicit solvent model. In order to efficiently construct the homology models, HLA-modeler uses a local database of the 3D structures of HLA molecules. The structure of the antigenic peptide-binding site is important for the function and the 3D structure is highly conserved between various alleles. HLA-modeler optimizes the use of this structural motif. The leave-one-out cross-validation using the crystal structures of class I and class II HLA molecules has demonstrated that the rmsds of nonhydrogen atoms of the sites between homology models and crystal structures are less than 1.0 Å in most cases. The results have indicated that the 3D structures of the antigenic peptide-binding sites can be reproduced by HLA-modeler at the level almost corresponding to the crystal structures.

  1. HLA-Modeler: Automated Homology Modeling of Human Leukocyte Antigens.

    Science.gov (United States)

    Amari, Shinji; Kataoka, Ryoichi; Ikegami, Takashi; Hirayama, Noriaki

    2013-01-01

    The three-dimensional (3D) structures of human leukocyte antigen (HLA) molecules are indispensable for the studies on the functions at molecular level. We have developed a homology modeling system named HLA-modeler specialized in the HLA molecules. Segment matching algorithm is employed for modeling and the optimization of the model is carried out by use of the PFROSST force field considering the implicit solvent model. In order to efficiently construct the homology models, HLA-modeler uses a local database of the 3D structures of HLA molecules. The structure of the antigenic peptide-binding site is important for the function and the 3D structure is highly conserved between various alleles. HLA-modeler optimizes the use of this structural motif. The leave-one-out cross-validation using the crystal structures of class I and class II HLA molecules has demonstrated that the rmsds of nonhydrogen atoms of the sites between homology models and crystal structures are less than 1.0 Å in most cases. The results have indicated that the 3D structures of the antigenic peptide-binding sites can be reproduced by HLA-modeler at the level almost corresponding to the crystal structures.

  2. Model-Based Control for Postal Automation and Baggage Handling

    NARCIS (Netherlands)

    Tarau, A.N.

    2010-01-01

    In this thesis we focus on two specific transportation systems, namely postal automation and baggage handling. Postal automation: During the last decades the volume of magazines, catalogs, and other plastic wrapped mail items that have to be processed by post sorting centers has increased consider

  3. Technical Work Plan for: Near Field Environment: Engineered System: Radionuclide Transport Abstraction Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2006-12-08

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model

  4. An initial-abstraction, constant-loss model for unit hydrograph modeling for applicable watersheds in Texas

    Science.gov (United States)

    Asquith, William H.; Roussel, Meghan C.

    2007-01-01

    Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is

  5. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  6. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  7. Individual Differences in Response to Automation: The Five Factor Model of Personality

    Science.gov (United States)

    Szalma, James L.; Taylor, Grant S.

    2011-01-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness--whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five…

  8. Virtual Machine Support for Many-Core Architectures: Decoupling Abstract from Concrete Concurrency Models

    Directory of Open Access Journals (Sweden)

    Stefan Marr

    2010-02-01

    Full Text Available The upcoming many-core architectures require software developers to exploit concurrency to utilize available computational power. Today's high-level language virtual machines (VMs, which are a cornerstone of software development, do not provide sufficient abstraction for concurrency concepts. We analyze concrete and abstract concurrency models and identify the challenges they impose for VMs. To provide sufficient concurrency support in VMs, we propose to integrate concurrency operations into VM instruction sets. Since there will always be VMs optimized for special purposes, our goal is to develop a methodology to design instruction sets with concurrency support. Therefore, we also propose a list of trade-offs that have to be investigated to advise the design of such instruction sets. As a first experiment, we implemented one instruction set extension for shared memory and one for non-shared memory concurrency. From our experimental results, we derived a list of requirements for a full-grown experimental environment for further research.

  9. The abstract geometry modeling language (AgML): experience and road map toward eRHIC

    International Nuclear Information System (INIS)

    The STAR experiment has adopted an Abstract Geometry Modeling Language (AgML) as the primary description of our geometry model. AgML establishes a level of abstraction, decoupling the definition of the detector from the software libraries used to create the concrete geometry model. Thus, AgML allows us to support both our legacy GEANT 3 simulation application and our ROOT/TGeo based reconstruction software from a single source, which is demonstrably self- consistent. While AgML was developed primarily as a tool to migrate away from our legacy FORTRAN-era geometry codes, it also provides a rich syntax geared towards the rapid development of detector models. AgML has been successfully employed by users to quickly develop and integrate the descriptions of several new detectors in the RHIC/STAR experiment including the Forward GEM Tracker (FGT) and Heavy Flavor Tracker (HFT) upgrades installed in STAR for the 2012 and 2013 runs. AgML has furthermore been heavily utilized to study future upgrades to the STAR detector as it prepares for the eRHIC era. With its track record of practical use in a live experiment in mind, we present the status, lessons learned and future of the AgML language as well as our experience in bringing the code into our production and development environments. We will discuss the path toward eRHIC and pushing the current model to accommodate for detector miss-alignment and high precision physics.

  10. The abstract geometry modeling language (AgML): experience and road map toward eRHIC

    Science.gov (United States)

    Webb, Jason; Lauret, Jerome; Perevoztchikov, Victor

    2014-06-01

    The STAR experiment has adopted an Abstract Geometry Modeling Language (AgML) as the primary description of our geometry model. AgML establishes a level of abstraction, decoupling the definition of the detector from the software libraries used to create the concrete geometry model. Thus, AgML allows us to support both our legacy GEANT 3 simulation application and our ROOT/TGeo based reconstruction software from a single source, which is demonstrably self- consistent. While AgML was developed primarily as a tool to migrate away from our legacy FORTRAN-era geometry codes, it also provides a rich syntax geared towards the rapid development of detector models. AgML has been successfully employed by users to quickly develop and integrate the descriptions of several new detectors in the RHIC/STAR experiment including the Forward GEM Tracker (FGT) and Heavy Flavor Tracker (HFT) upgrades installed in STAR for the 2012 and 2013 runs. AgML has furthermore been heavily utilized to study future upgrades to the STAR detector as it prepares for the eRHIC era. With its track record of practical use in a live experiment in mind, we present the status, lessons learned and future of the AgML language as well as our experience in bringing the code into our production and development environments. We will discuss the path toward eRHIC and pushing the current model to accommodate for detector miss-alignment and high precision physics.

  11. Automated forward mechanical modeling of wrinkle ridges on Mars

    Science.gov (United States)

    Nahm, Amanda; Peterson, Samuel

    2016-04-01

    One of the main goals of the InSight mission to Mars is to understand the internal structure of Mars [1], in part through passive seismology. Understanding the shallow surface structure of the landing site is critical to the robust interpretation of recorded seismic signals. Faults, such as the wrinkle ridges abundant in the proposed landing site in Elysium Planitia, can be used to determine the subsurface structure of the regions they deform. Here, we test a new automated method for modeling of the topography of a wrinkle ridge (WR) in Elysium Planitia, allowing for faster and more robust determination of subsurface fault geometry for interpretation of the local subsurface structure. We perform forward mechanical modeling of fault-related topography [e.g., 2, 3], utilizing the modeling program Coulomb [4, 5] to model surface displacements surface induced by blind thrust faulting. Fault lengths are difficult to determine for WR; we initially assume a fault length of 30 km, but also test the effects of different fault lengths on model results. At present, we model the wrinkle ridge as a single blind thrust fault with a constant fault dip, though WR are likely to have more complicated fault geometry [e.g., 6-8]. Typically, the modeling is performed using the Coulomb GUI. This approach can be time consuming, requiring user inputs to change model parameters and to calculate the associated displacements for each model, which limits the number of models and parameter space that can be tested. To reduce active user computation time, we have developed a method in which the Coulomb GUI is bypassed. The general modeling procedure remains unchanged, and a set of input files is generated before modeling with ranges of pre-defined parameter values. The displacement calculations are divided into two suites. For Suite 1, a total of 3770 input files were generated in which the fault displacement (D), dip angle (δ), depth to upper fault tip (t), and depth to lower fault tip (B

  12. Model-Based Control for Postal Automation and Baggage Handling

    OpenAIRE

    Tarau, A.N.

    2010-01-01

    In this thesis we focus on two specific transportation systems, namely postal automation and baggage handling. Postal automation: During the last decades the volume of magazines, catalogs, and other plastic wrapped mail items that have to be processed by post sorting centers has increased considerably. In order to be able to handle the large volumes of mail, state-of-the-art post sorting centers are equipped with dedicated mail sorting machines. The throughput of a post sorting machine is def...

  13. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    OpenAIRE

    Simonsen, Kent,; Kristensen, Lars,

    2014-01-01

    Model-based software engineering offers several attractive benefits for the implementation of protocols, including automated code generation for different platforms from design-level models. In earlier work, we have proposed a template-based approach using Coloured Petri Net formal models with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol softwar...

  14. Modelling the sensitivity of river reaches to water abstraction: RAPHSA- a hydroecology tool for environmental managers

    Science.gov (United States)

    Klaar, Megan; Laize, Cedric; Maddock, Ian; Acreman, Mike; Tanner, Kath; Peet, Sarah

    2014-05-01

    A key challenge for environmental managers is the determination of environmental flows which allow a maximum yield of water resources to be taken from surface and sub-surface sources, whilst ensuring sufficient water remains in the environment to support biota and habitats. It has long been known that sensitivity to changes in water levels resulting from river and groundwater abstractions varies between rivers. Whilst assessment at the catchment scale is ideal for determining broad pressures on water resources and ecosystems, assessment of the sensitivity of reaches to changes in flow has previously been done on a site-by-site basis, often with the application of detailed but time consuming techniques (e.g. PHABSIM). While this is appropriate for a limited number of sites, it is costly in terms of money and time resources and therefore not appropriate for application at a national level required by responsible licensing authorities. To address this need, the Environment Agency (England) is developing an operational tool to predict relationships between physical habitat and flow which may be applied by field staff to rapidly determine the sensitivity of physical habitat to flow alteration for use in water resource management planning. An initial model of river sensitivity to abstraction (defined as the change in physical habitat related to changes in river discharge) was developed using site characteristics and data from 66 individual PHABSIM surveys throughout the UK (Booker & Acreman, 2008). By applying a multivariate multiple linear regression analysis to the data to define habitat availability-flow curves using resource intensity as predictor variables, the model (known as RAPHSA- Rapid Assessment of Physical Habitat Sensitivity to Abstraction) is able to take a risk-based approach to modeled certainty. Site specific information gathered using desk-based, or a variable amount of field work can be used to predict the shape of the habitat- flow curves, with the

  15. The prototype effect revisited: Evidence for an abstract feature model of face recognition.

    Science.gov (United States)

    Wallis, Guy; Siebeck, Ulrike E; Swann, Kellie; Blanz, Volker; Bülthoff, Heinrich H

    2008-01-01

    Humans typically have a remarkable memory for faces. Nonetheless, in some cases they can be fooled. Experiments described in this paper provide new evidence for an effect in which observers falsely "recognize" a face that they have never seen before. The face is a chimera (prototype) built from parts extracted from previously viewed faces. It is known that faces of this kind can be confused with truly familiar faces, a result referred to as the prototype effect. However, recent studies have failed to find evidence for a full effect, one in which the prototype is regarded not only as familiar, but as more familiar than faces which have been seen before. This study sought to reinvestigate the effect. In a pair of experiments, evidence is reported for the full effect based on both an old/new discrimination task and a familiarity ranking task. The results are shown to be consistent with a recognition model in which faces are represented as combinations of reusable, abstract features. In a final experiment, novel predictions of the model are verified by comparing the size of the prototype effect for upright and upside-down faces. Despite the fundamentally piecewise nature of the model, an explanation is provided as to how it can also account for the sensitivity of observers to configural and holistic cues. This discussion is backed up with the use of an unsupervised network model. Overall, the paper describes how an abstract feature-based model can reconcile a range of results in the face recognition literature and, in turn, lessen currently perceived differences between the representation of faces and other objects. PMID:18484826

  16. Simulation of Self-Assembly in the Abstract Tile Assembly Model with ISU TAS

    CERN Document Server

    Patitz, Matthew J

    2011-01-01

    Since its introduction by Erik Winfree in 1998, the abstract Tile Assembly Model (aTAM) has inspired a wealth of research. As an abstract model for tile based self-assembly, it has proven to be remarkably powerful and expressive in terms of the structures which can self-assemble within it. As research has progressed in the aTAM, the self-assembling structures being studied have become progressively more complex. This increasing complexity, along with a need for standardization of definitions and tools among researchers, motivated the development of the Iowa State University Tile Assembly Simulator (ISU TAS). ISU TAS is a graphical simulator and tile set editor for designing and building 2-D and 3-D aTAM tile assembly systems and simulating their self-assembly. This paper reviews the features and functionality of ISU TAS and describes how it can be used to further research into the complexities of the aTAM. Software and source code are available at http://www.cs.iastate.edu/~lnsa.

  17. Automated 4D analysis of dendritic spine morphology: applications to stimulus-induced spine remodeling and pharmacological rescue in a disease model

    Directory of Open Access Journals (Sweden)

    Swanger Sharon A

    2011-10-01

    Full Text Available Abstract Uncovering the mechanisms that regulate dendritic spine morphology has been limited, in part, by the lack of efficient and unbiased methods for analyzing spines. Here, we describe an automated 3D spine morphometry method and its application to spine remodeling in live neurons and spine abnormalities in a disease model. We anticipate that this approach will advance studies of synapse structure and function in brain development, plasticity, and disease.

  18. Inventory Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    C. Leigh

    2000-11-03

    The purpose of the inventory abstraction as directed by the development plan (CRWMS M&O 1999b) is to: (1) Interpret the results of a series of relative dose calculations (CRWMS M&O 1999c, 1999d). (2) Recommend, including a basis thereof, a set of radionuclides that should be modeled in the Total System Performance Assessment in Support of the Site Recommendation (TSPA-SR) and the Total System Performance Assessment in Support of the Final Environmental Impact Statement (TSPA-FEIS). (3) Provide initial radionuclide inventories for the TSPA-SR and TSPA-FEIS models. (4) Answer the U.S. Nuclear Regulatory Commission (NRC)'s Issue Resolution Status Report ''Key Technical Issue: Container Life and Source Term'' (CLST IRSR) (NRC 1999) key technical issue (KTI): ''The rate at which radionuclides in SNF [Spent Nuclear Fuel] are released from the EBS [Engineered Barrier System] through the oxidation and dissolution of spent fuel'' (Subissue 3). The scope of the radionuclide screening analysis encompasses the period from 100 years to 10,000 years after the potential repository at Yucca Mountain is sealed for scenarios involving the breach of a waste package and subsequent degradation of the waste form as required for the TSPA-SR calculations. By extending the time period considered to one million years after repository closure, recommendations are made for the TSPA-FEIS. The waste forms included in the inventory abstraction are Commercial Spent Nuclear Fuel (CSNF), DOE Spent Nuclear Fuel (DSNF), High-Level Waste (HLW), naval Spent Nuclear Fuel (SNF), and U.S. Department of Energy (DOE) plutonium waste. The intended use of this analysis is in TSPA-SR and TSPA-FEIS. Based on the recommendations made here, models for release, transport, and possibly exposure will be developed for the isotopes that would be the highest contributors to the dose given a release to the accessible environment. The inventory abstraction is important in

  19. INVENTORY ABSTRACTION

    Energy Technology Data Exchange (ETDEWEB)

    G. Ragan

    2001-12-19

    The purpose of the inventory abstraction, which has been prepared in accordance with a technical work plan (CRWMS M&O 2000e for ICN 02 of the present analysis, and BSC 2001e for ICN 03 of the present analysis), is to: (1) Interpret the results of a series of relative dose calculations (CRWMS M&O 2000c, 2000f). (2) Recommend, including a basis thereof, a set of radionuclides that should be modeled in the Total System Performance Assessment in Support of the Site Recommendation (TSPA-SR) and the Total System Performance Assessment in Support of the Final Environmental Impact Statement (TSPA-FEIS). (3) Provide initial radionuclide inventories for the TSPA-SR and TSPA-FEIS models. (4) Answer the U.S. Nuclear Regulatory Commission (NRC)'s Issue Resolution Status Report ''Key Technical Issue: Container Life and Source Term'' (CLST IRSR) key technical issue (KTI): ''The rate at which radionuclides in SNF [spent nuclear fuel] are released from the EBS [engineered barrier system] through the oxidation and dissolution of spent fuel'' (NRC 1999, Subissue 3). The scope of the radionuclide screening analysis encompasses the period from 100 years to 10,000 years after the potential repository at Yucca Mountain is sealed for scenarios involving the breach of a waste package and subsequent degradation of the waste form as required for the TSPA-SR calculations. By extending the time period considered to one million years after repository closure, recommendations are made for the TSPA-FEIS. The waste forms included in the inventory abstraction are Commercial Spent Nuclear Fuel (CSNF), DOE Spent Nuclear Fuel (DSNF), High-Level Waste (HLW), naval Spent Nuclear Fuel (SNF), and U.S. Department of Energy (DOE) plutonium waste. The intended use of this analysis is in TSPA-SR and TSPA-FEIS. Based on the recommendations made here, models for release, transport, and possibly exposure will be developed for the isotopes that would be the highest

  20. Towards an Integrated System Model for Testing and Verification of Automation Machines

    OpenAIRE

    Peter Braun; Benjamin Hummel

    2016-01-01

    The models and documents created during the development of automation machines typically can be categorized into mechanics, electronics, and software/controller. The functionality of an automation machine is, however, usually realized by the interaction of all three of these domains. So no single model covering only one development category will be able to describe the behavior of the machine thoroughly. For early planning of the machine design, virtual prototypes, and especially for the form...

  1. Automating the Extraction of Model-Based Software Product Lines from Model Variants

    OpenAIRE

    Martinez, Jabier; Ziadi, Tewfik; Klein, Jacques; Le Traon, Yves

    2015-01-01

    International audience We address the problem of automating 1) the analysis of existing similar model variants and 2) migrating them into a software product line. Our approach, named MoVa2PL, considers the identification of variability and commonality in model variants, as well as the extraction of a CVL-compliant Model-based Software Product Line (MSPL) from the features identified on these variants. MoVa2PL builds on a generic representation of models making it suitable to any MOF-based ...

  2. Côte de Resyste : Automated Model Based Testing

    NARCIS (Netherlands)

    Tretmans, Jan; Brinksma, Ed; Schweizer, M.

    2002-01-01

    Systematic testing is very important for assessing and improving the quality of embedded software. Yet, testing turns out to be expensive, laborious, time-consuming and error-prone. The project Cˆote de Resyste has been working since 1998 on methods, techniques and tools for automating specification

  3. TorX: Automated Model-Based Testing

    NARCIS (Netherlands)

    Tretmans, Jan; Brinksma, Ed; Hartman, A.; Dussa-Ziegler, K.

    2003-01-01

    Systematic testing is very important for assessing and improving the quality of software systems. Yet, testing turns out to be expensive, laborious, time-consuming and error-prone. The Dutch research and development project Côte de Resyste worked on methods, techniques and tools for automating speci

  4. Industrial Automation Mechanic Model Curriculum Project. Final Report.

    Science.gov (United States)

    Toledo Public Schools, OH.

    This document describes a demonstration program that developed secondary level competency-based instructional materials for industrial automation mechanics. Program activities included task list compilation, instructional materials research, learning activity packet (LAP) development, construction of lab elements, system implementation,…

  5. Context based mixture model for cell phase identification in automated fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Zhou Xiaobo

    2007-01-01

    Full Text Available Abstract Background Automated identification of cell cycle phases of individual live cells in a large population captured via automated fluorescence microscopy technique is important for cancer drug discovery and cell cycle studies. Time-lapse fluorescence microscopy images provide an important method to study the cell cycle process under different conditions of perturbation. Existing methods are limited in dealing with such time-lapse data sets while manual analysis is not feasible. This paper presents statistical data analysis and statistical pattern recognition to perform this task. Results The data is generated from Hela H2B GFP cells imaged during a 2-day period with images acquired 15 minutes apart using an automated time-lapse fluorescence microscopy. The patterns are described with four kinds of features, including twelve general features, Haralick texture features, Zernike moment features, and wavelet features. To generate a new set of features with more discriminate power, the commonly used feature reduction techniques are used, which include Principle Component Analysis (PCA, Linear Discriminant Analysis (LDA, Maximum Margin Criterion (MMC, Stepwise Discriminate Analysis based Feature Selection (SDAFS, and Genetic Algorithm based Feature Selection (GAFS. Then, we propose a Context Based Mixture Model (CBMM for dealing with the time-series cell sequence information and compare it to other traditional classifiers: Support Vector Machine (SVM, Neural Network (NN, and K-Nearest Neighbor (KNN. Being a standard practice in machine learning, we systematically compare the performance of a number of common feature reduction techniques and classifiers to select an optimal combination of a feature reduction technique and a classifier. A cellular database containing 100 manually labelled subsequence is built for evaluating the performance of the classifiers. The generalization error is estimated using the cross validation technique. The

  6. New E-Commerce Model Based on Multi-Agent Automated Negotiation

    Institute of Scientific and Technical Information of China (English)

    向传杰; 贾云得

    2003-01-01

    A new multi-agent automated negotiation model is developed and evaluated, in which two competitive agents, such as the buyer and seller, have firm deadlines and incomplete information about each other. The negotiation is multi-dimensional in different cases. The model is discussed in 6 kinds of cases with different price strategies, warrantee strategies and time strategies. The model improves the model of Wooldridge and that of Sycara to a certain extent. In all possible situations, the optimal negotiation strategy is analyzed and presented, and an e-commerce model based on multi-agent automated negotiation model is also illustrated for the e-commerce application in the future.

  7. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  8. Feed forward neural networks and genetic algorithms for automated financial time series modelling

    OpenAIRE

    Kingdon, J. C.

    1995-01-01

    This thesis presents an automated system for financial time series modelling. Formal and applied methods are investigated for combining feed-forward Neural Networks and Genetic Algorithms (GAs) into a single adaptive/learning system for automated time series forecasting. Four important research contributions arise from this investigation: i) novel forms of GAs are introduced which are designed to counter the representational bias associated with the conventional Holland GA, ii) an...

  9. Bottom-Up Abstract Modelling of Optical Networks-on-Chip: From Physical to Architectural Layer

    Directory of Open Access Journals (Sweden)

    Alberto Parini

    2012-01-01

    Full Text Available This work presents a bottom-up abstraction procedure based on the design-flow FDTD + SystemC suitable for the modelling of optical Networks-on-Chip. In this procedure, a complex network is decomposed into elementary switching elements whose input-output behavior is described by means of scattering parameters models. The parameters of each elementary block are then determined through 2D-FDTD simulation, and the resulting analytical models are exported within functional blocks in SystemC environment. The inherent modularity and scalability of the S-matrix formalism are preserved inside SystemC, thus allowing the incremental composition and successive characterization of complex topologies typically out of reach for full-vectorial electromagnetic simulators. The consistency of the outlined approach is verified, in the first instance, by performing a SystemC analysis of a four-input, four-output ports switch and making a comparison with the results of 2D-FDTD simulations of the same device. Finally, a further complex network encompassing 160 microrings is investigated, the losses over each routing path are calculated, and the minimum amount of power needed to guarantee an assigned BER is determined. This work is a basic step in the direction of an automatic technology-aware network-level simulation framework capable of assembling complex optical switching fabrics, while at the same time assessing the practical feasibility and effectiveness at the physical/technological level.

  10. The Conversion of Cardiovascular Conference Abstracts to Publications

    DEFF Research Database (Denmark)

    Fosbøl, Emil L.; Fosbøl, Philip Loldrup; Harrington, Robert A.;

    2012-01-01

    presented at the American Heart Association (AHA), American College of Cardiology (ACC), and European Society of Cardiology (ESC) scientific sessions from 2006 to 2008. We compared abstract publication rates and journal impact factor between the 3 meetings using multivariable logistic regression modeling...... a systematic and automated evaluation of rates, timing, and correlates of publication from scientific abstracts presented at 3 major cardiovascular conferences. Methods and Results—Using an automated computer algorithm, we searched the ISI Web of Science to identify peer-reviewed publications of abstracts...... conference presentation in 2005, these rates had risen slightly to 49.7% for AHA, 42.6% for ACC, and 37.6% for ESC (P0.0001). After adjustment for abstract characteristics and contributing countries, abstracts presented at the AHA meeting remained more likely for publication relative to the ESC (adjusted...

  11. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author)

  12. Cellular Automation Model of Traffic Flow Based on the Car-Following Model

    Institute of Scientific and Technical Information of China (English)

    LI Ke-Ping; GAO Zi-You

    2004-01-01

    @@ We propose a new cellular automation (CA) traffic model that is based on the car-following model. A class of driving strategies is used in the car-following model instead of the acceleration in the NaSch traffic model. In our model, some realistic driver behaviour and detailed vehicle characteristics have been taken into account, such as distance-headway and safe distance, etc. The simulation results show that our model can exhibit some traffic flow states that have been observed in the real traffic, and both of the maximum flux and the critical density are very close to the real measurement. Moreover, it is easy to extend our method to multi-lane traffic.

  13. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  14. Model of informational system for freight insurance automation based on digital signature

    Directory of Open Access Journals (Sweden)

    Maxim E. SLOBODYANYUK

    2009-01-01

    Full Text Available In the article considered a model of informational system for freight insurance automation based on digital signature, showed architecture, macro flowchart of information flow in model, components (modules and their functions. Described calculation method of costs on interactive cargo insurance via proposed system, represented main characteristics and options of existing transport management systems, conceptual cost models.

  15. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  16. Model of automated computer aided NC machine tools programming

    Directory of Open Access Journals (Sweden)

    J. Balic

    2006-04-01

    Full Text Available Purpose: Modern companies tend towards the greatest possible automation in all areas. The new control concepts of manufacturing processes required development of adequate tools for the introduction of automated control in a certain area. The paper presents such system for automated programming of CNC machine tools.Design/methodology/approach: The system is based on the previously incorporated know-how and the rules of it implementation in tool – shop. The existing manufacturing knowledge of industry tool production was collected and analysing. On this bases flow chart of all activities were made. Theoretical contribution is made in systemization of technological knowledge, which is now accessible for all workers in NC preparation units.Findings: Utilization of technology knowledge. On the basis of the recognized properties it has worked out the algorithms with which the process of manufacture, the tool and the optimum parameters selected are indirectly determined, whereas the target function was working out of the NC programme. We can first out that with information approaching of the CAM and CAPP the barriers between them, strict so far, disappear.Research limitations/implications: Till now, the system is limited to milling, drilling and similar operation. It could be extended to other machining operations (turning, grinding, wire cutting, etc. with the same procedure. In advanced, some methods of artificial intelligence could be use.Practical implications: It is suitable for industry tools, dies and moulds production, while the system was proved in the real tool shop (production of tools for casting. The system reduces the preparation time of NC programs and could be used with any commercial available CAD/CAM/NC programming systems. Human errors are avoid or at lover level. It is important for engineers in CAD/CAM field and in tool – shops.Originality/value: The developed system is original and was not found in the literature or in the

  17. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  18. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) languag...... of the production workflows and the expression of the goals require manual input....

  19. A Binary Programming Approach to Automated Test Assembly for Cognitive Diagnosis Models

    Science.gov (United States)

    Finkelman, Matthew D.; Kim, Wonsuk; Roussos, Louis; Verschoor, Angela

    2010-01-01

    Automated test assembly (ATA) has been an area of prolific psychometric research. Although ATA methodology is well developed for unidimensional models, its application alongside cognitive diagnosis models (CDMs) is a burgeoning topic. Two suggested procedures for combining ATA and CDMs are to maximize the cognitive diagnostic index and to use a…

  20. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    Science.gov (United States)

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  1. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  2. Conceptual model of an automated information system of marketing at the enterprise

    Directory of Open Access Journals (Sweden)

    D.V. Raiko

    2014-09-01

    Full Text Available The aim of the article. The purpose of this paper is to create a conceptual model of an automated information system of marketing that has a certain theoretical and practical value. The results of the analysis. The main advantage of this model - a comprehensive disclosure of the relationship of concepts such as automated information technology, marketing information system, automated information system that solve the problem of processing large volumes of data in a short period of time, providing continuous communication with partners and customers and makes it possible to react quickly to market changes, and this in turn contributes to the competitiveness of the domestic and foreign markets. Scientific novelty of this model is, firstly, the assertion that the information system is based on automated information technology presents an automated information the system. Secondly, the marketing information system is an integral part of information system, structural elements are responsible for the transformation of data from internal and external sources of information to information necessary for managers and specialists of marketing services. Thirdly, the most important component of ensuring the functioning of the marketing information system and information system is an automated information technology. Due to the fact that these systems consist of human resources, work within them organized with the help of workstations. Conclusions and directions of further researches. Determined that this conceptual model provides a multi-variant calculations of rational decision-making, including real-time organization of complex accounting and economic analysis, and provides reliability and efficiency obtained and used in the management of information. The results of this model, testing the example of several industries, confirming its practical significance.

  3. Automated Generation of Digital Terrain Model using Point Clouds of Digital Surface Model in Forest Area

    Directory of Open Access Journals (Sweden)

    Yoshikazu Kamiya

    2011-04-01

    Full Text Available At present, most of the digital data acquisition methods generate Digital Surface Model (DSM and not a Digital Elevation Model (DEM. Conversion from DSM to DEM still has some drawbacks, especially the removing of off terrain point clouds and subsequently the generation of DEM within these spaces even though the methods are automated. In this paper it was intended to overcome this issue by attempting to project off terrain point clouds to the terrain in forest areas using Artificial Neural Networks (ANN instead of removing them and then filling gaps by interpolation. Five sites were tested and accuracies assessed. They all give almost the same results. In conclusion, the ANN has ability to obtain the DEM by projecting the DSM point clouds and greater accuracies of DEMs were obtained. If the size of the hollow areas resulting from the removal of DSM point clouds are larger the accuracies are reduced.

  4. Automating Routine Tasks in AmI Systems by Using Models at Runtime

    Science.gov (United States)

    Serral, Estefanía; Valderas, Pedro; Pelechano, Vicente

    One of the most important challenges to be confronted in Ambient Intelligent (AmI) systems is to automate routine tasks on behalf of users. In this work, we confront this challenge presenting a novel approach based on models at runtime. This approach proposes a context-adaptive task model that allows routine tasks to be specified in an understandable way for users, facilitating their participation in the specification. These tasks are described according to context, which is specified in an ontology-based context model. Both the context model and the task model are also used at runtime. The approach provides a software infrastructure capable of automating the routine tasks as they were specified in these models by interpreting them at runtime.

  5. Abstract Platform and Transformations for Model-Driven Service-Oriented Development

    NARCIS (Netherlands)

    Andrade Almeida, J.P.; Ferreira Pires, L.; Sinderen, van M.J.

    2006-01-01

    In this paper, we discuss the use of abstract platforms and transformation for designing applications according to the principles of the service-oriented architecture. We illustrate our approach by discussing the use of the service discovery pattern at a platform-independent design level. We show ho

  6. Tensor contraction engine: Abstraction and automated parallel implementation of configuration-interaction, coupled-cluster, and many-body perturbation theories

    International Nuclear Information System (INIS)

    We develop a symbolic manipulation program and program generator (Tensor Contraction Engine or TCE) that automatically derives the working equations of a well-defined model of second-quantized many-electron theories and synthesizes efficient parallel computer programs on the basis of these equations. Provided an ansatz of a many-electron theory model, TCE performs valid contractions of creation and annihilation operators according to Wick's theorem, consolidates identical terms, and reduces the expressions into the form of multiple tensor contractions acted by permutation operators. Subsequently, it determines the binary contraction order for each multiple tensor contraction with the minimal operation and memory cost, factorizes common binary contractions (defines intermediate tensors), and identifies reusable intermediates. The resulting ordered list of binary tensor contractions, additions, and index permutations is translated into an optimized program that is combined with the NWChem and UTChem computational chemistry software packages. The programs synthesized by TCE take advantage of spin symmetry, Abelian point-group symmetry, and index permutation symmetry at every stage of calculations to minimize the number of arithmetic operations and storage requirement, adjust the peak local memory usage by index range tiling, and support parallel I/O interfaces and dynamic load balancing for parallel executions. We demonstrate the utility of TCE through automatic derivation and implementation of parallel programs for various models of configuration-interaction theory (CISD, CISDT, CISDTQ), many-body perturbation theory[MBPT(2), MBPT(3), MBPT(4)], and coupled-cluster theory (LCCD, CCD, LCCSD, CCSD, QCISD, CCSDT, and CCSDTQ)

  7. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  8. Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest

    Science.gov (United States)

    Rohloff, Kurt

    2010-01-01

    The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.

  9. Spatial coincidence modelling, automated database updating and data consistency in vector GIS.

    NARCIS (Netherlands)

    Kufoniyi, O.

    1995-01-01

    This thesis presents formal approaches for automated database updating and consistency control in vector- structured spatial databases. To serve as a framework, a conceptual data model is formalized for the representation of geo-data from multiple map layers in which a map layer denotes a set of ter

  10. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  11. Development and Evaluation of a Model for Modular Automation in Plant Manufacturing

    Directory of Open Access Journals (Sweden)

    Uwe Katzke

    2005-08-01

    Full Text Available The benefit of modular concepts in plant automation is seen ambivalent. On one hand it offers advantages, on the other hand it also sets requirements on the system structure as well as discipline of designer. The main reasons to use modularity in systems design for automation applications in industry are reusability and reduction of complexity, but up to now modular concepts are rare in plant automation. This paper analyses the reasons and proposes measures and solution concepts. An analysis of the work flow and the working results of some companies in several branches show different proposals of modularity. These different proposals in production and process engineering are integrated in one model and represent different perspectives of an integrated system.

  12. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    OpenAIRE

    Thomas E. Watts III; Robyn A. Snow; Brown, Aaron W.; J. C. York; Greg Fantom; Paul S. Simone Jr.; Gary L. Emmert

    2015-01-01

    An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The...

  13. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  14. An automated construction of error models for uncertainty quantification and model calibration

    Science.gov (United States)

    Josset, L.; Lunati, I.

    2015-12-01

    To reduce the computational cost of stochastic predictions, it is common practice to rely on approximate flow solvers (or «proxy»), which provide an inexact, but computationally inexpensive response [1,2]. Error models can be constructed to correct the proxy response: based on a learning set of realizations for which both exact and proxy simulations are performed, a transformation is sought to map proxy into exact responses. Once the error model is constructed a prediction of the exact response is obtained at the cost of a proxy simulation for any new realization. Despite its effectiveness [2,3], the methodology relies on several user-defined parameters, which impact the accuracy of the predictions. To achieve a fully automated construction, we propose a novel methodology based on an iterative scheme: we first initialize the error model with a small training set of realizations; then, at each iteration, we add a new realization both to improve the model and to evaluate its performance. More specifically, at each iteration we use the responses predicted by the updated model to identify the realizations that need to be considered to compute the quantity of interest. Another user-defined parameter is the number of dimensions of the response spaces between which the mapping is sought. To identify the space dimensions that optimally balance mapping accuracy and risk of overfitting, we follow a Leave-One-Out Cross Validation. Also, the definition of a stopping criterion is central to an automated construction. We use a stability measure based on bootstrap techniques to stop the iterative procedure when the iterative model has converged. The methodology is illustrated with two test cases in which an inverse problem has to be solved and assess the performance of the method. We show that an iterative scheme is crucial to increase the applicability of the approach. [1] Josset, L., and I. Lunati, Local and global error models for improving uncertainty quantification, Math

  15. Learning with Technology: Video Modeling with Concrete-Representational-Abstract Sequencing for Students with Autism Spectrum Disorder

    Science.gov (United States)

    Yakubova, Gulnoza; Hughes, Elizabeth M.; Shinaberry, Megan

    2016-01-01

    The purpose of this study was to determine the effectiveness of a video modeling intervention with concrete-representational-abstract instructional sequence in teaching mathematics concepts to students with autism spectrum disorder (ASD). A multiple baseline across skills design of single-case experimental methodology was used to determine the…

  16. Abstract algebra

    CERN Document Server

    Garrett, Paul B

    2007-01-01

    Designed for an advanced undergraduate- or graduate-level course, Abstract Algebra provides an example-oriented, less heavily symbolic approach to abstract algebra. The text emphasizes specifics such as basic number theory, polynomials, finite fields, as well as linear and multilinear algebra. This classroom-tested, how-to manual takes a more narrative approach than the stiff formalism of many other textbooks, presenting coherent storylines to convey crucial ideas in a student-friendly, accessible manner. An unusual feature of the text is the systematic characterization of objects by universal

  17. Automating Measurement for Software Process Models using Attribute Grammar Rules

    Directory of Open Access Journals (Sweden)

    Abdul Azim Abd. Ghani

    2007-08-01

    Full Text Available The modelling concept is well accepted in software engineering discipline. Some software models are built either to control the development stages, to measure program quality or to serve as a medium that gives better understanding of the actual software systems. Software process modelling nowadays has reached a level that allow software designs to be transformed into programming languages, such as architecture design language and unified modelling language. This paper described the adaptation of attribute grammar approach in measuring software process model. A tool, called Software Process Measurement Application was developed to enable the measurement accordingly to specified attribute grammar rules. A context-free grammar to read the process model is depicted from IDEF3 standard, and rules were attached to enable the measurement metrics calculation. The measurement metric values collected were used to aid in determining the decomposing and structuring of processes for the proposed software systems.

  18. An Intuitive Automated Modelling Interface for Systems Biology

    Directory of Open Access Journals (Sweden)

    Ozan Kahramanoğulları

    2009-11-01

    Full Text Available We introduce a natural language interface for building stochastic pi calculus models of biological systems. In this language, complex constructs describing biochemical events are built from basic primitives of association, dissociation and transformation. This language thus allows us to model biochemical systems modularly by describing their dynamics in a narrative-style language, while making amendments, refinements and extensions on the models easy. We demonstrate the language on a model of Fc-gamma receptor phosphorylation during phagocytosis. We provide a tool implementation of the translation into a stochastic pi calculus language, Microsoft Research's SPiM.

  19. BALWOIS: Abstracts

    International Nuclear Information System (INIS)

    anthropogenic pressures and international shared water. Here are the 320 abstracts proposed by authors and accepted by the Scientific Committee. More than 200 papers are presented during the Conference on 8 topics related to Hydrology, Climatology and Hydro biology: - Climate and Environment; - Hydrological regimes and water balances; - Droughts and Floods; -Integrated Water Resources Management; -Water bodies Protection and Eco hydrology; -Lakes; -Information Systems for decision support; -Hydrological modelling. Papers relevant to INIS are indexed separately

  20. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    Directory of Open Access Journals (Sweden)

    Thomas E. Watts III

    2015-10-01

    Full Text Available An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The original model predicted trihalomethanes concentrations within ~10 μg·L−1 of the measurement. Calibration improved model prediction by a factor of three to five times better than the literature model.

  1. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...... to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility...... of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow....

  2. Man power/cost estimation model: Automated planetary projects

    Science.gov (United States)

    Kitchen, L. D.

    1975-01-01

    A manpower/cost estimation model is developed which is based on a detailed level of financial analysis of over 30 million raw data points which are then compacted by more than three orders of magnitude to the level at which the model is applicable. The major parameter of expenditure is manpower (specifically direct labor hours) for all spacecraft subsystem and technical support categories. The resultant model is able to provide a mean absolute error of less than fifteen percent for the eight programs comprising the model data base. The model includes cost saving inheritance factors, broken down in four levels, for estimating follow-on type programs where hardware and design inheritance are evident or expected.

  3. PDB_REDO: automated re-refinement of X-ray structure models in the PDB

    OpenAIRE

    Joosten, R.P.; Salzemann, J.; Bloch, V.; Stockinger, H.; Berglund, A; Blanchet, C.; Bongcam-Rudloff, E.; Combet, C.; Da Costa, A.L.; Deleage, G.; Diarena, M.; Fabbretti, R.; Fettahi, G.; Flegel, V.; Gisel, A.

    2009-01-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimenta...

  4. Automated longitudinal monitoring of in vivo protein aggregation in neurodegenerative disease C. elegans models

    OpenAIRE

    Cornaglia, Matteo; Krishnamani, Gopalan; Mouchiroud, Laurent; Sorrentino, Vincenzo; Lehnert, Thomas; Auwerx, Johan; Gijs, Martin A. M.

    2016-01-01

    Background While many biological studies can be performed on cell-based systems, the investigation of molecular pathways related to complex human dysfunctions – e.g. neurodegenerative diseases – often requires long-term studies in animal models. The nematode Caenorhabditis elegans represents one of the best model organisms for many of these tests and, therefore, versatile and automated systems for accurate time-resolved analyses on C. elegans are becoming highly desirable tools in the field. ...

  5. Software Test Case Automated Generation Algorithm with Extended EDPN Model

    Directory of Open Access Journals (Sweden)

    Jinlong Tao

    2013-08-01

    Full Text Available To improve the sufficiency for software testing and the performance of testing algorithms, an improved event-driven Petri network model using combination method is proposed, abbreviated as OEDPN model. Then it is applied to OATS method to extend the implementation of OATS. On the basis of OEDPN model, the marked associate recursive method of state combination on category is presented to solve problems of combined conflict. It is also for test case explosion generated by redundant test cases and hard extension of OATS method. Meanwhile, the generation methods on interactive test cases of extended OATS are also presented by research on generation test cases.

  6. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  7. TRAFFIC FLOW MODEL BASED ON CELLULAR AUTOMATION WITH ADAPTIVE DECELERATION

    OpenAIRE

    Shinkarev, A. A.

    2016-01-01

    This paper describes continuation of the authors’ work in the field of traffic flow mathematical models based on the cellular automata theory. The refactored representation of the multifactorial traffic flow model based on the cellular automata theory is used for a representation of an adaptive deceleration step implementation. The adaptive deceleration step in the case of a leader deceleration allows slowing down smoothly but not instantly. Concepts of the number of time steps without confli...

  8. An Abstract Model for Proving Safety of Multi-lane Traffic Manoeuvres

    DEFF Research Database (Denmark)

    Hilscher, Martin; Linker, Sven; Olderog, Ernst-Rüdiger;

    2011-01-01

    We present an approach to prove safety (collision freedom) of multi-lane motorway traffic with lane-change manoeuvres. This is ultimately a hybrid verification problem due to the continuous dynamics of the cars. We abstract from the dynamics by introducing a new spatial interval logic based...... on the view of each car. To guarantee safety, we present two variants of a lane-change controller, one with perfect knowledge of the safety envelopes of neighbouring cars and one which takes only the size of the neighbouring cars into account. Based on these controllers we provide a local safety proof...

  9. Automated biowaste sampling system urine subsystem operating model, part 1

    Science.gov (United States)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  10. A Voyage to Arcturus: A model for automated management of a WLCG Tier-2 facility

    International Nuclear Information System (INIS)

    With the current trend towards 'On Demand Computing' in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on 'off the shelf' software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems.

  11. A Voyage to Arcturus: A model for automated management of a WLCG Tier-2 facility

    Science.gov (United States)

    Roy, Gareth; Crooks, David; Mertens, Lena; Mitchell, Mark; Purdie, Stuart; Cadellin Skipsey, Samuel; Britton, David

    2014-06-01

    With the current trend towards "On Demand Computing" in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on "off the shelf" software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems.

  12. An Improvement in Thermal Modelling of Automated Tape Placement Process

    International Nuclear Information System (INIS)

    The thermoplastic tape placement process offers the possibility of manufacturing large laminated composite parts with all kinds of geometries (double curved i.e.). This process is based on the fusion bonding of a thermoplastic tape on a substrate. It has received a growing interest during last years because of its non autoclave abilities.In order to control and optimize the quality of the manufactured part, we need to predict the temperature field throughout the processing of the laminate. In this work, we focus on a thermal modeling of this process which takes in account the imperfect bonding existing between the different layers of the substrate by introducing thermal contact resistance in the model. This study is leaning on experimental results which inform us that the value of the thermal resistance evolves with temperature and pressure applied on the material.

  13. An Improvement in Thermal Modelling of Automated Tape Placement Process

    Science.gov (United States)

    Barasinski, Anaïs; Leygue, Adrien; Soccard, Eric; Poitou, Arnaud

    2011-01-01

    The thermoplastic tape placement process offers the possibility of manufacturing large laminated composite parts with all kinds of geometries (double curved i.e.). This process is based on the fusion bonding of a thermoplastic tape on a substrate. It has received a growing interest during last years because of its non autoclave abilities. In order to control and optimize the quality of the manufactured part, we need to predict the temperature field throughout the processing of the laminate. In this work, we focus on a thermal modeling of this process which takes in account the imperfect bonding existing between the different layers of the substrate by introducing thermal contact resistance in the model. This study is leaning on experimental results which inform us that the value of the thermal resistance evolves with temperature and pressure applied on the material.

  14. ADGEN: a system for automated sensitivity analysis of predictive models

    International Nuclear Information System (INIS)

    A system that can automatically enhance computer codes with a sensitivity calculation capability is presented. With this new system, named ADGEN, rapid and cost-effective calculation of sensitivities can be performed in any FORTRAN code for all input data or parameters. The resulting sensitivities can be used in performance assessment studies related to licensing or interactions with the public to systematically and quantitatively prove the relative importance of each of the system parameters in calculating the final performance results. A general procedure calling for the systematic use of sensitivities in assessment studies is presented. The procedure can be used in modelling and model validation studies to avoid ''over modelling,'' in site characterization planning to avoid ''over collection of data,'' and in performance assessment to determine the uncertainties on the final calculated results. The added capability to formally perform the inverse problem, i.e., to determine the input data or parameters on which to focus additional research or analysis effort in order to improve the uncertainty of the final results, is also discussed

  15. Model-based metrics of human-automation function allocation in complex work environments

    Science.gov (United States)

    Kim, So Young

    Function allocation is the design decision which assigns work functions to all agents in a team, both human and automated. Efforts to guide function allocation systematically has been studied in many fields such as engineering, human factors, team and organization design, management science, and cognitive systems engineering. Each field focuses on certain aspects of function allocation, but not all; thus, an independent discussion of each does not address all necessary issues with function allocation. Four distinctive perspectives emerged from a review of these fields: technology-centered, human-centered, team-oriented, and work-oriented. Each perspective focuses on different aspects of function allocation: capabilities and characteristics of agents (automation or human), team structure and processes, and work structure and the work environment. Together, these perspectives identify the following eight issues with function allocation: 1) Workload, 2) Incoherency in function allocations, 3) Mismatches between responsibility and authority, 4) Interruptive automation, 5) Automation boundary conditions, 6) Function allocation preventing human adaptation to context, 7) Function allocation destabilizing the humans' work environment, and 8) Mission Performance. Addressing these issues systematically requires formal models and simulations that include all necessary aspects of human-automation function allocation: the work environment, the dynamics inherent to the work, agents, and relationships among them. Also, addressing these issues requires not only a (static) model, but also a (dynamic) simulation that captures temporal aspects of work such as the timing of actions and their impact on the agent's work. Therefore, with properly modeled work as described by the work environment, the dynamics inherent to the work, agents, and relationships among them, a modeling framework developed by this thesis, which includes static work models and dynamic simulation, can capture the

  16. Automated soil resources mapping based on decision tree and Bayesian predictive modeling

    Institute of Scientific and Technical Information of China (English)

    周斌; 张新刚; 王人潮

    2004-01-01

    This article presents two approaches for automated building of knowledge bases of soil resources mapping.These methods used decision tree and Bayesian predictive modeling, respectively to generate knowledge from training data.With these methods, building a knowledge base for automated soil mapping is easier than using the conventional knowledge acquisition approach. The knowledge bases built by these two methods were used by the knowledge classifier for soil type classification of the Longyou area, Zhejiang Province, China using TM hi-temporal imageries and GIS data. To evaluate the performance of the resultant knowledge bases, the classification results were compared to existing soil map based on field survey. The accuracy assessment and analysis of the resultant soil maps suggested that the knowledge bases built by these two methods were of good quality for mapping distribution model of soil classes over the study area.

  17. Automated soil resources mapping based on decision tree and Bayesian predictive modeling

    Institute of Scientific and Technical Information of China (English)

    周斌; 张新刚; 王人潮

    2004-01-01

    This article presents two approaches for automated building of knowledge bases of soil resources mapping.These methods used decision tree and Bayesian predictive modeling,respectively to generate knowledge from training data.With these methods,building a knowledge base for automated soil mapping is easier than using the conventional knowledge acquisition approach.The knowledge bases built by these two methods were used by the knowledge classifier for soil type classification of the Longyou area,Zhejiang Province,China using TM bi-temporal imageries and GIS data.To evaluate the performance of the resultant knowledge bases,the classification results were compared to existing soil map based on field survey.The accuracy assessment and analysis of the resultant soil maps suggested that the knowledge bases built by these two methods were of good quality for mapping distribution model of soil classes over the study area.

  18. Analysis on Traffic Conflicts of Two-lane Highway Based on Improved Cellular Automation Model

    Directory of Open Access Journals (Sweden)

    Xiru Tang

    2013-06-01

    Full Text Available Based on microscopic traffic characteristics of two-lane highway and different driving characteristics for drivers, the characteristics of drivers and vehicle structure are introduced into Cellular Automation model for establishing new Cellular Automation model of two-lane highway. Through computer simulation, the paper analyzes the effect of the promotion of different vehicles, drivers and arrival rates on traffic conflicts of two-lane highway, which gets the relationship between the parameters such as road traffic and velocity variance and collision. The results indicate that the frequency of traffic conflicts has close relationship with the product of traffic flow and velocity variation. When the traffic flow and velocity variation are great, the frequency of the conflict is the greatest, and when the traffic flow and velocity variation are little, the frequency of the conflict is the least.

  19. Assessing model uncertainty using hexavalent chromium and lung cancer mortality as an example [Abstract 2015

    Science.gov (United States)

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality a...

  20. Automated EEG monitoring in defining a chronic epilepsy model.

    Science.gov (United States)

    Mascott, C R; Gotman, J; Beaudet, A

    1994-01-01

    There has been a recent surge of interest in chronic animal models of epilepsy. Proper assessment of these models requires documentation of spontaneous seizures by EEG, observation, or both in each individual animal to confirm the presumed epileptic condition. We used the same automatic seizure detection system as that currently used for patients in our institution and many others. Electrodes were implanted in 43 rats before intraamygdalar administration of kainic acid (KA). Animals were monitored intermittently for 3 months. Nine of the rats were protected by anticonvulsants [pentobarbital (PB) and diazepam (DZP)] at the time of KA injection. Between 1 and 3 months after KA injection, spontaneous seizures were detected in 20 of the 34 unprotected animals (59%). Surprisingly, spontaneous seizures were also detected during the same period in 2 of the 9 protected animals that were intended to serve as nonepileptic controls. Although the absence of confirmed spontaneous seizures in the remaining animals cannot exclude their occurrence, it indicates that, if present, they are at least rare. On the other hand, definitive proof of epilepsy is invaluable in the attempt to interpret pathologic data from experimental brains.

  1. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Fons

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for th

  2. Automated optimal glycaemic control using a physiology based pharmacokinetic, pharmacodynamic model

    OpenAIRE

    Schaller, Stephan

    2015-01-01

    After decades of research, Automated Glucose Control (AGC) is still out of reach for everyday control of blood glucose. The inter- and intra-individual variability of glucose dynamics largely arising from variability in insulin absorption, distribution, and action, and related physiological lag-times remain a core problem in the development of suitable control algorithms. Over the years, model predictive control (MPC) has established itself as the gold standard in AGC systems in research. Mod...

  3. Composition of Petri nets models in service-oriented industrial automation

    OpenAIRE

    Mendes, João M.; Leitão, Paulo; Restivo, Francisco; Colombo, Armando W.

    2010-01-01

    In service-oriented systems, composition of services is required to build new, distributed and more complex services, based on the logic behavior of individual ones. This paper discusses the formal composition of Petri nets models used for the process description and control in service-oriented automation systems. The proposed approach considers two forms for the composition of services, notably the offline composition, applied during the design phase, and the online composition, related to t...

  4. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  5. Beyond modeling abstractions: Learning nouns over developmental time in atypical populations and individuals

    Directory of Open Access Journals (Sweden)

    Clare eSims

    2013-11-01

    Full Text Available Connectionist models that capture developmental change over time have much to offer in the field of language development research. Several models in the literature have made good contact with developmental data, effectively captured behavioral tasks, and accurately represented linguistic input available to young children. However, fewer models of language development have truly captured the process of developmental change over time. In this review paper, we discuss several prominent connectionist models of early word learning, focusing on semantic development, as well as our recent work modeling the emergence of word learning biases in different populations. We also discuss the potential of these kinds of models to capture children’s language development at the individual level. We argue that a modeling approach that truly captures change over time has the potential to inform theory, guide research, and lead to innovations in early language intervention.

  6. Automated Finite Element Modeling of Wing Structures for Shape Optimization

    Science.gov (United States)

    Harvey, Michael Stephen

    1993-01-01

    The displacement formulation of the finite element method is the most general and most widely used technique for structural analysis of airplane configurations. Modem structural synthesis techniques based on the finite element method have reached a certain maturity in recent years, and large airplane structures can now be optimized with respect to sizing type design variables for many load cases subject to a rich variety of constraints including stress, buckling, frequency, stiffness and aeroelastic constraints (Refs. 1-3). These structural synthesis capabilities use gradient based nonlinear programming techniques to search for improved designs. For these techniques to be practical a major improvement was required in computational cost of finite element analyses (needed repeatedly in the optimization process). Thus, associated with the progress in structural optimization, a new perspective of structural analysis has emerged, namely, structural analysis specialized for design optimization application, or.what is known as "design oriented structural analysis" (Ref. 4). This discipline includes approximation concepts and methods for obtaining behavior sensitivity information (Ref. 1), all needed to make the optimization of large structural systems (modeled by thousands of degrees of freedom and thousands of design variables) practical and cost effective.

  7. Modelling of series of types of automated trenchless works tunneling

    Science.gov (United States)

    Gendarz, P.; Rzasinski, R.

    2016-08-01

    Microtunneling is the newest method for making underground installations. Show method is the result of experience and methods applied in other, previous methods of trenchless underground works. It is considered reasonable to elaborate a series of types of construction of tunneling machines, to develop this particular earthworks method. There are many design solutions of machines, but the current goal is to develop non - excavation robotized machine. Erosion machines with main dimensions of the tunnels which are: 1600, 2000, 2500, 3150 are design with use of the computer aided methods. Series of types of construction of tunneling machines creating process was preceded by analysis of current state. The verification of practical methodology of creating the systematic part series was based on the designed erosion machines series of types. There were developed: method of construction similarity of the erosion machines, algorithmic methods of quantitative construction attributes variant analyzes in the I-DEAS advanced graphical program, relational and program parameterization. There manufacturing process of the parts will be created, which allows to verify the technological process on the CNC machines. The models of designed will be modified and the construction will be consulted with erosion machine users and manufacturers like: Tauber Rohrbau GmbH & Co.KG from Minster, OHL ZS a.s. from Brna,. The companies’ acceptance will result in practical verification by JUMARPOL company.

  8. Channel and active component abstractions for WSN programming - a language model with operating system support

    OpenAIRE

    P. Harvey; Dearle, A.; Lewis, J.; Sventek, J.

    2012-01-01

    To support the programming of Wireless Sensor Networks, a number of unconventional programming models have evolved, in particular the event-based model. These models are non-intuitive to programmers due to the introduction of unnecessary, non-intrinsic complexity. Component-based languages like Insense can eliminate much of this unnecessary complexity via the use of active components and synchronous channels. However, simply layering an Insense implementation over an existing event-based syst...

  9. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  10. Towards self-healing in distribution networks operation: Bipartite graph modelling for automated switching

    Energy Technology Data Exchange (ETDEWEB)

    Kost' alova, Alena; Carvalho, Pedro M.S. [Department of Electrical Engineering and Computers, Instituto Superior Tecnico, UTL, Av. Rovisco Pais, Lisbon (Portugal)

    2011-01-15

    The concept of self-healing has been recently introduced in power systems. The general self-healing framework is complex and includes several aspects of networks' operation. This paper deals with automated switching in the context of autonomous operation of distribution networks. The paper presents a new network data model that allows effective reconfiguration algorithms to be designed. The model is based on bipartite graph representation of switching possibilities. The model properties and capabilities are illustrated for simple self-healing algorithms and a small real world medium voltage distribution network. (author)

  11. A "Brutus" model checking of a spi-calculus dialect (Extended Abstract)

    NARCIS (Netherlands)

    Gnesi, S.; Latella, D.; Lenzini, G.

    2000-01-01

    This paper proposes a preliminary framework in which protocols, expressed in a dialect of the spi-calculus, can be verified using model checking algorithms. In particular we define a formal semantics for a dialect of the spi-calculus based on labeled transition systems in such a way that the model c

  12. Modelling Venting and Pressure Build-up in a 18650 LCO Cell during Thermal Runaway (ABSTRACT)

    DEFF Research Database (Denmark)

    Coman, Paul Tiberiu; Veje, Christian; White, Ralph;

    may lead to fires and explosions. To prevent this, it is therefore important to model thermal runaway considering different events such as venting and the pressure development inside the battery cell, which makes the main purpose of this paper. A model consisting of the different decomposition...... reactions in the anode, cathode and SEI, but also in electrochemical reactions and boiling of the electrolyte is developed for a cylindrical 18650 LCO cell (Lithium Cobalt Oxide). For determining the pressure and the temperature after venting, the isentropic flow equations are included in the model....... By fitting the activation energies, and measuring experimentally the mass of the ejecta during thermal runaway, the model is compared and validated against an extensive experiment performed by Golukbov et al. [1] during oven heating. When analysing the results, it is found that by including the venting...

  13. An Ontology-based Model to Determine the Automation Level of an Automated Vehicle for Co-Driving

    OpenAIRE

    Pollard, Evangeline; Morignot, Philippe; Nashashibi, Fawzi

    2013-01-01

    International audience Full autonomy of ground vehicles is a major goal of the ITS (Intelligent Transportation Systems) community. However, reaching such highest autonomy level in all situations (weather, traffic, . . . ) may seem difficult in practice, despite recent results regarding driverless cars (e.g., Google Cars). In addition, an automated vehicle should also self-assess its own perception abilities, and not only perceive its environment. In this paper, we propose an intermediate a...

  14. Towards automated software model checking using graph transformation systems and Bogor

    Institute of Scientific and Technical Information of China (English)

    Vahid RAFE; Adel T.RAHMANI

    2009-01-01

    Graph transformation systems have become a general formal modeling language to describe many models in software development process. Behavioral modeling of dynamic systems and model-to-model transformations are only a few examples in which graphs have been used to software development. But even the perfect graph transformation system must be equipped with automated analysis capabilities to let users understand whether such a formal specification fulfills their requirements. In this paper,we present a new solution to verify graph transformation systems using the Bogor model checker. The attributed graph grammars (AGG)-Iike graph transformation systems are translated to Bandera intermediate representation (BIR), the input language of Bogor,and Bogor verifies the model against some interesting properties defined by combining linear temporal logic (LTL) and special-purpose graph rules. Experimental results are encouraging, showing that in most cases oar solution improves existing approaches in terms of both performance and expressiveness.

  15. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  16. A New Modular Strategy For Action Sequence Automation Using Neural Networks And Hidden Markov Models

    OpenAIRE

    Mohamed Adel Taher; Mostapha Abdeljawad

    2013-01-01

    In this paper, the authors propose a new hybrid strategy (using artificial neural networks and hidden Markov models) for skill automation. The strategy is based on the concept of using an “adaptive desired†that is introduced in the paper. The authors explain how using an adaptive desired can help a system for which an explicit model is not available or is difficult to obtain to smartly cope with environmental disturbances without requiring explicit rules specification (as with fuzzy syste...

  17. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    Science.gov (United States)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  18. Radionuclide Transport Modelling: Current Status and Future Needs. Synthesis, Work Group Reports and Extended Abstracts

    International Nuclear Information System (INIS)

    The workshop identified a set of critical issues for the Swedish Nuclear Power Inspectorate (SKI) and the Swedish Radiation Protection Authority (SSI) to address in preparing for future reviews of license applications, which have subsequently been considered in preparing this synthesis. Structure for organising expert participation: A structure for organising expert participation in future reviews is proposed based on clearinghouses for (1) regulatory application and context, (2) engineered barrier systems, (3) geosphere, (4) biosphere, and (5) performance assessment integration and calculations. As part of their work, these clearinghouses could identify key issues that need to be resolved prior to future reviews. Performance assessment strategy and review context: Future reviews will be conducted in the context of regulations based on risk criteria; this leads to a need to review the methods used in probabilistic risk assessment, as well as the underlying process models. A plan is needed for accomplishing both aims. Despite the probabilistic framework, a need is anticipated for targeted, deterministic calculations to check particular assumptions. Priorities and ambition level for reviews: SKI's and SSI's resources can be more efficiently utilised by an early review of SKB's safety case, so that if necessary the authorities can make an early start on evaluating topics that are of primary significance to the safety case. As a guide to planning for allocation of effort in future reviews, this workshop produced a preliminary ranking of technical issues, on a scale from 'non-controversial' to 'requiring independent modelling,' Analysis of repository system and scenarios: Systems analysis tools including features/events/processes encyclopaedias, process-influence diagrams, and assessment-model flowcharts should be used as review tools, to check the processes and influences considered in SKB's analyses, and to evaluate the comprehensiveness of the scenarios that are

  19. Emergence of Consensus in a Multi-Robot Network: from Abstract Models to Empirical Validation

    CERN Document Server

    Trianni, Vito; Reina, Andreagiovanni; Baronchelli, Andrea

    2016-01-01

    Consensus dynamics in decentralised multiagent systems are subject to intense studies, and several different models have been proposed and analysed. Among these, the naming game stands out for its simplicity and applicability to a wide range of phenomena and applications, from semiotics to engineering. Despite the wide range of studies available, the implementation of theoretical models in real distributed systems is not always straightforward, as the physical platform imposes several constraints that may have a bearing on the consensus dynamics. In this paper, we investigate the effects of an implementation of the naming game for the kilobot robotic platform, in which we consider concurrent execution of games and physical interferences. Consensus dynamics are analysed in the light of the continuously evolving communication network created by the robots, highlighting how the different regimes crucially depend on the robot density and on their ability to spread widely in the experimental arena. We find that ph...

  20. Modeling and predicting abstract concept or idea introduction and propagation through geopolitical groups

    Science.gov (United States)

    Jaenisch, Holger M.; Handley, James W.; Hicklen, Michael L.

    2007-04-01

    This paper describes a novel capability for modeling known idea propagation transformations and predicting responses to new ideas from geopolitical groups. Ideas are captured using semantic words that are text based and bear cognitive definitions. We demonstrate a unique algorithm for converting these into analytical predictive equations. Using the illustrative idea of "proposing a gasoline price increase of 1 per gallon from 2" and its changing perceived impact throughout 5 demographic groups, we identify 13 cost of living Diplomatic, Information, Military, and Economic (DIME) features common across all 5 demographic groups. This enables the modeling and monitoring of Political, Military, Economic, Social, Information, and Infrastructure (PMESII) effects of each group to this idea and how their "perception" of this proposal changes. Our algorithm and results are summarized in this paper.

  1. Automated modelling of spatially-distributed glacier ice thickness and volume

    Science.gov (United States)

    James, William H. M.; Carrivick, Jonathan L.

    2016-07-01

    Ice thickness distribution and volume are both key parameters for glaciological and hydrological applications. This study presents VOLTA (Volume and Topography Automation), which is a Python script tool for ArcGISTM that requires just a digital elevation model (DEM) and glacier outline(s) to model distributed ice thickness, volume and bed topography. Ice thickness is initially estimated at points along an automatically generated centreline network based on the perfect-plasticity rheology assumption, taking into account a valley side drag component of the force balance equation. Distributed ice thickness is subsequently interpolated using a glaciologically correct algorithm. For five glaciers with independent field-measured bed topography, VOLTA modelled volumes were between 26.5% (underestimate) and 16.6% (overestimate) of that derived from field observations. Greatest differences were where an asymmetric valley cross section shape was present or where significant valley infill had occurred. Compared with other methods of modelling ice thickness and volume, key advantages of VOLTA are: a fully automated approach and a user friendly graphical user interface (GUI), GIS consistent geometry, fully automated centreline generation, inclusion of a side drag component in the force balance equation, estimation of glacier basal shear stress for each individual glacier, fully distributed ice thickness output and the ability to process multiple glaciers rapidly. VOLTA is capable of regional scale ice volume assessment, which is a key parameter for exploring glacier response to climate change. VOLTA also permits subtraction of modelled ice thickness from the input surface elevation to produce an ice-free DEM, which is a key input for reconstruction of former glaciers. VOLTA could assist with prediction of future glacier geometry changes and hence in projection of future meltwater fluxes.

  2. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  3. An architecture and model for cognitive engineering simulation analysis - Application to advanced aviation automation

    Science.gov (United States)

    Corker, Kevin M.; Smith, Barry R.

    1993-01-01

    The process of designing crew stations for large-scale, complex automated systems is made difficult because of the flexibility of roles that the crew can assume, and by the rapid rate at which system designs become fixed. Modern cockpit automation frequently involves multiple layers of control and display technology in which human operators must exercise equipment in augmented, supervisory, and fully automated control modes. In this context, we maintain that effective human-centered design is dependent on adequate models of human/system performance in which representations of the equipment, the human operator(s), and the mission tasks are available to designers for manipulation and modification. The joint Army-NASA Aircrew/Aircraft Integration (A3I) Program, with its attendant Man-machine Integration Design and Analysis System (MIDAS), was initiated to meet this challenge. MIDAS provides designers with a test bed for analyzing human-system integration in an environment in which both cognitive human function and 'intelligent' machine function are described in similar terms. This distributed object-oriented simulation system, its architecture and assumptions, and our experiences from its application in advanced aviation crew stations are described.

  4. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  5. Learning with Technology: Video Modeling with Concrete-Representational-Abstract Sequencing for Students with Autism Spectrum Disorder.

    Science.gov (United States)

    Yakubova, Gulnoza; Hughes, Elizabeth M; Shinaberry, Megan

    2016-07-01

    The purpose of this study was to determine the effectiveness of a video modeling intervention with concrete-representational-abstract instructional sequence in teaching mathematics concepts to students with autism spectrum disorder (ASD). A multiple baseline across skills design of single-case experimental methodology was used to determine the effectiveness of the intervention on the acquisition and maintenance of addition, subtraction, and number comparison skills for four elementary school students with ASD. Findings supported the effectiveness of the intervention in improving skill acquisition and maintenance at a 3-week follow-up. Implications for practice and future research are discussed. PMID:26983919

  6. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  7. Illuminance-based slat angle selection model for automated control of split blinds

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Jia; Olbina, Svetlana [Rinker School of Building Construction, University of Florida, Gainesville, FL 32611-5703 (United States)

    2011-03-15

    Venetian blinds play an important role in controlling daylight in buildings. Automated blinds overcome some limitations of manual blinds; however, the existing automated systems mainly control the direct solar radiation and glare and cannot be used for controlling innovative blind systems such as split blinds. This research developed an Illuminance-based Slat Angle Selection (ISAS) model that predicts the optimum slat angles of split blinds to achieve the designed indoor illuminance. The model was constructed based on a series of multi-layer feed-forward artificial neural networks (ANNs). The illuminance values at the sensor points used to develop the ANNs were obtained by the software EnergyPlus trademark. The weather determinants (such as horizontal illuminance and sun angles) were used as the input variables for the ANNs. The illuminance level at a sensor point was the output variable for the ANNs. The ISAS model was validated by evaluating the errors in the calculation of the: 1) illuminance and 2) optimum slat angles. The validation results showed that the power of the ISAS model to predict illuminance was 94.7% while its power to calculate the optimum slat angles was 98.5%. For about 90% of time in the year, the illuminance percentage errors were less than 10%, and the percentage errors in calculating the optimum slat angles were less than 5%. This research offers a new approach for the automated control of split blinds and a guide for future research to utilize the adaptive nature of ANNs to develop a more practical and applicable blind control system. (author)

  8. Intelligent sensor-model automated control of PMR-15 autoclave processing

    Science.gov (United States)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  9. IDENTIFICATION OF TYPES AND MODELS OF AIRCRAFT USING ASC-ANALYSIS OF THEIR SILHOUETTES (CONTOURS (GENERALIZATION, ABSTRACTION, CLASSIFICATION AND IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-12-01

    Full Text Available The article discusses the application of automated system-cognitive analysis (ASC-analysis, its mathematical model which is system theory of information and its software tool, which is intellectual system called "Eidos" for solving problems related to identification of types and models of aircraft by their silhouettes on the ground, to be more precise, their external contours: 1 digitization of scanned images of aircraft and creation of their mathematical models; 2 formation of mathematical models of specific aircraft with the use of the information theory; 3 modeling of the generalized images of various aircraft types and models and their graphic visualization; 4 comparing an image of a particular plane with generalized images of various aircraft types and models, and quantifying the degree of similarities and differences between them, i.e., the identification of the type and model of airplane by its silhouette (contour on the ground; 5 quantification of the similarities and differences of the generalized images of the planes with each other, i.e., clusterconstructive analysis of generalized images of various aircraft types and models. The article gives a new approach to digitizing images of aircraft, based on the use of the polar coordinate system, the center of gravity of the image and its external contour. Before digitizing images, we may use their transformation, standardizing the position of the images, their sizes (resolution, distance and the angle of rotation (angle in three dimensions. Therefore, the results of digitization and ASC-analysis of the images can be invariant (independent relative to their position, dimensions and turns. The shape of the contour of a particular aircraft is considered as a noise information on the type and model of aircraft, including information about the true shape of the aircraft type and its model (clean signal and noise, which distort the real shape, due to noise influences, both of the means of

  10. GoSam 2.0: Automated one loop calculations within and beyond the Standard Model

    CERN Document Server

    Greiner, Nicolas

    2014-01-01

    We present GoSam 2.0, a fully automated framework for the generation and evaluation of one loop amplitudes in multi leg processes. The new version offers numerous improvements both on generational aspects as well as on the reduction side. This leads to a faster and more stable code for calculations within and beyond the Standard Model. Furthermore it contains the extended version of the standardized interface to Monte Carlo programs which allows for an easy combination with other existing tools. We briefly describe the conceptual innovations and present some phenomenological results.

  11. An automated model-based aim point distribution system for solar towers

    Science.gov (United States)

    Schwarzbözl, Peter; Rong, Amadeus; Macke, Ansgar; Säck, Jan-Peter; Ulmer, Steffen

    2016-05-01

    Distribution of heliostat aim points is a major task during central receiver operation, as the flux distribution produced by the heliostats varies continuously with time. Known methods for aim point distribution are mostly based on simple aim point patterns and focus on control strategies to meet local temperature and flux limits of the receiver. Lowering the peak flux on the receiver to avoid hot spots and maximizing thermal output are obviously competing targets that call for a comprehensive optimization process. This paper presents a model-based method for online aim point optimization that includes the current heliostat field mirror quality derived through an automated deflectometric measurement process.

  12. Electronic design automation of analog ICs combining gradient models with multi-objective evolutionary algorithms

    CERN Document Server

    Rocha, Frederico AE; Lourenço, Nuno CC; Horta, Nuno CG

    2013-01-01

    This book applies to the scientific area of electronic design automation (EDA) and addresses the automatic sizing of analog integrated circuits (ICs). Particularly, this book presents an approach to enhance a state-of-the-art layout-aware circuit-level optimizer (GENOM-POF), by embedding statistical knowledge from an automatically generated gradient model into the multi-objective multi-constraint optimization kernel based on the NSGA-II algorithm. The results showed allow the designer to explore the different trade-offs of the solution space, both through the achieved device sizes, or the resp

  13. Automated Brain Structure Segmentation Based on Atlas Registration and Appearance Models

    DEFF Research Database (Denmark)

    van der Lijn, Fedde; de Bruijne, Marleen; Klein, Stefan;

    2012-01-01

    Accurate automated brain structure segmentation methods facilitate the analysis of large-scale neuroimaging studies. This work describes a novel method for brain structure segmentation in magnetic resonance images that combines information about a structure’s location and appearance. The spatial...... model is implemented by registering multiple atlas images to the target image and creating a spatial probability map. The structure’s appearance is modeled by a classi¿er based on Gaussian scale-space features. These components are combined with a regularization term in a Bayesian framework...... that is globally optimized using graph cuts. The incorporation of the appearance model enables the method to segment structures with complex intensity distributions and increases its robustness against errors in the spatial model. The method is tested in cross-validation experiments on two datasets acquired...

  14. Automated determination of fibrillar structures by simultaneous model building and fiber diffraction refinement.

    Science.gov (United States)

    Potrzebowski, Wojciech; André, Ingemar

    2015-07-01

    For highly oriented fibrillar molecules, three-dimensional structures can often be determined from X-ray fiber diffraction data. However, because of limited information content, structure determination and validation can be challenging. We demonstrate that automated structure determination of protein fibers can be achieved by guiding the building of macromolecular models with fiber diffraction data. We illustrate the power of our approach by determining the structures of six bacteriophage viruses de novo using fiber diffraction data alone and together with solid-state NMR data. Furthermore, we demonstrate the feasibility of molecular replacement from monomeric and fibrillar templates by solving the structure of a plant virus using homology modeling and protein-protein docking. The generated models explain the experimental data to the same degree as deposited reference structures but with improved structural quality. We also developed a cross-validation method for model selection. The results highlight the power of fiber diffraction data as structural constraints. PMID:25961412

  15. Automated Translation and Thermal Zoning of Digital Building Models for Energy Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Nathaniel L. [Cornell University; McCrone, Colin J. [Cornell University; Walter, Bruce J. [Cornell University; Pratt, Kevin B. [Cornell University; Greenberg, Donald P. [Cornell University

    2013-08-26

    Building energy simulation is valuable during the early stages of design, when decisions can have the greatest impact on energy performance. However, preparing digital design models for building energy simulation typically requires tedious manual alteration. This paper describes a series of five automated steps to translate geometric data from an unzoned CAD model into a multi-zone building energy model. First, CAD input is interpreted as geometric surfaces with materials. Second, surface pairs defining walls of various thicknesses are identified. Third, normal directions of unpaired surfaces are determined. Fourth, space boundaries are defined. Fifth, optionally, settings from previous simulations are applied, and spaces are aggregated into a smaller number of thermal zones. Building energy models created quickly using this method can offer guidance throughout the design process.

  16. PDB_REDO: automated re-refinement of X-ray structure models in the PDB.

    Science.gov (United States)

    Joosten, Robbie P; Salzemann, Jean; Bloch, Vincent; Stockinger, Heinz; Berglund, Ann-Charlott; Blanchet, Christophe; Bongcam-Rudloff, Erik; Combet, Christophe; Da Costa, Ana L; Deleage, Gilbert; Diarena, Matteo; Fabbretti, Roberto; Fettahi, Géraldine; Flegel, Volker; Gisel, Andreas; Kasam, Vinod; Kervinen, Timo; Korpelainen, Eija; Mattila, Kimmo; Pagni, Marco; Reichstadt, Matthieu; Breton, Vincent; Tickle, Ian J; Vriend, Gert

    2009-06-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimental X-ray data as well as in terms of geometric quality. The re-refinement protocol uses TLS models to describe concerted atom movement. The resulting structure models are made available through the PDB_REDO databank (http://www.cmbi.ru.nl/pdb_redo/). Grid computing techniques were used to overcome the computational requirements of this endeavour. PMID:22477769

  17. A Study on Automated Context-aware Access Control Model Using Ontology

    Science.gov (United States)

    Jang, Bokman; Jang, Hyokyung; Choi, Euiin

    Applications in context-aware computing environment will be connected wireless network and various devices. According to, recklessness access of information resource can make trouble of system. So, access authority management is very important issue both information resource and adapt to system through founding security policy of needed system. But, existing security model is easy of approach to resource through simply user ID and password. This model has a problem that is not concerned about user's environment information. In this paper, propose model of automated context-aware access control using ontology that can more efficiently control about resource through inference and judgment of context information that collect user's information and user's environment context information in order to ontology modeling.

  18. Semi-automated proper orthogonal decomposition reduced order model non-linear analysis for future BWR stability

    International Nuclear Information System (INIS)

    Highlights: • Techniques within the field of ROMing based on POD are reviewed regarding “well-behaved” applications. • A systematic, general, mostly automated, reduction methodology based on POD is derived. • It is applicable for many classes of dynamical problems including the envisioned BWR application. • Robustness of this approach is demonstrated by a “pathological” test example. • The derived ROM accurately predicts dynamics of transients not included in the data set. - Abstract: Thermal–hydraulic coupling between power, flow rate and density, intensified by neutronics feedback are the main drivers determining the stability behavior of a boiling water reactor (BWR). High-power low-flow conditions in connection with unfavorable power distributions can lead the BWR system into unstable regions where power oscillations can be triggered. This important threat to operational safety requires careful analysis for proper understanding. Current design rules assure admissible operation conditions by exclusion regions determined by numerical calculations and analytical methods based on non-linear states for specific transients. Analyzing an exhaustive parameter space of the non-linear BWR system becomes feasible with methodologies based on reduced order models (ROMs), saving computational cost and improving the physical understanding. A new self-contained methodology is developed, based on the general general proper orthogonal decomposition (POD) reduction technique. It is mostly automated, applicable for generic partial differential equation (PDE) systems, and reduces them in a grid-free manner to a small ordinary differential equation (ODE) system able to capture even non-linear dynamics. This allows a much more extensive analysis of the represented physical system. Symbolic mathematical manipulations are performed automatically by Mathematica routines. A novel and general calibration roadmap is proposed which simplifies choices on specific POD

  19. Automated Trust Negotiation with Time Behavior

    Directory of Open Access Journals (Sweden)

    Zuo Chen

    2011-08-01

    Full Text Available In automated trust negotiation(ATN, strangers build trust relationship by disclosing attributes credentials alternately. In the recent study of ATN, there was no rigorous formal definition of ATN abstract model, and have nothing to do with the time behavior. It exist denial of service attack which can lead system to paralysis. We present the basic components of ATN abstract model. It defines ATN abstract model by bringing state-transition system. Based on this formalized basis, the time behavior will be explored. We analyze and discuss some time semantics of ATN, and describe the extended ATN by constructing a state-transition system with time behavior. It can describe the ATN abstract model more comprehensive and secure. The satiability of security policies in ATN is discussed.

  20. MAIN ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Reflection on Some Issues Regarding the System of Socialism with Chinese Characteristics Zhang Xingmao The establishment of the system of socialism with Chinese characteristics, as the symbol of China's entry into the socialist society with Chinese characteristics, is the significant development of Marist theory of social formation. The Chinese model is framed and defined by the socialist system with Chinese characteristics, therefore the study of different levels and aspects of the Chinese model should be related to the relevant Chinese system to guarantee a scientific interpretation. Under the fundamental system of socialism, the historical and logical starting point of the formation of socialism with Chinese characteristics lies in eliminating the private ownership first and then allowing the existence and rapid development of the non-public sectors of the economy. With the gradual establishment and on the basis of the basic economic system in the preliminary stage of Socialism, and with the adaptive adjustments in the economic, political, cultural, and social systems, the socialist system with Chinese characteristics is gradually formed.

  1. Automated model-based bias field correction of MR images of the brain.

    Science.gov (United States)

    Van Leemput, K; Maes, F; Vandermeulen, D; Suetens, P

    1999-10-01

    We propose a model-based method for fully automated bias field correction of MR brain images. The MR signal is modeled as a realization of a random process with a parametric probability distribution that is corrupted by a smooth polynomial inhomogeneity or bias field. The method we propose applies an iterative expectation-maximization (EM) strategy that interleaves pixel classification with estimation of class distribution and bias field parameters, improving the likelihood of the model parameters at each iteration. The algorithm, which can handle multichannel data and slice-by-slice constant intensity offsets, is initialized with information from a digital brain atlas about the a priori expected location of tissue classes. This allows full automation of the method without need for user interaction, yielding more objective and reproducible results. We have validated the bias correction algorithm on simulated data and we illustrate its performance on various MR images with important field inhomogeneities. We also relate the proposed algorithm to other bias correction algorithms. PMID:10628948

  2. Automated Behavioral Phenotyping Reveals Presymptomatic Alterations in a SCA3 Genetrap Mouse Model

    Institute of Scientific and Technical Information of China (English)

    Jeannette Hübener; Nicolas Casadei; Peter Teismann; Mathias W. Seeliger; Maria Bj(o)rkqvist; Stephan von H(o)rsten; Olaf Riess; Huu Phuc Nguyen

    2012-01-01

    Characterization of disease models of neurodegenerative disorders requires a systematic and comprehensive phenotyping in a highly standardized manner,Therefore,automated high-resolution behavior test systems such as the homecage based LabMaster system are of particular interest.We demonstrate the power of the automated LabMaster system by discovering previously unrecognized features of a recently characterized atxn3 mutant mouse model.This model provided neurological symptoms including gait ataxia,tremor,weight loss and premature death at the age of t2 months usually detectable just 2 weeks before the mice died.Moreover,using the LabMaster system we were able to detect hypoactivity in presymptomatic mutant mice in the dark as well as light phase.Additionally,we analyzed inflammation,immunological and hematological parameters,which indicated a reduced immune defense in phenotypic mice.Here we demonstrate thai a detailed characterization even of organ systems that are usually not affected in SCA3 is important for further studies of pathogenesis and required for the preclinical therapeutic studies.

  3. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  4. Unifying Abstractions

    DEFF Research Database (Denmark)

    Torgersen, Mads

    of the most complex type relations put forth in type systems research, without compromising such fundamental qualities as conceptuality, modularity and static typing. While many new constructs and unifications are put forth  to substantiate their conceptual validity, type rules are given to support......This thesis presents the RUNE language, a semantic construction of related and tightly coupled programming constructs presented in the shape of a programming language. The major contribution is the succesfull design of a highly unified and general programming model, capable of expressing some...... their typeability and examples are described to demonstrate their use. Novel constructs include a parallel approach to object generation, and a blend of structural and nominal subtyping, while a very general class construct integrates the notions of procedures, parameterisation and genericity, and provides...

  5. An Interactive Tool For Semi-automated Statistical Prediction Using Earth Observations and Models

    Science.gov (United States)

    Zaitchik, B. F.; Berhane, F.; Tadesse, T.

    2015-12-01

    We developed a semi-automated statistical prediction tool applicable to concurrent analysis or seasonal prediction of any time series variable in any geographic location. The tool was developed using Shiny, JavaScript, HTML and CSS. A user can extract a predictand by drawing a polygon over a region of interest on the provided user interface (global map). The user can select the Climatic Research Unit (CRU) precipitation or Climate Hazards Group InfraRed Precipitation with Station data (CHIRPS) as predictand. They can also upload their own predictand time series. Predictors can be extracted from sea surface temperature, sea level pressure, winds at different pressure levels, air temperature at various pressure levels, and geopotential height at different pressure levels. By default, reanalysis fields are applied as predictors, but the user can also upload their own predictors, including a wide range of compatible satellite-derived datasets. The package generates correlations of the variables selected with the predictand. The user also has the option to generate composites of the variables based on the predictand. Next, the user can extract predictors by drawing polygons over the regions that show strong correlations (composites). Then, the user can select some or all of the statistical prediction models provided. Provided models include Linear Regression models (GLM, SGLM), Tree-based models (bagging, random forest, boosting), Artificial Neural Network, and other non-linear models such as Generalized Additive Model (GAM) and Multivariate Adaptive Regression Splines (MARS). Finally, the user can download the analysis steps they used, such as the region they selected, the time period they specified, the predictand and predictors they chose and preprocessing options they used, and the model results in PDF or HTML format. Key words: Semi-automated prediction, Shiny, R, GLM, ANN, RF, GAM, MARS

  6. Abstraction and Learning for Infinite-State Compositional Verification

    Directory of Open Access Journals (Sweden)

    Dimitra Giannakopoulou

    2013-09-01

    Full Text Available Despite many advances that enable the application of model checking techniques to the verification of large systems, the state-explosion problem remains the main challenge for scalability. Compositional verification addresses this challenge by decomposing the verification of a large system into the verification of its components. Recent techniques use learning-based approaches to automate compositional verification based on the assume-guarantee style reasoning. However, these techniques are only applicable to finite-state systems. In this work, we propose a new framework that interleaves abstraction and learning to perform automated compositional verification of infinite-state systems. We also discuss the role of learning and abstraction in the related context of interface generation for infinite-state components.

  7. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    Directory of Open Access Journals (Sweden)

    Y. Mao

    2014-07-01

    distributed automated extraction of drainage network model (Adam was proposed in the study. The Adam model has two features: (1 searching upward from outlet of basin instead of sink filling, (2 dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales.

  8. Policy-Based Automation of Dynamique and Multipoint Virtual Private Network Simulation on OPNET Modeler

    Directory of Open Access Journals (Sweden)

    Ayoub BAHNASSE

    2014-12-01

    Full Text Available The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different architectures of the Dynamic Multipoint Virtual Private Network. Several research studies have been conducted to automate the generation and simulation of complex networks under various simulators, according to our research no work has dealt with the Dynamic Multipoint Virtual Private Network. In this paper we present a simulation model of the Dynamic and Multipoint Virtual Private network in OPNET Modeler, and a WEB-based tool for project management on the same network.

  9. Improved automated diagnosis of misfire in internal combustion engines based on simulation models

    Science.gov (United States)

    Chen, Jian; Bond Randall, Robert

    2015-12-01

    In this paper, a new advance in the application of Artificial Neural Networks (ANNs) to the automated diagnosis of misfires in Internal Combustion engines(IC engines) is detailed. The automated diagnostic system comprises three stages: fault detection, fault localization and fault severity identification. Particularly, in the severity identification stage, separate Multi-Layer Perceptron networks (MLPs) with saturating linear transfer functions were designed for individual speed conditions, so they could achieve finer classification. In order to obtain sufficient data for the network training, numerical simulation was used to simulate different ranges of misfires in the engine. The simulation models need to be updated and evaluated using experimental data, so a series of experiments were first carried out on the engine test rig to capture the vibration signals for both normal condition and with a range of misfires. Two methods were used for the misfire diagnosis: one is based on the torsional vibration signals of the crankshaft and the other on the angular acceleration signals (rotational motion) of the engine block. Following the signal processing of the experimental and simulation signals, the best features were selected as the inputs to ANN networks. The ANN systems were trained using only the simulated data and tested using real experimental cases, indicating that the simulation model can be used for a wider range of faults for which it can still be considered valid. The final results have shown that the diagnostic system based on simulation can efficiently diagnose misfire, including location and severity.

  10. TOBAGO — a semi-automated approach for the generation of 3-D building models

    Science.gov (United States)

    Gruen, Armin

    3-D city models are in increasing demand for a great number of applications. Photogrammetry is a relevant technology that can provide an abundance of geometric, topologic and semantic information concerning these models. The pressure to generate a large amount of data with high degree of accuracy and completeness poses a great challenge to phtogrammetry. The development of automated and semi-automated methods for the generation of those data sets is therefore a key issue in photogrammetric research. We present in this article a strategy and methodology for an efficient generation of even fairly complex building models. Within this concept we request the operator to measure the house roofs from a stereomodel in form of an unstructured point cloud. According to our experience this can be done very quickly. Even a non-experienced operator can measure several hundred roofs or roof units per day. In a second step we fit generic building models fully automatically to these point clouds. The structure information is inherently included in these building models. In such a way geometric, topologic and even semantic data can be handed over to a CAD-system, in our case AutoCad, for further visualization and manipulation. The structuring is achieved in three steps. In a first step a classifier is initiated which recognizes the class of houses a particular roof point cloud belongs to. This recognition step is primarily based on the analysis of the number of ridge points. In the second and third steps the concrete topological relations between roof points are investigated and generic building models are fitted to the point clouds. Based on the technique of constraint-based reasoning two geometrical parsers are solving this problem. We have tested the methodology under a variety of different conditions in several pilot projects. The results will indicate the good performance of our approach. In addition we will demonstrate how the results can be used for visualization (texture

  11. Automated 3D Damaged Cavity Model Builder for Lower Surface Acreage Tile on Orbiter

    Science.gov (United States)

    Belknap, Shannon; Zhang, Michael

    2013-01-01

    The 3D Automated Thermal Tool for Damaged Acreage Tile Math Model builder was developed to perform quickly and accurately 3D thermal analyses on damaged lower surface acreage tiles and structures beneath the damaged locations on a Space Shuttle Orbiter. The 3D model builder created both TRASYS geometric math models (GMMs) and SINDA thermal math models (TMMs) to simulate an idealized damaged cavity in the damaged tile(s). The GMMs are processed in TRASYS to generate radiation conductors between the surfaces in the cavity. The radiation conductors are inserted into the TMMs, which are processed in SINDA to generate temperature histories for all of the nodes on each layer of the TMM. The invention allows a thermal analyst to create quickly and accurately a 3D model of a damaged lower surface tile on the orbiter. The 3D model builder can generate a GMM and the correspond ing TMM in one or two minutes, with the damaged cavity included in the tile material. A separate program creates a configuration file, which would take a couple of minutes to edit. This configuration file is read by the model builder program to determine the location of the damage, the correct tile type, tile thickness, structure thickness, and SIP thickness of the damage, so that the model builder program can build an accurate model at the specified location. Once the models are built, they are processed by the TRASYS and SINDA.

  12. A cellular automation model for the change of public attitude regarding nuclear energy

    International Nuclear Information System (INIS)

    A cellular automation model was constructed to investigate how public opinion on nuclear energy in Japan depends upon the information environment and personal communication between people. From simulation with this model, the following become clear; (i) society is a highly non-linear system with a self-organizing potential: (ii) in a society composed of one type of constituent member with homogeneous characteristics, the trend of public opinion is substantially changed only when the effort to ameliorate public acceptance over a long period of time, by means such as education, persuasion and advertisement, exceeds a certain threshold, and (iii) in the case when the amount of information on nuclear risk released from the newsmedia is reduced continuously from now on, the acceptability of nuclear energy is significantly improved so far as the extent of the reduction exceeds a certain threshold. (author)

  13. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  14. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  15. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show that the...... pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... cause of test failures were mostly due to local and trivial errors in newly written code-generation templates, and not related to the overall logical operation of the protocol as specified by the CPN model....

  16. Modeling and performance optimization of automated antenna alignment for telecommunication transceivers

    Directory of Open Access Journals (Sweden)

    Md. Ahsanul Hoque

    2015-09-01

    Full Text Available Antenna alignment is very cumbersome in telecommunication industry and it especially affects the MW links due to environmental anomalies or physical degradation over a period of time. While in recent years a more conventional approach of redundancy has been employed but to ensure the LOS link stability, novel automation techniques are needed. The basic principle is to capture the desired Received Signal Level (RSL by means of an outdoor unit installed on tower top and analyzing the RSL in indoor unit by means of a GUI interface. We have proposed a new smart antenna system where automation is initiated when the transceivers receive low signal strength and report the finding to processing comparator unit. Series architecture is used that include loop antenna, RCX Robonics, LabVIEW interface coupled with a tunable external controller. Denavit–Hartenberg parameters are used in analytical modeling and numerous control techniques have been investigated to overcome imminent overshoot problems for the transport link. With this novel approach, a solution has been put forward for the communication industry where any antenna could achieve optimal directivity for desired RSL with low overshoot and fast steady state response.

  17. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    R. Schreiner

    2001-06-27

    The purpose of this work is to develop the Engineered Barrier System (EBS) radionuclide transport abstraction model, as directed by a written development plan (CRWMS M&O 1999a). This abstraction is the conceptual model that will be used to determine the rate of release of radionuclides from the EBS to the unsaturated zone (UZ) in the total system performance assessment-license application (TSPA-LA). In particular, this model will be used to quantify the time-dependent radionuclide releases from a failed waste package (WP) and their subsequent transport through the EBS to the emplacement drift wall/UZ interface. The development of this conceptual model will allow Performance Assessment Operations (PAO) and its Engineered Barrier Performance Department to provide a more detailed and complete EBS flow and transport abstraction. The results from this conceptual model will allow PA0 to address portions of the key technical issues (KTIs) presented in three NRC Issue Resolution Status Reports (IRSRs): (1) the Evolution of the Near-Field Environment (ENFE), Revision 2 (NRC 1999a), (2) the Container Life and Source Term (CLST), Revision 2 (NRC 1999b), and (3) the Thermal Effects on Flow (TEF), Revision 1 (NRC 1998). The conceptual model for flow and transport in the EBS will be referred to as the ''EBS RT Abstraction'' in this analysis/modeling report (AMR). The scope of this abstraction and report is limited to flow and transport processes. More specifically, this AMR does not discuss elements of the TSPA-SR and TSPA-LA that relate to the EBS but are discussed in other AMRs. These elements include corrosion processes, radionuclide solubility limits, waste form dissolution rates and concentrations of colloidal particles that are generally represented as boundary conditions or input parameters for the EBS RT Abstraction. In effect, this AMR provides the algorithms for transporting radionuclides using the flow geometry and radionuclide concentrations

  18. Testing abstract behavioral specifications

    NARCIS (Netherlands)

    Wong, P.Y.H.; Bubel, R.; Boer, F.S. de; Gouw, C.P.T. de; Gómez-Zamalloa, M.; Haehnle, R; Meinke, K.; Sindhu, M.A.

    2015-01-01

    We present a range of testing techniques for the Abstract Behavioral Specification (ABS) language and apply them to an industrial case study. ABS is a formal modeling language for highly variable, concurrent, component-based systems. The nature of these systems makes them susceptible to the introduc

  19. Abstracts of SIG Sessions.

    Science.gov (United States)

    Proceedings of the ASIS Annual Meeting, 1995

    1995-01-01

    Presents abstracts of 15 special interest group (SIG) sessions. Topics include navigation and information utilization in the Internet, natural language processing, automatic indexing, image indexing, classification, users' models of database searching, online public access catalogs, education for information professions, information services,…

  20. Examining Uncertainty in Demand Response Baseline Models and Variability in Automated Response to Dynamic Pricing

    Energy Technology Data Exchange (ETDEWEB)

    Mathieu, Johanna L.; Callaway, Duncan S.; Kiliccote, Sila

    2011-08-15

    Controlling electric loads to deliver power system services presents a number of interesting challenges. For example, changes in electricity consumption of Commercial and Industrial (C&I) facilities are usually estimated using counterfactual baseline models, and model uncertainty makes it difficult to precisely quantify control responsiveness. Moreover, C&I facilities exhibit variability in their response. This paper seeks to understand baseline model error and demand-side variability in responses to open-loop control signals (i.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR) parameters, which characterize changes in electricity use on DR days, and then present a method for computing the error associated with DR parameter estimates. In addition to analyzing the magnitude of DR parameter error, we develop a metric to determine how much observed DR parameter variability is attributable to real event-to-event variability versus simply baseline model error. Using data from 38 C&I facilities that participated in an automated DR program in California, we find that DR parameter errors are large. For most facilities, observed DR parameter variability is likely explained by baseline model error, not real DR parameter variability; however, a number of facilities exhibit real DR parameter variability. In some cases, the aggregate population of C&I facilities exhibits real DR parameter variability, resulting in implications for the system operator with respect to both resource planning and system stability.

  1. Modeling the alternative oxidase from the human pathogen Blastocystis using automated hybrid structural template assembly

    Directory of Open Access Journals (Sweden)

    Standley DM

    2012-01-01

    Full Text Available Daron M Standley1, Mark van der Giezen21Laboratory of Systems Immunology, World Premier International Immunology Frontier Research Center, Osaka University, Osaka, Japan; 2Centre for Eukaryotic Evolutionary Microbiology, Biosciences, College of Life and Environmental Sciences, University of Exeter, Exeter, UKAbstract: Alternative oxidases (AOX of human parasites represent attractive drug targets due to their absence in humans. However, the lack of a structure has prevented structure-based drug design. Moreover, a large helical insertion proves difficult for automated structural modeling efforts. We have used a novel hybrid structural modeling approach to generate a model that is globally consistent with a previous model but based on a phylogenetically closer template and systematic sampling of known fragments in the helical insertion. Our model, in agreement with site-directed mutagenesis studies, clearly assigns E200 as the iron-ligating residue as opposed to the previously suggested E201. Crystallization of AOX from another species has recently been reported suggesting that our blind prediction can be independently validated in the near future.Keywords: homology modeling, protein structure, blind prediction, fragment assembly, active site, parasite, mitosome, hydrogenosome, evolution

  2. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  3. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  4. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J. Prouty

    2006-07-14

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  5. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.;

    2015-01-01

    Background: Field calculations for transcranial magnetic stimulation (TMS) are increasingly implemented online in neuronavigation systems and in more realistic offline approaches based on finite-element methods. They are often based on simplified and/or non-validated models of the magnetic vector...... potential of the TMS coils. Objective: To develop an approach to reconstruct the magnetic vector potential based on automated measurements. Methods: We implemented a setup that simultaneously measures the three components of the magnetic field with high spatial resolution. This is complemented by a novel...... approach to determine the magnetic vector potential via volume integration of the measured field. Results: The integration approach reproduces the vector potential with very good accuracy. The vector potential distribution of a standard figure-of-eight shaped coil determined with our setup corresponds well...

  6. Modeling and matching of landmarks for automation of Mars Rover localization

    Science.gov (United States)

    Wang, Jue

    The Mars Exploration Rover (MER) mission, begun in January 2004, has been extremely successful. However, decision-making for many operation tasks of the current MER mission and the 1997 Mars Pathfinder mission is performed on Earth through a predominantly manual, time-consuming process. Unmanned planetary rover navigation is ideally expected to reduce rover idle time, diminish the need for entering safe-mode, and dynamically handle opportunistic science events without required communication to Earth. Successful automation of rover navigation and localization during the extraterrestrial exploration requires that accurate position and attitude information can be received by a rover and that the rover has the support of simultaneous localization and mapping. An integrated approach with Bundle Adjustment (BA) and Visual Odometry (VO) can efficiently refine the rover position. However, during the MER mission, BA is done manually because of the difficulty in the automation of the cross-sitetie points selection. This dissertation proposes an automatic approach to select cross-site tie points from multiple rover sites based on the methods of landmark extraction, landmark modeling, and landmark matching. The first step in this approach is that important landmarks such as craters and rocks are defined. Methods of automatic feature extraction and landmark modeling are then introduced. Complex models with orientation angles and simple models without those angles are compared. The results have shown that simple models can provide reasonably good results. Next, the sensitivity of different modeling parameters is analyzed. Based on this analysis, cross-site rocks are matched through two complementary stages: rock distribution pattern matching and rock model matching. In addition, a preliminary experiment on orbital and ground landmark matching is also briefly introduced. Finally, the reliability of the cross-site tie points selection is validated by fault detection, which

  7. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    Directory of Open Access Journals (Sweden)

    J. F. Wellmann

    2015-11-01

    Full Text Available We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilise the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a~link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential-fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  8. Automated clustering of ensembles of alternative models in protein structure databases.

    Science.gov (United States)

    Domingues, Francisco S; Rahnenführer, Jörg; Lengauer, Thomas

    2004-06-01

    Experimentally determined protein structures have been classified in different public databases according to their structural and evolutionary relationships. Frequently, alternative structural models, determined using X-ray crystallography or NMR spectroscopy, are available for a protein. These models can present significant structural dissimilarity. Currently there is no classification available for these alternative structures. In order to classify them, we developed STRuster, an automated method for clustering ensembles of structural models according to their backbone structure. The method is based on the calculation of carbon alpha (Calpha) distance matrices. Two filters are applied in the calculation of the dissimilarity measure in order to identify both large and small (but significant) backbone conformational changes. The resulting dissimilarity value is used for hierarchical clustering and partitioning around medoids (PAM). Hierarchical clustering reflects the hierarchy of similarities between all pairs of models, while PAM groups the models into the 'optimal' number of clusters. The method has been applied to cluster the structures in each SCOP species level and can be easily applied to any other sets of conformers. The results are available at: http://bioinf.mpi-sb.mpg.de/projects/struster/. PMID:15319469

  9. Automated As-Built Model Generation of Subway Tunnels from Mobile LiDAR Data

    Directory of Open Access Journals (Sweden)

    Mostafa Arastounia

    2016-09-01

    Full Text Available This study proposes fully-automated methods for as-built model generation of subway tunnels employing mobile Light Detection and Ranging (LiDAR data. The employed dataset is acquired by a Velodyne HDL 32E and covers 155 m of a subway tunnel containing six million points. First, the tunnel’s main axis and cross sections are extracted. Next, a preliminary model is created by fitting an ellipse to each extracted cross section. The model is refined by employing residual analysis and Baarda’s data snooping method to eliminate outliers. The final model is then generated by applying least squares adjustment to outlier-free data. The obtained results indicate that the tunnel’s main axis and 1551 cross sections at 0.1 m intervals are successfully extracted. Cross sections have an average semi-major axis of 7.8508 m with a standard deviation of 0.2 mm and semi-minor axis of 7.7509 m with a standard deviation of 0.1 mm. The average normal distance of points from the constructed model (average absolute error is also 0.012 m. The developed algorithm is applicable to tunnels with any horizontal orientation and degree of curvature since it makes no assumptions, nor does it use any a priori knowledge regarding the tunnel’s curvature and horizontal orientation.

  10. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    Science.gov (United States)

    Florian Wellmann, J.; Thiele, Sam T.; Lindsay, Mark D.; Jessell, Mark W.

    2016-03-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilize the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  11. Automated As-Built Model Generation of Subway Tunnels from Mobile LiDAR Data

    Science.gov (United States)

    Arastounia, Mostafa

    2016-01-01

    This study proposes fully-automated methods for as-built model generation of subway tunnels employing mobile Light Detection and Ranging (LiDAR) data. The employed dataset is acquired by a Velodyne HDL 32E and covers 155 m of a subway tunnel containing six million points. First, the tunnel’s main axis and cross sections are extracted. Next, a preliminary model is created by fitting an ellipse to each extracted cross section. The model is refined by employing residual analysis and Baarda’s data snooping method to eliminate outliers. The final model is then generated by applying least squares adjustment to outlier-free data. The obtained results indicate that the tunnel’s main axis and 1551 cross sections at 0.1 m intervals are successfully extracted. Cross sections have an average semi-major axis of 7.8508 m with a standard deviation of 0.2 mm and semi-minor axis of 7.7509 m with a standard deviation of 0.1 mm. The average normal distance of points from the constructed model (average absolute error) is also 0.012 m. The developed algorithm is applicable to tunnels with any horizontal orientation and degree of curvature since it makes no assumptions, nor does it use any a priori knowledge regarding the tunnel’s curvature and horizontal orientation. PMID:27649172

  12. Diabetes in The Netherlands: Exploratory Modelling and Analysis of a Macro Diabetes System Dynamics Model (abstract only)

    NARCIS (Netherlands)

    Logtens, T.W.A.

    2014-01-01

    Type II Diabetes Mellitus is one of the fastest growing chronic diseases in the western world. This exploratory research provides an insight into the causes for the diabetes prevalence in The Netherlands by performing an Exploratory Modelling and Analysis (EMA) study on an adaptation of an existing

  13. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  14. Seismic Consequence Abstraction

    International Nuclear Information System (INIS)

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274])

  15. Fast Model Adaptation for Automated Section Classification in Electronic Medical Records.

    Science.gov (United States)

    Ni, Jian; Delaney, Brian; Florian, Radu

    2015-01-01

    Medical information extraction is the automatic extraction of structured information from electronic medical records, where such information can be used for improving healthcare processes and medical decision making. In this paper, we study one important medical information extraction task called section classification. The objective of section classification is to automatically identify sections in a medical document and classify them into one of the pre-defined section types. Training section classification models typically requires large amounts of human labeled training data to achieve high accuracy. Annotating institution-specific data, however, can be both expensive and time-consuming; which poses a big hurdle for adapting a section classification model to new medical institutions. In this paper, we apply two advanced machine learning techniques, active learning and distant supervision, to reduce annotation cost and achieve fast model adaptation for automated section classification in electronic medical records. Our experiment results show that active learning reduces the annotation cost and time by more than 50%, and distant supervision can achieve good model accuracy using weakly labeled training data only. PMID:26262005

  16. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC) syst

  17. Semi-Automated Experimental Set-Up for CAD-oriented Low Frequency Noise Modeling of Bipolar Transistors

    OpenAIRE

    Borgarino, M.; Bogoni, A; Fantini, F.; Peroni, M.; Cetronio, A.

    2004-01-01

    The present work addresses the hardware and software development of a semi-automated experimental set-up devoted to the extraction of low frequency noise compact models of bipolar transistors for microwave circuit applications (e.g. oscillators). The obtained experimental setup is applied to GaInP/GaAs Heterojunction Bipolar Transistors.

  18. Semi-automated calibration method for modelling of mountain permafrost evolution in Switzerland

    Science.gov (United States)

    Marmy, A.; Rajczak, J.; Delaloye, R.; Hilbich, C.; Hoelzle, M.; Kotlarski, S.; Lambiel, C.; Noetzli, J.; Phillips, M.; Salzmann, N.; Staub, B.; Hauck, C.

    2015-09-01

    Permafrost is a widespread phenomenon in the European Alps. Many important topics such as the future evolution of permafrost related to climate change and the detection of permafrost related to potential natural hazards sites are of major concern to our society. Numerical permafrost models are the only tools which facilitate the projection of the future evolution of permafrost. Due to the complexity of the processes involved and the heterogeneity of Alpine terrain, models must be carefully calibrated and results should be compared with observations at the site (borehole) scale. However, a large number of local point data are necessary to obtain a broad overview of the thermal evolution of mountain permafrost over a larger area, such as the Swiss Alps, and the site-specific model calibration of each point would be time-consuming. To face this issue, this paper presents a semi-automated calibration method using the Generalized Likelihood Uncertainty Estimation (GLUE) as implemented in a 1-D soil model (CoupModel) and applies it to six permafrost sites in the Swiss Alps prior to long-term permafrost evolution simulations. We show that this automated calibration method is able to accurately reproduce the main thermal condition characteristics with some limitations at sites with unique conditions such as 3-D air or water circulation, which have to be calibrated manually. The calibration obtained was used for RCM-based long-term simulations under the A1B climate scenario specifically downscaled at each borehole site. The projection shows general permafrost degradation with thawing at 10 m, even partially reaching 20 m depths until the end of the century, but with different timing among the sites. The degradation is more rapid at bedrock sites whereas ice-rich sites with a blocky surface cover showed a reduced sensitivity to climate change. The snow cover duration is expected to be reduced drastically (between -20 to -37 %) impacting the ground thermal regime. However

  19. The modeling of transfer of steering between automated vehicle and human driver using hybrid control framework

    NARCIS (Netherlands)

    Kaustubh, M.; Willemsen, D.M.C.; Mazo, M.

    2016-01-01

    Proponents of autonomous driving pursue driverless technologies, whereas others foresee a gradual transition where there will be automated driving systems that share the control of the vehicle with the driver. With such advances it becomes pertinent that the developed automated systems need to be sa

  20. Automated Verification of Code Generated from Models: Comparing Specifications with Observations

    Science.gov (United States)

    Gerlich, R.; Sigg, D.; Gerlich, R.

    2008-08-01

    The interest for automatic code generation from models is increasing. A specification is expressed as model and verification and validation is performed in the application domain. Once the model is formally correct and complete, code can be generated automatically. The general belief is that this code should be correct as well. However, this might be not true: Many parameters impact the generation of code and its correctness: it depends on conditions changing from application to application, the properties of the code depend on the environment where it is executed. From the principles of ISVV (Independent Software Verification and Validation) it even must be doubted that the automatically generated code is correct. Therefore an additional activity is required proving the correctness of the whole chain from modelling level down to execution on the target platform. Certification of a code generator is the state-of-the-art approach dealing with such risks,. Scade [1] was the first code generator certified according to DO178B. The certification costs are a significant disadvantage of this certification approach. All codes needs to be analysed manually, and this procedure has to be repeated for recertification after each maintenance step. But certification does not guarantee at all that the generated code does comply with the model. Certification is based on compliance of the code of the code generator with given standards. Such compliance never can guarantee correctness of the whole chain through transformation down to the environment for execution, though the belief is that certification implies well-formed code at a reduced fault rate. The approach presented here goes a direction different from manual certification.. It is guided by the idea of automated proof: each time code is generated from a model the properties of the code when being executed in its environment are compared with the properties specified in the model. This allows to conclude on the correctness of

  1. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    Science.gov (United States)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  2. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  3. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  4. Automating the Mapping Process of Traditional Malay Textile Knowledge Model with the Core Ontology

    Directory of Open Access Journals (Sweden)

    Syerina A.M. Nasir

    2011-01-01

    Full Text Available Problem statement: The wave of ontology has spread drastically in the cultural heritage domain. The impact can be seen from the growing number of cultural heritage web information systems, available textile ontology and harmonization works with the core ontology, CIDOC CRM. The aim of this study is to provide a base for common views in automating the process of mapping between revised TMT Knowledge Model and CIDOC CRM. Approach: Manual mapping was conducted to find similar or overlapping concepts which are aligned to each other in order to achieve ontology similarity. This is achieved after TMT Knowledge Model already undergone transformation process to match with CIDOC CRM structure. Results: Although there are several problems encountered during mapping process, the result shows an instant view of the classes which are found to be easily mapped between both models. Conclusion/Recommendations: Future research will be focused on the construction of Batik Heritage Ontology by using the mapping result obtained in this study. Further testing, evaluation and refinement by using the real collections of cultural artifacts within museums will also be conducted in the near future.

  5. AUTOMATED FORMATION OF CALCULATION MODELS OF TURBOGENERATORS FOR SOFTWARE ENVIRONMENT FEMM

    Directory of Open Access Journals (Sweden)

    V.I. Milykh

    2015-08-01

    Full Text Available Attention is paid to the popular FEMM (Finite Element Method Magnetics program which is effective in the numerical calculations of the magnetic fields of electrical machines. The main problem of its using - high costs in time on the formation of a graphical model representing the design and on the formation of the physical model representing the materials properties and the winding currents of machines – is solved. For this purpose, principles of the automated formation of such models are developed and presented on the turbogenerator example. The task is performed by a program written in an algorithmic language Lua integrated into the package FEMM. The program is universal in terms of varying the geometry and dimensions of the designed turbogenerators. It uses a minimum of input information in a digital form representing the design of the whole turbogenerator and its fragments. A general structure of the Lua script is provided, significant parts of its text, the graphic results of work's phases, as well as explanations of the program and instructions for its use are given. Performance capabilities of the compiled Lua script are shown on the example of the real 340 MW turbogenerator.

  6. World-wide distribution automation systems

    Energy Technology Data Exchange (ETDEWEB)

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  7. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Bouzon Gustavo

    2008-01-01

    Full Text Available Abstract This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  8. Modeling and simulation of control system for electron beam machine (EBM) using programmable automation controller (PAC)

    International Nuclear Information System (INIS)

    An EBM electronic model is designed to simulate the control system of the Nissin EBM, which is located at Block 43, MINT complex of Jalan Dengkil with maximum output of 3 MeV, 30 mA using a Programmable Automation Controllers (PAC). This model operates likes a real EBM system where all the start-up, interlocking and stopping procedures are fully followed. It also involves formulating the mathematical models to relate certain output with the input parameters using data from actual operation on EB machine. The simulation involves a set of PAC system consisting of the digital and analogue input/output modules. The program code is written using Labview software (real-time version) on a PC and then downloaded into the PAC stand-alone memory. All the 23 interlocking signals required by the EB machine are manually controlled by mechanical switches and represented by LEDs. The EB parameters are manually controlled by potentiometers and displayed on analogue and digital meters. All these signals are then interfaced to the PC via a wifi wireless communication built-in at the PAC controller. The program is developed in accordance to the specifications and requirement of the original real EB system and displays them on the panel of the model and also on the PC monitor. All possible chances from human errors, hardware and software malfunctions, including the worst-case conditions will be tested, evaluated and modified. We hope that the performance of our model complies the requirements of operating the EB machine. It also hopes that this electronic model can replace the original PC interfacing being utilized in the Nissin EBM in the near future. The system can also be used to study the fault tolerance analysis and automatic re-configuration for advanced control of the EB system. (Author)

  9. What determines the take-over time? An integrated model approach of driver take-over after automated driving.

    Science.gov (United States)

    Zeeb, Kathrin; Buchner, Axel; Schrauf, Michael

    2015-05-01

    In recent years the automation level of driver assistance systems has increased continuously. One of the major challenges for highly automated driving is to ensure a safe driver take-over of the vehicle guidance. This must be ensured especially when the driver is engaged in non-driving related secondary tasks. For this purpose it is essential to find indicators of the driver's readiness to take over and to gain more knowledge about the take-over process in general. A simulator study was conducted to explore how drivers' allocation of visual attention during highly automated driving influences a take-over action in response to an emergency situation. Therefore we recorded drivers' gaze behavior during automated driving while simultaneously engaging in a visually demanding secondary task, and measured their reaction times in a take-over situation. According to their gaze behavior the drivers were categorized into "high", "medium" and "low-risk". The gaze parameters were found to be suitable for predicting the readiness to take-over the vehicle, in such a way that high-risk drivers reacted late and more often inappropriately in the take-over situation. However, there was no difference among the driver groups in the time required by the drivers to establish motor readiness to intervene after the take-over request. An integrated model approach of driver behavior in emergency take-over situations during automated driving is presented. It is argued that primarily cognitive and not motor processes determine the take-over time. Given this, insights can be derived for further research and the development of automated systems. PMID:25794922

  10. An automated method for generating analogic signals that embody the Markov kinetics of model ionic channels.

    Science.gov (United States)

    Luchian, Tudor

    2005-08-30

    In this work we present an automated method for generating electrical signals which reflect the kinetics of ionic channels that have custom-tailored intermediate sub-states and intermediate reaction constants. The concept of our virtual single-channel waveform generator makes use of two software platforms, one for the numerical generation of single channel traces stemming from a pre-defined model and another for the digital-to-analog conversion of such numerical generated single channel traces. This technique of continuous generation and recording of the activity of a model ionic channel provides an efficient protocol to teach neophytes in the field of single-channel electrophysiology about its major phenomenological facets. Random analogic signals generated by using our technique can be successfully employed in a number of applications, such us: assisted learning of the single-molecule kinetic investigation via electrical recordings, impedance spectroscopy, the evaluation of linear frequency response of neurons and the study of stochastic resonance of ion channels. PMID:16054511

  11. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    Science.gov (United States)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  12. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  13. Automation of the Jarrell--Ash model 70-314 emission spectrometer

    International Nuclear Information System (INIS)

    Automation of the Jarrell-Ash 3.4-Meter Ebert direct-reading emission spectrometer with digital scaler readout is described. The readout is interfaced to a Data General NOVA 840 minicomputer. The automation code consists of BASIC language programs for interactive routines, data processing, and report generation. Call statements within the BASIC programs invoke assembly language routines for real-time data acquisition and control. In addition, the automation objectives as well as the spectrometer-computer system functions, coding, and operating instructions are presented

  14. A logical partial equivalencerelation model of abstract interpretation%抽象解释的部分等价逻辑关系模型

    Institute of Scientific and Technical Information of China (English)

    王蓁蓁

    2015-01-01

    抽象解释由 CousotP 和 CousotR 于1977年提出,随后许多作者做了大量工作。从不同的角度构造了基于部分等价关系和逻辑部分等价关系一个模型,它与传统抽象解释模型根本不同,该模型并不是对具体系统在“近似”意义上的抽象,而是对原系统上的一切关系(包括逻辑关系)的抽象,因此它不是原系统的“简化”,而是原系统的一个“深化”。从而在此模型上提出的问题具有另外的特征,例如复杂性和多态性等问题。%interpretation was presented by CousotP and CousotR in 1977.Subsequently many scholars have done a lot of studies on it.Generally speaking,the classic abstract interpretation theory is developing within two equivalent formal frameworks of Galois connections and closure operators.This paper constructs a model based on partial equivalence relations and logical partial equivalence relations from a different perspective.It considers the abstract domain as a collection of“partial equivalence relations”and the semantic operator as a collection of “logical partial equivalence relations”.So this model is totally different from the traditional abstract interpretation models.Except that it requires that the concrete or abstract semantic operators have certain logical relations,the model never requires some special properties such as the mono-tonicity,so in this point it is different from the classic model.Besides,it is not an abstraction of a concrete system in an “ap-proximate”sense,but rather an abstraction of all relations(including logical relations)on the original system.Thus this model is not a “simplification”of the original system but rather a “complication”of the original system.In some sense,it disjoins the “quality”unit from the original system,and it may be more complex than the original system.The advantage of this model is that it embodies some logical relations concerned with the system functions

  15. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2005-08-25

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport

  16. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  17. An Analytic Model for Design of a Multivehicle Automated Guided Vehicle System

    OpenAIRE

    Eric Johnson, M; Brandeau, Margaret L.

    1993-01-01

    We consider the problem of designing a multivehicle automated guided vehicle system (AGVS) to supplement an existing nonautomated material handling system. The AGVS consists of a pool of vehicles that deliver raw components from a central storage area to workcenters throughout the factor floor. The objective is to determine which workcenters warrant automated component delivery and the number of vehicles required to service those workcenters, to maximize the benefit of the AGVS, subject to a ...

  18. A model for business process automation in service oriented systems with knowledge management technologies

    OpenAIRE

    Šaša, Ana

    2009-01-01

    Due to increasing requirements for efficiency, effectiveness and flexibility of business systems, automation of business processes has become an important topic. In the last years, the most successful and predominant approach to business process automation has become the service-oriented architecture approach. In a service-oriented architecture a business process is composed of services, which represent different tasks that have to be performed in a business system. Typically, a business proc...

  19. Setup time reduction: SMED-balancing integrated model for manufacturing systems with automated transfer

    Directory of Open Access Journals (Sweden)

    Maurizio Faccio

    2013-10-01

    Full Text Available The importance of short setup times is increasing in every type of industry. It has been known how to address this problem for about 20 years. The SMED method, originally developed by the Japanese industrial engineer Shigeo Shingo for reducing the time to exchange dies, gives a really straightforward approach to improve existing setups. On the other hand, in the case of complex manufacturing systems the simple application of the SMED methodology is not enough. Manufacturing systems composed of different working machines with automated transfer facilities are a good example. Technologicalconstraints, task precedence constraints, and synchronization between different setup tasks are just some of the influencing factors that make an improved SMED desirable. The present paper, starting from an industrial case, aims to provide a heuristics methodology that integrates the traditional SMED with the workload balancing problem that is typical of assembly systems, in order to address the setup reduction problem in the case of complex manufacturing systems. Anindustrial case is reported to validate the proposed model and to demonstrate its practical implications.

  20. Modelling molecule-surface interactions--an automated quantum-classical approach using a genetic algorithm.

    Science.gov (United States)

    Herbers, Claudia R; Johnston, Karen; van der Vegt, Nico F A

    2011-06-14

    We present an automated and efficient method to develop force fields for molecule-surface interactions. A genetic algorithm (GA) is used to parameterise a classical force field so that the classical adsorption energy landscape of a molecule on a surface matches the corresponding landscape from density functional theory (DFT) calculations. The procedure performs a sophisticated search in the parameter phase space and converges very quickly. The method is capable of fitting a significant number of structures and corresponding adsorption energies. Water on a ZnO(0001) surface was chosen as a benchmark system but the method is implemented in a flexible way and can be applied to any system of interest. In the present case, pairwise Lennard Jones (LJ) and Coulomb potentials are used to describe the molecule-surface interactions. In the course of the fitting procedure, the LJ parameters are refined in order to reproduce the adsorption energy landscape. The classical model is capable of describing a wide range of energies, which is essential for a realistic description of a fluid-solid interface. PMID:21594260

  1. Piaget on Abstraction.

    Science.gov (United States)

    Moessinger, Pierre; Poulin-Dubois, Diane

    1981-01-01

    Reviews and discusses Piaget's recent work on abstract reasoning. Piaget's distinction between empirical and reflective abstraction is presented; his hypotheses are considered to be metaphorical. (Author/DB)

  2. Programme and abstracts

    International Nuclear Information System (INIS)

    Abstracts of 25 papers presented at the congress are given. The abstracts cover various topics including radiotherapy, radiopharmaceuticals, radioimmunoassay, health physics, radiation protection and nuclear medicine

  3. Automated parameter estimation for biological models using Bayesian statistical model checking

    OpenAIRE

    Hussain, Faraz; Langmead, Christopher J.; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram; Jha, Sumit K.

    2015-01-01

    Background Probabilistic models have gained widespread acceptance in the systems biology community as a useful way to represent complex biological systems. Such models are developed using existing knowledge of the structure and dynamics of the system, experimental observations, and inferences drawn from statistical analysis of empirical data. A key bottleneck in building such models is that some system variables cannot be measured experimentally. These variables are incorporated into the mode...

  4. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  5. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  6. Model-Driven Development of Automation and Control Applications: Modeling and Simulation of Control Sequences

    Directory of Open Access Journals (Sweden)

    Timo Vepsäläinen

    2014-01-01

    Full Text Available The scope and responsibilities of control applications are increasing due to, for example, the emergence of industrial internet. To meet the challenge, model-driven development techniques have been in active research in the application domain. Simulations that have been traditionally used in the domain, however, have not yet been sufficiently integrated to model-driven control application development. In this paper, a model-driven development process that includes support for design-time simulations is complemented with support for simulating sequential control functions. The approach is implemented with open source tools and demonstrated by creating and simulating a control system model in closed-loop with a large and complex model of a paper industry process.

  7. Automated Segmentation of Cardiac Magnetic Resonance Images

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Nilsson, Jens Chr.; Grønning, Bjørn A.

    2001-01-01

    Magnetic resonance imaging (MRI) has been shown to be an accurate and precise technique to assess cardiac volumes and function in a non-invasive manner and is generally considered to be the current gold-standard for cardiac imaging [1]. Measurement of ventricular volumes, muscle mass and function...... is based on determination of the left-ventricular endocardial and epicardial borders. Since manual border detection is laborious, automated segmentation is highly desirable as a fast, objective and reproducible alternative. Automated segmentation will thus enhance comparability between and within cardiac...... studies and increase accuracy by allowing acquisition of thinner MRI-slices. This abstract demonstrates that statistical models of shape and appearance, namely the deformable models: Active Appearance Models, can successfully segment cardiac MRIs....

  8. Evidence evaluation in fingerprint comparison and automated fingerprint identification systems--Modeling between finger variability.

    Science.gov (United States)

    Egli Anthonioz, N M; Champod, C

    2014-02-01

    In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting. PMID:24447455

  9. An automated system to simulate the River discharge in Kyushu Island using the H08 model

    Science.gov (United States)

    Maji, A.; Jeon, J.; Seto, S.

    2015-12-01

    Kyushu Island is located in southwestern part of Japan, and it is often affected by typhoons and a Baiu front. There have been severe water-related disasters recorded in Kyushu Island. On the other hand, because of high population density and for crop growth, water resource is an important issue of Kyushu Island.The simulation of river discharge is important for water resource management and early warning of water-related disasters. This study attempts to apply H08 model to simulate river discharge in Kyushu Island. Geospatial meteorological and topographical data were obtained from Japanese Ministry of Land, Infrastructure, Transport and Tourism (MLIT) and Automated Meteorological Data Acquisition System (AMeDAS) of Japan Meteorological Agency (JMA). The number of the observation stations of AMeDAS is limited and is not quite satisfactory for the application of water resources models in Kyushu. It is necessary to spatially interpolate the point data to produce grid dataset. Meteorological grid dataset is produced by considering elevation dependence. Solar radiation is estimated from hourly sunshine duration by a conventional formula. We successfully improved the accuracy of interpolated data just by considering elevation dependence and found out that the bias is related to geographical location. The rain/snow classification is done by H08 model and is validated by comparing estimated and observed snow rate. The estimates tend to be larger than the corresponding observed values. A system to automatically produce daily meteorological grid dataset is being constructed.The geospatial river network data were produced by ArcGIS and they were utilized in the H08 model to simulate the river discharge. Firstly, this research is to compare simulated and measured specific discharge, which is the ratio of discharge to watershed area. Significant error between simulated and measured data were seen in some rivers. Secondly, the outputs by the coupled model including crop growth

  10. Beyond captions: linking figures with abstract sentences in biomedical articles.

    Directory of Open Access Journals (Sweden)

    Joseph P Bockhorst

    Full Text Available Although figures in scientific articles have high information content and concisely communicate many key research findings, they are currently under utilized by literature search and retrieval systems. Many systems ignore figures, and those that do not typically only consider caption text. This study describes and evaluates a fully automated approach for associating figures in the body of a biomedical article with sentences in its abstract. We use supervised methods to learn probabilistic language models, hidden Markov models, and conditional random fields for predicting associations between abstract sentences and figures. Three kinds of evidence are used: text in abstract sentences and figures, relative positions of sentences and figures, and the patterns of sentence/figure associations across an article. Each information source is shown to have predictive value, and models that use all kinds of evidence are more accurate than models that do not. Our most accurate method has an F1-score of 69% on a cross-validation experiment, is competitive with the accuracy of human experts, has significantly better predictive accuracy than state-of-the-art methods and enables users to access figures associated with an abstract sentence with an average of 1.82 fewer mouse clicks. A user evaluation shows that human users find our system beneficial. The system is available at http://FigureItOut.askHERMES.org.

  11. Forecasting macroeconomic variables using neural network models and three automated model selection techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2016-01-01

    When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. To alleviate the problem, White (2006) presented a solution (QuickNet) that conv......When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. To alleviate the problem, White (2006) presented a solution (Quick...

  12. Automation based on knowledge modeling theory and its applications in engine diagnostic systems using Space Shuttle Main Engine vibrational data. M.S. Thesis

    Science.gov (United States)

    Kim, Jonnathan H.

    1995-01-01

    Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).

  13. Forecasting Macroeconomic Variables using Neural Network Models and Three Automated Model Selection Techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    previous studies have indicated. When forecasting with neural network models one faces several problems, all of which influence the accuracy of the forecasts. First, neural networks are often hard to estimate due to their highly nonlinear structure. In fact, their parameters are not even globally...

  14. Automated identification of potential snow avalanche release areas based on digital elevation models

    Directory of Open Access Journals (Sweden)

    Y. Bühler

    2013-05-01

    Full Text Available The identification of snow avalanche release areas is a very difficult task. The release mechanism of snow avalanches depends on many different terrain, meteorological, snowpack and triggering parameters and their interactions, which are very difficult to assess. In many alpine regions such as the Indian Himalaya, nearly no information on avalanche release areas exists mainly due to the very rough and poorly accessible terrain, the vast size of the region and the lack of avalanche records. However avalanche release information is urgently required for numerical simulation of avalanche events to plan mitigation measures, for hazard mapping and to secure important roads. The Rohtang tunnel access road near Manali, Himachal Pradesh, India, is such an example. By far the most reliable way to identify avalanche release areas is using historic avalanche records and field investigations accomplished by avalanche experts in the formation zones. But both methods are not feasible for this area due to the rough terrain, its vast extent and lack of time. Therefore, we develop an operational, easy-to-use automated potential release area (PRA detection tool in Python/ArcGIS which uses high spatial resolution digital elevation models (DEMs and forest cover information derived from airborne remote sensing instruments as input. Such instruments can acquire spatially continuous data even over inaccessible terrain and cover large areas. We validate our tool using a database of historic avalanches acquired over 56 yr in the neighborhood of Davos, Switzerland, and apply this method for the avalanche tracks along the Rohtang tunnel access road. This tool, used by avalanche experts, delivers valuable input to identify focus areas for more-detailed investigations on avalanche release areas in remote regions such as the Indian Himalaya and is a precondition for large-scale avalanche hazard mapping.

  15. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism.

    Directory of Open Access Journals (Sweden)

    Alicja ePuścian

    2014-04-01

    Full Text Available Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order and cognitive rigidity (higher-order. Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repetitive behaviors during reversal learning in mice in the automated IntelliCage system. During the reward-motivated place preference reversal learning, designed to assess cognitive abilities of mice, visits to the previously rewarded places were recorded to measure cognitive flexibility. Thereafter, emotional flexibility was assessed by measuring conditioned fear extinction. Additionally, to look for neuronal correlates of cognitive impairments, we measured CA3-CA1 hippocampal long term potentiation (LTP. To standardize the designed tests we used C57BL/6 and BALB/c mice, representing two genetic backgrounds, for induction of autism by prenatal exposure to the sodium valproate. We found impairments of place learning related to perseveration and no LTP impairments in C57BL/6 valproate-treated mice. In contrast, BALB/c valproate-treated mice displayed severe deficits of place learning not associated with perseverative behaviors and accompanied by hippocampal LTP impairments. Alterations of cognitive flexibility observed in C57BL/6 valproate-treated mice were related to neither restricted exploration pattern nor to emotional flexibility. Altogether, we showed that the designed tests of cognitive performance and perseverative behaviors are efficient and highly replicable. Moreover, the results suggest that genetic background is crucial for the behavioral effects of prenatal valproate treatment.

  16. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  17. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  18. Evidence evaluation in fingerprint comparison and automated fingerprint identification systems--modelling within finger variability.

    Science.gov (United States)

    Egli, Nicole M; Champod, Christophe; Margot, Pierre

    2007-04-11

    Recent challenges and errors in fingerprint identification have highlighted the need for assessing the information content of a papillary pattern in a systematic way. In particular, estimation of the statistical uncertainty associated with this type of evidence is more and more called upon. The approach used in the present study is based on the assessment of likelihood ratios (LRs). This evaluative tool weighs the likelihood of evidence given two mutually exclusive hypotheses. The computation of likelihood ratios on a database of marks of known sources (matching the unknown and non-matching the unknown mark) allows an estimation of the evidential contribution of fingerprint evidence. LRs are computed taking advantage of the scores obtained from an automated fingerprint identification system and hence are based exclusively on level II features (minutiae). The AFIS system attributes a score to any comparison (fingerprint to fingerprint, mark to mark and mark to fingerprint), used here as a proximity measure between the respective arrangements of minutiae. The numerator of the LR addresses the within finger variability and is obtained by comparing the same configurations of minutiae coming from the same source. Only comparisons where the same minutiae are visible both on the mark and on the print are therefore taken into account. The denominator of the LR is obtained by cross-comparison with a database of prints originating from non-matching sources. The estimation of the numerator of the LR is much more complex in terms of specific data requirements than the estimation of the denominator of the LR (that requires only a large database of prints from an non-associated population). Hence this paper addresses specific issues associated with the numerator or within finger variability. This study aims at answering the following questions: (1) how a database for modelling within finger variability should be acquired; (2) whether or not the visualisation technique or the

  19. Global-scale assessment of groundwater depletion and related groundwater abstractions: Combining hydrological modeling with information from well observations and GRACE satellites

    Science.gov (United States)

    Döll, Petra; Müller Schmied, Hannes; Schuh, Carina; Portmann, Felix T.; Eicker, Annette

    2014-07-01

    Groundwater depletion (GWD) compromises crop production in major global agricultural areas and has negative ecological consequences. To derive GWD at the grid cell, country, and global levels, we applied a new version of the global hydrological model WaterGAP that simulates not only net groundwater abstractions and groundwater recharge from soils but also groundwater recharge from surface water bodies in dry regions. A large number of independent estimates of GWD as well as total water storage (TWS) trends determined from GRACE satellite data by three analysis centers were compared to model results. GWD and TWS trends are simulated best assuming that farmers in GWD areas irrigate at 70% of optimal water requirement. India, United States, Iran, Saudi Arabia, and China had the highest GWD rates in the first decade of the 21st century. On the Arabian Peninsula, in Libya, Egypt, Mali, Mozambique, and Mongolia, at least 30% of the abstracted groundwater was taken from nonrenewable groundwater during this time period. The rate of global GWD has likely more than doubled since the period 1960-2000. Estimated GWD of 113 km3/yr during 2000-2009, corresponding to a sea level rise of 0.31 mm/yr, is much smaller than most previous estimates. About 15% of the globally abstracted groundwater was taken from nonrenewable groundwater during this period. To monitor recent temporal dynamics of GWD and related water abstractions, GRACE data are best evaluated with a hydrological model that, like WaterGAP, simulates the impact of abstractions on water storage, but the low spatial resolution of GRACE remains a challenge.

  20. A Cognitive System Model for Human/Automation Dynamics in Airspace Management

    Science.gov (United States)

    Corker, Kevin M.; Pisanich, Gregory; Lebacqz, J. Victor (Technical Monitor)

    1997-01-01

    NASA has initiated a significant thrust of research and development focused on providing the flight crew and air traffic managers automation aids to increase capacity in en route and terminal area operations through the use of flexible, more fuel-efficient routing, while improving the level of safety in commercial carrier operations. In that system development, definition of cognitive requirements for integrated multi-operator dynamic aiding systems is fundamental. In order to support that cognitive function definition, we have extended the Man Machine Integrated Design and Analysis System (MIDAS) to include representation of multiple cognitive agents (both human operators and intelligent aiding systems) operating aircraft, airline operations centers and air traffic control centers in the evolving airspace. The demands of this application require representation of many intelligent agents sharing world-models, and coordinating action/intention with cooperative scheduling of goals and actions in a potentially unpredictable world of operations. The MIDAS operator models have undergone significant development in order to understand the requirements for operator aiding and the impact of that aiding in the complex nondeterminate system of national airspace operations. The operator model's structure has been modified to include attention functions, action priority, and situation assessment. The cognitive function model has been expanded to include working memory operations including retrieval from long-term store, interference, visual-motor and verbal articulatory loop functions, and time-based losses. The operator's activity structures have been developed to include prioritization and interruption of multiple parallel activities among multiple operators, to provide for anticipation (knowledge of the intention and action of remote operators), and to respond to failures of the system and other operators in the system in situation-specific paradigms. The model's internal

  1. Enhanced Automated Canopy Characterization from Hyperspectral Data by a Novel Two Step Radiative Transfer Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    Wolfgang Wagner

    2009-11-01

    Full Text Available Automated, image based methods for the retrieval of vegetation biophysical and biochemical variables are often hampered by the lack of a priori knowledge about land cover and phenology, which makes the retrieval a highly underdetermined problem. This study addresses this problem by presenting a novel approach, called CRASh, for the concurrent retrieval of leaf area index, leaf chlorophyll content, leaf water content and leaf dry matter content from high resolution solar reflective earth observation data. CRASh, which is based on the inversion of the combined PROSPECT+SAILh radiative transfer model (RTM, explores the benefits of combining semi-empirical and physically based approaches. The approach exploits novel ways to address the underdetermined problem in the context of an automated retrieval from mono-temporal high resolution data. To regularize the inverse problem in the variable domain, RTM inversion is coupled with an automated land cover classification. Model inversion is based on a two step lookup table (LUT approach: First, a range of possible solutions is selected from a previously calculated LUT based on the analogy between measured and simulated reflectance. The final solution is determined from this subset by minimizing the difference between the variables used to simulate the spectra contained in the reduced LUT and a first guess of the solution. This first guess of the variables is derived from predictive semi-empirical relationships between classical vegetation indices and the single variables. Additional spectral regularization is obtained by the use of hyperspectral data. Results show that estimates obtained with CRASh are significantly more accurate than those obtained with a tested conventional RTM inversion and semi-empirical approach. Accuracies obtained in this study are comparable to the results obtained by various authors for better constrained inversions that assume more a priori information. The completely automated

  2. Using a Semi-Automated Strategy to Develop Multi-Compartment Models That Predict Biophysical Properties of Interneuron-Specific 3 (IS3) Cells in Hippocampus

    Science.gov (United States)

    Camiré, Olivier

    2016-01-01

    Abstract Determining how intrinsic cellular properties govern and modulate neuronal input–output processing is a critical endeavor for understanding microcircuit functions in the brain. However, lack of cellular specifics and nonlinear interactions prevent experiments alone from achieving this. Building and using cellular models is essential in these efforts. We focus on uncovering the intrinsic properties of mus musculus hippocampal type 3 interneuron-specific (IS3) cells, a cell type that makes GABAergic synapses onto specific interneuron types, but not pyramidal cells. While IS3 cell morphology and synaptic output have been examined, their voltage-gated ion channel profile and distribution remain unknown. We combined whole-cell patch-clamp recordings and two-photon dendritic calcium imaging to examine IS3 cell membrane and dendritic properties. Using these data as a target reference, we developed a semi-automated strategy to obtain multi-compartment models for a cell type with unknown intrinsic properties. Our approach is based on generating populations of models to capture determined features of the experimental data, each of which possesses unique combinations of channel types and conductance values. From these populations, we chose models that most closely resembled the experimental data. We used these models to examine the impact of specific ion channel combinations on spike generation. Our models predict that fast delayed rectifier currents should be present in soma and proximal dendrites, and this is confirmed using immunohistochemistry. Further, without A-type potassium currents in the dendrites, spike generation is facilitated at more distal synaptic input locations. Our models will help to determine the functional role of IS3 cells in hippocampal microcircuits.

  3. Using a Semi-Automated Strategy to Develop Multi-Compartment Models That Predict Biophysical Properties of Interneuron-Specific 3 (IS3) Cells in Hippocampus

    Science.gov (United States)

    Camiré, Olivier

    2016-01-01

    Abstract Determining how intrinsic cellular properties govern and modulate neuronal input–output processing is a critical endeavor for understanding microcircuit functions in the brain. However, lack of cellular specifics and nonlinear interactions prevent experiments alone from achieving this. Building and using cellular models is essential in these efforts. We focus on uncovering the intrinsic properties of mus musculus hippocampal type 3 interneuron-specific (IS3) cells, a cell type that makes GABAergic synapses onto specific interneuron types, but not pyramidal cells. While IS3 cell morphology and synaptic output have been examined, their voltage-gated ion channel profile and distribution remain unknown. We combined whole-cell patch-clamp recordings and two-photon dendritic calcium imaging to examine IS3 cell membrane and dendritic properties. Using these data as a target reference, we developed a semi-automated strategy to obtain multi-compartment models for a cell type with unknown intrinsic properties. Our approach is based on generating populations of models to capture determined features of the experimental data, each of which possesses unique combinations of channel types and conductance values. From these populations, we chose models that most closely resembled the experimental data. We used these models to examine the impact of specific ion channel combinations on spike generation. Our models predict that fast delayed rectifier currents should be present in soma and proximal dendrites, and this is confirmed using immunohistochemistry. Further, without A-type potassium currents in the dendrites, spike generation is facilitated at more distal synaptic input locations. Our models will help to determine the functional role of IS3 cells in hippocampal microcircuits. PMID:27679813

  4. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...

  5. AUTOMATED GIS WATERSHED ANALYSIS TOOLS FOR RUSLE/SEDMOD SOIL EROSION AND SEDIMENTATION MODELING

    Science.gov (United States)

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed using a suite of automated Arc Macro Language (AML ) scripts and a pair of processing- intensive ANSI C++ executable programs operating on an ESRI ArcGIS 8.x Workstation platform...

  6. 基于时间抽象状态机的AADL模型验证∗%Verification of AADL Models with Timed Abstract State Machines

    Institute of Scientific and Technical Information of China (English)

    杨志斌; 胡凯; 赵永望; 马殿富; Jean-Paul BODEVEIX

    2015-01-01

    提出了一种基于时间抽象状态机(timed abstract state machine,简称TASM)的AADL(architecture analysis and design language)模型验证方法。分别给出了AADL子集和TASM的抽象语法,并基于语义函数和类ML的元语言形式定义转换规则。在此基础上,基于AADL开源建模环境OSATE(open source AADL tool environment)设计并实现了AADL模型验证与分析工具AADL2TASM,并基于航天器导航、制导与控制系统(guidance, navigation and control)进行了实例性验证。%This paper presents a formal verification method for AADL (architecture analysis and design language) models by TASM (timed abstract state machine) translation. The abstract syntax of the chosen subset of AADL and of TASM are given. The translation rules are defined clearly by the semantic functions expressed in a ML-like language. Furthermore, the translation is implemented in the model transformation tool AADL2TASM, which provides model checking and simulation for AADL models. Finally, a case study of space GNC (guidance, navigation and control) system is provided.

  7. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed

  8. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    Science.gov (United States)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an

  9. Automation of measurement of heights waves around a model ship; Mokeisen mawari no hako keisoku no jidoka

    Energy Technology Data Exchange (ETDEWEB)

    Ikehata, M.; Kato, M.; Yanagida, F. [Yokohama National University, Yokohama (Japan). Faculty of Engineering

    1997-10-01

    Trial fabrication and tests were performed on an instrument to automate measurement of heights of waves around a model ship. The currently used electric wave height measuring instrument takes long time for measurement, hence poor in efficiency. The method for processing optical images also has a problem in accuracy. Therefore, a computer controlled system was structured by using AC servo motors in driving the X and Y axes of a traverse equipment. Equipment was fabricated to automate the wave height measurement, in which four servo type wave height meters are installed on a moving rack in the lateral (Y-axial) direction so that wave heights to be measured by four meters can be measured automatically all at once. Wave heights can be measured continuously by moving the moving rack at a constant speed, verifying that wave shapes in longitudinal cross sections can be acquired by only one towing. Time required in the measurements using the instrument was 40 hours as a net time for fixed point measurement and 12 hours for continuous measurement, or 52 hours in total. On the other hand, the time may reach 240 hours for fixed point measurement when the conventional all-point manual traverse equipment is used. Enormous effects were obtained from automating the instrument. Collection of wave height data will continue also on tankers and other types of ships. 2 refs., 8 figs., 1 tab.

  10. Designing Negotiating Agent for Automated Negotiations

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Traditional research in automated negotiation is focused on negotiation protocol and strategy.This paper studies automated negotiation from a new point of view, proposes a novel concept, namely negotiating agent, and discusses its significance in construction of automated negotiation system, with an abstract model formally described and the architecture designed, which supports both goal-directed reasoning and reactive response. A communication model was proposed to construct interaction mechanism used by negotiating agents, in which the negotiation language used by agents is defined.The communication model and the language are defined in a way general enough to support a wide variety of market mechanisms, thus being particularly suitable for flexible applications such as electronic business. The design and expression of the negotiation ontology is also discussed. On the base of the theoretical model of negotiating agent, negotiating agent architecture and negotiating agent communication model (NACM) are explicit and formal specifications for the agents negotiating in an E-business environment; especially, NACM defines the negotiation language template shared among all agents formally and explicitly. The novelty of the communication model is twofold.

  11. Program and abstracts

    International Nuclear Information System (INIS)

    Abstracts of the papers given at the conference are presented. The abstracts are arranged under sessions entitled:Theoretical Physics; Nuclear Physics; Solid State Physics; Spectroscopy; Physics Education; SANCGASS; Astronomy; Plasma Physics; Physics in Industry; Applied and General Physics

  12. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    Science.gov (United States)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  13. Reification of abstract concepts to improve comprehension using interactive virtual environments and a knowledge-based design: a renal physiology model.

    Science.gov (United States)

    Alverson, Dale C; Saiki, Stanley M; Caudell, Thomas P; Goldsmith, Timothy; Stevens, Susan; Saland, Linda; Colleran, Kathleen; Brandt, John; Danielson, Lee; Cerilli, Lisa; Harris, Alexis; Gregory, Martin C; Stewart, Randall; Norenberg, Jeffery; Shuster, George; Panaoitis; Holten, James; Vergera, Victor M; Sherstyuk, Andrei; Kihmm, Kathleen; Lui, Jack; Wang, Kin Lik

    2006-01-01

    Several abstract concepts in medical education are difficult to teach and comprehend. In order to address this challenge, we have been applying the approach of reification of abstract concepts using interactive virtual environments and a knowledge-based design. Reification is the process of making abstract concepts and events, beyond the realm of direct human experience, concrete and accessible to teachers and learners. Entering virtual worlds and simulations not otherwise easily accessible provides an opportunity to create, study, and evaluate the emergence of knowledge and comprehension from the direct interaction of learners with otherwise complex abstract ideas and principles by bringing them to life. Using a knowledge-based design process and appropriate subject matter experts, knowledge structure methods are applied in order to prioritize, characterize important relationships, and create a concept map that can be integrated into the reified models that are subsequently developed. Applying these principles, our interdisciplinary team has been developing a reified model of the nephron into which important physiologic functions can be integrated and rendered into a three dimensional virtual environment called Flatland, a virtual environments development software tool, within which a learners can interact using off-the-shelf hardware. The nephron model can be driven dynamically by a rules-based artificial intelligence engine, applying the rules and concepts developed in conjunction with the subject matter experts. In the future, the nephron model can be used to interactively demonstrate a number of physiologic principles or a variety of pathological processes that may be difficult to teach and understand. In addition, this approach to reification can be applied to a host of other physiologic and pathological concepts in other systems. These methods will require further evaluation to determine their impact and role in learning.

  14. Abstractions on test design techniques

    OpenAIRE

    Wendland, Marc-Florian

    2014-01-01

    Automated test design is an approach to test design in which automata are utilized for generating test artifacts such as test cases and test data from a formal test basis, most often called test model. A test generator operates on such a test model to meet a certain test coverage goal. In the plethora of the approaches, tools and standards for model-based test design, the test design techniques to be applied and test coverage goals to be met are not part of the test model, which may easily le...

  15. Introduction to abstract algebra

    CERN Document Server

    Nicholson, W Keith

    2012-01-01

    Praise for the Third Edition ". . . an expository masterpiece of the highest didactic value that has gained additional attractivity through the various improvements . . ."-Zentralblatt MATH The Fourth Edition of Introduction to Abstract Algebra continues to provide an accessible approach to the basic structures of abstract algebra: groups, rings, and fields. The book's unique presentation helps readers advance to abstract theory by presenting concrete examples of induction, number theory, integers modulo n, and permutations before the abstract structures are defined. Readers can immediately be

  16. Abstraction and Consolidation

    Science.gov (United States)

    Monaghan, John; Ozmantar, Mehmet Fatih

    2006-01-01

    The framework for this paper is a recently developed theory of abstraction in context. The paper reports on data collected from one student working on tasks concerned with absolute value functions. It examines the relationship between mathematical constructions and abstractions. It argues that an abstraction is a consolidated construction that can…

  17. Remarks on abstract Galois theory

    Directory of Open Access Journals (Sweden)

    Newton C. A. da Costa

    2011-06-01

    Full Text Available This paper is a historical companion to a previous one, in which it was studied the so-called abstract Galois theory as formulated by the Portuguese mathematician José Sebastião e Silva (see da Costa, Rodrigues (2007. Our purpose is to present some applications of abstract Galois theory to higher-order model theory, to discuss Silva's notion of expressibility and to outline a classical Galois theory that can be obtained inside the two versions of the abstract theory, those of Mark Krasner and of Silva. Some comments are made on the universal theory of (set-theoretic structures.

  18. Modeling nurses' attitude toward using automated unit-based medication storage and distribution systems: an extension of the technology acceptance model.

    Science.gov (United States)

    Escobar-Rodríguez, Tomás; Romero-Alonso, María Mercedes

    2013-05-01

    This article analyzes the attitude of nurses toward the use of automated unit-based medication storage and distribution systems and identifies influencing factors. Understanding these factors provides an opportunity to explore actions that might be taken to boost adoption by potential users. The theoretical grounding for this research is the Technology Acceptance Model. The Technology Acceptance Model specifies the causal relationships between perceived usefulness, perceived ease of use, attitude toward using, and actual usage behavior. The research model has six constructs, and nine hypotheses were generated from connections between these six constructs. These constructs include perceived risks, experience level, and training. The findings indicate that these three external variables are related to the perceived ease of use and perceived usefulness of automated unit-based medication storage and distribution systems, and therefore, they have a significant influence on attitude toward the use of these systems.

  19. NeuroGPS: automated localization of neurons for brain circuits using L1 minimization model

    OpenAIRE

    Quan, Tingwei; Zheng, Ting; Yang, Zhongqing; Ding, Wenxiang; Li, Shiwei; Li, Jing; Zhou, Hang; Luo, Qingming; Gong, Hui; Zeng, Shaoqun

    2013-01-01

    Drawing the map of neuronal circuits at microscopic resolution is important to explain how brain works. Recent progresses in fluorescence labeling and imaging techniques have enabled measuring the whole brain of a rodent like a mouse at submicron-resolution. Considering the huge volume of such datasets, automatic tracing and reconstruct the neuronal connections from the image stacks is essential to form the large scale circuits. However, the first step among which, automated location the soma...

  20. Automated Kinematic Modelling of Warped Galaxy Discs in Large Hi Surveys: 3D Tilted Ring Fitting of HI Emission Cubes

    CERN Document Server

    Kamphuis, P; Oh, S- H; Spekkens, K; Urbancic, N; Serra, P; Koribalski, B S; Dettmar, R -J

    2015-01-01

    Kinematical parameterisations of disc galaxies, employing emission line observations, are indispensable tools for studying the formation and evolution of galaxies. Future large-scale HI surveys will resolve the discs of many thousands of galaxies, allowing a statistical analysis of their disc and halo kinematics, mass distribution and dark matter content. Here we present an automated procedure which fits tilted-ring models to Hi data cubes of individual, well-resolved galaxies. The method builds on the 3D Tilted Ring Fitting Code (TiRiFiC) and is called FAT (Fully Automated TiRiFiC). To assess the accuracy of the code we apply it to a set of 52 artificial galaxies and 25 real galaxies from the Local Volume HI Survey (LVHIS). Using LVHIS data, we compare our 3D modelling to the 2D modelling methods DiskFit and rotcur. A conservative result is that FAT accurately models the kinematics and the morphologies of galaxies with an extent of eight beams across the major axis in the inclination range 20$^{\\circ}$-90$^{...

  1. Abstracts and program proceedings of the 1994 meeting of the International Society for Ecological Modelling North American Chapter

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, J.R.

    1994-06-01

    This document contains information about the 1994 meeting of the International Society for Ecological Modelling North American Chapter. The topics discussed include: extinction risk assessment modelling, ecological risk analysis of uranium mining, impacts of pesticides, demography, habitats, atmospheric deposition, and climate change.

  2. (abstract) A Test of the Theoretical Models of Bipolar Outflows: The Bipolar Outflow in Mon R2

    Science.gov (United States)

    Xie, Taoling; Goldsmith, Paul; Patel, Nimesh

    1993-01-01

    We report some results of a study of the massive bipolar outflow in the central region of the relatively nearby giant molecular cloud Monoceros R2. We make a quantative comparison of our results with the Shu et al. outflow model which incorporates a radially directed wind sweeping up the ambient material into a shell. We find that this simple model naturally explains the shape of this thin shell. Although Shu's model in its simplest form predicts with reasonable parameters too much mass at very small polar angles, as previously pointed out by Masson and Chernin, it provides a reasonable good fit to the mass distribution at larger polar angles. It is possible that this discrepancy is due to inhomogeneities of the ambient molecular gas which is not considered by the model. We also discuss the constraints imposed by these results on recent jet-driven outflow models.

  3. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  4. Automated data evaluation and modelling of simultaneous (19) F-(1) H medium-resolution NMR spectra for online reaction monitoring.

    Science.gov (United States)

    Zientek, Nicolai; Laurain, Clément; Meyer, Klas; Paul, Andrea; Engel, Dirk; Guthausen, Gisela; Kraume, Matthias; Maiwald, Michael

    2016-06-01

    Medium-resolution nuclear magnetic resonance spectroscopy (MR-NMR) currently develops to an important analytical tool for both quality control and process monitoring. In contrast to high-resolution online NMR (HR-NMR), MR-NMR can be operated under rough environmental conditions. A continuous re-circulating stream of reaction mixture from the reaction vessel to the NMR spectrometer enables a non-invasive, volume integrating online analysis of reactants and products. Here, we investigate the esterification of 2,2,2-trifluoroethanol with acetic acid to 2,2,2-trifluoroethyl acetate both by (1) H HR-NMR (500 MHz) and (1) H and (19) F MR-NMR (43 MHz) as a model system. The parallel online measurement is realised by splitting the flow, which allows the adjustment of quantitative and independent flow rates, both in the HR-NMR probe as well as in the MR-NMR probe, in addition to a fast bypass line back to the reactor. One of the fundamental acceptance criteria for online MR-MNR spectroscopy is a robust data treatment and evaluation strategy with the potential for automation. The MR-NMR spectra are treated by an automated baseline and phase correction using the minimum entropy method. The evaluation strategies comprise (i) direct integration, (ii) automated line fitting, (iii) indirect hard modelling (IHM) and (iv) partial least squares regression (PLS-R). To assess the potential of these evaluation strategies for MR-NMR, prediction results are compared with the line fitting data derived from the quantitative HR-NMR spectroscopy. Although, superior results are obtained from both IHM and PLS-R for (1) H MR-NMR, especially the latter demands for elaborate data pretreatment, whereas IHM models needed no previous alignment. Copyright © 2015 John Wiley & Sons, Ltd. PMID:25854892

  5. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  6. An initial abstraction and constant loss model, and methods for estimating unit hydrographs, peak streamflows, and flood volumes for urban basins in Missouri

    Science.gov (United States)

    Huizinga, Richard J.

    2014-01-01

    Streamflow data, basin characteristics, and rainfall data from 39 streamflow-gaging stations for urban areas in and adjacent to Missouri were used by the U.S. Geological Survey in cooperation with the Metropolitan Sewer District of St. Louis to develop an initial abstraction and constant loss model (a time-distributed basin-loss model) and a gamma unit hydrograph (GUH) for urban areas in Missouri. Study-specific methods to determine peak streamflow and flood volume for a given rainfall event also were developed.

  7. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  8. 抽象代数理论下的复杂系统模型%Complex System Model Based on the Theory of Abstract Algebra

    Institute of Scientific and Technical Information of China (English)

    李顺勇; 赵深淼

    2011-01-01

    复杂系统模型已经广泛地应用于生态学\\,中医学\\,物理\\,化学及其他学科.运用抽象代数理论对复杂系统模型进行了研究.首先对张应山所给出的有关复杂系统模型及其 3 种关系(等价关系、跃迁关系和返祖关系)的定义进行了阐述,然后在抽象代数理论的基础上给出几个引理并进行了证明,最后给出了复杂系统模型的一个等价定义.%The complex system model is widely used in ecology, traditional chinese medicine, physics,chemistry and other subjects. The complex system model was studied by using the theory of abstract algebra. Firstly, the definition of the complex system model and three relations (equivalent relation,transition relation and atavism relation) given by Zhang Yingshan was introduced. Secondly, some lemmas were given and proved on the basis of abstract algebra. Finally, an equivalent definition of the complex system model was given.

  9. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  10. Extending and applying active appearance models for automated, high precision segmentation in different image modalities

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Fisker, Rune; Ersbøll, Bjarne Kjær

    2001-01-01

    , an initialization scheme is designed thus making the usage of AAMs fully automated. Using these extensions it is demonstrated that AAMs can segment bone structures in radiographs, pork chops in perspective images and the left ventricle in cardiovascular magnetic resonance images in a robust, fast and accurate...... object class description, which can be employed to rapidly search images for new object instances. The proposed extensions concern enhanced shape representation, handling of homogeneous and heterogeneous textures, refinement optimization using Simulated Annealing and robust statistics. Finally...

  11. A knowledge- and model-based system for automated weaning from mechanical ventilation: technical description and first clinical application.

    Science.gov (United States)

    Schädler, Dirk; Mersmann, Stefan; Frerichs, Inéz; Elke, Gunnar; Semmel-Griebeler, Thomas; Noll, Oliver; Pulletz, Sven; Zick, Günther; David, Matthias; Heinrichs, Wolfgang; Scholz, Jens; Weiler, Norbert

    2014-10-01

    To describe the principles and the first clinical application of a novel prototype automated weaning system called Evita Weaning System (EWS). EWS allows an automated control of all ventilator settings in pressure controlled and pressure support mode with the aim of decreasing the respiratory load of mechanical ventilation. Respiratory load takes inspired fraction of oxygen, positive end-expiratory pressure, pressure amplitude and spontaneous breathing activity into account. Spontaneous breathing activity is assessed by the number of controlled breaths needed to maintain a predefined respiratory rate. EWS was implemented as a knowledge- and model-based system that autonomously and remotely controlled a mechanical ventilator (Evita 4, Dräger Medical, Lübeck, Germany). In a selected case study (n = 19 patients), ventilator settings chosen by the responsible physician were compared with the settings 10 min after the start of EWS and at the end of the study session. Neither unsafe ventilator settings nor failure of the system occurred. All patients were successfully transferred from controlled ventilation to assisted spontaneous breathing in a mean time of 37 ± 17 min (± SD). Early settings applied by the EWS did not significantly differ from the initial settings, except for the fraction of oxygen in inspired gas. During the later course, EWS significantly modified most of the ventilator settings and reduced the imposed respiratory load. A novel prototype automated weaning system was successfully developed. The first clinical application of EWS revealed that its operation was stable, safe ventilator settings were defined and the respiratory load of mechanical ventilation was decreased.

  12. Nuclear medicine. Abstracts; Nuklearmedizin 2000. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2000-07-01

    This issue of the journal contains the abstracts of the 183 conference papers as well as 266 posters presented at the conference. Subject fields covered are: Neurology, psychology, oncology, pediatrics, radiopharmacy, endocrinology, EDP, measuring equipment and methods, radiological protection, cardiology, and therapy. (orig./CB) [German] Die vorliegende Zeitschrift enthaelt die Kurzfassungen der 183 auf der Tagung gehaltenen Vortraege sowie der 226 praesentierten Poster, die sich mit den folgenden Themen befassten: Neurologie, Psychiatrie, Onkologie, Paediatrie, Radiopharmazie, Endokrinologie, EDV, Messtechnik, Strahlenschutz, Kardiologie sowie Therapie. (MG)

  13. Automation of block assignment planning using a diagram-based scenario modeling method

    Directory of Open Access Journals (Sweden)

    Hwang In Hyuck

    2014-03-01

    Full Text Available Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  14. Automated delineation of karst sinkholes from LiDAR-derived digital elevation models

    Science.gov (United States)

    Wu, Qiusheng; Deng, Chengbin; Chen, Zuoqi

    2016-08-01

    Sinkhole mapping is critical for understanding hydrological processes and mitigating geological hazards in karst landscapes. Current methods for identifying sinkholes are primarily based on visual interpretation of low-resolution topographic maps and aerial photographs with subsequent field verification, which is labor-intensive and time-consuming. The increasing availability of high-resolution LiDAR-derived digital elevation data allows for an entirely new level of detailed delineation and analyses of small-scale geomorphologic features and landscape structures at fine scales. In this paper, we present a localized contour tree method for automated extraction of sinkholes in karst landscapes. One significant advantage of our automated approach for sinkhole extraction is that it may reduce inconsistencies and alleviate repeatability concerns associated with visual interpretation methods. In addition, the proposed method has contributed to improving the sinkhole inventory in several ways: (1) detection of non-inventoried sinkholes; (2) identification of previously inventoried sinkholes that have been filled; (3) delineation of sinkhole boundaries; and (4) characterization of sinkhole morphometric properties. We applied the method to Fillmore County in southeastern Minnesota, USA, and identified three times as many sinkholes as the existing database for the same area. The results suggest that previous visual interpretation method might significantly underestimate the number of potential sinkholes in the region. Our method holds great potential for creating and updating sinkhole inventory databases at a regional scale in a timely manner.

  15. Agenda, extended abstracts, and bibliographies for a workshop on Deposit modeling, mineral resources assessment, and their role in sustainable development

    Science.gov (United States)

    Briskey, Joseph A., (Edited By); Schulz, Klaus J.

    2002-01-01

    Global demand for mineral resources continues to increase because of increasing global population and the desire and efforts to improve living standards worldwide. The ability to meet this growing demand for minerals is affected by the concerns about possible environmental degradation associated with minerals production and by competing land uses. Informed planning and decisions concerning sustainability and resource development require a long-term perspective and an integrated approach to land-use, resource, and environmental management worldwide. This, in turn, requires unbiased information on the global distribution of identified and especially undiscovered resources, the economic and political factors influencing their development, and the potential environmental consequences of their exploitation. The purpose of the IGC workshop is to review the state-of-the-art in mineral-deposit modeling and quantitative resource assessment and to examine their role in the sustainability of mineral use. The workshop will address such questions as: Which of the available mineral-deposit models and assessment methods are best suited for predicting the locations, deposit types, and amounts of undiscovered nonfuel mineral resources remaining in the world? What is the availability of global geologic, mineral deposit, and mineral-exploration information? How can mineral-resource assessments be used to address economic and environmental issues? Presentations will include overviews of assessment methods used in previous national and other small-scale assessments of large regions as well as resulting assessment products and their uses.

  16. Completeness of Lyapunov Abstraction

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Sloth, Christoffer

    2013-01-01

    This paper addresses the generation of complete abstractions of polynomial dynamical systems by timed automata. For the proposed abstraction, the state space is divided into cells by sublevel sets of functions. We identify a relation between these functions and their directional derivatives along...

  17. Designing for Mathematical Abstraction

    Science.gov (United States)

    Pratt, Dave; Noss, Richard

    2010-01-01

    Our focus is on the design of systems (pedagogical, technical, social) that encourage mathematical abstraction, a process we refer to as "designing for abstraction." In this paper, we draw on detailed design experiments from our research on children's understanding about chance and distribution to re-present this work as a case study in designing…

  18. SWAT and River-2D Modelling of Pinder River for Analysing Snow Trout Habitat under Different Flow Abstraction Scenarios

    Science.gov (United States)

    Nale, J. P.; Gosain, A. K.; Khosa, R.

    2015-12-01

    Pinder River, one of major headstreams of River Ganga, originates in Pindari Glaciers of Kumaon Himalayas and after passing through rugged gorges meets Alaknanda at Karanprayag forming one of the five celestial confluences of Upper Ganga region. While other sub-basins of Upper Ganga are facing severe ecological losses, Pinder basin is still in its virginal state and is well known for its beautiful valleys besides being host to unique and rare biodiversity. A proposed 252 MW run-of-river hydroelectric project at Devsari on this river has been a major concern on account of its perceived potential for egregious environmental and social impacts. In this context, the study presented tries to analyse the expected changes in aquatic habitat conditions after this project is operational (with different operation policies). SWAT hydrological modelling platform has been used to derive stream flow simulations under various scenarios ranging from the present to the likely future conditions. To analyse the habitat conditions, a two dimensional hydraulic-habitat model 'River-2D', a module of iRIC software, is used. Snow trout has been identified as the target keystone species and its habitat preferences, in the form of flow depths, flow velocity and substrate condition, are obtained from diverse sources of related literature and are provided as Habitat Suitability Indices to River-2D. Bed morphology constitutes an important River-2D input and has been obtained, for the designated 1 km long study reach of Pinder upto Karanprayag, from a combination of actual field observations and supplemented by SRTM 1 Arc-Second Global digital elevation data. Monthly Weighted Usable Area for three different life stages (Spawning, Juvenile and Adult) of Snow Trout are obtained corresponding to seven different flow discharges ranging from 10 cumec to 1000 cumec. Comparing the present and proposed future river flow conditions obtained from SWAT modelling, losses in Weighted Usable Area, for the

  19. Abstraction of Drift Seepage

    Energy Technology Data Exchange (ETDEWEB)

    J.T. Birkholzer

    2004-11-01

    This model report documents the abstraction of drift seepage, conducted to provide seepage-relevant parameters and their probability distributions for use in Total System Performance Assessment for License Application (TSPA-LA). Drift seepage refers to the flow of liquid water into waste emplacement drifts. Water that seeps into drifts may contact waste packages and potentially mobilize radionuclides, and may result in advective transport of radionuclides through breached waste packages [''Risk Information to Support Prioritization of Performance Assessment Models'' (BSC 2003 [DIRS 168796], Section 3.3.2)]. The unsaturated rock layers overlying and hosting the repository form a natural barrier that reduces the amount of water entering emplacement drifts by natural subsurface processes. For example, drift seepage is limited by the capillary barrier forming at the drift crown, which decreases or even eliminates water flow from the unsaturated fractured rock into the drift. During the first few hundred years after waste emplacement, when above-boiling rock temperatures will develop as a result of heat generated by the decay of the radioactive waste, vaporization of percolation water is an additional factor limiting seepage. Estimating the effectiveness of these natural barrier capabilities and predicting the amount of seepage into drifts is an important aspect of assessing the performance of the repository. The TSPA-LA therefore includes a seepage component that calculates the amount of seepage into drifts [''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504], Section 6.3.3.1)]. The TSPA-LA calculation is performed with a probabilistic approach that accounts for the spatial and temporal variability and inherent uncertainty of seepage-relevant properties and processes. Results are used for subsequent TSPA-LA components that may handle, for example, waste package

  20. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  1. Abstract Milling with Turn Costs

    CERN Document Server

    Fellows, M; Knauer, C; Paul, C; Rosamond, F; Whitesides, S; Yu, N

    2009-01-01

    The Abstract Milling problem is a natural and quite general graph-theoretic model for geometric milling problems. Given a graph, one asks for a walk that covers all its vertices with a minimum number of turns, as specified in the graph model by a 0/1 turncost function fx at each vertex x giving, for each ordered pair of edges (e,f) incident at x, the turn cost at x of a walk that enters the vertex on edge e and departs on edge f. We describe an initial study of the parameterized complexity of the problem. Our main positive result shows that Abstract Milling, parameterized by: number of turns, treewidth and maximum degree, is fixed-parameter tractable, We also show that Abstract Milling parameterized by (only) the number of turns and the pathwidth, is hard for W[1] -- one of the few parameterized intractability results for bounded pathwidth.

  2. Efficient abstraction selection in reinforcement learning

    NARCIS (Netherlands)

    Seijen, H. van; Whiteson, S.; Kester, L.

    2013-01-01

    This paper introduces a novel approach for abstraction selection in reinforcement learning problems modelled as factored Markov decision processes (MDPs), for which a state is described via a set of state components. In abstraction selection, an agent must choose an abstraction from a set of candida

  3. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    Science.gov (United States)

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. PMID:25448021

  4. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    Science.gov (United States)

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time.

  5. Model-based analysis of an automated changeover switching unit for a busbar. MODSAFE 2009 work report

    Energy Technology Data Exchange (ETDEWEB)

    Bjorkman, K.; Valkonen, J.; Ranta, J.

    2011-06-15

    Verification of digital instrumentation and control (I and C) systems is challenging, because programmable logic controllers enable complicated control functions and the state spaces (number of distinct values of inputs, outputs, and internal memory) of the designs become easily too large for comprehensive manual inspection. Model checking is a promising formal method that can be used for verifying the correctness of system designs. A number of efficient model checking systems are available offering analysis tools that are able to determine automatically whether a given state machine model satisfies the desired safety properties. Model checking can also handle delays and other time-related operations, which are crucial in safety I and C systems and challenging to design and verify. The system analysed in this research project is called 'automated changeover switching unit for a busbar' and its purpose is to switch the power feed to stand-by power supply in the event of voltage breaks. The system is modelled as a finite state machine and some of its key properties are verified with the NuSMV model checking tool. The time-dependent components are modelled to operate in discrete fixed-length time steps and the lengths of the timed functions are scaled to avoid state explosion and enable efficient model checking. (orig.)

  6. DSL development based on target meta-models. Using AST transformations for automating semantic analysis in a textual DSL framework

    CERN Document Server

    Breslav, Andrey

    2008-01-01

    This paper describes an approach to creating textual syntax for Do- main-Specific Languages (DSL). We consider target meta-model to be the main artifact and hence to be developed first. The key idea is to represent analysis of textual syntax as a sequence of transformations. This is made by explicit opera- tions on abstract syntax trees (ATS), for which a simple language is proposed. Text-to-model transformation is divided into two parts: text-to-AST (developed by openArchitectureWare [1]) and AST-to-model (proposed by this paper). Our approach simplifies semantic analysis and helps to generate as much as possi- ble.

  7. Toward the Development of a Fundamentally Based Chemical Model for Cyclopentanone: High-Pressure-Limit Rate Constants for H Atom Abstraction and Fuel Radical Decomposition.

    Science.gov (United States)

    Zhou, Chong-Wen; Simmie, John M; Pitz, William J; Curran, Henry J

    2016-09-15

    Theoretical aspects of the development of a chemical kinetic model for the pyrolysis and combustion of a cyclic ketone, cyclopentanone, are considered. Calculated thermodynamic and kinetic data are presented for the first time for the principal species including 2- and 3-oxo-cyclopentyl radicals, which are in reasonable agreement with the literature. These radicals can be formed via H atom abstraction reactions by Ḣ and Ö atoms and ȮH, HȮ2, and ĊH3 radicals, the rate constants of which have been calculated. Abstraction from the β-hydrogen atom is the dominant process when ȮH is involved, but the reverse holds true for HȮ2 radicals. The subsequent β-scission of the radicals formed is also determined, and it is shown that recent tunable VUV photoionization mass spectrometry experiments can be interpreted in this light. The bulk of the calculations used the composite model chemistry G4, which was benchmarked in the simplest case with a coupled cluster treatment, CCSD(T), in the complete basis set limit. PMID:27558073

  8. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls...... or capturing concrete values, objects, or actions. As the next step, some of these are lifted to a higher level by computational means. In the object-oriented paradigm the target of such steps is classes. We hypothesise that the proposed approach primarily will be beneficial to novice programmers or during...

  9. An Automated Feedback System Based on Adaptive Testing: Extending the Model

    Directory of Open Access Journals (Sweden)

    Trevor Barker

    2010-06-01

    Full Text Available Abstract—The results of the recent national students survey (NSS revealed that a major problem in HE today is that of student feedback. Research carried out by members of the project team in the past has led to the development of an automated student feedback system for use with objective formative testing. This software relies on an ‘intelligent’ engine to determine the most appropriate individual feedback, based on test performance, relating not only to answers, but also to Bloom’s cognitive levels. The system also recommends additional materials and challenges, for each individual learner. Detailed evaluation with more than 500 students and 100 university staff have shown that the system is highly valued by learners and seen by staff as an important addition to the methods available. The software has been used on two modules so far over a two year period

  10. Automating the development of Physical Mobile Workflows. A Model Driven Engineering approach

    OpenAIRE

    Giner Blasco, Pau

    2010-01-01

    La visión de la "Internet de las Cosas", hace énfasis en la integración entre elementos del mundo real y los Sistemas de Información. Gracias a tecnologías de Identificación Automática (Auto-ID) cómo RFID, los sistemas pueden percibir objetos del mundo físico. Cuando éstos participan de manera activa en los procesos de negocio, se evita el uso de los seres humanos como transportadores de información. Por tanto, el número de errores se reduce y la eficiencia de los procesos aumenta. Aunque...

  11. MATHEMATICAL MODELING, AUTOMATION AND CONTROL OF THE BIOCONVERSION OF SORBITOL TO SORBOSE IN THE VITAMIN C PRODUCTION PROCESS I. MATHEMATICAL MODELING

    Directory of Open Access Journals (Sweden)

    Bonomi A.

    1997-01-01

    Full Text Available In 1990, the Biotechnology and the Control Systems Groups of IPT started developing a system for the control and automation of fermentation processes, applied to the oxidation of sorbitol to sorbose by the bacteria Gluconobacter oxydans, the microbial step of the vitamin C production process, that was chosen as a case study. Initially, a thirteen-parameter model was fitted to represent the batch operation of the system utilizing a nonlinear regression analysis, the flexible polyhedron method. Based on these results, a model for the continuous process (with the same kinetic equations was constructed and its optimum operating point obtained

  12. Automata Learning through Counterexample Guided Abstraction Refinement

    DEFF Research Database (Denmark)

    Aarts, Fides; Heidarian, Faranak; Kuppens, Harco;

    2012-01-01

    Abstraction is the key when learning behavioral models of realistic systems. Hence, in most practical applications where automata learning is used to construct models of software components, researchers manually define abstractions which, depending on the history, map a large set of concrete events...... are allowed. Our approach uses counterexample-guided abstraction refinement: whenever the current abstraction is too coarse and induces nondeterministic behavior, the abstraction is refined automatically. Using Tomte, a prototype tool implementing our algorithm, we have succeeded to learn – fully...

  13. Modeling pilot interaction with automated digital avionics systems: Guidance and control algorithms for contour and nap-of-the-Earth flight

    Science.gov (United States)

    Hess, Ronald A.

    1990-01-01

    A collection of technical papers are presented that cover modeling pilot interaction with automated digital avionics systems and guidance and control algorithms for contour and nap-of-the-earth flight. The titles of the papers presented are as follows: (1) Automation effects in a multiloop manual control system; (2) A qualitative model of human interaction with complex dynamic systems; (3) Generalized predictive control of dynamic systems; (4) An application of generalized predictive control to rotorcraft terrain-following flight; (5) Self-tuning generalized predictive control applied to terrain-following flight; and (6) Precise flight path control using a predictive algorithm.

  14. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Influence of Fermented Product from Beneficial Microorganism on the Cultivation of Larvae Apostichopus japonicus Li Shuang et al(1) Abstract The fermented product from beneficial microorganism was applied in the seed rearing of sea cucumber.The result

  15. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    To most people the concept of abstract machines is connected to the name of Alan Turing and the development of the modern computer. The Turing machine is universal, axiomatic and symbolic (E.g. operating on symbols). Inspired by Foucault, Deleuze and Guattari extended the concept of abstract...... machines to singular, non-axiomatic and diagrammatic machines. That is: Machines which constitute becomings. This presentation gives a survey of the development of the concept of abstract machines in the philosophy of Deleuze and Guatari and the function of these abstract machines in the creation of works...... of art. From Difference and Repetition to Anti-Oedipus, the machines are conceived as binary machines based on the exclusive or inclusive use respectively of the three syntheses: conexa, disjuncta and conjuncta. The machines have a twofold embedment: In the desiring-production and in the social...

  16. 2016 ACPA MEETING ABSTRACTS.

    Science.gov (United States)

    2016-07-01

    The peer-reviewed abstracts presented at the 73rd Annual Meeting of the ACPA are published as submitted by the authors. For financial conflict of interest disclosure, please visit http://meeting.acpa-cpf.org/disclosures.html. PMID:27447885

  17. Abstract sectional category

    CERN Document Server

    Diaz, F; Garcia, P; Murillo, A; Remedios, J

    2011-01-01

    We study, in an abstract axiomatic setting, the notion of sectional category of a morphism. From this, we unify and generalize known results about this invariant in different settings as well as we deduce new applications.

  18. 08071 Abstracts Collection -- Scheduling

    OpenAIRE

    Jane W. S. Liu; Rolf H. Möhring; Pruhs, Kirk

    2008-01-01

    From 10.02. to 15.02., the Dagstuhl Seminar 08071 ``Scheduling'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in gen...

  19. 10071 Abstracts Collection -- Scheduling

    OpenAIRE

    Albers, Susanne; Baruah, Sanjoy K; Rolf H. Möhring; Pruhs, Kirk

    2010-01-01

    From 14.02. to 19.02.2010, the Dagstuhl Seminar 10071 ``Scheduling '' was held in Schloss Dagstuhl-Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to ext...

  20. 07381 Abstracts Collection -- Cryptography

    OpenAIRE

    Blömer, Johannes; Boneh, Dan; Cramer, Ronald; Maurer, Ueli

    2008-01-01

    From 16.09.2007 to 21.09.2007 the Dagstuhl Seminar 07381 ``Cryptography'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goa...

  1. Abstracts of contributed papers

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This volume contains 571 abstracts of contributed papers to be presented during the Twelfth US National Congress of Applied Mechanics. Abstracts are arranged in the order in which they fall in the program -- the main sessions are listed chronologically in the Table of Contents. The Author Index is in alphabetical order and lists each paper number (matching the schedule in the Final Program) with its corresponding page number in the book.

  2. Norddesign 2012 - Book of Abstract

    DEFF Research Database (Denmark)

    has been organized in line with the original ideas. The topics mentioned in the call for abstracts were: Product Development: Integrated, Multidisciplinary, Product life oriented and Distributed. Multi-product Development. Innovation and Business Models. Engineering Design and Industrial Design...... received more than 140 abstracts and through the review process this have resulted in approximately 70 accepted papers. One of the new research fields included in this conference is the area of Biomechanics – hence the cover graphics of the conference proceedings. With this short introduction we encourage...

  3. Formal Abstractions for Automated Verification and Synthesis of Stochastic Systems

    NARCIS (Netherlands)

    Esmaeil Zadeh Soudjani, S.

    2014-01-01

    Stochastic hybrid systems involve the coupling of discrete, continuous, and probabilistic phenomena, in which the composition of continuous and discrete variables captures the behavior of physical systems interacting with digital, computational devices. Because of their versatility and generality, m

  4. Toward Automated FAÇADE Texture Generation for 3d Photorealistic City Modelling with Smartphones or Tablet Pcs

    Science.gov (United States)

    Wang, S.

    2012-07-01

    An automated model-image fitting algorithm is proposed in this paper for generating façade texture image from pictures taken by smartphones or tablet PCs. The façade texture generation requires tremendous labour work and thus, has been the bottleneck of 3D photo-realistic city modelling. With advanced developments of the micro electro mechanical system (MEMS), camera, global positioning system (GPS), and gyroscope (G-sensors) can all be integrated into a smartphone or a table PC. These sensors bring the possibility of direct-georeferencing for the pictures taken by smartphones or tablet PCs. Since the accuracy of these sensors cannot compared to the surveying instruments, the image position and orientation derived from these sensors are not capable of photogrammetric measurements. This paper adopted the least-squares model-image fitting (LSMIF) algorithm to iteratively improve the image's exterior orientation. The image position from GPS and the image orientation from gyroscope are treated as the initial values. By fitting the projection of the wireframe model to the extracted edge pixels on image, the image exterior orientation elements are solved when the optimal fitting achieved. With the exact exterior orientation elements, the wireframe model of the building can be correctly projected on the image and, therefore, the façade texture image can be extracted from the picture.

  5. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa

    2008-03-01

    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  6. Metacognition and abstract reasoning.

    Science.gov (United States)

    Markovits, Henry; Thompson, Valerie A; Brisson, Janie

    2015-05-01

    The nature of people's meta-representations of deductive reasoning is critical to understanding how people control their own reasoning processes. We conducted two studies to examine whether people have a metacognitive representation of abstract validity and whether familiarity alone acts as a separate metacognitive cue. In Study 1, participants were asked to make a series of (1) abstract conditional inferences, (2) concrete conditional inferences with premises having many potential alternative antecedents and thus specifically conducive to the production of responses consistent with conditional logic, or (3) concrete problems with premises having relatively few potential alternative antecedents. Participants gave confidence ratings after each inference. Results show that confidence ratings were positively correlated with logical performance on abstract problems and concrete problems with many potential alternatives, but not with concrete problems with content less conducive to normative responses. Confidence ratings were higher with few alternatives than for abstract content. Study 2 used a generation of contrary-to-fact alternatives task to improve levels of abstract logical performance. The resulting increase in logical performance was mirrored by increases in mean confidence ratings. Results provide evidence for a metacognitive representation based on logical validity, and show that familiarity acts as a separate metacognitive cue. PMID:25416026

  7. From nominal sets binding to functions and lambda-abstraction: connecting the logic of permutation models with the logic of functions

    CERN Document Server

    Dowek, Gilles

    2011-01-01

    Permissive-Nominal Logic (PNL) extends first-order predicate logic with term-formers that can bind names in their arguments. It takes a semantics in (permissive-)nominal sets. In PNL, the forall-quantifier or lambda-binder are just term-formers satisfying axioms, and their denotation is functions on nominal atoms-abstraction. Then we have higher-order logic (HOL) and its models in ordinary (i.e. Zermelo-Fraenkel) sets; the denotation of forall or lambda is functions on full or partial function spaces. This raises the following question: how are these two models of binding connected? What translation is possible between PNL and HOL, and between nominal sets and functions? We exhibit a translation of PNL into HOL, and from models of PNL to certain models of HOL. It is natural, but also partial: we translate a restricted subsystem of full PNL to HOL. The extra part which does not translate is the symmetry properties of nominal sets with respect to permutations. To use a little nominal jargon: we can translate na...

  8. Thermography colloquium 2015. Abstracts

    International Nuclear Information System (INIS)

    The USB stick contains 17 lectures which where held on the Thermography colloquium 2015 in Leinfelden-Echterdingen (Germany). Here a selection of the topics: Thermal Chladni sound figures in nondestructive testing (M. Rahammer); Flash thermography with several flashes (R. Krankenhagen); Frequency optimization of ultrasound-induced thermography during the measurement (C. Srajbr); Worldwide introduction of a thermographic inspection system for gas turbine components (M. Goldammer); Practical aspects of automation of thermographic weld inspection (G.Mahler); Investigations to determine the crack depth with inductive thermography (B. Oswald-Tranta); Testing of spot welds with laser thermography (M. Ziegler).

  9. Semi-automated curation of metabolic models via flux balance analysis: a case study with Mycoplasma gallisepticum.

    Directory of Open Access Journals (Sweden)

    Eddy J Bautista

    Full Text Available Primarily used for metabolic engineering and synthetic biology, genome-scale metabolic modeling shows tremendous potential as a tool for fundamental research and curation of metabolism. Through a novel integration of flux balance analysis and genetic algorithms, a strategy to curate metabolic networks and facilitate identification of metabolic pathways that may not be directly inferable solely from genome annotation was developed. Specifically, metabolites involved in unknown reactions can be determined, and potentially erroneous pathways can be identified. The procedure developed allows for new fundamental insight into metabolism, as well as acting as a semi-automated curation methodology for genome-scale metabolic modeling. To validate the methodology, a genome-scale metabolic model for the bacterium Mycoplasma gallisepticum was created. Several reactions not predicted by the genome annotation were postulated and validated via the literature. The model predicted an average growth rate of 0.358±0.12[Formula: see text], closely matching the experimentally determined growth rate of M. gallisepticum of 0.244±0.03[Formula: see text]. This work presents a powerful algorithm for facilitating the identification and curation of previously known and new metabolic pathways, as well as presenting the first genome-scale reconstruction of M. gallisepticum.

  10. Beyond the abstractions?

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2006-01-01

      The anniversary of the International Journal of Lifelong Education takes place in the middle of a conceptual landslide from lifelong education to lifelong learning. Contemporary discourses of lifelong learning etc are however abstractions behind which new functions and agendas for adult education...... the abstractions by broadening it's scope of interest, by focussing on differences of learners, by age, gender, ethnicity, social experience, and differences in context, by socioeconomic environment, culture etc. Practically, it is argued, this means to embrace the new focus on informal learning and work related...

  11. A Knowledge Based Approach for Automated Modelling of Extended Wing Structures in Preliminary Aircraft Design

    OpenAIRE

    Dorbath, Felix; Nagel, Björn; Gollnick, Volker

    2011-01-01

    This paper introduces the concept of the ELWIS model generator for Finite Element models of aircraft wing structures. The physical modelling of the structure is extended beyond the wing primary structures, to increase the level of accuracy for aircraft which diverge from existing configurations. Also the impact of novel high lift technologies on structural masses can be captured already in the early stages of design by using the ELWIS models. The ELWIS model generator is able to c...

  12. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  13. Automated evolutionary optimization of ion channel conductances and kinetics in models of young and aged rhesus monkey pyramidal neurons.

    Science.gov (United States)

    Rumbell, Timothy H; Draguljić, Danel; Yadav, Aniruddha; Hof, Patrick R; Luebke, Jennifer I; Weaver, Christina M

    2016-08-01

    Conductance-based compartment modeling requires tuning of many parameters to fit the neuron model to target electrophysiological data. Automated parameter optimization via evolutionary algorithms (EAs) is a common approach to accomplish this task, using error functions to quantify differences between model and target. We present a three-stage EA optimization protocol for tuning ion channel conductances and kinetics in a generic neuron model with minimal manual intervention. We use the technique of Latin hypercube sampling in a new way, to choose weights for error functions automatically so that each function influences the parameter search to a similar degree. This protocol requires no specialized physiological data collection and is applicable to commonly-collected current clamp data and either single- or multi-objective optimization. We applied the protocol to two representative pyramidal neurons from layer 3 of the prefrontal cortex of rhesus monkeys, in which action potential firing rates are significantly higher in aged compared to young animals. Using an idealized dendritic topology and models with either 4 or 8 ion channels (10 or 23 free parameters respectively), we produced populations of parameter combinations fitting the target datasets in less than 80 hours of optimization each. Passive parameter differences between young and aged models were consistent with our prior results using simpler models and hand tuning. We analyzed parameter values among fits to a single neuron to facilitate refinement of the underlying model, and across fits to multiple neurons to show how our protocol will lead to predictions of parameter differences with aging in these neurons. PMID:27106692

  14. ABSTRACTS WELDEL PIPE AND TUBE

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    ABSTRACTS WELDEL PIPE AND TUBE Vol.24 No.3 May.2001 Huang Jingan(1) Strengthen, Intercourse, Coordination and Promote the Development Together Liang Aiyu(11) The Production and the Development of the Water supply pipe for City Construction From the aspects of the quality, appearance, environment protection, economic analysis etc., This article evaluates the galvanized pipe, plastic steel complex pipe, plastic aluminum pipe, stainless pipe for city water supply. In accordance with the requirements of the city construction programming and development, it is considered that the plastic aluminum pipe and plastic steel pipe instead of galvanization pipe is the trend of the development. The author also gives some constructive proposals for reference. Subject Terms:galvanized pipe complex pipe stainless pipe city water supply evaluation Zhao Rongbin,Li Guangjun(14) The TIG welding of Protected Tantalum-pipe for sheathed thermocouples used in corrosive environment The protected Tantalum-pipe welding of sheathed therocouples was investigated by TIG. The welding process and its key parameters were introduced. Welding quality influenced by processing was discussed. Subject Terms:welding protected Tantalum-pipe corrosion He Defu et al(18) Design and Research for An Automatic MIG Welding Machine of Catalyst Converter of Automobile Two different schemes for automatic MIG welding of catalyst converter of automobile have been compared and analysed. A design of automatic MIG welding machine used for catalyst converter of automobile has been suggested in this paper. Subject Terms:environmental protection automobile tri-catalyst converter MIG welding automatic welding PLC Fang Chucai(24) Cold Crack Analysis of Low Alloy High Strength Steel Weld Seam Heat Affected Area During the welding of low alloy high strength (X65 and above), the fine crack occurs in the weld (especially inner weld) and the low plastic hard brickle structure occurs in the Heat Affected Area (HAZ) sometime. This

  15. Automated Design Space Exploration with Aspen

    Directory of Open Access Journals (Sweden)

    Kyle L. Spafford

    2015-01-01

    Full Text Available Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation language with three new language constructs: user-defined resources, parameter ranges, and a collection of costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.

  16. Building Safe Concurrency Abstractions

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    2014-01-01

    as well as programming, and we describe how this has had an impact on the design of the language. Although Beta supports the definition of high-level concurrency abstractions, the use of these rely on the discipline of the programmer as is the case for Java and other mainstream OO languages. We introduce...

  17. ESPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-06-15

    The Proceedings on ESPR 2014 include abstracts concerning the following topics: pediatric imaging: thorax, cardiovascular system, CT-technique, head and neck, perinatal imaging, molecular imaging; interventional imaging; specific focus: muscoskeletal imaging in juvenile idiopathic arthritis; radiation protection; oncology; molecular imaging - nuclear medicine; uroradiology and abdominal imaging.

  18. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Study of Feeding Effects of EM Fermented Feed on the Growth and Survival of Juvenile Sea Cucumber Apostichopus japonicus Gong Hai-ning et al(1) Abstract In this study, comparisons of feeding effects on the sea cucumber Apostichopus japonicus between th

  19. Leadership Abstracts, 2002.

    Science.gov (United States)

    Wilson, Cynthia, Ed.; Milliron, Mark David, Ed.

    2002-01-01

    This 2002 volume of Leadership Abstracts contains issue numbers 1-12. Articles include: (1) "Skills Certification and Workforce Development: Partnering with Industry and Ourselves," by Jeffrey A. Cantor; (2) "Starting Again: The Brookhaven Success College," by Alice W. Villadsen; (3) "From Digital Divide to Digital Democracy," by Gerardo E. de los…

  20. SPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-04-01

    The volume contains the abstracts of the SPR (society for pediatric radiology) 2015 meeting covering the following issues: fetal imaging, muscoskeletal imaging, cardiac imaging, chest imaging, oncologic imaging, tools for process improvement, child abuse, contrast enhanced ultrasound, image gently - update of radiation dose recording/reporting/monitoring - meaningful or useless meaning?, pediatric thoracic imaging, ALARA.

  1. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Morphological Variations and Discriminant Analysis of Three Populations of Mytilus coruscus Ye Ya-qiu et al. (4) Abstract The multivariate morphometrics analysis method was used for studying four morphological characters of three geographical populations of Mytilus coruscus from Sheng-si, Zhou-shan, Tai-zhou along the coast of Zhe-jiang province of China.

  2. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Study on the Enrichment Regularity of Semicarbazide in Algae Tian Xiu-hui eta1. (1) Abstract Semicarbazide (SEM) in three kinds of representative algae (Nitzschia closterium, Tetraselmis chui and Dicrateria sp) and seawater was determined using ultra performance liquid chromatogram tandem mass spectrometry in this work. Accumulation of semicarbazide (SEM) in algae under laboratory conditions was studied.

  3. 2002 NASPSA Conference Abstracts.

    Science.gov (United States)

    Journal of Sport & Exercise Psychology, 2002

    2002-01-01

    Contains abstracts from the 2002 conference of the North American Society for the Psychology of Sport and Physical Activity. The publication is divided into three sections: the preconference workshop, "Effective Teaching Methods in the Classroom;" symposia (motor development, motor learning and control, and sport psychology); and free…

  4. Monadic abstract interpreters

    DEFF Research Database (Denmark)

    Sergey, Ilya; Devriese, Dominique; Might, Matthew;

    2013-01-01

    Recent developments in the systematic construction of abstract interpreters hinted at the possibility of a broad unification of concepts in static analysis. We deliver that unification by showing context-sensitivity, polyvariance, flow-sensitivity, reachabilitypruning, heap-cloning and cardinalit...

  5. Abstracts of submitted papers

    International Nuclear Information System (INIS)

    The conference proceedings contain 152 abstracts of presented papers relating to various aspects of personnel dosimetry, the dosimetry of the working and living environment, various types of dosemeters and spectrometers, the use of radionuclides in various industrial fields, the migration of radionuclides on Czechoslovak territory after the Chernobyl accident, theoretical studies of some parameters of ionizing radiation detectors, and their calibration. (M.D.)

  6. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Study on Optimization of Enzymic Preparation of Collagen Polypeptide from Skin of Gadous macrocephaius Liu Chun-e et al. (1) Abstract Enzymolysis was used to prepare collagen peptide. The optimum condition was determined based on one way ANOVA and orthogonal experimental design. The result indicated that use alkaline protease on the concentration of 4.5%,

  7. Abstract Film and Beyond.

    Science.gov (United States)

    Le Grice, Malcolm

    A theoretical and historical account of the main preoccupations of makers of abstract films is presented in this book. The book's scope includes discussion of nonrepresentational forms as well as examination of experiments in the manipulation of time in films. The ten chapters discuss the following topics: art and cinematography, the first…

  8. THE CHINA MEDICAL ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The purpose of the China Medical Abstracts (Internal Medicine) is to promote international exchange of works done by the Chinese medical profession in the field of internal medicine. The papers selected from journals represent the newest and most important advances and progress in various specialities in internal medicine.

  9. Cambridge Scientific Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    正Meteorological and Environmental Research has been included by Cambridge Scientific Abstracts (CSA) since 2011. CSA is a retrieval system published by Cambridge Information Group. CSA was founded in the late 1950's,and became part of the CIG family in 1971. CSA's original mission was publishing secondary source materials relating to the physical sciences. Completely

  10. Cambridge Scientific Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Meteorological and Environmental Research has been included by Cambridge Scientific Abstracts (CSA) since 2011. CSA is a retrieval system published by Cambridge Information Group. CSA was founded in the late 1950’s,and became part of the CIG family in 1971. CSA’s original mission was publishing secondary source materials relating to the physical sciences. Completely

  11. Reasoning abstractly about resources

    Science.gov (United States)

    Clement, B.; Barrett, A.

    2001-01-01

    r describes a way to schedule high level activities before distributing them across multiple rovers in order to coordinate the resultant use of shared resources regardless of how each rover decides how to perform its activities. We present an algorithm for summarizing the metric resource requirements of an abstract activity based n the resource usages of its potential refinements.

  12. Test Construction: Automated

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2014-01-01

    Optimal test construction deals with automated assembly of tests for educational and psychological measurement. Items are selected from an item bank to meet a predefined set of test specifications. Several models for optimal test construction are presented, and two algorithms for optimal test assemb

  13. Test Construction: Automated

    NARCIS (Netherlands)

    Veldkamp, Bernard P.

    2016-01-01

    Optimal test construction deals with automated assembly of tests for educational and psychological measurement. Items are selected from an item bank to meet a predefined set of test specifications. Several models for optimal test construction are presented, and two algorithms for optimal test assemb

  14. Bounded Rationality of Generalized Abstract Fuzzy Economies

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2014-01-01

    Full Text Available By using a nonlinear scalarization technique, the bounded rationality model M for generalized abstract fuzzy economies in finite continuous spaces is established. Furthermore, by using the model M, some new theorems for structural stability and robustness to (λ,ϵ-equilibria of generalized abstract fuzzy economies are proved.

  15. Abstractions of Awareness: Aware of What?

    Science.gov (United States)

    Metaxas, Georgios; Markopoulos, Panos

    This chapter presents FN-AAR, an abstract model of awareness systems. The purpose of the model is to capture in a concise and abstract form essential aspects of awareness systems, many of which have been discussed in design essays or in the context of evaluating specific design solutions.

  16. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  17. Using Personal Health Records for Automated Clinical Trials Recruitment: the ePaIRing Model

    OpenAIRE

    Wilcox, Adam; Natarajan, Karthik; Weng, Chunhua

    2009-01-01

    We describe the development of a model describing the use of patient information to improve patient recruitment in clinical trials. This model, named ePaIRing (electronic Participant Identification and Recruitment Model) describes variations in how information flows between stakeholders, and how personal health records can specifically facilitate patient recruitment.

  18. On the Application of Macros to the Automation of different Dating Models Using ''210 Pb

    International Nuclear Information System (INIS)

    Different Dating models based on 210 Pb measurements, used for verifying recent events are shown in this report as well as, models that describe different processes affecting the vertical distribution of radionuclides in lacustrine and marine sediments. Macro-Commands are programmes included in calculation work sheets that allow automatised operations to run. In this report macros are used to: a) obtain 210 Pb results from a data base created from different sampling campaigns b) apply different dating models automatically c) optimise the diffusion coefficient employed by models through standards deviation calculations among experimental values and those obtained by the model. (Author) 21 refs

  19. Generic modeling and mapping languages for model management

    OpenAIRE

    Kensche, David Sebastian

    2010-01-01

    Activities in management of schemas and schema mappings are usually solved by special-purpose solutions such as coding wrapper components or manually updating view definitions. The goal of model management is to raise the level of abstraction for metadata-intensive activities by providing a set of high-level operators that automate or semi-automate such tasks. The problems of model management are aggravated by the fact that usually heterogeneous modeling languages, such as the relational data...

  20. Controlling Modelling Artifacts

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    artifacts that were inadvertently introduced. In this paper, we propose a novel methodology to reason about modelling artifacts, given a detailed model and a highlevel (more abstract) model of the same system. By a series of automated abstraction steps, we lift the detailed model to the same state space......, and identify some modelling artifacts in the model. Since we can apply our abstractions on-the-fly, while exploring the state space of the detailed model, we can analyse larger networks than are possible with existing techniques....

  1. iGen 0.1: a program for the automated generation of models and parameterisations

    Directory of Open Access Journals (Sweden)

    D. F. Tang

    2011-09-01

    Full Text Available Complex physical systems can often be simulated using very high resolution models but this is not always practical because of computational restrictions. In this case the model must be simplified or parameterised in order to make it computationally tractable. A parameterised model is created using an ad-hoc selection of techniques which range from the formal to the purely intuitive, and as a result it is very difficult to objectively quantify the fidelity of the model to the physical system. It is rare that a parameterised model can be formally shown to simulate a physical system to within some bounded error. Here we introduce a new approach to parameterising models which allows error to be formally bounded. The approach makes use of a newly developed computer program, which we call iGen, that analyses the source code of a high-resolution model and formally derives a much faster, parameterised model that closely approximates the original, reporting bounds on the error introduced by any approximations. These error bounds can be used to formally justify conclusions about a physical system based on observations of the model's behaviour. Using increasingly complex physical systems as examples we illustrate that iGen has the ability to produce parameterisations that run typically orders of magnitude faster than the underlying, high-resolution models from which they are derived.

  2. Automated Test Case Generation

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  3. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers

    OpenAIRE

    Shu, Jie; Dolman, G. E.; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-01-01

    Background Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Methods Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To ...

  4. Ghana Science Abstracts

    International Nuclear Information System (INIS)

    This issue of the Ghana Science Abstracts combines in one publication all the country's bibliographic output in science and technology. The objective is to provide a quick reference source to facilitate the work of information professionals, research scientists, lecturers and policy makers. It is meant to give users an idea of the depth and scope and results of the studies and projects carried out. The scope and coverage comprise research outputs, conference proceedings and periodical articles published in Ghana. It does not capture those that were published outside Ghana. Abstracts reported have been grouped under the following subject areas: Agriculture, Biochemistry, Biodiversity conservation, biological sciences, biotechnology, chemistry, dentistry, engineering, environmental management, forestry, information management, mathematics, medicine, physics, nuclear science, pharmacy, renewable energy and science education

  5. Abstracts of Main Essays

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Position of Capitalist Study in Marx's Social Formation Theory Yang Xue-gong Xi Da-min The orientation and achievements of Marx's study of Capitalism or bourgeois society is the foundation of his social formation theory. On the base of his scientific study of capitalism, Marx evolves his concept of eco- nomic social formation, the scientific methodology of researching other social formations or social forms, the clues of the development of social formations, the abstraction of the general laws as well as his reflection on this abstraction. A full evaluation and acknowledgement of the position of capitalist study in Marx's social formation theory is crucial for revising Marx's social formation theory in the new era and for solving some controversial issues in the research of social formation theory.

  6. Abstracts of the communications

    OpenAIRE

    2014-01-01

    (P) paper, (A) abstract only Dietary patterns and habitat of the Grimm’s duiker, Sylvicapra grimmia in Benin, (P)Abdoul Razack Adjibi Oualiou, Jean Claude Codjia, Guy Apollinaire Mensah The distribution of protected areas and conservation of flora in the republic of Benin, (P)Aristide Adomou, Hounnankpon Yedomonhan, Brice Sinsin, Laurentius Josephus and Gerardus Van Der Maesen The problem of invasive plants in protected areas. Chromolaena odorata in the regeneration process of the dense, semi...

  7. SPR 2014. Abstracts

    International Nuclear Information System (INIS)

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  8. Historical development of abstracting.

    Science.gov (United States)

    Skolnik, H

    1979-11-01

    The abstract, under a multitude of names, such as hypothesis, marginalia, abridgement, extract, digest, précis, resumé, and summary, has a long history, one which is concomitant with advancing scholarship. The progression of this history from the Sumerian civilization ca. 3600 B.C., through the Egyptian and Greek civilizations, the Hellenistic period, the Dark Ages, Middle Ages, Renaissance, and into the modern period is reviewed. PMID:399482

  9. Introduction to abstract analysis

    CERN Document Server

    Goldstein, Marvin E

    2015-01-01

    Developed from lectures delivered at NASA's Lewis Research Center, this concise text introduces scientists and engineers with backgrounds in applied mathematics to the concepts of abstract analysis. Rather than preparing readers for research in the field, this volume offers background necessary for reading the literature of pure mathematics. Starting with elementary set concepts, the treatment explores real numbers, vector and metric spaces, functions and relations, infinite collections of sets, and limits of sequences. Additional topics include continuity and function algebras, Cauchy complet

  10. SPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-05-15

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  11. Medical physics 2013. Abstracts

    International Nuclear Information System (INIS)

    The proceedings of the medical physics conference 2013 include abstract of lectures and poster sessions concerning the following issues: Tele-therapy - application systems, nuclear medicine and molecular imaging, neuromodulation, hearing and technical support, basic dosimetry, NMR imaging -CEST (chemical exchange saturation transfer), medical robotics, magnetic particle imaging, audiology, radiation protection, phase contrast - innovative concepts, particle therapy, brachytherapy, computerized tomography, quantity assurance, hybrid imaging techniques, diffusion and lung NMR imaging, image processing - visualization, cardiac and abdominal NMR imaging.

  12. Application of Holdridge life-zone model based on the terrain factor in Xinjiang Automous Region

    Institute of Scientific and Technical Information of China (English)

    NI Yong-ming; OUYANG Zhi-yun; WANG Xiao-ke

    2005-01-01

    This study improved the application of the Holdridge life-zone model to simulate the distribution of desert vegetation in China which gives statistics to support eco-recovery and ecosystem reconstruction in desert area. This study classified the desert vegetation into four types: (1) LAD: little arbor desert; (2) SD: shrub desert; (3) HLHSD: half-shrub, little half-shrub desert; (4) LHSCD: little halfshrub cushion desert. Based on the classification of Xinjiang desert vegetation, the classical Holdridge life-zone model was used to simulate Xinjiang desert vegetation's distribution and compare the Kappa coefficient result of the model with table of accuracy represented by Kappa values. The Kappa value of the model was only 0.19, it means the simulation result was poor. To improve the life-zone model application to Xinjiang desert vegetation type, a set of plot standards for terrain factors was developed by using the plot standard as the reclassification criterion to climate sub-regime. Then the desert vegetation in Xinjiang was simulated. The average Kappa value of the second simulation to the respective climate regime was 0.45. The Kappa value of final modeling result was 0.64, which is the better value.The modification of the model made it in more application region. In the end, the model' s ecological relevance to the Xinjiang desert vegetation types was studied.

  13. AN OPTIMIZATION-BASED HEURISTIC FOR A CAPACITATED LOT-SIZING MODEL IN AN AUTOMATED TELLER MACHINES NETWORK

    Directory of Open Access Journals (Sweden)

    Supatchaya Chotayakul

    2013-01-01

    Full Text Available This research studies a cash inventory problem in an ATM Network to satisfy customer’s cash needs over multiple periods with deterministic demand. The objective is to determine the amount of money to place in Automated Teller Machines (ATMs and cash centers for each period over a given time horizon. The algorithms are designed as a multi-echelon inventory problem with single-item capacitated lot-sizing to minimize total costs of running ATM network. In this study, we formulate the problem as a Mixed Integer Program (MIP and develop an approach based on reformulating the model as a shortest path formulation for finding a near-optimal solution of the problem. This reformulation is the same as the traditional model, except the capacity constraints, inventory balance constraints and setup constraints related to the management of the money in ATMs are relaxed. This new formulation gives more variables and constraints, but has a much tighter linear relaxation than the original and is faster to solve for short term planning. Computational results show its effectiveness, especially for large sized problems.

  14. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Directory of Open Access Journals (Sweden)

    Gregory R Johnson

    2015-12-01

    Full Text Available Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins and clinical research (e.g. identification of cancer biomarkers. Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  15. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  16. Automated Modeling and Simulation Using the Bond Graph Method for the Aerospace Industry

    Science.gov (United States)

    Granda, Jose J.; Montgomery, Raymond C.

    2003-01-01

    Bond graph modeling was originally developed in the late 1950s by the late Prof. Henry M. Paynter of M.I.T. Prof. Paynter acted well before his time as the main advantage of his creation, other than the modeling insight that it provides and the ability of effectively dealing with Mechatronics, came into fruition only with the recent advent of modern computer technology and the tools derived as a result of it, including symbolic manipulation, MATLAB, and SIMULINK and the Computer Aided Modeling Program (CAMPG). Thus, only recently have these tools been available allowing one to fully utilize the advantages that the bond graph method has to offer. The purpose of this paper is to help fill the knowledge void concerning its use of bond graphs in the aerospace industry. The paper first presents simple examples to serve as a tutorial on bond graphs for those not familiar with the technique. The reader is given the basic understanding needed to appreciate the applications that follow. After that, several aerospace applications are developed such as modeling of an arresting system for aircraft carrier landings, suspension models used for landing gears and multibody dynamics. The paper presents also an update on NASA's progress in modeling the International Space Station (ISS) using bond graph techniques, and an advanced actuation system utilizing shape memory alloys. The later covers the Mechatronics advantages of the bond graph method, applications that simultaneously involves mechanical, hydraulic, thermal, and electrical subsystem modeling.

  17. Forecasting performance of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this work we consider forecasting macroeconomic variables during an economic crisis. The focus is on a speci…c class of models, the so-called single hidden-layer feedforward autoregressive neural network models. What makes these models interesting in the present context is that they form a cla...... during the economic crisis 2007–2009. Forecast accuracy is measured by the root mean square forecast error. Hypothesis testing is also used to compare the performance of the different techniques with each other....

  18. Forecasting performances of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2014-01-01

    In this work we consider the forecasting of macroeconomic variables during an economic crisis. The focus is on a specific class of models, the so-called single hidden-layer feed-forward autoregressive neural network models. What makes these models interesting in the present context is the fact th...... Scandinavian ones, and focus on forecasting during the economic crisis 2007–2009. The forecast accuracy is measured using the root mean square forecast error. Hypothesis testing is also used to compare the performances of the different techniques....

  19. Automated test generation for production systems with a model-based testing approach

    OpenAIRE

    Durand, William

    2016-01-01

    This thesis tackles the problem of testing (legacy) production systems such as those of our industrial partner Michelin, one of the three largest tire manufacturers in the world, by means of Model-based Testing. A production system is defined as a set of production machines controlled by a software, in a factory. Despite the large body of work within the field of Model-based Testing, a common issue remains the writing of models describing either the system under test or its specification. It ...

  20. Bringing Automated Model Checking to PLC Program Development - A CERN Case Study

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. Model checking appears to be an appropriate approach for this purpose. However, this technique is not widely used in industry yet, due to some obstacles. The main obstacles encountered when trying to apply formal verification techniques at industrial installations are the difficulty of creating models out of PLC programs and defining formally the specification requirements. In addition, models produced out of real-life programs have a huge state space, thus preventing the verification due to performance issues. Our work at CERN (European Organization for Nuclear Research) focuses on developing efficient automatic verification methods for industrial critical installations based on PLC (Programmable Logic Controller) control systems. In this paper, we present a tool generating automatically formal models out of PLC code. The tool implements a general methodology which can support several input languages, ...

  1. Designing attractive models via automated identification of chaotic and oscillatory dynamical regimes

    OpenAIRE

    Silk, Daniel; Kirk, Paul D. W.; Barnes, Chris P.; Toni, Tina; Rose, Anna; Moon, Simon; Dallman, Margaret J.; Michael P H Stumpf

    2011-01-01

    Chaos and oscillations continue to capture the interest of both the scientific and public domains. Yet despite the importance of these qualitative features, most attempts at constructing mathematical models of such phenomena have taken an indirect, quantitative approach, for example, by fitting models to a finite number of data points. Here we develop a qualitative inference framework that allows us to both reverse-engineer and design systems exhibiting these and other dynamical behaviours by...

  2. Reference Model of Desired Yaw Angle for Automated Lane Changing Behavior of Vehicle

    Institute of Scientific and Technical Information of China (English)

    Dianbo Ren; Guanzhe Zhang; Hangzhe Wu

    2016-01-01

    In this paper, it studies the problem of trajectory planning and tracking for lane changing behavior of vehicle in automatic highway systems. Based on the model of yaw angle acceleration with positive and negative trapezoid constraint, by analyzing the variation laws of yaw motion of vehicle during a lane changing maneuver, the reference model of desired yaw angle and yaw rate for lane changing is generated. According to the yaw angle model, the vertical and horizontal coordinates of trajectory for vehicle lane change are calculated. Assuming that the road curvature is a constant, the difference and associations between two scenarios are analyzed, the lane changing maneuvers occurred on curve road and straight road, respectively. On this basis, it deduces the calculation method of desired yaw angle for lane changing on circular road. Simulation result shows that, it is different from traditional lateral acceleration planning method with the trapezoid constraint, by applying the trapezoidal yaw acceleration reference model proposed in this paper, the resulting expected yaw angular acceleration is continuous, and the step tracking for steering angle is not needed to implement. Due to the desired yaw model is direct designed based on the variation laws of raw movement of vehicle during a lane changing maneuver, rather than indirectly calculated from the trajectory model for lane changing, the calculation steps are simplified.

  3. Automated measurement of Drosophila wings

    Directory of Open Access Journals (Sweden)

    Mezey Jason

    2003-12-01

    Full Text Available Abstract Background Many studies in evolutionary biology and genetics are limited by the rate at which phenotypic information can be acquired. The wings of Drosophila species are a favorable target for automated analysis because of the many interesting questions in evolution and development that can be addressed with them, and because of their simple structure. Results We have developed an automated image analysis system (WINGMACHINE that measures the positions of all the veins and the edges of the wing blade of Drosophilid flies. A video image is obtained with the aid of a simple suction device that immobilizes the wing of a live fly. Low-level processing is used to find the major intersections of the veins. High-level processing then optimizes the fit of an a priori B-spline model of wing shape. WINGMACHINE allows the measurement of 1 wing per minute, including handling, imaging, analysis, and data editing. The repeatabilities of 12 vein intersections averaged 86% in a sample of flies of the same species and sex. Comparison of 2400 wings of 25 Drosophilid species shows that wing shape is quite conservative within the group, but that almost all taxa are diagnosably different from one another. Wing shape retains some phylogenetic structure, although some species have shapes very different from closely related species. The WINGMACHINE system facilitates artificial selection experiments on complex aspects of wing shape. We selected on an index which is a function of 14 separate measurements of each wing. After 14 generations, we achieved a 15 S.D. difference between up and down-selected treatments. Conclusion WINGMACHINE enables rapid, highly repeatable measurements of wings in the family Drosophilidae. Our approach to image analysis may be applicable to a variety of biological objects that can be represented as a framework of connected lines.

  4. Abstract Cauchy problems three approaches

    CERN Document Server

    Melnikova, Irina V

    2001-01-01

    Although the theory of well-posed Cauchy problems is reasonably understood, ill-posed problems-involved in a numerous mathematical models in physics, engineering, and finance- can be approached in a variety of ways. Historically, there have been three major strategies for dealing with such problems: semigroup, abstract distribution, and regularization methods. Semigroup and distribution methods restore well-posedness, in a modern weak sense. Regularization methods provide approximate solutions to ill-posed problems. Although these approaches were extensively developed over the last decades by many researchers, nowhere could one find a comprehensive treatment of all three approaches.Abstract Cauchy Problems: Three Approaches provides an innovative, self-contained account of these methods and, furthermore, demonstrates and studies some of the profound connections between them. The authors discuss the application of different methods not only to the Cauchy problem that is not well-posed in the classical sense, b...

  5. Automation of reverse engineering process in aircraft modeling and related optimization problems

    Science.gov (United States)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  6. Study on dynamic model of tractor system for automated navigation applications

    Institute of Scientific and Technical Information of China (English)

    FENG Lei; HE Yong

    2005-01-01

    This research aims at using a dynamic model of tractor system to support navigation system design for an automatically guided agricultural tractor. This model, consisting of a bicycle model of the tractor system, has been implemented in the MATLAB environment and was developed based on a John Deere tractor. The simulation results from this MATLAB model was validated through field navigation tests. The accuracy of the trajectory estimation is strongly affected by the determination of the cornering stiffness of the tractor. In this simulation, the tractor cornering stiffness analysis was identified during simulation analysis using the MATLAB model based on the recorded trajectory data. The obtained data was used in simulation analyses for various navigation operations in the field of interest. The analysis on field validation test results indicated that the developed tractor system could accurately estimate wheel trajectories of a tractor system while operating in agricultural fields at various speeds. The results also indicated that the developed system could accurately determine tractor velocity and steering angle while the tractor operates in curved fields.

  7. High-Throughput Automated Phenotyping of Two Genetic Mouse Models of Huntington's Disease.

    Science.gov (United States)

    Balci, Fuat; Oakeshott, Stephen; Shamy, Jul Lea; El-Khodor, Bassem F; Filippov, Igor; Mushlin, Richard; Port, Russell; Connor, David; Paintdakhi, Ahmad; Menalled, Liliana; Ramboz, Sylvie; Howland, David; Kwak, Seung; Brunner, Dani

    2013-01-01

    Phenotyping with traditional behavioral assays constitutes a major bottleneck in the primary screening, characterization, and validation of genetic mouse models of disease, leading to downstream delays in drug discovery efforts. We present a novel and comprehensive one-stop approach to phenotyping, the PhenoCube™. This system simultaneously captures the cognitive performance, motor activity, and circadian patterns of group-housed mice by use of home-cage operant conditioning modules (IntelliCage) and custom-built computer vision software. We evaluated two different mouse models of Huntington's Disease (HD), the R6/2 and the BACHD in the PhenoCube™ system. Our results demonstrated that this system can efficiently capture and track alterations in both cognitive performance and locomotor activity patterns associated with these disease models. This work extends our prior demonstration that PhenoCube™ can characterize circadian dysfunction in BACHD mice and shows that this system, with the experimental protocols used, is a sensitive and efficient tool for a first pass high-throughput screening of mouse disease models in general and mouse models of neurodegeneration in particular. PMID:23863947

  8. Designing Attractive Models via Automated Identification of Chaotic and Oscillatory Dynamical Regimes

    CERN Document Server

    Silk, Daniel; Barnes, Chris P; Toni, Tina; Rose, Anna; Moon, Simon; Dallman, Margaret J; Stumpf, Michael P H

    2011-01-01

    Chaos and oscillations continue to capture the interest of both the scientific and public domains. Yet despite the importance of these qualitative features, most attempts at constructing mathematical models of such phenomena have taken an indirect, quantitative approach, e.g. by fitting models to a finite number of data-points. Here we develop a qualitative inference framework that allows us to both reverse engineer and design systems exhibiting these and other dynamical behaviours by directly specifying the desired characteristics of the underlying dynamical attractor. This change in perspective from quantitative to qualitative dynamics, provides fundamental and new insights into the properties of dynamical systems.

  9. Designing attractive models via automated identification of chaotic and oscillatory dynamical regimes.

    Science.gov (United States)

    Silk, Daniel; Kirk, Paul D W; Barnes, Chris P; Toni, Tina; Rose, Anna; Moon, Simon; Dallman, Margaret J; Stumpf, Michael P H

    2011-01-01

    Chaos and oscillations continue to capture the interest of both the scientific and public domains. Yet despite the importance of these qualitative features, most attempts at constructing mathematical models of such phenomena have taken an indirect, quantitative approach, for example, by fitting models to a finite number of data points. Here we develop a qualitative inference framework that allows us to both reverse-engineer and design systems exhibiting these and other dynamical behaviours by directly specifying the desired characteristics of the underlying dynamical attractor. This change in perspective from quantitative to qualitative dynamics, provides fundamental and new insights into the properties of dynamical systems. PMID:21971504

  10. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    International Nuclear Information System (INIS)

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operators mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  11. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. (Chalmers Univ. of Technology. Division Design and Human factors. Dept. of Product and Production Development, Goeteborg (Sweden))

    2011-01-15

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operator's mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  12. IPR 2016. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-05-15

    The volume on the meeting of pediatric radiology includes abstract on the following issues: chest, cardiovascular system, neuroradiology, CT radiation DRs (diagnostic reference levels) and dose reporting guidelines, genitourinary imaging, gastrointestinal radiology, oncology an nuclear medicine, whole body imaging, fetal/neonates imaging, child abuse, oncology and hybrid imaging, value added imaging, muscoskeletal imaging, dose and radiation safety, imaging children - immobilization and distraction techniques, information - education - QI and healthcare policy, ALARA, the knowledge skills and competences for a technologist/radiographer in pediatric radiology, full exploitation of new technological features in pediatric CT, image quality issues in pediatrics, abdominal imaging, interventional radiology, MR contrast agents, tumor - mass imaging, cardiothoracic imaging, ultrasonography.

  13. Program and abstracts

    International Nuclear Information System (INIS)

    This volume contains the program and abstracts of the conference. The following topics are included: metal vapor molecular lasers, magnetohydrodynamics, rare gas halide and nuclear pumped lasers, transfer mechanisms in arcs, kinetic processes in rare gas halide lasers, arcs and flows, XeF kinetics and lasers, fundamental processes in excimer lasers, electrode effects and vacuum arcs, electron and ion transport, ion interactions and mobilities, glow discharges, diagnostics and afterglows, dissociative recombination, electron ionization and excitation, rare gas excimers and group VI lasers, breakdown, novel laser pumping techniques, electrode-related discharge phenomena, photon interactions, attachment, plasma chemistry and infrared lasers, electron scattering, and reactions of excited species

  14. Circularity and Lambda Abstraction

    DEFF Research Database (Denmark)

    Danvy, Olivier; Thiemann, Peter; Zerny, Ian

    2013-01-01

    In this tribute to Doaitse Swierstra, we present the rst transformation between lazy circular programs a la Bird and strict cir- cular programs a la Pettorossi. Circular programs a la Bird rely on lazy recursive binding: they involve circular unknowns and make sense equa- tionally. Circular...... unknowns from what is done to them, which we lambda-abstract with functions. The circular unknowns then become dead variables, which we eliminate. The result is a strict circu- lar program a la Pettorossi. This transformation is reversible: given a strict circular program a la Pettorossi, we introduce...

  15. IPR 2016. Abstracts

    International Nuclear Information System (INIS)

    The volume on the meeting of pediatric radiology includes abstract on the following issues: chest, cardiovascular system, neuroradiology, CT radiation DRs (diagnostic reference levels) and dose reporting guidelines, genitourinary imaging, gastrointestinal radiology, oncology an nuclear medicine, whole body imaging, fetal/neonates imaging, child abuse, oncology and hybrid imaging, value added imaging, muscoskeletal imaging, dose and radiation safety, imaging children - immobilization and distraction techniques, information - education - QI and healthcare policy, ALARA, the knowledge skills and competences for a technologist/radiographer in pediatric radiology, full exploitation of new technological features in pediatric CT, image quality issues in pediatrics, abdominal imaging, interventional radiology, MR contrast agents, tumor - mass imaging, cardiothoracic imaging, ultrasonography.

  16. ESPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-05-10

    The volume includes the abstracts of the ESPR 2015 covering the following topics: PCG (post graduate courses): Radiography; fluoroscopy and general issue; nuclear medicine, interventional radiology and hybrid imaging, pediatric CT, pediatric ultrasound; MRI in childhood. Scientific sessions and task force sessions: International aspects; neuroradiology, neonatal imaging, engineering techniques to simulate injury in child abuse, CT - dose and quality, challenges in the chest, cardiovascular and chest, muscoskeletal, oncology, pediatric uroradiology and abdominal imaging, fetal and postmortem imaging, education and global challenges, neuroradiology - head and neck, gastrointestinal and genitourinary.

  17. Abstracts of Major Articles

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On Problems in Fujian's Present Health Insurance Professionals and Related Suggestions LIN Deng-hui,WU Xiao-nan (School of Public Health, Fujian Medical University, Fuzhou 350108, China) Abstract:Based on a statistical analysis of questionnaire survey data collected from practitioners in Fu- jian's medical insurance management system, the paper discusses the problems relevant to the staff's qua lity structure in this industry as well as mechanisms for continuing education and motivation. Finally, the authors advance such suggestions as increasing the levels of practitioner's expertise and working capacity by developing disciplinary education and continuing motivated with a well-established motivation system. education, and encouraging employees to get highly

  18. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Establishment of a Method for Content Determination of Polysaccharide in Membranous milkveteh root Applied in Fisheries Yu Xiao-qing et al. (1) Abstract Some chemical component in the traditional Chinese medicine Membranous milkvetch root can improve the ability of disease-prevention of animal and it can be applied in fisheries. In the paper, the method about content determination of polysaccharide in the root was established based on orthogonal experimental design Key words medicine; polysaccharide in Membranous milkvetch root; method of determination

  19. Economic Perspectives on Automated Demand Responsive Transportation and Shared Taxi Services - Analytical models and simulations for policy analysis

    OpenAIRE

    Jokinen, Jani-Pekka

    2016-01-01

    The automated demand responsive transportation (DRT) and modern shared taxi services provide shared trips for passengers, adapting dynamically to trip requests by routing a fleet of vehicles operating without any fixed routes or schedules. Compared with traditional public transportation, these new services provide trips without transfers and free passengers from the necessity of using timetables and maps of route networks. Furthermore, automated DRT applies real-time traffic information in ve...

  20. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    OpenAIRE

    Zucker Gerhard; Dietrich Dietmar; Velik Rosemarie

    2011-01-01

    The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour ...

  1. Automated Voxel Model from Point Clouds for Structural Analysis of Cultural Heritage

    Science.gov (United States)

    Bitelli, G.; Castellazzi, G.; D'Altri, A. M.; De Miranda, S.; Lambertini, A.; Selvaggi, I.

    2016-06-01

    In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM) of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy) that was hit by an earthquake in 2012.

  2. Automating the Simulation of SME Processes through a Discrete Event Parametric Model

    Directory of Open Access Journals (Sweden)

    Francesco Aggogeri

    2015-02-01

    Full Text Available At the factory level, the manufacturing system can be described as a group of processes governed by complex weaves of engineering strategies and technologies. Decision- making processes involve a lot of information, driven by managerial strategies, technological implications and layout constraints. Many factors affect decisions, and their combination must be carefully managed to determine the best solutions to optimize performances. In this way, advanced simulation tools could support the decisional process of many SMEs. The accessibility of these tools is limited by knowledge, cost, data availability and development time. These tools should be used to support strategic decisions rather than specific situations. In this paper, a novel approach is proposed that aims to facilitate the simulation of manufacturing processes by fast modelling and evaluation. The idea is to realize a model that is able to be automatically adapted to the user’s specific needs. The model must be characterized by a high degree of flexibility, configurability and adaptability in order to automatically simulate multiple/heterogeneous industrial scenarios. In this way, even a SME can easily access a complex tool, perform thorough analyses and be supported in taking strategic decisions. The parametric DES model is part of a greater software platform developed during COPERNICO EU funded project.

  3. Natural Environment Modeling and Fault-Diagnosis for Automated Agricultural Vehicle

    DEFF Research Database (Denmark)

    Blas, Morten Rufus; Blanke, Mogens

    2008-01-01

    This paper presents results for an automatic navigation system for agricultural vehicles. The system uses stereo-vision, inertial sensors and GPS. Special emphasis has been placed on modeling the natural environment in conjunction with a fault-tolerant navigation system. The results are exemplified...

  4. Meaning-Based Scoring: A Systemic Functional Linguistics Model for Automated Test Tasks

    Science.gov (United States)

    Gleason, Jesse

    2014-01-01

    Communicative approaches to language teaching that emphasize the importance of speaking (e.g., task-based language teaching) require innovative and evidence-based means of assessing oral language. Nonetheless, research has yet to produce an adequate assessment model for oral language (Chun 2006; Downey et al. 2008). Limited by automatic speech…

  5. Bilateral Image Subtraction and Multivariate Models for the Automated Triaging of Screening Mammograms

    Directory of Open Access Journals (Sweden)

    José Celaya-Padilla

    2015-01-01

    Full Text Available Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. This work presents a computer-aided diagnosis (CADx method aimed to automatically triage mammogram sets. The method coregisters the left and right mammograms, extracts image features, and classifies the subjects into risk of having malignant calcifications (CS, malignant masses (MS, and healthy subject (HS. In this study, 449 subjects (197 CS, 207 MS, and 45 HS from a public database were used to train and evaluate the CADx. Percentile-rank (p-rank and z-normalizations were used. For the p-rank, the CS versus HS model achieved a cross-validation accuracy of 0.797 with an area under the receiver operating characteristic curve (AUC of 0.882; the MS versus HS model obtained an accuracy of 0.772 and an AUC of 0.842. For the z-normalization, the CS versus HS model achieved an accuracy of 0.825 with an AUC of 0.882 and the MS versus HS model obtained an accuracy of 0.698 and an AUC of 0.807. The proposed method has the potential to rank cases with high probability of malignant findings aiding in the prioritization of radiologists work list.

  6. Automated Response Surface Methodology for Stochastic Optimization Models with Unknown Variance

    NARCIS (Netherlands)

    R.P. Nicolai (Robin); R. Dekker (Rommert)

    2005-01-01

    textabstractResponse Surface Methodology (RSM) is a tool that was introduced in the early 50´s by Box and Wilson (1951). It is a collection of mathematical and statistical techniques useful for the approximation and optimization of stochastic models. Applications of RSM can be found in e.g. chemical

  7. DATA FOR ENVIRONMENTAL MODELING (D4EM): BACKGROUND AND EXAMPLE APPLICATIONS OF DATA AUTOMATION

    Science.gov (United States)

    Data is a basic requirement for most modeling applications. Collecting data is expensive and time consuming. High speed internet connections and growing databases of online environmental data go a long way to overcoming issues of data scarcity. Among the obstacles still remaining...

  8. An automated nowcasting model of significant instability events in the flight terminal area of Rio de Janeiro, Brazil

    Science.gov (United States)

    Borges França, Gutemberg; Valdonel de Almeida, Manoel; Rosette, Alessana C.

    2016-05-01

    This paper presents a novel model, based on neural network techniques, to produce short-term and local-specific forecasts of significant instability for flights in the terminal area of Galeão Airport, Rio de Janeiro, Brazil. Twelve years of data were used for neural network training/validation and test. Data are originally from four sources: (1) hourly meteorological observations from surface meteorological stations at five airports distributed around the study area; (2) atmospheric profiles collected twice a day at the meteorological station at Galeão Airport; (3) rain rate data collected from a network of 29 rain gauges in the study area; and (4) lightning data regularly collected by national detection networks. An investigation was undertaken regarding the capability of a neural network to produce early warning signs - or as a nowcasting tool - for significant instability events in the study area. The automated nowcasting model was tested using results from five categorical statistics, indicated in parentheses in forecasts of the first, second, and third hours, respectively, namely proportion correct (0.99, 0.97, and 0.94), BIAS (1.10, 1.42, and 2.31), the probability of detection (0.79, 0.78, and 0.67), false-alarm ratio (0.28, 0.45, and 0.73), and threat score (0.61, 0.47, and 0.25). Possible sources of error related to the test procedure are presented and discussed. The test showed that the proposed model (or neural network) can grab the physical content inside the data set, and its performance is quite encouraging for the first and second hours to nowcast significant instability events in the study area.

  9. Laser performance operations model (LPOM): The computational system that automates the setup and performance analysis of the National Ignition Facility

    Science.gov (United States)

    Shaw, Michael; House, Ronald

    2015-02-01

    The National Ignition Facility (NIF) is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500-TW, 351-nm laser system together with a 10-m diameter target chamber with room for many target diagnostics. NIF is the world's largest laser experimental system, providing a national center to study inertial confinement fusion and the physics of matter at extreme energy densities and pressures. A computational system, the Laser Performance Operations Model (LPOM) has been developed that automates the laser setup process, and accurately predict laser energetics. LPOM uses diagnostic feedback from previous NIF shots to maintain accurate energetics models (gains and losses), as well as links to operational databases to provide `as currently installed' optical layouts for each of the 192 NIF beamlines. LPOM deploys a fully integrated laser physics model, the Virtual Beamline (VBL), in its predictive calculations in order to meet the accuracy requirements of NIF experiments, and to provide the ability to determine the damage risk to optical elements throughout the laser chain. LPOM determines the settings of the injection laser system required to achieve the desired laser output, provides equipment protection, and determines the diagnostic setup. Additionally, LPOM provides real-time post shot data analysis and reporting for each NIF shot. The LPOM computation system is designed as a multi-host computational cluster (with 200 compute nodes, providing the capability to run full NIF simulations fully parallel) to meet the demands of both the controls systems within a shot cycle, and the NIF user community outside of a shot cycle.

  10. Interactional Metadiscourse in Research Article Abstracts

    Science.gov (United States)

    Gillaerts, Paul; Van de Velde, Freek

    2010-01-01

    This paper deals with interpersonality in research article abstracts analysed in terms of interactional metadiscourse. The evolution in the distribution of three prominent interactional markers comprised in Hyland's (2005a) model, viz. hedges, boosters and attitude markers, is investigated in three decades of abstract writing in the field of…

  11. From abstract to peer-reviewed publication: country matters

    DEFF Research Database (Denmark)

    Fosbol, E.; Fosbøl, Philip Loldrup; Eapen, Z. J.;

    2013-01-01

    within 2 years of the conference. Less is known about the relative difference between countries in regards to likelihood of publication. Methods: Using a validated automated computer algorithm, we searched the ISI Web of Science to identify peer-reviewed publications of abstracts presented at the AHA...

  12. Multi-Instance Learning Models for Automated Support of Analysts in Simulated Surveillance Environments

    Science.gov (United States)

    Birisan, Mihnea; Beling, Peter

    2011-01-01

    New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.

  13. Evaluation of automated statistical shape model based knee kinematics from biplane fluoroscopy

    DEFF Research Database (Denmark)

    Baka, Nora; Kaptein, Bart L.; Giphart, J. Erik;

    2014-01-01

    State-of-the-art fluoroscopic knee kinematic analysis methods require the patient-specific bone shapes segmented from CT or MRI. Substituting the patient-specific bone shapes with personalizable models, such as statistical shape models (SSM), could eliminate the CT/MRI acquisitions, and thereby...... decrease costs and radiation dose (when eliminating CT). SSM based kinematics, however, have not yet been evaluated on clinically relevant joint motion parameters. Therefore, in this work the applicability of SSMs for computing knee kinematics from biplane fluoroscopic sequences was explored. Kinematic......-posterior tibial drawer, joint distraction-contraction, flexion, tibial rotation and adduction. The relationship between kinematic precision and bone shape accuracy was also investigated. The SSM based kinematics resulted in sub-millimeter (0.48-0.81mm) and approximately 1° (0.69-0.99°) median precision...

  14. Automated Object-Oriented Simulation Framework for Modelling of Superconducting Magnets at CERN

    CERN Document Server

    Maciejewski, Michał; Bartoszewicz, Andrzej

    The thesis aims at designing a flexible, extensible, user-friendly interface to model electro thermal transients occurring in superconducting magnets. Simulations are a fundamental tool for assessing the performance of a magnet and its protection system against the effects of a quench. The application is created using scalable and modular architecture based on object-oriented programming paradigm which opens an easy way for future extensions. What is more, each model composed of thousands of blocks is automatically created in MATLAB/Simulink. Additionally, the user is able to automatically run sets of simulations with varying parameters. Due to its scalability and modularity the framework can be easily used to simulate wide range of materials and magnet configurations.

  15. Micro-simulation Modeling of Coordination of Automated Guided Vehicles at Intersection

    OpenAIRE

    Makarem, Laleh; Pham, Minh Hai; Dumont, André-Gilles; Gillet, Denis

    2012-01-01

    One of the challenging problems with autonomous vehicles is their performance at intersections. This paper shows an alternative control method for the coordination of autonomous vehicles at intersections. The proposed approach is grounded in multi-robot coordination and it also takes into account vehicle dynamics as well as realistic communication constraints. The existing concept of decentralized navigation functions is combined with a sensing model and a crossing strategy is developed. It i...

  16. Protection and Automation Devices Testing Using the Modeling Features of EUROSTAG

    OpenAIRE

    Sauhats, A; Utāns, A; Kucajevs, J; Pašņins, G; Antonovs, D; Bieļa-Dailidoviča, E

    2011-01-01

    Setting calculation of asynchronous regime liquidation devices can be very difficult as a result of variable power system parameters. Therefore, efficiency of action operation and correctness of setting must be tested by experience in various regimes of power system work. For asynchronous regimes such signals are typical which form it is difficult to recreate using traditional relay testing technique, so, these signals of testing are obtained by modelling of various regimes of power system w...

  17. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism.

    OpenAIRE

    Alicja ePuścian; Szymon eŁęski; Tomasz eGórkiewicz; Ksenia eMeyza; Hans-Peter eLipp; Ewelina Anna Knapska

    2014-01-01

    Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order) and cognitive rigidity (higher-order). Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repet...

  18. Automated diagnosis of coronary artery disease based on data mining and fuzzy modeling.

    Science.gov (United States)

    Tsipouras, Markos G; Exarchos, Themis P; Fotiadis, Dimitrios I; Kotsia, Anna P; Vakalis, Konstantinos V; Naka, Katerina K; Michalis, Lampros K

    2008-07-01

    A fuzzy rule-based decision support system (DSS) is presented for the diagnosis of coronary artery disease (CAD). The system is automatically generated from an initial annotated dataset, using a four stage methodology: 1) induction of a decision tree from the data; 2) extraction of a set of rules from the decision tree, in disjunctive normal form and formulation of a crisp model; 3) transformation of the crisp set of rules into a fuzzy model; and 4) optimization of the parameters of the fuzzy model. The dataset used for the DSS generation and evaluation consists of 199 subjects, each one characterized by 19 features, including demographic and history data, as well as laboratory examinations. Tenfold cross validation is employed, and the average sensitivity and specificity obtained is 62% and 54%, respectively, using the set of rules extracted from the decision tree (first and second stages), while the average sensitivity and specificity increase to 80% and 65%, respectively, when the fuzzification and optimization stages are used. The system offers several advantages since it is automatically generated, it provides CAD diagnosis based on easily and noninvasively acquired features, and is able to provide interpretation for the decisions made. PMID:18632325

  19. An automated model for rooftop PV systems assessment in ArcGIS using LIDAR

    Directory of Open Access Journals (Sweden)

    Mesude Bayrakci Boz

    2015-08-01

    Full Text Available As photovoltaic (PV systems have become less expensive, building rooftops have come to be attractive for local power production. Identifying rooftops suitable for solar energy systems over large geographic areas is needed for cities to obtain more accurate assessments of production potential and likely patterns of development. This paper presents a new method for extracting roof segments and locating suitable areas for PV systems using Light Detection and Ranging (LIDAR data and building footprints. Rooftop segments are created using seven slope (tilt, ve aspect (azimuth classes and 6 different building types. Moreover, direct beam shading caused by nearby objects and the surrounding terrain is taken into account on a monthly basis. Finally, the method is implemented as an ArcGIS model in ModelBuilder and a tool is created. In order to show its validity, the method is applied to city of Philadelphia, PA, USA with the criteria of slope, aspect, shading and area used to locate suitable areas for PV system installation. The results show that 33.7% of the buildings footprints areas and 48.6% of the rooftop segments identi ed is suitable for PV systems. Overall, this study provides a replicable model using commercial software that is capable of extracting individual roof segments with more detailed criteria across an urban area.

  20. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization

    Science.gov (United States)

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm. PMID:27632581

  1. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization.

    Science.gov (United States)

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm. PMID:27632581

  2. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    Science.gov (United States)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework

  3. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Determination of the Estrogen Alkylphenols and Bisphenol A in Marine Sediments by Gas Chromatography-Mass Spectrometry Deng Xu-xiu et al. (1) Abstract Octylphenol, nonylphenol and bisphenol A are recognized environmental endocrine disruptors. A quantitative method was established for the simultaneous determination of octylphenol, nonylphenol and bisphenol A in marine sediments by gas chromatography-mass spectrometry. The test sample was extracted by methanol with ultrasonic technique, purified with copper powder and carbon solid phase extraction column, and derived with heptafluorobutyric anhydride. Then the analytes were separated on HP-5ms column and determined by gas chromatography-mass. The recovery of the method was between 84.3% and 94.5%, and the LOQ of 4-N- octylphenol, nonylphenol and bisphenol A was 0.25 g/kg, 0.15 g/kg and 0.15 g/kg. Key words octylphenol; nonylphenol; bisphenol A; gas chromatography-mass spectrometry

  4. A LARI Experience (Abstract)

    Science.gov (United States)

    Cook, M.

    2015-12-01

    (Abstract only) In 2012, Lowell Observatory launched The Lowell Amateur Research Initiative (LARI) to formally involve amateur astronomers in scientific research by bringing them to the attention of and helping professional astronomers with their astronomical research. One of the LARI projects is the BVRI photometric monitoring of Young Stellar Objects (YSOs), wherein amateurs obtain observations to search for new outburst events and characterize the colour evolution of previously identified outbursters. A summary of the scientific and organizational aspects of this LARI project, including its goals and science motivation, the process for getting involved with the project, a description of the team members, their equipment and methods of collaboration, and an overview of the programme stars, preliminary findings, and lessons learned is presented.

  5. Contents and Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [Ancient Mediterranean Civilizations] Title: On Poseidon's Image in Homeric Epics Author: Zhu Yizhang, Lecturer, School of History and Culture, Shandong University, Jinan, Shandong, 250100, China. Abstract: Poseidon was an important role in religion, myth and literature of ancient Greece. His religious functions, and status in mythical image in literature were mainly established by Homeric Epics. Poseidon doesn't only appear frequently in the Homeric Epics but also influences the development of the plots directly; therefore, he could be seen as one of the most important gods in the Epics. But Homeric Epics do not introduce his basic image clearly. In Homeric Epics, Poseidon carries the deity and humanity aspect of the figure, and the latter was emphasized, which implied his archetype was a mortal wanax.

  6. Abstracts of Selected Papers

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On the Social Solidarity of Organization An Empirical Analysis Li Hanlin Abstract: Based on the 2002 survey data, this paper tries to measure solidarity in organization. The operationalization for this measurement goes from two points of view. One is from the degree of cohesion and another one is from the degree of vulnerability. To observe and measure the degree of cohesion three subscales like social support, vertical integration and organizational identity have been used. To observe and measure the degree of vulnerability other three subscales like dissatisfaction, relative deprivation and anomie have been used. The paper tries to explore finally under which condition the organization behavior and behavior orientation could go to the similarity or make some difference. Key words: Organization Cohesion Vulnerability Organization Behavior

  7. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Comparative Study on Adhesion Effect Among Different Materials of Sepia esculenta Wang Xue-mei et al. (1) Abstract PE Harness, mesh, sea cucumber seedling box attached, sorghum bar, tamarix (fresh, and old), artemisia annua (fresh, and old) and artificial egg-based subsidiary were used as spawning substrates of Sepia esculenta for comparative study on adhesion effect during artificial breeding. The results showed that the best was artificial egg-based subsidiary produced by the process of invention in this study. The second was old artemisia annua and tamarix. PE Harness, mesh, sea cucumber seedling box attached, sorghum bar were unsatisfactory for using as spawning substrates of Sepia esculenta. Key words Sepia esculenta; adhesion effect; different materials

  8. ICENES 2007 Abstracts

    International Nuclear Information System (INIS)

    In this book Conference Program and Abstracts were included 13th International Conference on Emerging Nuclear Energy Systems which held between 03-08 June 2007 in Istanbul, Turkey. The main objective of International Conference series on Emerging Nuclear Energy Systems (ICENES) is to provide an international scientific and technical forum for scientists, engineers, industry leaders, policy makers, decision makers and young professionals who will shape future energy supply and technology , for a broad review and discussion of various advanced, innovative and non-conventional nuclear energy production systems. The main topics of 159 accepted papers from 35 countries are fusion science and technology, fission reactors, accelerator driven systems, transmutation, laser in nuclear technology, radiation shielding, nuclear reactions, hydrogen energy, solar energy, low energy physics and societal issues

  9. Automated 3D Motion Tracking using Gabor Filter Bank, Robust Point Matching, and Deformable Models

    Science.gov (United States)

    Wang, Xiaoxu; Chung, Sohae; Metaxas, Dimitris; Axel, Leon

    2013-01-01

    Tagged Magnetic Resonance Imaging (tagged MRI or tMRI) provides a means of directly and noninvasively displaying the internal motion of the myocardium. Reconstruction of the motion field is needed to quantify important clinical information, e.g., the myocardial strain, and detect regional heart functional loss. In this paper, we present a three-step method for this task. First, we use a Gabor filter bank to detect and locate tag intersections in the image frames, based on local phase analysis. Next, we use an improved version of the Robust Point Matching (RPM) method to sparsely track the motion of the myocardium, by establishing a transformation function and a one-to-one correspondence between grid tag intersections in different image frames. In particular, the RPM helps to minimize the impact on the motion tracking result of: 1) through-plane motion, and 2) relatively large deformation and/or relatively small tag spacing. In the final step, a meshless deformable model is initialized using the transformation function computed by RPM. The model refines the motion tracking and generates a dense displacement map, by deforming under the influence of image information, and is constrained by the displacement magnitude to retain its geometric structure. The 2D displacement maps in short and long axis image planes can be combined to drive a 3D deformable model, using the Moving Least Square method, constrained by the minimization of the residual error at tag intersections. The method has been tested on a numerical phantom, as well as on in vivo heart data from normal volunteers and heart disease patients. The experimental results show that the new method has a good performance on both synthetic and real data. Furthermore, the method has been used in an initial clinical study to assess the differences in myocardial strain distributions between heart disease (left ventricular hypertrophy) patients and the normal control group. The final results show that the proposed method

  10. Automated Image-Based Procedures for Accurate Artifacts 3D Modeling and Orthoimage Generation

    Directory of Open Access Journals (Sweden)

    Marc Pierrot-Deseilligny

    2011-12-01

    Full Text Available The accurate 3D documentation of architectures and heritages is getting very common and required in different application contexts. The potentialities of the image-based approach are nowadays very well-known but there is a lack of reliable, precise and flexible solutions, possibly open-source, which could be used for metric and accurate documentation or digital conservation and not only for simple visualization or web-based applications. The article presents a set of photogrammetric tools developed in order to derive accurate 3D point clouds and orthoimages for the digitization of archaeological and architectural objects. The aim is also to distribute free solutions (software, methodologies, guidelines, best practices, etc. based on 3D surveying and modeling experiences, useful in different application contexts (architecture, excavations, museum collections, heritage documentation, etc. and according to several representations needs (2D technical documentation, 3D reconstruction, web visualization, etc..

  11. A functional tolerance model: an approach to automate the inspection process

    Directory of Open Access Journals (Sweden)

    R. Hunter

    2008-12-01

    Full Text Available Purpose: Purpose of this paper is the definition of a framework to describe the Technological ProductSpecifications (TPS and the information associated with the geometric dimensioning and tolerancing tointegrate the design concepts into a commercial inspection system.Design/methodology/approach: A functional tolerance model provides a complete framework to define thegeometric dimensioning and tolerancing and its relationship with the part geometry and the inspection process.This framework establishes a connection between a computer aided design and computer aided inspectionsystem throughout the exportation of the information associated to the dimensions and tolerance of the part intoa commercial CAI system.Findings: They are mainly focused on the definition of a framework that describes the relationship between theentities of dimensions and tolerances with the geometry of the part. The information imported to a CAI system allowsto develop the inspection process without the additional information provided by a physical drawing of the part.Research limitations/implications: They regard the limited access to commercial CAI system and to the lackof protocols of exchange of data associated to the tolerances of the part.Practical implications: They involve facilitation of the inspection process development. This implicationallows realizing the inspection process reducing the time spent to define the geometry to inspect and theparameters that must be controlled.Originality/value: The main value of this research is the development of a unique framework to extract theinformation related to the geometric dimensioning and tolerances and the geometry of the part in a commonmodel. This model provides a complete definition and representation of the entities, attributes and relationshipof design and inspection system.

  12. Improving amino-acid identification, fit and C(alpha) prediction using the Simplex method in automated model building.

    Science.gov (United States)

    Romo, Tod D; Sacchettini, James C; Ioerger, Thomas R

    2006-11-01

    Automated methods for protein model building in X-ray crystallography typically use a two-phased approach that involves first modeling the protein backbone followed by building in the side chains. The latter phase requires the identification of the amino-acid side-chain type as well as fitting of the side-chain model into the observed electron density. While mistakes in identification of individual side chains are common for a number of reasons, sequence alignment can sometimes be used to correct errors by mapping fragments into the true (expected) amino-acid sequence and exploiting contiguity constraints among neighbors. However, side chains cannot always be confidently aligned; this depends on having sufficient accuracy in the initial calls. The recognition of amino-acid side-chains based on the surrounding pattern of electron density, whether by features, density correlation or free atoms, can be sensitive to inaccuracies in the coordinates of the predicted backbone C(alpha) atoms to which they are anchored. By incorporating a Nelder-Mead Simplex search into the side-chain identification and model-building routines of TEXTAL, it is demonstrated that this form of residue-by-residue rigid-body real-space refinement (in which the C(alpha) itself is allowed to shift) can improve the initial accuracy of side-chain selection by over 25% on average (from 25% average identity to 32% on a test set of five representative proteins, without corrections by sequence alignment). This improvement in amino-acid selection accuracy in TEXTAL is often sufficient to bring the pairwise amino-acid identity of chains in the model out of the so-called ;twilight zone' for sequence-alignment methods. When coupled with sequence alignment, use of the Simplex search yielded improvements in side-chain accuracy on average by over 13 percentage points (from 64 to 77%) and up to 38 percentage points (from 40 to 78%) in one case compared with using sequence alignment alone. PMID:17057345

  13. Improving the correlation of structural FEA models by the application of automated high density robotized laser Doppler vibrometry

    Science.gov (United States)

    Chowanietz, Maximilian; Bhangaonkar, Avinash; Semken, Michael; Cockrill, Martin

    2016-06-01

    Sound has had an intricate relation with the wellbeing of humans since time immemorial. It has the ability to enhance the quality of life immensely when present as music; at the same time, it can degrade its quality when manifested as noise. Hence, understanding its sources and the processes by which it is produced gains acute significance. Although various theories exist with respect to evolution of bells, it is indisputable that they carry millennia of cultural significance, and at least a few centuries of perfection with respect to design, casting and tuning. Despite the science behind its design, the nuances pertaining to founding and tuning have largely been empirical, and conveyed from one generation to the next. Post-production assessment for bells remains largely person-centric and traditional. However, progressive bell manufacturers have started adopting methods such as finite element analysis (FEA) for informing and optimising their future model designs. To establish confidence in the FEA process it is necessary to correlate the virtual model against a physical example. This is achieved by performing an experimental modal analysis (EMA) and comparing the results with those from FEA. Typically to collect the data for an EMA, the vibratory response of the structure is measured with the application of accelerometers. This technique has limitations; principally these are the observer effect and limited geometric resolution. In this paper, 3-dimensional laser Doppler vibrometry (LDV) has been used to measure the vibratory response with no observer effect due to the non-contact nature of the technique; resulting in higher accuracy measurements as the input to the correlation process. The laser heads were mounted on an industrial robot that enables large objects to be measured and extensive data sets to be captured quickly through an automated process. This approach gives previously unobtainable geometric resolution resulting in a higher confidence EMA. This is

  14. An automation of design and modelling tasks in NX Siemens environment with original software - cost module

    Science.gov (United States)

    Zbiciak, R.; Grabowik, C.; Janik, W.

    2015-11-01

    The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing

  15. Modelling of the operation of the multi-storey automated garage with a big capacity

    Directory of Open Access Journals (Sweden)

    Czesław Pypno

    2013-09-01

    Full Text Available Background: The paper presents the issues of parking in the cities. The idea of multi-storey, overground garage with the capacity of 400 cars per hour has been proposed in the paper. The main focus is on analyzing loading and unloading as well as trans-shipment of the cars on the storeys of the garage. Methods: The queuing theory has been used in the modelling process of the vehicles operation. The theory may enable to draw up general methods which let us indicate basic factors describing the process of the operation and the evaluation of the quality of work of the queuing theory system. Aims: The subject of the paper is to check the influence of stochastic effects on the effectiveness of the parking operation in multi-storey garage.  Conclusions:  The garage could be a solution to parking problems in the city centres, in the vicinity of factories, office buildings, academic centres and the like. Furthermore the research method may support and speed up a decisional process while choosing the optimal structure, organization and first of all the construction of the parking.  

  16. Automated Method for Estimating Nutation Time Constant Model Parameters for Spacecraft Spinning on Axis

    Science.gov (United States)

    2008-01-01

    Calculating an accurate nutation time constant (NTC), or nutation rate of growth, for a spinning upper stage is important for ensuring mission success. Spacecraft nutation, or wobble, is caused by energy dissipation anywhere in the system. Propellant slosh in the spacecraft fuel tanks is the primary source for this dissipation and, if it is in a state of resonance, the NTC can become short enough to violate mission constraints. The Spinning Slosh Test Rig (SSTR) is a forced-motion spin table where fluid dynamic effects in full-scale fuel tanks can be tested in order to obtain key parameters used to calculate the NTC. We accomplish this by independently varying nutation frequency versus the spin rate and measuring force and torque responses on the tank. This method was used to predict parameters for the Genesis, Contour, and Stereo missions, whose tanks were mounted outboard from the spin axis. These parameters are incorporated into a mathematical model that uses mechanical analogs, such as pendulums and rotors, to simulate the force and torque resonances associated with fluid slosh.

  17. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  18. GIBS Geospatial Data Abstraction Library (GDAL)

    Data.gov (United States)

    National Aeronautics and Space Administration — GDAL is an open source translator library for raster geospatial data formats that presents a single abstract data model to the calling application for all supported...

  19. Towards Abstract Interpretation of Epistemic Logic

    DEFF Research Database (Denmark)

    Ajspur, Mai; Gallagher, John Patrick

    applicable to infinite models. The abstract model-checker allows model-checking with infinite-state models. When applied to the problem of whether M |= φ, it terminates and returns the set of states in M at which φ might hold. If the set is empty, then M definitely does not satisfy φ, while if the set is non...

  20. Evaluation of different lactation curve models fitted for milk viscosity recorded by an automated on-line California Mastitis Test.

    Science.gov (United States)

    Neitzel, Anne-Christin; Stamer, Eckhard; Junge, Wolfgang; Thaller, Georg

    2015-05-01

    Laboratory somatic cell count (LSCC) records are usually recorded monthly and provide an important information source for breeding and herd management. Daily milk viscosity detection in composite milking (expressed as drain time) with an automated on-line California Mastitis Test (CMT) could serve immediately as an early predictor of udder diseases and might be used as a selection criterion to improve udder health. The aim of the present study was to clarify the relationship between the well-established LSCS and the new trait,'drain time', and to estimate their correlations to important production traits. Data were recorded on the dairy research farm Karkendamm in Germany. Viscosity sensors were installed on every fourth milking stall in the rotary parlour to measure daily drain time records. Weekly LSCC and milk composition data were available. Two data sets were created containing records of 187,692 milkings from 320 cows (D1) and 25,887 drain time records from 311 cows (D2). Different fixed effect models, describing the log-transformed drain time (logDT), were fitted to achieve applicable models for further analysis. Lactation curves were modelled with standard parametric functions (Ali and Schaeffer, Legendre polynomials of second and third degree) of days in milk (DIM). Random regression models were further applied to estimate the correlations between cow effects between logDT and LSCS with further important production traits. LogDT and LSCS were strongest correlated in mid-lactation (r = 0.78). Correlations between logDT and production traits were low to medium. Highest correlations were reached in late lactation between logDT and milk yield (r = -0.31), between logDT and protein content (r = 0.30) and in early as well as in late lactation between logDT and lactose content (r = -0.28). The results of the present study show that the drain time could be used as a new trait for daily mastitis control. PMID:25731191