WorldWideScience

Sample records for automated model abstraction

  1. Automated Predicate Abstraction for Real-Time Models

    Directory of Open Access Journals (Sweden)

    Bahareh Badban

    2009-11-01

    Full Text Available We present a technique designed to automatically compute predicate abstractions for dense real-timed models represented as networks of timed automata. We use the CIPM algorithm in our previous work which computes new invariants for timed automata control locations and prunes the model, to compute a predicate abstraction of the model. We do so by taking information regarding control locations and their newly computed invariants into account.

  2. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  3. Abstract Delta Modeling

    OpenAIRE

    Dave Clarke; Michiel Helvensteijn; Ina Schaefer

    2011-01-01

    Delta modeling is an approach to facilitate automated product derivation for software product lines. It is based on a set of deltas specifying modifications that are incrementally applied to a core product. The applicability of deltas depends on feature-dependent conditions. This paper presents abstract delta modeling, which explores delta modeling from an abstract, algebraic perspective. Compared to previous work, we take a more flexible approach with respect to conflicts between modificatio...

  4. Automated Assume-Guarantee Reasoning by Abstraction Refinement

    Science.gov (United States)

    Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra

    2008-01-01

    Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.

  5. Teaching for Abstraction: A Model

    Science.gov (United States)

    White, Paul; Mitchelmore, Michael C.

    2010-01-01

    This article outlines a theoretical model for teaching elementary mathematical concepts that we have developed over the past 10 years. We begin with general ideas about the abstraction process and differentiate between "abstract-general" and "abstract-apart" concepts. A 4-phase model of teaching, called Teaching for Abstraction, is then proposed…

  6. Replication and Abstraction: Symmetry in Automated Formal Verification

    Directory of Open Access Journals (Sweden)

    Thomas Wahl

    2010-04-01

    Full Text Available This article surveys fundamental and applied aspects of symmetry in system models, and of symmetry reduction methods used to counter state explosion in model checking, an automated formal verification technique. While covering the research field broadly, we particularly emphasize recent progress in applying the technique to realistic systems, including tools that promise to elevate the scope of symmetry reduction to large-scale program verification. The article targets researchers and engineers interested in formal verification of concurrent systems.

  7. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  8. Automated data model evaluation

    International Nuclear Information System (INIS)

    Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation

  9. Automated spatial and thematic generalization using a context transformation model: integrating steering parameters, classification and aggregation hierarchies, reduction factors, and topological structures for multiple abstractions.

    NARCIS (Netherlands)

    Richardson, D.E.

    1993-01-01

    This dissertation presents a model for spatial and thematic digital generalization. To do so, the development of digital generalization over the last thirty years is first reviewedThe approach to generalization taken in this research differs from other existing works as it tackles the task from a da

  10. Conjure Revisited: Towards Automated Constraint Modelling

    CERN Document Server

    Akgun, Ozgur; Hnich, Brahim; Jefferson, Chris; Miguel, Ian

    2011-01-01

    Automating the constraint modelling process is one of the key challenges facing the constraints field, and one of the principal obstacles preventing widespread adoption of constraint solving. This paper focuses on the refinement-based approach to automated modelling, where a user specifies a problem in an abstract constraint specification language and it is then automatically refined into a constraint model. In particular, we revisit the Conjure system that first appeared in prototype form in 2005 and present a new implementation with a much greater coverage of the specification language Essence.

  11. Engineering Abstractions in Model Checking and Testing

    OpenAIRE

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet and testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implicati...

  12. Engineering Abstractions in Model Checking and Testing

    DEFF Research Database (Denmark)

    Achenbach, Michael; Ostermann, Klaus

    2009-01-01

    Abstractions are used in model checking to tackle problems like state space explosion or modeling of IO. The application of these abstractions in real software development processes, however, lacks engineering support. This is one reason why model checking is not widely used in practice yet and...... testing is still state of the art in falsification. We show how user-defined abstractions can be integrated into a Java PathFinder setting with tools like AspectJ or Javassist and discuss implications of remaining weaknesses of these tools. We believe that a principled engineering approach to designing...

  13. 10111 Abstracts Collection -- Practical Software Testing : Tool Automation and Human Factors

    OpenAIRE

    Harman, Mark; Muccini, Henry; Schulte, Wolfram; Xie, Tao

    2010-01-01

    From March 14, 2010 to March 19, 2010, the Dagstuhl Seminar 10111 ``Practical Software Testing : Tool Automation and Human Factors'' was held in Schloss Dagstuhl~--~Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section des...

  14. SATURATED ZONE FLOW AND TRANSPORT MODEL ABSTRACTION

    Energy Technology Data Exchange (ETDEWEB)

    B.W. ARNOLD

    2004-10-27

    The purpose of the saturated zone (SZ) flow and transport model abstraction task is to provide radionuclide-transport simulation results for use in the total system performance assessment (TSPA) for license application (LA) calculations. This task includes assessment of uncertainty in parameters that pertain to both groundwater flow and radionuclide transport in the models used for this purpose. This model report documents the following: (1) The SZ transport abstraction model, which consists of a set of radionuclide breakthrough curves at the accessible environment for use in the TSPA-LA simulations of radionuclide releases into the biosphere. These radionuclide breakthrough curves contain information on radionuclide-transport times through the SZ. (2) The SZ one-dimensional (I-D) transport model, which is incorporated in the TSPA-LA model to simulate the transport, decay, and ingrowth of radionuclide decay chains in the SZ. (3) The analysis of uncertainty in groundwater-flow and radionuclide-transport input parameters for the SZ transport abstraction model and the SZ 1-D transport model. (4) The analysis of the background concentration of alpha-emitting species in the groundwater of the SZ.

  15. SATURATED ZONE FLOW AND TRANSPORT MODEL ABSTRACTION

    International Nuclear Information System (INIS)

    The purpose of the saturated zone (SZ) flow and transport model abstraction task is to provide radionuclide-transport simulation results for use in the total system performance assessment (TSPA) for license application (LA) calculations. This task includes assessment of uncertainty in parameters that pertain to both groundwater flow and radionuclide transport in the models used for this purpose. This model report documents the following: (1) The SZ transport abstraction model, which consists of a set of radionuclide breakthrough curves at the accessible environment for use in the TSPA-LA simulations of radionuclide releases into the biosphere. These radionuclide breakthrough curves contain information on radionuclide-transport times through the SZ. (2) The SZ one-dimensional (I-D) transport model, which is incorporated in the TSPA-LA model to simulate the transport, decay, and ingrowth of radionuclide decay chains in the SZ. (3) The analysis of uncertainty in groundwater-flow and radionuclide-transport input parameters for the SZ transport abstraction model and the SZ 1-D transport model. (4) The analysis of the background concentration of alpha-emitting species in the groundwater of the SZ

  16. Abstract polymer models with general pair interactions

    CERN Document Server

    Procacci, Aldo

    2007-01-01

    A convergence criterion of cluster expansion is presented in the case of an abstract polymer system with general pair interactions (i.e. not necessarily hard core or repulsive). As a concrete example, the low temperature disordered phase of the BEG model with infinite range interactions, decaying polynomially as $1/r^{d+\\lambda}$ with $\\lambda>0$, is studied.

  17. An Abstract Model of Historical Processes

    CERN Document Server

    Poulshock, Michael

    2016-01-01

    A game theoretic model is presented which attempts to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents play a dynamic, noncooperative, perfect information game where the goal is to maximize payoffs based on positional utility and intertemporal preference, while being constrained by social inertia. Agents use the power they have in order to get more of it, both in an absolute and relative sense. More research is needed to assess the model's empirical validity.

  18. Abstract Action Potential Models for Toxin Recognition

    OpenAIRE

    Peterson, James; Khan, Taufiquar

    2005-01-01

    In this paper, we present a robust methodology using mathematical pattern recognition schemes to detect and classify events in action potentials for recognizing toxins in biological cells. We focus on event detection in action potential via abstraction of information content into a low dimensional feature vector within the constrained computational environment of a biosensor. We use generated families of action potentials from a classic Hodgkin–Huxley model to verify our methodology and build...

  19. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    A Preliminary Inquiry into the Intellectual Origins of Li Dazhao's My Idea of Marxism Abstract ;By translingual-textual comparison, this paper attempts to make a preliminary inquiry into the intellectual origins of Li Dazhao's My Idea of Marxism, suggesting that Li's article, instead of being "a complete copy" of the Japanese scholar

  20. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The Western Theories of War Ethics and Contemporary Controversies Li Xiaodong U Ruijing (4) [ Abstract] In the field of international relations, war ethics is a concept with distinct westem ideological color. Due to factors of history and reality, the in

  1. Abstract

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Cognitive Structure of Scientific Theory in the Scientist-Philosopher's Eyes、Two Theories of Scientific Abstraction Centered on Practices、Many-worlds Interpretation in Quantum Measurement and Its Meaning、Scientific Instrument: Paradigm Shift from Instrumentalism to Realism and Phenomenology

  2. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [ Abstract ] The global resurgence of religion and the return of religion from the so-call "Westphalia Exile" to the central stage of international religions have significantly trans- formed the viewpoints of both media and academia toward the role of religion in IR, and the challenges posed by religion to the contemporary international relations are often described as entirely subversive. The author argues that as a second-tier factor in most countries' for- eign policies and international affairs,

  3. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    (1) Lenin's "Unity of Three Dialectics": Notes Science of Logic and The Capital on Philosophy in the Dual Contexts of Sun Zhengyu 4 Lenin's dialectics in Notes on Philosophy is essentially a unity of materialistic logic, dialectics and epistemology that has arisen from interactions between Hegel' s Science of Logic and Marx' s The Capital. Due to a lack of understanding of Lenin' s "unity of three dialectics," people tend to misunderstand his dialectics for the meeting of two extremes of the "sum total of living instances" and "abstract methods,

  4. ABSTRACT

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    --Based on of Marx's Economic Philosophy Manuscripts of 1844 HE Jian-jin (Philosophy Department, Fujian Provincial Committee Party School, Fuzhou, Fujian 350012, China) Abstract: Socialism with Chinese characteristics has a close relationship with the return and growth of capital in China. To implement the scientific concept of development, we must confront the problem of scientifically controlling the capital. In Economic and Philosophical Manuscripts of 1844, Marx criticized the three old philosophical thinking of treating capital: Object-oriented thinking, intuitive thinking, purely spiritual abstract thinking, and he established his own unique understanding of the capital that is to understand the capital from the human perceptual activities and practical activities. Contemporary Chinese society exist the problem of underdevelopment and abnormal development, and the three heterogeneity problems of pre-modern, modern, postmodern concurrent. In order to implement the scientific concept of development, we must reject any abstract positive or negative to modern basic principles under the guidance of the capital, against the eternal capital theory and capital theory of evil, and we must oppose the thinking that the capital is eternal or evil. Key words: socialism with Chinese characteristics; capital; national economics; scientific concept of development

  5. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Relation between Individuals and Work Units in Stated-Owned Enterprises in Economic Transitional Pe- riod: Changes and their Influences Abstract: As a representation of extinction of work unit system, some dramatic changes have taken place on the rela- tion between individuals and work units in state-owned enterprises. Among many changes are the radical change of the way work unit stimulating and controlling its employees, the extinction of previous system supported by "work unit people", a tense relation between the employees and the work unit caused by the enterprise' s over-pursuit of performance. These changes result in such problems as grievous inequality, violation of personal interest, lack of mechanism for employees' voices and their low sense of belonging, which has brought unprecedented challenges for business administration and corpo- ration euhure development in China. Keywords: danwei/work unit; stimulate and control; relation between individuals and work units; work unit people

  6. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On Rousseau's Equality and Freedom GONG Qun Abstract:Equality and freedom are the two core concepts of political philosophy and Rousseau~s political philosophy is no exception. Freedom and equality in Rousseau in- cludes two levels: natural state and social state under social contract, and among them, there is one state of un-equality. The relationship between the two concepts here is that equality is a necessary precondition of freedom, and that there is no equality, there is no freedom. The achievement of Rousseau~s equality is by one contractual behavior that all the members transfer their rights, especially property rights, and form of the Community. Freedom in Rousseau's mind is through the people's sovereignty in the Community to achieve freedom.

  7. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [Abstract] The essay analyzed the action logic of hegemon with a power approach. Hegemony can be classified as benign or malignant. A benign hegemon should be pro- ductive, inclusive and maintain procedure justice when it uses its power. The power of hegemon can be categorized into two types: the hard power, which is the use of coer- cion and payment and can be measured by public products, and the soft power, which shows the ability to attract and co-opt and can be measured by the relationship-specific investments. The relationship between the input of public products and the relationship -specific investments is not positively correlative. Confusing with the public products and the soft power might lead to strategic misleading. A country rich in power re- sources should comply with the following principles if it wanted to improve its hard power and soft power: first, analyze the scope of the existing hegemon's soft power and avoid investing public products in the scope; second, maintain honesty in a long term and continue to increase others' benefits following the rule of neutral Pareto im- provement; third, provide both public goods and public bads; fourth, be more patient to obtain soft power. [ Key Words] hegemon, soft power, relationship-specific investment, strategic misleading [Authors]Feng Weijiang, Ph.D., Associate Professor, Institute of World Economics and Politics, Chinese Academy of Social Science; Yu Jieya, Master, PBC Shanghai Headquarters.

  8. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Discussions of Design Highlights for Tailgas Treatment in Sulphuric Acid Plant Using New Technology for Flue Gas Desulfurization Through Catalytic Reduction LI Xin , CAO Long-wen , YIN Hua-qiang , El Yue-li , LI Jian-iun ( 1 ,College of Architecture and Environment, Sichuan University, Chengdu 610065, China;2 ,Daye Nonferrous Metals Co., Ltd., Huangshi 435000, China; 3 ,Tile Sixth Construction Company Ltd. of China National Chemical Engineering Corp., Xiangfan 441021, China) Abstract : For the present situation of tailgas treatment in current sulphuric acid plants and existing problems with commonly used technologies, the fun- damental working principle, process flow and reference project for a new technology for flue gas desulfurization through catalytic redaction which is used for tailgas treatment in a sulphuric acid plant and recovery of sulphur resource are outlined. The design highlights of this technology are analyzed and the are proposed. Compared to conventional technologies, this new technology offers high desulfurization efficiency and unique technology, which can effectively tackle the difficuhies of tailgas treatment in sulphuric acid plants after enforcement of the new standard. This new technology is thought to be significant economic benefit, environmental benefit, as well as a promising future of application.

  9. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Strategic Realism: An Option for China' s Grand Strategy Song Dexing (4) [ Abstract] As a non-Western emerging power, China should positively adapt its grand strategy to the strategic psychological traits in the 21st century, maintain a realist tone consistent with the national conditions of China, and avoid adventurist policies while awaring both strategic strength and weakness. In the 21st century, China' s grand strategy should be based on such core values as security, development, peace and justice, especially focusing on development in particular, which we named "strategic realism". Given the profound changes in China and the world, strategic realism encourages active foreign policy to safe- guard the long-term national interests of China. Following the self-help logic and the fun- damental values of security and prosperity, strategic realism concerns national interests as its top-priority. It advocates smart use of power, and aims to achieve its objectives by optimizing both domestic and international conditions. From the perspective of diplomatic phi- losophy, strategic realism is not a summarization of concrete policies but a description of China' s grand strategy orientations in the new century. [ Key Words] China, grand strategy, strategic realism [ Author]Song Dexing, Professor, Ph.D. Supervisor, and Director of the Center for International Strategic Studies, University of International Studies of PLA.

  10. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Western Characteristics of the Pardims of International Studies in America:With the Huaxla System as a Counterexample Ye Zicheng (4)[ Abstract ] Three flaws are obvious in the three paradigms of International Studies in America. Specifically, their arguments are based on the assumption that the world is anarchic ; they go too far in employing the scientific and rational methodology; they pay little attention to the humans. Hence, the three paradigms of international studies in America aren' t necessarily useful for the explanation of China' s history and culture as well as its relations with the outside world. The Huaxia system, for example, is anarchic but also apparently hierarchical; the approach of pursuing security in understanding the rise of western powers may be meaningless, for the hegemony in the Huaxia System needn't worry about its security; the theory of power-balancing seemingly couldn' t explain why Qin ended up in defeating the alliance of the other six states in the Warring-states period. The Huaxia system is quite open, and has free movement of people, goods, and ideas. Some interstate regimes and institutions were formed through Huimeng (alliance-making) among states. However, this kind of limited and fragile interdependence and cooperation soon came to an end after the hegemonies of Qi, Jin and Wei. There does exit the identity problem among states in the Huaxia System, but this problem doesn't play such a great role as the constructivists expect it would.

  11. Automating Data Abstraction in a Quality Improvement Platform for Surgical and Interventional Procedures

    Science.gov (United States)

    Yetisgen, Meliha; Klassen, Prescott; Tarczy-Hornoch, Peter

    2014-01-01

    Objective: This paper describes a text processing system designed to automate the manual data abstraction process in a quality improvement (QI) program. The Surgical Care and Outcomes Assessment Program (SCOAP) is a clinician-led, statewide performance benchmarking QI platform for surgical and interventional procedures. The data elements abstracted as part of this program cover a wide range of clinical information from patient medical history to details of surgical interventions. Methods: Statistical and rule-based extractors were developed to automatically abstract data elements. A preprocessing pipeline was created to chunk free-text notes into its sections, sentences, and tokens. The information extracted in this preprocessing step was used by the statistical and rule-based extractors as features. Findings: Performance results for 25 extractors (14 statistical, 11 rule based) are presented. The average f1-scores for 11 rule-based extractors and 14 statistical extractors are 0.785 (min=0.576,max=0.931,std-dev=0.113) and 0.812 (min=0.571,max=0.993,std-dev=0.135) respectively. Discussion: Our error analysis revealed that most extraction errors were due either to data imbalance in the data set or the way the gold standard had been created. Conclusion: As future work, more experiments will be conducted with a more comprehensive data set from multiple institutions contributing to the QI project. PMID:25848598

  12. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Research of Theory & Method Miscible Flooding Well Test Model With Carbon Dioxide Injection and Its Pressure Analysis. 2011,20(4):1 -4 Zhu Jianwei, Shao Changjin, Liao Xinwei, Yang Zhenqing( China University of Petroleum (Beijing) ) Based on miscible flooding well test analysis theory with carbon dioxide injection, the diffusion model of mixture components when carbon dioxide miscible with the oil and variation law of the temperature and viscosity are analyzed,

  13. Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Post-Western International System and the Rise of the East;Hegemonlc Dependence and the Logic in the Declining Asceudance of Leading Powers;Constructive Leadership and China's Diplomatic Transformation;The Bargaining Model of International Mediation Onset:A Quantitative Test;The Imnact of Gender Differences on National Military Expenditure

  14. Injecting Abstract Interpretations into Linear Cost Models

    Directory of Open Access Journals (Sweden)

    David Cachera

    2010-06-01

    Full Text Available We present a semantics based framework for analysing the quantitative behaviour of programs with regard to resource usage. We start from an operational semantics equipped with costs. The dioid structure of the set of costs allows for defining the quantitative semantics as a linear operator. We then present an abstraction technique inspired from abstract interpretation in order to effectively compute global cost information from the program. Abstraction has to take two distinct notions of order into account: the order on costs and the order on states. We show that our abstraction technique provides a correct approximation of the concrete cost computations.

  15. Abstract

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Management fraud and auditing scandals became more serious since the 70s-80s of the last century, so that the independence of CPA faced nprecedented challenges. Growing emphasis was put on the independence of CPA on which the academic research deepened too. This article analyzes the nfluence of the independence of CPA from the conflict of the owners / shareholders, managers, and CPA. By analysing the balance of power of those conflict nd the factors that restrict them, and basing on a summary research in this area of other scholars, this paper put forward a CPA conflict model based on the orporate governance structure, and suggest on the issue of how to protect the auditing independence under this new model.

  16. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    A Representative work that Vainly Attempts to Westernize China On Yang Ji.sheng's Paper "My View of Chinese Pattern" XU Chong-wenAbstract: Mr. Yang Ji-sheng calls economic connotation of Chinese pattern "market economy of power" with all sorts of drawbacks, it is to take the problems that Chinese model deliberately struggles with and even the objects must be resolutely eliminated as the parts of Chinese pattern, thus they are absolute nonsense; he boils down political connotation of Chinese pattern to "authority politics" of "thoroughly denying modem democratic system",

  17. ABSTRACT

    Directory of Open Access Journals (Sweden)

    Michelle de Stefano Sabino

    2011-12-01

    Full Text Available This paper aims to describe and to analyze the integration observed in the Sintonia project with respect to the comparison of project management processes to the model of the Stage-Gate ®. The literature addresses these issues conceptually, but lack an alignment between them that is evident in practice. As a method was used single case study. The report is as if the Sintonia project, developed by PRODESP - Data Processing Company of São Paulo. The results show the integration of project management processes with the Stage-Gate model developed during the project life cycle. The formalization of the project was defined in stages in which allowed the exploitation of economies of repetition and recombination to the development of new projects. This study contributes to the technical vision in dealing with the integration of project management processes. It was concluded that this system represents an attractive way, in terms of creating economic value and technological innovation for the organization.

  18. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Research on the Theory and Standard of Peasants" Life Cycle Pension Compensation Mu Huaizhong Shen Yi· 2 · Thedifficulties of full coverage in pension system lie in rural farmers. In this paper, we put forward a "dual agricultural welfare difference" theory and apply it to the issues regarding peasants' life cycle pension compensation. Taking differential between equilib- rium and biased agricultural incomes as the key indicator, we build mathematical models of "dual agricultural welfare balance" and measure the size from 1953 to 2009. Our finding shows that China's "dual agricultnral welfare difference" has a fluctuation ranged be- tween 0.4 and 0.6. Based on life cycle characteristics, such as natural life cycle, policy and institutional life cycle, our suggestion is to compensate peasants' primary pension with a balance of "dual agriculture welfare difference" and other countermeasures.

  19. Automated parking garage system model

    Science.gov (United States)

    Collins, E. R., Jr.

    1975-01-01

    A one-twenty-fifth scale model of the key components of an automated parking garage system is described. The design of the model required transferring a vehicle from an entry level, vertically (+Z, -Z), to a storage location at any one of four storage positions (+X, -X, +Y, +Y, -Y) on the storage levels. There are three primary subsystems: (1) a screw jack to provide the vertical motion of the elevator, (2) a cam-driven track-switching device to provide X to Y motion, and (3) a transfer cart to provide horizontal travel and a small amount to vertical motion for transfer to the storage location. Motive power is provided by dc permanent magnet gear motors, one each for the elevator and track switching device and two for the transfer cart drive system (one driving the cart horizontally and the other providing the vertical transfer). The control system, through the use of a microprocessor, provides complete automation through a feedback system which utilizes sensing devices.

  20. ABSTRACTS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Fold distribution, offset distribution and azi- muth distribution in a bin have direct effects on ge- ometry attributes in 3D seismic survey design. If two adjacent bins have the same fold but different offsets, this non-uniform offset distribution brings some stack amplitude diversity in the adjacent bins. At present, 3D geometry attribute uniformi- ty of the most analytical methods is expressed by qualitative analysis chart. We introduce in this pa- per a uniformity quantitative analysis method for offset distribution by average value, square devia- tion and weighting factor. The paper analyses effects on offset distribution uniformity of different 3D geometry parameters by the proposed uniformi- ty quantitative analysis method. Furthermore, the paper analyses effects on seismic stack amplitude or frequency uniformity of different 3D geometry parameters by the seismic wave modeling. The re- suits show that offset distribution uniformity is in good agreement with seismic stack amplitude or frequency uniformity. Therefore this improved method can be considered as a useful tool for ana- lyzing and evaluating 3D geometry attribute uni- formity.

  1. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    Science.gov (United States)

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results. PMID:20012610

  2. Geographic information abstractions: conceptual clarity for geographic modeling

    OpenAIRE

    T L Nyerges

    1991-01-01

    Just as we abstract our reality to make life intellectually manageable, we must create abstractions when we build models of geographic structure and process. Geographic information abstractions with aspects of theme, time, and space can be used to provide a comprehensive description of geographic reality in a geographic information system (GIS). In the context of geographic modeling a geographic information abstraction is defined as a simultaneous focus on important characteristics of geograp...

  3. Abstraction, visualization, and evolution of process models

    OpenAIRE

    Kolb, Jens

    2015-01-01

    The increasing adoption of process orientation in companies has resulted in large process model collections. Each process model of such a collection may comprise dozens of elements and captures various perspectives of a business process. Domain experts having only limited modeling knowledge hardly comprehend such complex process models. Therefore, they demand for a customized view on business processes enabling them to optimize and evolve process models effectively. This thesis contribute...

  4. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  5. Model-based Abstraction of Data Provenance

    OpenAIRE

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions. This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potent...

  6. Traffic Modeling and Probabilistic Process Abstraction

    Institute of Scientific and Technical Information of China (English)

    HU Lian-ming

    2003-01-01

    State-based models provide an attractive and simple approach to performance modeling. Unfortunately,this approach gives rise to two fundamental problems: 1) capturing the input loads to a system efficiently within such presentations; and 2) coping with the explosion in the number of states whtn system is compositionally presented. Both problems can be regarded as searching for some optimal representative state model with a minimal const. In this paper a probabilistic feedback search approach (popularly referred to as a genetic algorithm) was presented for locating good models with low (state) cost.

  7. Integration of Automated Decision Support Systems with Data Mining Abstract: A Client Perspective

    Directory of Open Access Journals (Sweden)

    Abdullah Saad AL-Malaise

    2013-03-01

    Full Text Available Customer’s behavior and satisfaction are always play important role to increase organization’s growth and market value. Customers are on top priority for the growing organization to build up their businesses. In this paper presents the architecture of Decision Support Systems (DSS in connection to deal with the customer’s enquiries and requests. Main purpose behind the proposed model is to enhance the customer’s satisfaction and behavior using DSS. We proposed model by extension in traditional DSS concepts with integration of Data Mining (DM abstract. The model presented in this paper shows the comprehensive architecture to work on the customer requests using DSS and knowledge management (KM for improving the customer’s behavior and satisfaction. Furthermore, DM abstract provides more methods and techniques; to understand the contacted customer’s data, to classify the replied answers in number of classes, and to generate association between the same type of queries, and finally to maintain the KM for future correspondence.

  8. A Model-Driven Parser Generator, from Abstract Syntax Trees to Abstract Syntax Graphs

    CERN Document Server

    Quesada, Luis; Cubero, Juan-Carlos

    2012-01-01

    Model-based parser generators decouple language specification from language processing. The model-driven approach avoids the limitations that conventional parser generators impose on the language designer. Conventional tools require the designed language grammar to conform to the specific kind of grammar supported by the particular parser generator (being LL and LR parser generators the most common). Model-driven parser generators, like ModelCC, do not require a grammar specification, since that grammar can be automatically derived from the language model and, if needed, adapted to conform to the requirements of the given kind of parser, all of this without interfering with the conceptual design of the language and its associated applications. Moreover, model-driven tools such as ModelCC are able to automatically resolve references between language elements, hence producing abstract syntax graphs instead of abstract syntax trees as the result of the parsing process. Such graphs are not confined to directed ac...

  9. Syntactic Abstraction of B Models to Generate Tests

    CERN Document Server

    Julliand, Jacques; Bué, Pierre-Christophe; Masson, Pierre-Alain

    2010-01-01

    In a model-based testing approach as well as for the verification of properties, B models provide an interesting solution. However, for industrial applications, the size of their state space often makes them hard to handle. To reduce the amount of states, an abstraction function can be used, often combining state variable elimination and domain abstractions of the remaining variables. This paper complements previous results, based on domain abstraction for test generation, by adding a preliminary syntactic abstraction phase, based on variable elimination. We define a syntactic transformation that suppresses some variables from a B event model, in addition to a method that chooses relevant variables according to a test purpose. We propose two methods to compute an abstraction A of an initial model M. The first one computes A as a simulation of M, and the second one computes A as a bisimulation of M. The abstraction process produces a finite state system. We apply this abstraction computation to a Model Based T...

  10. Automated systemic-cognitive analysis of images pixels (generalization, abstraction, classification and identification

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-09-01

    Full Text Available In the article the application of systemic-cognitive analysis and its mathematical model i.e. the system theory of the information and its program toolkit which is "Eidos" system for loading images from graphics files, synthesis of the generalized images of classes, their abstraction, classification of the generalized images (clusters and constructs comparisons of concrete images with the generalized images (identification are examined. We suggest using the theory of information for processing the data and its size for every pixel which indicates that the image is of a certain class. A numerical example is given in which on the basis of a number of specific examples of images belonging to different classes, forming generalized images of these classes, independent of their specific implementations, i.e., the "Eidoses" of these images (in the definition of Plato – the prototypes or archetypes of images (in the definition of Jung. But the "Eidos" system provides not only the formation of prototype images, which quantitatively reflects the amount of information in the elements of specific images on their belonging to a particular proto-types, but a comparison of specific images with generic (identification and the generalization of pictures images with each other (classification

  11. Coupling Radar Rainfall to Hydrological Models for Water Abstraction Management

    Science.gov (United States)

    Asfaw, Alemayehu; Shucksmith, James; Smith, Andrea; MacDonald, Ken

    2015-04-01

    The impacts of climate change and growing water use are likely to put considerable pressure on water resources and the environment. In the UK, a reform to surface water abstraction policy has recently been proposed which aims to increase the efficiency of using available water resources whilst minimising impacts on the aquatic environment. Key aspects to this reform include the consideration of dynamic rather than static abstraction licensing as well as introducing water trading concepts. Dynamic licensing will permit varying levels of abstraction dependent on environmental conditions (i.e. river flow and quality). The practical implementation of an effective dynamic abstraction strategy requires suitable flow forecasting techniques to inform abstraction asset management. Potentially the predicted availability of water resources within a catchment can be coupled to predicted demand and current storage to inform a cost effective water resource management strategy which minimises environmental impacts. The aim of this work is to use a historical analysis of UK case study catchment to compare potential water resource availability using modelled dynamic abstraction scenario informed by a flow forecasting model, against observed abstraction under a conventional abstraction regime. The work also demonstrates the impacts of modelling uncertainties on the accuracy of predicted water availability over range of forecast lead times. The study utilised a conceptual rainfall-runoff model PDM - Probability-Distributed Model developed by Centre for Ecology & Hydrology - set up in the Dove River catchment (UK) using 1km2 resolution radar rainfall as inputs and 15 min resolution gauged flow data for calibration and validation. Data assimilation procedures are implemented to improve flow predictions using observed flow data. Uncertainties in the radar rainfall data used in the model are quantified using artificial statistical error model described by Gaussian distribution and

  12. Automating Weaving of Interaction Models in The Aspect Oriented Model Driven Framework using Kermeta

    OpenAIRE

    2007-01-01

    The Aspect Oriented Model Driven Framework (AOMDF) is a software design approach facilitating multidimensional separation of crosscutting concerns in a model driven software development setting. This thesis provides a proof-of-concept of automated weaving (composition) of interaction models (sequence diagrams) in AOMDF. The contribution is twofold; Firstly, we develop a metamodel that orthogonally extends UML2 and introduces new abstract syntax concepts for modeling of interaction aspects and...

  13. Compositional Abstraction of PEPA Models for Transient Analysis

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    Stochastic process algebras such as PEPA allow complex stochastic models to be described in a compositional way, but this leads to state space explosion problems. To combat this, there has been a great deal of work in developing techniques for abstracting Markov chains. In particular, abstract - or...... explicitly. In this paper, we present a compositional application of abstract Markov chains to PEPA, based on a Kronecker representation of the underlying CTMC. This can be used to bound probabilistic reachability properties in the Continuous Stochastic Logic (CSL), and we have implemented this as part of...

  14. Test automation for Markov Chain Usage Models

    OpenAIRE

    Bettinotti, Adriana M.; Garavaglia, Mauricio

    2011-01-01

    Statistical testing with Markov Chain Usage Models is an effective method to be used by programmers and testers during web sites development, to guarantee the software reliability. The JUMBL software works on this models; it supports model construction with the TML language and analysis, tests generation and execution and analysis of tests results. This paper is targeted at test automation for web sites development with JUMBL and JWebUnit.

  15. 07361 Abstracts Collection -- Programming Models for Ubiquitous Parallelism

    OpenAIRE

    Wong, David Chi-Leung; Cohen, Albert; Garzarán, María J.; Lengauer, Christian; Midkiff, Samuel P.

    2008-01-01

    From 02.09. to 07.09.2007, the Dagstuhl Seminar 07361 ``Programming Models for Ubiquitous Parallelism'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section des...

  16. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  18. Learning Models of Communication Protocols using Abstraction Techniques

    OpenAIRE

    Uijen, Johan

    2009-01-01

    In order to accelerate the usage of model based verification in real life software life cycles, an approach is introduced in this thesis to learn models from black box software modules. These models can be used to perform model based testing on. In this thesis models of communication protocols are considered. To learn these models efficiently, an abstraction needs to be defined over the parameters that are used in the messages that are sent and received by the protocols. The tools that are us...

  19. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  20. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  1. Automated extraction of precise protein expression patterns in lymphoma by text mining abstracts of immunohistochemical studies

    Directory of Open Access Journals (Sweden)

    Jia-Fu Chang

    2013-01-01

    Full Text Available Background: In general, surgical pathology reviews report protein expression by tumors in a semi-quantitative manner, that is, -, -/+, +/-, +. At the same time, the experimental pathology literature provides multiple examples of precise expression levels determined by immunohistochemical (IHC tissue examination of populations of tumors. Natural language processing (NLP techniques enable the automated extraction of such information through text mining. We propose establishing a database linking quantitative protein expression levels with specific tumor classifications through NLP. Materials and Methods: Our method takes advantage of typical forms of representing experimental findings in terms of percentages of protein expression manifest by the tumor population under study. Characteristically, percentages are represented straightforwardly with the % symbol or as the number of positive findings of the total population. Such text is readily recognized using regular expressions and templates permitting extraction of sentences containing these forms for further analysis using grammatical structures and rule-based algorithms. Results: Our pilot study is limited to the extraction of such information related to lymphomas. We achieved a satisfactory level of retrieval as reflected in scores of 69.91% precision and 57.25% recall with an F-score of 62.95%. In addition, we demonstrate the utility of a web-based curation tool for confirming and correcting our findings. Conclusions: The experimental pathology literature represents a rich source of pathobiological information, which has been relatively underutilized. There has been a combinatorial explosion of knowledge within the pathology domain as represented by increasing numbers of immunophenotypes and disease subclassifications. NLP techniques support practical text mining techniques for extracting this knowledge and organizing it in forms appropriate for pathology decision support systems.

  2. Automating Risk Analysis of Software Design Models

    OpenAIRE

    Maxime Frydman; Guifré Ruiz; Elisa Heymann; Eduardo César; Barton P. Miller

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security e...

  3. Policy Recognition in the Abstract Hidden Markov Model

    CERN Document Server

    Bui, H H; West, G; 10.1613/jair.839

    2011-01-01

    In this paper, we present a method for recognising an agent's behaviour in dynamic, noisy, uncertain domains, and across multiple levels of abstraction. We term this problem on-line plan recognition under uncertainty and view it generally as probabilistic inference on the stochastic process representing the execution of the agent's plan. Our contributions in this paper are twofold. In terms of probabilistic inference, we introduce the Abstract Hidden Markov Model (AHMM), a novel type of stochastic processes, provide its dynamic Bayesian network (DBN) structure and analyse the properties of this network. We then describe an application of the Rao-Blackwellised Particle Filter to the AHMM which allows us to construct an efficient, hybrid inference method for this model. In terms of plan recognition, we propose a novel plan recognition framework based on the AHMM as the plan execution model. The Rao-Blackwellised hybrid inference for AHMM can take advantage of the independence properties inherent in a model of p...

  4. Recent advances in automated system model extraction (SME)

    International Nuclear Information System (INIS)

    In this paper we present two different techniques for automated extraction of system models from FEA models. We discuss two different algorithms: for (i) automated N-DOF SME for electrostatically actuated MEMS and (ii) automated N-DOF SME for MEMS inertial sensors. We will present case studies for the two different algorithms presented

  5. Particle Tracking Model and Abstraction of Transport Processes

    International Nuclear Information System (INIS)

    The purpose of this report is to document the abstraction model being used in total system performance assessment (TSPA) model calculations for radionuclide transport in the unsaturated zone (UZ). The UZ transport abstraction model uses the particle-tracking method that is incorporated into the finite element heat and mass model (FEHM) computer code (Zyvoloski et al. 1997 [DIRS 100615]) to simulate radionuclide transport in the UZ. This report outlines the assumptions, design, and testing of a model for calculating radionuclide transport in the UZ at Yucca Mountain. In addition, methods for determining and inputting transport parameters are outlined for use in the TSPA for license application (LA) analyses. Process-level transport model calculations are documented in another report for the UZ (BSC 2004 [DIRS 164500]). Three-dimensional, dual-permeability flow fields generated to characterize UZ flow (documented by BSC 2004 [DIRS 169861]; DTN: LB03023DSSCP9I.001 [DIRS 163044]) are converted to make them compatible with the FEHM code for use in this abstraction model. This report establishes the numerical method and demonstrates the use of the model that is intended to represent UZ transport in the TSPA-LA. Capability of the UZ barrier for retarding the transport is demonstrated in this report, and by the underlying process model (BSC 2004 [DIRS 164500]). The technical scope, content, and management of this report are described in the planning document ''Technical Work Plan for: Unsaturated Zone Transport Model Report Integration'' (BSC 2004 [DIRS 171282]). Deviations from the technical work plan (TWP) are noted within the text of this report, as appropriate. The latest version of this document is being prepared principally to correct parameter values found to be in error due to transcription errors, changes in source data that were not captured in the report, calculation errors, and errors in interpretation of source data

  6. Acquisition of data for plasma simulation by automated extraction of terminology from article abstracts

    International Nuclear Information System (INIS)

    Computer simulation of burning plasmas as well as computational plasma modeling in image processing requires a number of accurate data, in addition to a relevant model framework. To this aim, it is very important to recognize, obtain and evaluate data relevant for such a simulation from the literature. This work focuses on the simultaneous search of relevant data across various online databases, extraction of cataloguing and numerical information, and automatic recognition of specific terminology in the text retrieved. The concept is illustrated on the particular terminology of Atomic and Molecular data relevant to edge plasma simulation. The IAEA search engine GENIE and the NIFS search engine Joint Search 2 are compared and discussed. Accurate modeling of the imaged object is considered to be the ultimate challenge in improving the resolution limits of plasma imaging. (author)

  7. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  8. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    Science.gov (United States)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  9. Models in Movies: Teaching Abstract Concepts in Concrete Models

    Directory of Open Access Journals (Sweden)

    Madeline Breer

    2012-02-01

    Full Text Available The tool created here is an instructional video that demonstrates how to create models of the heavy and light chains of an antibody using pipe cleaners, and how the process of V, D, and J gene recombination functions, using the model as a visual aide. The video was created by undergraduate students, with the intended audience being other undergraduates. This type of direct peer teaching aids in education because the “teachers” in this situation greatly enhance their own knowledge through instruction, and their imparting of knowledge is often more helpful to that of another student because they are more aware of comprehension gaps within their peer groups. As such, the supplementary goal of the model is to have students follow along with the video by creating and using their own models, thereby gaining a deeper knowledge of the process with a more thorough interaction.

  10. The Conceptual Integration Modeling Framework: Abstracting from the Multidimensional Model

    CERN Document Server

    Rizzolo, Flavio; Pottinger, Rachel; Wong, Kwok

    2010-01-01

    Data warehouses are overwhelmingly built through a bottom-up process, which starts with the identification of sources, continues with the extraction and transformation of data from these sources, and then loads the data into a set of data marts according to desired multidimensional relational schemas. End user business intelligence tools are added on top of the materialized multidimensional schemas to drive decision making in an organization. Unfortunately, this bottom-up approach is costly both in terms of the skilled users needed and the sheer size of the warehouses. This paper proposes a top-down framework in which data warehousing is driven by a conceptual model. The framework offers both design time and run time environments. At design time, a business user first uses the conceptual modeling language as a multidimensional object model to specify what business information is needed; then she maps the conceptual model to a pre-existing logical multidimensional representation. At run time, a system will tra...

  11. Simplifying Process Model Abstraction: Techniques for Generating Model Names

    OpenAIRE

    Leopold, Henrik; Mendling, Jan; Reijers, Hajo A.; La Rosa, Marcello

    2014-01-01

    The increased adoption of business process management approaches, tools, and practices has led organizations to accumulate large collections of business process models. These collections can easily include from a hundred to a thousand models, especially in the context of multinational corporations or as a result of organizational mergers and acquisitions. A concrete problem is thus how to maintain these large repositories in such a way that their complexity does not hamper their p...

  12. Models in Movies: Teaching Abstract Concepts in Concrete Models

    OpenAIRE

    Madeline Breer; Bianca Christensen; Jennifer Taylor

    2012-01-01

    The tool created here is an instructional video that demonstrates how to create models of the heavy and light chains of an antibody using pipe cleaners, and how the process of V, D, and J gene recombination functions, using the model as a visual aide. The video was created by undergraduate students, with the intended audience being other undergraduates. This type of direct peer teaching aids in education because the “teachers” in this situation greatly enhance their own knowledge throu...

  13. The Abstract Machine Model for Transaction-based System Control

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.

    2003-01-31

    Recent work applying statistical mechanics to economic modeling has demonstrated the effectiveness of using thermodynamic theory to address the complexities of large scale economic systems. Transaction-based control systems depend on the conjecture that when control of thermodynamic systems is based on price-mediated strategies (e.g., auctions, markets), the optimal allocation of resources in a market-based control system results in an emergent optimal control of the thermodynamic system. This paper proposes an abstract machine model as the necessary precursor for demonstrating this conjecture and establishes the dynamic laws as the basis for a special theory of emergence applied to the global behavior and control of complex adaptive systems. The abstract machine in a large system amounts to the analog of a particle in thermodynamic theory. The permit the establishment of a theory dynamic control of complex system behavior based on statistical mechanics. Thus we may be better able to engineer a few simple control laws for a very small number of devices types, which when deployed in very large numbers and operated as a system of many interacting markets yields the stable and optimal control of the thermodynamic system.

  14. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  15. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  16. Selected translated abstracts of Russian-language climate-change publications. 4: General circulation models

    Energy Technology Data Exchange (ETDEWEB)

    Burtis, M.D. [comp.] [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Razuvaev, V.N.; Sivachok, S.G. [All-Russian Research Inst. of Hydrometeorological Information--World Data Center, Obninsk (Russian Federation)

    1996-10-01

    This report presents English-translated abstracts of important Russian-language literature concerning general circulation models as they relate to climate change. Into addition to the bibliographic citations and abstracts translated into English, this report presents the original citations and abstracts in Russian. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  17. Resource Allocation Model for Modelling Abstract RTOS on Multiprocessor System-on-Chip

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2003-01-01

    Resource Allocation is an important problem in RTOS's, and has been an active area of research. Numerous approaches have been developed and many different techniques have been combined for a wide range of applications. In this paper, we address the problem of resource allocation in the context of...... modelling an abstract RTOS on multiprocessor SoC platforms. We discuss the implementation details of a simplified basic priority inheritance protocol for our abstract system model in SystemC....

  18. Automation Marketplace 2010: New Models, Core Systems

    Science.gov (United States)

    Breeding, Marshall

    2010-01-01

    In a year when a difficult economy presented fewer opportunities for immediate gains, the major industry players have defined their business strategies with fundamentally different concepts of library automation. This is no longer an industry where companies compete on the basis of the best or the most features in similar products but one where…

  19. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... UML and a test automation principle based on traces written as a kind of regular expressions....

  20. Mathematical models in marketing a collection of abstracts

    CERN Document Server

    Funke, Ursula H

    1976-01-01

    Mathematical models can be classified in a number of ways, e.g., static and dynamic; deterministic and stochastic; linear and nonlinear; individual and aggregate; descriptive, predictive, and normative; according to the mathematical technique applied or according to the problem area in which they are used. In marketing, the level of sophistication of the mathe­ matical models varies considerably, so that a nurnber of models will be meaningful to a marketing specialist without an extensive mathematical background. To make it easier for the nontechnical user we have chosen to classify the models included in this collection according to the major marketing problem areas in which they are applied. Since the emphasis lies on mathematical models, we shall not as a rule present statistical models, flow chart models, computer models, or the empirical testing aspects of these theories. We have also excluded competitive bidding, inventory and transportation models since these areas do not form the core of ·the market...

  1. Abstract interpretation over non-deterministic finite tree automate for set-based analysis of logic programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick; Puebla, G.

    2002-01-01

    Set-based program analysis has many potential applications, including compiler optimisations, type-checking, debugging, verification and planning. One method of set-based analysis is to solve a set of {\\it set constraints} derived directly from the program text. Another approach is based on abstr...... of achieving the precision of set-constraints in the abstract interpretation framework....

  2. Modeling situated abstraction : action coalescence via multidimensional coherence.

    Energy Technology Data Exchange (ETDEWEB)

    Sallach, D. L.; Decision and Information Sciences; Univ. of Chicago

    2007-01-01

    Situated social agents weigh dozens of priorities, each with its own complexities. Domains of interest are intertwined, and progress in one area either complements or conflicts with other priorities. Interpretive agents address these complexities through: (1) integrating cognitive complexities through the use of radial concepts, (2) recognizing the role of emotion in prioritizing alternatives and urgencies, (3) using Miller-range constraints to avoid oversimplified notions omniscience, and (4) constraining actions to 'moves' in multiple prototype games. Situated agent orientations are dynamically grounded in pragmatic considerations as well as intertwined with internal and external priorities. HokiPoki is a situated abstraction designed to shape and focus strategic agent orientations. The design integrates four pragmatic pairs: (1) problem and solution, (2) dependence and power, (3) constraint and affordance, and (4) (agent) intent and effect. In this way, agents are empowered to address multiple facets of a situation in an exploratory, or even arbitrary, order. HokiPoki is open to the internal orientation of the agent as it evolves, but also to the communications and actions of other agents.

  3. Modelling of evapotranspiration at field and landscape scales. Abstract

    DEFF Research Database (Denmark)

    Overgaard, Jesper; Butts, M.B.; Rosbjerg, Dan

    2002-01-01

    -covariance observations from three 2-m masts representing fluxes from grass, winter wheat and spring barley. Observations from 40-m mast representing a mixture of the land-use classes in the model domain were used to validate the model at the larger scale. Good agreement was found at both the field and landscape scale...

  4. Compositional Abstraction of PEPA Models for Transient Analysis

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    interval - Markov chains allow us to aggregate states in such a way as to safely bound transient probabilities of the original Markov chain. Whilst we can apply this technique directly to a PEPA model, it requires us to obtain the CTMC of the model, whose state space may be too large to construct...

  5. Simulation Model of Automated Peat Briquetting Press Drive

    OpenAIRE

    A. Marozka; Y. Petrenko

    2014-01-01

    The paper presents the developed fully functional simulation model of an automated peat briquetting press drive. The given model makes it possible to reduce financial and time costs while developing, designing and operating a double-stamp peat briquetting press drive.

  6. Simulation Model of Automated Peat Briquetting Press Drive

    Directory of Open Access Journals (Sweden)

    A. Marozka

    2012-01-01

    Full Text Available The paper presents the developed fully functional simulation model of an automated peat briquetting press drive. The given model makes it possible to reduce financial and time costs while developing, designing and operating a double-stamp peat briquetting press drive.

  7. Partial Orders and Fully Abstract Models for Concurrency

    DEFF Research Database (Denmark)

    Engberg, Uffe Henrik

    1990-01-01

    In this thesis sets of labelled partial orders are employed as fundamental mathematical entities for modelling nondeterministic and concurrent processes thereby obtaining so-called noninterleaving semantics. Based on different closures of sets of labelled partial orders, simple algebraic languages...

  8. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732

  9. On Privacy Losses in the Trusted Agent Model (Abstract)

    OpenAIRE

    Mateus, Paulo; Vaudenay, Serge

    2009-01-01

    Tamper-proof devices are pretty powerful. They typically make security applications simpler (provided that the tamper-proof assumption is not violated). For application requiring privacy, we observe that some properties may become harder (if possible at all) to achieve when devices are maliciously used. We take the example of deniability, receipt-freeness, and anonymity. We formalize the trusted agent model which assumes tamper-proof hardware in a way which captures the notion of programmable...

  10. Constraint-Based Abstraction of a Model Checker for Infinite State Systems

    DEFF Research Database (Denmark)

    Banda, Gourinath; Gallagher, John Patrick

    Abstract interpretation-based model checking provides an approach to verifying properties of infinite-state systems. In practice, most previous work on abstract model checking is either restricted to verifying universal properties, or develops special techniques for temporal logics such as modal t...

  11. On automation of the procedure for crystal structure model refinement

    International Nuclear Information System (INIS)

    The methods of automation of the procedure for crystal structure model refinement from experimental diffraction data, implemented in the ASTRA program package, are described. Such tools as statistical tests, parameter scanning, and data scanning reduce the time necessary for structural investigation. At strong correlations between parameters, especially when the data set is limited, parameter scanning has an advantage over the full-matrix refinement.

  12. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  13. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  14. Total human exposure and indoor air quality: An automated bibliography (BLIS) with summary abstracts. Volume 2. Final report, January 1987-December 1989

    International Nuclear Information System (INIS)

    The Bibliographical Literature Information System (BLIS) is a computer database that provides a comprehensive review of available literature on total human exposure to environmental pollution. Brief abstracts (often condensed versions of the original abstract) are included; if the original document had no abstract, one was prepared. Unpublished draft reports are listed, as well as final reports of the U.S. Government and other countries, reports by governmental research contractors, journal articles, and other publications on exposure models field data, and newly emerging research methodologies. Emphasis is placed on those field studies measuring all the concentrations to which people may be exposed, including indoors, outdoors, and in-transit

  15. Semi-Automated Design Space Exploration for Formal Modelling

    OpenAIRE

    Grov, Gudmund; Ireland, Andrew; Llano, Maria Teresa; Kovacs, Peter; Colton, Simon; Gow, Jeremy

    2016-01-01

    Refinement based formal methods allow the modelling of systems through incremental steps via abstraction. Discovering the right levels of abstraction, formulating correct and meaningful invariants, and analysing faulty models are some of the challenges faced when using this technique. Here, we propose Design Space Exploration, an approach that aims to assist a designer by automatically providing high-level modelling guidance in real-time. More specifically, through the combination of common p...

  16. Abstract Platform and Transformations for Model-Driven Service-Oriented Development

    OpenAIRE

    Andrade Almeida, J.P.; Ferreira Pires, L.; Sinderen, van, Marten

    2006-01-01

    In this paper, we discuss the use of abstract platforms and transformation for designing applications according to the principles of the service-oriented architecture. We illustrate our approach by discussing the use of the service discovery pattern at a platform-independent design level. We show how a trader service can be specified at a high-level of abstraction and incorporated in an abstract platform for service-oriented development. Designers can then build platform-independent models of...

  17. Automated adaptive inference of phenomenological dynamical models

    Science.gov (United States)

    Daniels, Bryan C.; Nemenman, Ilya

    2015-08-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.

  18. Model Search: Formalizing and Automating Constraint Solving in MDE Platforms

    Science.gov (United States)

    Kleiner, Mathias; Del Fabro, Marcos Didonet; Albert, Patrick

    Model Driven Engineering (MDE) and constraint programming (CP) have been widely used and combined in different applications. However, existing results are either ad-hoc, not fully integrated or manually executed. In this article, we present a formalization and an approach for automating constraint-based solving in a MDE platform. Our approach generalizes existing work by combining known MDE concepts with CP techniques into a single operation called model search. We present the theoretical basis for model search, as well as an automated process that details the involved operations. We validate our approach by comparing two implemented solutions (one based on Alloy/SAT, the other on OPL/CP), and by executing them over an academic use-case.

  19. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  20. Abstraction and Model Checking in the PEPA Plug-in for Eclipse

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew

    2010-01-01

    The stochastic process algebra PEPA is a widely used language for performance modelling, and a large part of its success is due to the rich tool support that is available. As a compositional Markovian formalism, however, it suffers from the state space explosion problem, where even small models can...... lead to very large Markov chains. One way of analysing such models is to use abstraction - constructing a smaller model that bounds the properties of the original. We present an extension to the PEPA plug-in for Eclipse that enables abstracting and model checking of PEPA models. This implements two new...

  1. Modeling automated worlds: A performance shaping factors perspective

    International Nuclear Information System (INIS)

    This paper presents preliminary findings regarding a modeling framework under development for use in human reliability analysis (HRA) for automated systems. This framework supports the review of cognitive factors thought to be important for crew performance in control rooms modified by the introduction of digital control systems and advanced information presentation systems. The modeling framework proposed in this paper provides a means conceptualizing the dynamic sharing of responsibilities and information that occurs as a natural part of human machine systems interaction in advanced systems. Performance shaping factors (PSFs) are presented as the interface between plant status and the crew. Special emphasis is placed on the evaluation of traditional versus non-traditional PSFs. Many of the non-traditional PSFs are cognitive in nature. A number of issues pertaining to crew performance in automated environments are discussed. Data collection activities in the form of field studies and controlled experiments necessary to support PSF evaluation methods are discussed

  2. Automated Reasoning on Feature Models via Constraint Programming 

    OpenAIRE

    Alvarez Divo, Carlos Eduardo

    2011-01-01

    Feature models are often used in software product lines to represent a set of products and reason over their properties, similarities and differences, costs, etc. The problem becomes automating such reasoning which translates into a positive impact in terms of production, cost, and creation of the final products. To approach this matter we take advantage of the benefits of the constraint programming technology, which has proven to be most effective when solving problems of large complexity. T...

  3. Automated refinement and inference of analytical models for metabolic networks

    Science.gov (United States)

    Schmidt, Michael D.; Vallabhajosyula, Ravishankar R.; Jenkins, Jerry W.; Hood, Jonathan E.; Soni, Abhishek S.; Wikswo, John P.; Lipson, Hod

    2011-10-01

    The reverse engineering of metabolic networks from experimental data is traditionally a labor-intensive task requiring a priori systems knowledge. Using a proven model as a test system, we demonstrate an automated method to simplify this process by modifying an existing or related model--suggesting nonlinear terms and structural modifications--or even constructing a new model that agrees with the system's time series observations. In certain cases, this method can identify the full dynamical model from scratch without prior knowledge or structural assumptions. The algorithm selects between multiple candidate models by designing experiments to make their predictions disagree. We performed computational experiments to analyze a nonlinear seven-dimensional model of yeast glycolytic oscillations. This approach corrected mistakes reliably in both approximated and overspecified models. The method performed well to high levels of noise for most states, could identify the correct model de novo, and make better predictions than ordinary parametric regression and neural network models. We identified an invariant quantity in the model, which accurately derived kinetics and the numerical sensitivity coefficients of the system. Finally, we compared the system to dynamic flux estimation and discussed the scaling and application of this methodology to automated experiment design and control in biological systems in real time.

  4. Automated refinement and inference of analytical models for metabolic networks

    International Nuclear Information System (INIS)

    The reverse engineering of metabolic networks from experimental data is traditionally a labor-intensive task requiring a priori systems knowledge. Using a proven model as a test system, we demonstrate an automated method to simplify this process by modifying an existing or related model-–suggesting nonlinear terms and structural modifications–-or even constructing a new model that agrees with the system's time series observations. In certain cases, this method can identify the full dynamical model from scratch without prior knowledge or structural assumptions. The algorithm selects between multiple candidate models by designing experiments to make their predictions disagree. We performed computational experiments to analyze a nonlinear seven-dimensional model of yeast glycolytic oscillations. This approach corrected mistakes reliably in both approximated and overspecified models. The method performed well to high levels of noise for most states, could identify the correct model de novo, and make better predictions than ordinary parametric regression and neural network models. We identified an invariant quantity in the model, which accurately derived kinetics and the numerical sensitivity coefficients of the system. Finally, we compared the system to dynamic flux estimation and discussed the scaling and application of this methodology to automated experiment design and control in biological systems in real time

  5. A model for automation of radioactive dose control

    International Nuclear Information System (INIS)

    The paper presents a proposal for automation of the personnel dose control system to be used in nuclear medicine environments. The model has considered the Standards and rules of the National Commission of Nuclear Energy (CNEN) and of the Health Ministry. The advantages of the model is a robust management of the integrated dose and technicians qualification status. The software platform selected to be used was the Lotus Notes and an analysis of the advantages, disadvantages of the use of this platform is also presented. (author)

  6. 04101 Abstracts Collection -- Language Engineering for Model-Driven Software Development

    OpenAIRE

    Bézivin, Jean; Heckel, Reiko

    2005-01-01

    From 29.02. to 05.03.04, the Dagstuhl Seminar 04101 ``Language Engineering for Model-Driven Software Development'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this pa...

  7. 07341 Abstracts Collection -- Code Instrumentation and Modeling for Parallel Performance Analysis

    OpenAIRE

    Hoisie, Adolfy; Miller, Barton P.; Mohr, Bernd

    2007-01-01

    From 20th to 24th August 2007, the Dagstuhl Seminar 07341 ``Code Instrumentation and Modeling for Parallel Performance Analysis'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The ...

  8. Automated Decomposition of Model-based Learning Problems

    Science.gov (United States)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  9. Model abstraction addressing long-term simulations of chemical degradation of large-scale concrete structures

    International Nuclear Information System (INIS)

    This paper presents a methodology to assess the spatial-temporal evolution of chemical degradation fronts in real-size concrete structures typical of a near-surface radioactive waste disposal facility. The methodology consists of the abstraction of a so-called full (complicated) model accounting for the multicomponent - multi-scale nature of concrete to an abstracted (simplified) model which simulates chemical concrete degradation based on a single component in the aqueous and solid phase. The abstracted model is verified against chemical degradation fronts simulated with the full model under both diffusive and advective transport conditions. Implementation in the multi-physics simulation tool COMSOL allows simulation of the spatial-temporal evolution of chemical degradation fronts in large-scale concrete structures. (authors)

  10. Extraction Error Modeling and Automated Model Debugging in High-Performance Low Power Custom Designs

    OpenAIRE

    Yang, Yu-Shen; Veneris, Andreas; Thadikaran, Paul; Venkataraman, Srikanth

    2005-01-01

    Test model generation is common in the design cycle of custom made high performance low power designs targeted for high volume production. Logic extraction is a key step in test model generation to produce a logic level netlist from the transistor level representation. This is a semi-automated process which is error prone. This paper analyzes typical extraction errors applicable to clocking schemes seen in high-performance designs today. An automated debugging solution for these errors in des...

  11. Abstract algebra

    CERN Document Server

    Deskins, W E

    1996-01-01

    This excellent textbook provides undergraduates with an accessible introduction to the basic concepts of abstract algebra and to the analysis of abstract algebraic systems. These systems, which consist of sets of elements, operations, and relations among the elements, and prescriptive axioms, are abstractions and generalizations of various models which evolved from efforts to explain or discuss physical phenomena.In Chapter 1, the author discusses the essential ingredients of a mathematical system, and in the next four chapters covers the basic number systems, decompositions of integers, diop

  12. The Application of the Cumulative Logistic Regression Model to Automated Essay Scoring

    Science.gov (United States)

    Haberman, Shelby J.; Sinharay, Sandip

    2010-01-01

    Most automated essay scoring programs use a linear regression model to predict an essay score from several essay features. This article applied a cumulative logit model instead of the linear regression model to automated essay scoring. Comparison of the performances of the linear regression model and the cumulative logit model was performed on a…

  13. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  14. M2m Automation: Matlab-To-Map Reduce Automation

    Directory of Open Access Journals (Sweden)

    Archana C S

    2014-06-01

    Full Text Available Abstract- MapReduce is a very popular parallel programming model for cloud computing platforms, and has become an effective method for processing massive data by using a cluster of computers. Program language -to-MapReduce Automator is a possible solution to help traditional programmers easily deploy an application to cloud systems through translating sequential codes to MapReduce codes.M2M Automation mainly focuses on automating numerical computations by using hadoop at the back end. M2M automates Hadoop, for faster execution of Matlab commands using MapReduce code.

  15. Automated Modeling of Microwave Structures by Enhanced Neural Networks

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2006-12-01

    Full Text Available The paper describes the methodology of the automated creation of neural models of microwave structures. During the creation process, artificial neural networks are trained using the combination of the particle swarm optimization and the quasi-Newton method to avoid critical training problems of the conventional neural nets. In the paper, neural networks are used to approximate the behavior of a planar microwave filter (moment method, Zeland IE3D. In order to evaluate the efficiency of neural modeling, global optimizations are performed using numerical models and neural ones. Both approaches are compared from the viewpoint of CPU-time demands and the accuracy. Considering conclusions, methodological recommendations for including neural networks to the microwave design are formulated.

  16. Geochemistry Model Abstraction and Sensitivity Studies for the 21 PWR CSNF Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    P. Bernot; S. LeStrange; E. Thomas; K. Zarrabi; S. Arthur

    2002-10-29

    The CSNF geochemistry model abstraction, as directed by the TWP (BSC 2002b), was developed to provide regression analysis of EQ6 cases to obtain abstracted values of pH (and in some cases HCO{sub 3}{sup -} concentration) for use in the Configuration Generator Model. The pH of the system is the controlling factor over U mineralization, CSNF degradation rate, and HCO{sub 3}{sup -} concentration in solution. The abstraction encompasses a large variety of combinations for the degradation rates of materials. The ''base case'' used EQ6 simulations looking at differing steel/alloy corrosion rates, drip rates, and percent fuel exposure. Other values such as the pH/HCO{sub 3}{sup -} dependent fuel corrosion rate and the corrosion rate of A516 were kept constant. Relationships were developed for pH as a function of these differing rates to be used in the calculation of total C and subsequently, the fuel rate. An additional refinement to the abstraction was the addition of abstracted pH values for cases where there was limited O{sub 2} for waste package corrosion and a flushing fluid other than J-13, which has been used in all EQ6 calculation up to this point. These abstractions also used EQ6 simulations with varying combinations of corrosion rates of materials to abstract the pH (and HCO{sub 3}{sup -} in the case of the limiting O{sub 2} cases) as a function of WP materials corrosion rates. The goodness of fit for most of the abstracted values was above an R{sup 2} of 0.9. Those below this value occurred during the time at the very beginning of WP corrosion when large variations in the system pH are observed. However, the significance of F-statistic for all the abstractions showed that the variable relationships are significant. For the abstraction, an analysis of the minerals that may form the ''sludge'' in the waste package was also presented. This analysis indicates that a number a different iron and aluminum minerals may form in

  17. Uncovering protein interaction in abstracts and text using a novel linear model and word proximity networks

    CERN Document Server

    Abi-Haidar, Alaa; Maguitman, Ana G; Radivojac, Predrag; Retchsteiner, Andreas; Verspoor, Karin; Wang, Zhiping; Rocha, Luis M; 10.1186/gb-2008-9-s2-s11

    2008-01-01

    We participated in three of the protein-protein interaction subtasks of the Second BioCreative Challenge: classification of abstracts relevant for protein-protein interaction (IAS), discovery of protein pairs (IPS) and text passages characterizing protein interaction (ISS) in full text documents. We approached the abstract classification task with a novel, lightweight linear model inspired by spam-detection techniques, as well as an uncertainty-based integration scheme. We also used a Support Vector Machine and the Singular Value Decomposition on the same features for comparison purposes. Our approach to the full text subtasks (protein pair and passage identification) includes a feature expansion method based on word-proximity networks. Our approach to the abstract classification task (IAS) was among the top submissions for this task in terms of the measures of performance used in the challenge evaluation (accuracy, F-score and AUC). We also report on a web-tool we produced using our approach: the Protein Int...

  18. Development of an automated core model for nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Mosteller, R.D.

    1998-12-31

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop an automated package of computer codes that can model the steady-state behavior of nuclear-reactor cores of various designs. As an added benefit, data produced for steady-state analysis also can be used as input to the TRAC transient-analysis code for subsequent safety analysis of the reactor at any point in its operating lifetime. The basic capability to perform steady-state reactor-core analysis already existed in the combination of the HELIOS lattice-physics code and the NESTLE advanced nodal code. In this project, the automated package was completed by (1) obtaining cross-section libraries for HELIOS, (2) validating HELIOS by comparing its predictions to results from critical experiments and from the MCNP Monte Carlo code, (3) validating NESTLE by comparing its predictions to results from numerical benchmarks and to measured data from operating reactors, and (4) developing a linkage code to transform HELIOS output into NESTLE input.

  19. Development of an automated core model for nuclear reactors

    International Nuclear Information System (INIS)

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop an automated package of computer codes that can model the steady-state behavior of nuclear-reactor cores of various designs. As an added benefit, data produced for steady-state analysis also can be used as input to the TRAC transient-analysis code for subsequent safety analysis of the reactor at any point in its operating lifetime. The basic capability to perform steady-state reactor-core analysis already existed in the combination of the HELIOS lattice-physics code and the NESTLE advanced nodal code. In this project, the automated package was completed by (1) obtaining cross-section libraries for HELIOS, (2) validating HELIOS by comparing its predictions to results from critical experiments and from the MCNP Monte Carlo code, (3) validating NESTLE by comparing its predictions to results from numerical benchmarks and to measured data from operating reactors, and (4) developing a linkage code to transform HELIOS output into NESTLE input

  20. An Automated 3d Indoor Topological Navigation Network Modelling

    Science.gov (United States)

    Jamali, A.; Rahman, A. A.; Boguslawski, P.; Gold, C. M.

    2015-10-01

    Indoor navigation is important for various applications such as disaster management and safety analysis. In the last decade, indoor environment has been a focus of wide research; that includes developing techniques for acquiring indoor data (e.g. Terrestrial laser scanning), 3D indoor modelling and 3D indoor navigation models. In this paper, an automated 3D topological indoor network generated from inaccurate 3D building models is proposed. In a normal scenario, 3D indoor navigation network derivation needs accurate 3D models with no errors (e.g. gap, intersect) and two cells (e.g. rooms, corridors) should touch each other to build their connections. The presented 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. For reducing time and cost of indoor building data acquisition process, Trimble LaserAce 1000 as surveying instrument is used. The modelling results were validated against an accurate geometry of indoor building environment which was acquired using Trimble M3 total station.

  1. Conceptual model of an automated information system of marketing at the enterprise

    OpenAIRE

    Raiko, D. V.; L.E. Lebedeva

    2014-01-01

    The aim of the article. The purpose of this paper is to create a conceptual model of an automated information system of marketing that has a certain theoretical and practical value. The results of the analysis. The main advantage of this model - a comprehensive disclosure of the relationship of concepts such as automated information technology, marketing information system, automated information system that solve the problem of processing large volumes of data in a short period of time, pr...

  2. Random many-particle systems: applications from biology, and propagation of chaos in abstract models

    CERN Document Server

    Wennberg, Bernt

    2011-01-01

    The paper discusses a family of Markov processes that represent many particle systems, and their limiting behaviour when the number of particles go to infinity. The first part concerns model of biological systems: a model for sympatric speciation, i.e. the process in which a genetically homogeneous population is split in two or more different species sharing the same habitat, and models for swarming animals. The second part of the paper deals with abstract many particle systems, and methods for rigorously deriving mean field models.

  3. Flightdeck Automation Problems (FLAP) Model for Safety Technology Portfolio Assessment

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    NASA's Aviation Safety Program (AvSP) develops and advances methodologies and technologies to improve air transportation safety. The Safety Analysis and Integration Team (SAIT) conducts a safety technology portfolio assessment (PA) to analyze the program content, to examine the benefits and risks of products with respect to program goals, and to support programmatic decision making. The PA process includes systematic identification of current and future safety risks as well as tracking several quantitative and qualitative metrics to ensure the program goals are addressing prominent safety risks accurately and effectively. One of the metrics within the PA process involves using quantitative aviation safety models to gauge the impact of the safety products. This paper demonstrates the role of aviation safety modeling by providing model outputs and evaluating a sample of portfolio elements using the Flightdeck Automation Problems (FLAP) model. The model enables not only ranking of the quantitative relative risk reduction impact of all portfolio elements, but also highlighting the areas with high potential impact via sensitivity and gap analyses in support of the program office. Although the model outputs are preliminary and products are notional, the process shown in this paper is essential to a comprehensive PA of NASA's safety products in the current program and future programs/projects.

  4. Representation of water abstraction from a karst conduit with numerical discrete-continuum models

    Directory of Open Access Journals (Sweden)

    T. Reimann

    2013-04-01

    Full Text Available Karst aquifers are characterized by highly conductive conduit flow paths embedded in a less conductive fissured and fractured matrix resulting in strong permeability contrasts with structured heterogeneity and anisotropy. Groundwater storage occurs predominantly in the fissured matrix. Hence, most karst models assume quasi steady-state flow in conduits neglecting conduit associated drainable storage (CADS. The concept of CADS considers storage volumes, where karst water is not part of the active flow system but rather hydraulically connected to conduits (for example karstic voids and large fractures. The disregard of conduit storage can be inappropriate when direct water abstraction from karst conduits occurs, e.g. large scale pumping. In such cases, CADS may be relevant. Furthermore, the typical fixed head boundary condition at the karst outlet can be inadequate for water abstraction scenarios because unhampered water inflow is possible. The objective of this paper is to analyze the significance of CADS and flow-limited boundary conditions on the hydraulic behavior of karst aquifers in water abstraction scenarios. To this end, the numerical hybrid model MODFLOW-2005 Conduit Flow Process Mode 1 (CFPM1 is enhanced to account for CADS. Additionally, a fixed-head limited-flow (FHLQ boundary condition is added that limits inflow from constant head boundaries to a user-defined threshold. The affect and proper functioning of these modifications is demonstrated by simplified model studies. Both enhancements, CAD storage and the FHLQ boundary, are shown to be useful for water abstraction scenarios within karst aquifers. An idealized representation of a large-scale pumping test in a karst conduit is used to demonstrate that the enhanced CFPM1 is potentially able to adequately represent water abstraction processes in both the conduits and the matrix of real karst systems.

  5. Software abstractions logic, language, and analysis

    CERN Document Server

    Jackson, Daniel

    2011-01-01

    In Software Abstractions Daniel Jackson introduces an approach to software design that draws on traditional formal methods but exploits automated tools to find flaws as early as possible. This approach--which Jackson calls "lightweight formal methods" or "agile modeling"--takes from formal specification the idea of a precise and expressive notation based on a tiny core of simple and robust concepts but replaces conventional analysis based on theorem proving with a fully automated analysis that gives designers immediate feedback. Jackson has developed Alloy, a language that captures the essence of software abstractions simply and succinctly, using a minimal toolkit of mathematical notions. This revised edition updates the text, examples, and appendixes to be fully compatible with the latest version of Alloy (Alloy 4). The designer can use automated analysis not only to correct errors but also to make models that are more precise and elegant. This approach, Jackson says, can rescue designers from "the tarpit of...

  6. Technical Work Plan for: Near Field Environment: Engineered System: Radionuclide Transport Abstraction Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2006-12-08

    This technical work plan (TWP) describes work activities to be performed by the Near-Field Environment Team. The objective of the work scope covered by this TWP is to generate Revision 03 of EBS Radionuclide Transport Abstraction, referred to herein as the radionuclide transport abstraction (RTA) report. The RTA report is being revised primarily to address condition reports (CRs), to address issues identified by the Independent Validation Review Team (IVRT), to address the potential impact of transport, aging, and disposal (TAD) canister design on transport models, and to ensure integration with other models that are closely associated with the RTA report and being developed or revised in other analysis/model reports in response to IVRT comments. The RTA report will be developed in accordance with the most current version of LP-SIII.10Q-BSC and will reflect current administrative procedures (LP-3.15Q-BSC, ''Managing Technical Product Inputs''; LP-SIII.2Q-BSC, ''Qualification of Unqualified Data''; etc.), and will develop related Document Input Reference System (DIRS) reports and data qualifications as applicable in accordance with prevailing procedures. The RTA report consists of three models: the engineered barrier system (EBS) flow model, the EBS transport model, and the EBS-unsaturated zone (UZ) interface model. The flux-splitting submodel in the EBS flow model will change, so the EBS flow model will be validated again. The EBS transport model and validation of the model will be substantially revised in Revision 03 of the RTA report, which is the main subject of this TWP. The EBS-UZ interface model may be changed in Revision 03 of the RTA report due to changes in the conceptualization of the UZ transport abstraction model (a particle tracker transport model based on the discrete fracture transfer function will be used instead of the dual-continuum transport model previously used). Validation of the EBS-UZ interface model

  7. Virtual Machine Support for Many-Core Architectures: Decoupling Abstract from Concrete Concurrency Models

    Directory of Open Access Journals (Sweden)

    Stefan Marr

    2010-02-01

    Full Text Available The upcoming many-core architectures require software developers to exploit concurrency to utilize available computational power. Today's high-level language virtual machines (VMs, which are a cornerstone of software development, do not provide sufficient abstraction for concurrency concepts. We analyze concrete and abstract concurrency models and identify the challenges they impose for VMs. To provide sufficient concurrency support in VMs, we propose to integrate concurrency operations into VM instruction sets. Since there will always be VMs optimized for special purposes, our goal is to develop a methodology to design instruction sets with concurrency support. Therefore, we also propose a list of trade-offs that have to be investigated to advise the design of such instruction sets. As a first experiment, we implemented one instruction set extension for shared memory and one for non-shared memory concurrency. From our experimental results, we derived a list of requirements for a full-grown experimental environment for further research.

  8. The abstract geometry modeling language (AgML): experience and road map toward eRHIC

    Science.gov (United States)

    Webb, Jason; Lauret, Jerome; Perevoztchikov, Victor

    2014-06-01

    The STAR experiment has adopted an Abstract Geometry Modeling Language (AgML) as the primary description of our geometry model. AgML establishes a level of abstraction, decoupling the definition of the detector from the software libraries used to create the concrete geometry model. Thus, AgML allows us to support both our legacy GEANT 3 simulation application and our ROOT/TGeo based reconstruction software from a single source, which is demonstrably self- consistent. While AgML was developed primarily as a tool to migrate away from our legacy FORTRAN-era geometry codes, it also provides a rich syntax geared towards the rapid development of detector models. AgML has been successfully employed by users to quickly develop and integrate the descriptions of several new detectors in the RHIC/STAR experiment including the Forward GEM Tracker (FGT) and Heavy Flavor Tracker (HFT) upgrades installed in STAR for the 2012 and 2013 runs. AgML has furthermore been heavily utilized to study future upgrades to the STAR detector as it prepares for the eRHIC era. With its track record of practical use in a live experiment in mind, we present the status, lessons learned and future of the AgML language as well as our experience in bringing the code into our production and development environments. We will discuss the path toward eRHIC and pushing the current model to accommodate for detector miss-alignment and high precision physics.

  9. The abstract geometry modeling language (AgML): experience and road map toward eRHIC

    International Nuclear Information System (INIS)

    The STAR experiment has adopted an Abstract Geometry Modeling Language (AgML) as the primary description of our geometry model. AgML establishes a level of abstraction, decoupling the definition of the detector from the software libraries used to create the concrete geometry model. Thus, AgML allows us to support both our legacy GEANT 3 simulation application and our ROOT/TGeo based reconstruction software from a single source, which is demonstrably self- consistent. While AgML was developed primarily as a tool to migrate away from our legacy FORTRAN-era geometry codes, it also provides a rich syntax geared towards the rapid development of detector models. AgML has been successfully employed by users to quickly develop and integrate the descriptions of several new detectors in the RHIC/STAR experiment including the Forward GEM Tracker (FGT) and Heavy Flavor Tracker (HFT) upgrades installed in STAR for the 2012 and 2013 runs. AgML has furthermore been heavily utilized to study future upgrades to the STAR detector as it prepares for the eRHIC era. With its track record of practical use in a live experiment in mind, we present the status, lessons learned and future of the AgML language as well as our experience in bringing the code into our production and development environments. We will discuss the path toward eRHIC and pushing the current model to accommodate for detector miss-alignment and high precision physics.

  10. Identifying crop vulnerability to groundwater abstraction: modelling and expert knowledge in a GIS.

    Science.gov (United States)

    Procter, Chris; Comber, Lex; Betson, Mark; Buckley, Dennis; Frost, Andy; Lyons, Hester; Riding, Alison; Voyce, Kevin

    2006-11-01

    Water use is expected to increase and climate change scenarios indicate the need for more frequent water abstraction. Abstracting groundwater may have a detrimental effect on soil moisture availability for crop growth and yields. This work presents an elegant and robust method for identifying zones of crop vulnerability to abstraction. Archive groundwater level datasets were used to generate a composite groundwater surface that was subtracted from a digital terrain model. The result was the depth from surface to groundwater and identified areas underlain by shallow groundwater. Knowledge from an expert agronomist was used to define classes of risk in terms of their depth below ground level. Combining information on the permeability of geological drift types further refined the assessment of the risk of crop growth vulnerability. The nature of the mapped output is one that is easy to communicate to the intended farming audience because of the general familiarity of mapped information. Such Geographic Information System (GIS)-based products can play a significant role in the characterisation of catchments under the EU Water Framework Directive especially in the process of public liaison that is fundamental to the setting of priorities for management change. The creation of a baseline allows the impact of future increased water abstraction rates to be modelled and the vulnerability maps are in a format that can be readily understood by the various stakeholders. This methodology can readily be extended to encompass additional data layers and for a range of groundwater vulnerability issues including water resources, ecological impacts, nitrate and phosphorus. PMID:16963176

  11. Individual Differences in Response to Automation: The Five Factor Model of Personality

    Science.gov (United States)

    Szalma, James L.; Taylor, Grant S.

    2011-01-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness--whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five…

  12. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  13. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  14. Model-Based Control for Postal Automation and Baggage Handling

    OpenAIRE

    Tarau, A.N.

    2010-01-01

    In this thesis we focus on two specific transportation systems, namely postal automation and baggage handling. Postal automation: During the last decades the volume of magazines, catalogs, and other plastic wrapped mail items that have to be processed by post sorting centers has increased considerably. In order to be able to handle the large volumes of mail, state-of-the-art post sorting centers are equipped with dedicated mail sorting machines. The throughput of a post sorting machine is def...

  15. Automated medical diagnosis with fuzzy stochastic models: monitoring chronic diseases.

    Science.gov (United States)

    Jeanpierre, Laurent; Charpillet, François

    2004-01-01

    As the world population ages, the patients per physician ratio keeps on increasing. This is even more important in the domain of chronic pathologies where people are usually monitored for years and need regular consultations. To address this problem, we propose an automated system to monitor a patient population, detecting anomalies in instantaneous data and in their temporal evolution, so that it could alert physicians. By handling the population of healthy patients autonomously and by drawing the physicians' attention to the patients-at-risk, the system allows physicians to spend comparatively more time with patients who need their services. In such a system, the interaction between the patients, the diagnosis module, and the physicians is very important. We have based this system on a combination of stochastic models, fuzzy filters, and strong medical semantics. We particularly focused on a particular tele-medicine application: the Diatelic Project. Its objective is to monitor chronic kidney-insufficient patients and to detect hydration troubles. During two years, physicians from the ALTIR have conducted a prospective randomized study of the system. This experiment clearly shows that the proposed system is really beneficial to the patients' health. PMID:15520535

  16. Modelling the sensitivity of river reaches to water abstraction: RAPHSA- a hydroecology tool for environmental managers

    Science.gov (United States)

    Klaar, Megan; Laize, Cedric; Maddock, Ian; Acreman, Mike; Tanner, Kath; Peet, Sarah

    2014-05-01

    A key challenge for environmental managers is the determination of environmental flows which allow a maximum yield of water resources to be taken from surface and sub-surface sources, whilst ensuring sufficient water remains in the environment to support biota and habitats. It has long been known that sensitivity to changes in water levels resulting from river and groundwater abstractions varies between rivers. Whilst assessment at the catchment scale is ideal for determining broad pressures on water resources and ecosystems, assessment of the sensitivity of reaches to changes in flow has previously been done on a site-by-site basis, often with the application of detailed but time consuming techniques (e.g. PHABSIM). While this is appropriate for a limited number of sites, it is costly in terms of money and time resources and therefore not appropriate for application at a national level required by responsible licensing authorities. To address this need, the Environment Agency (England) is developing an operational tool to predict relationships between physical habitat and flow which may be applied by field staff to rapidly determine the sensitivity of physical habitat to flow alteration for use in water resource management planning. An initial model of river sensitivity to abstraction (defined as the change in physical habitat related to changes in river discharge) was developed using site characteristics and data from 66 individual PHABSIM surveys throughout the UK (Booker & Acreman, 2008). By applying a multivariate multiple linear regression analysis to the data to define habitat availability-flow curves using resource intensity as predictor variables, the model (known as RAPHSA- Rapid Assessment of Physical Habitat Sensitivity to Abstraction) is able to take a risk-based approach to modeled certainty. Site specific information gathered using desk-based, or a variable amount of field work can be used to predict the shape of the habitat- flow curves, with the

  17. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    OpenAIRE

    Simonsen, Kent,; Kristensen, Lars,

    2014-01-01

    Model-based software engineering offers several attractive benefits for the implementation of protocols, including automated code generation for different platforms from design-level models. In earlier work, we have proposed a template-based approach using Coloured Petri Net formal models with pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol softwar...

  18. Simulation of Self-Assembly in the Abstract Tile Assembly Model with ISU TAS

    CERN Document Server

    Patitz, Matthew J

    2011-01-01

    Since its introduction by Erik Winfree in 1998, the abstract Tile Assembly Model (aTAM) has inspired a wealth of research. As an abstract model for tile based self-assembly, it has proven to be remarkably powerful and expressive in terms of the structures which can self-assemble within it. As research has progressed in the aTAM, the self-assembling structures being studied have become progressively more complex. This increasing complexity, along with a need for standardization of definitions and tools among researchers, motivated the development of the Iowa State University Tile Assembly Simulator (ISU TAS). ISU TAS is a graphical simulator and tile set editor for designing and building 2-D and 3-D aTAM tile assembly systems and simulating their self-assembly. This paper reviews the features and functionality of ISU TAS and describes how it can be used to further research into the complexities of the aTAM. Software and source code are available at http://www.cs.iastate.edu/~lnsa.

  19. INVENTORY ABSTRACTION

    Energy Technology Data Exchange (ETDEWEB)

    G. Ragan

    2001-12-19

    The purpose of the inventory abstraction, which has been prepared in accordance with a technical work plan (CRWMS M&O 2000e for ICN 02 of the present analysis, and BSC 2001e for ICN 03 of the present analysis), is to: (1) Interpret the results of a series of relative dose calculations (CRWMS M&O 2000c, 2000f). (2) Recommend, including a basis thereof, a set of radionuclides that should be modeled in the Total System Performance Assessment in Support of the Site Recommendation (TSPA-SR) and the Total System Performance Assessment in Support of the Final Environmental Impact Statement (TSPA-FEIS). (3) Provide initial radionuclide inventories for the TSPA-SR and TSPA-FEIS models. (4) Answer the U.S. Nuclear Regulatory Commission (NRC)'s Issue Resolution Status Report ''Key Technical Issue: Container Life and Source Term'' (CLST IRSR) key technical issue (KTI): ''The rate at which radionuclides in SNF [spent nuclear fuel] are released from the EBS [engineered barrier system] through the oxidation and dissolution of spent fuel'' (NRC 1999, Subissue 3). The scope of the radionuclide screening analysis encompasses the period from 100 years to 10,000 years after the potential repository at Yucca Mountain is sealed for scenarios involving the breach of a waste package and subsequent degradation of the waste form as required for the TSPA-SR calculations. By extending the time period considered to one million years after repository closure, recommendations are made for the TSPA-FEIS. The waste forms included in the inventory abstraction are Commercial Spent Nuclear Fuel (CSNF), DOE Spent Nuclear Fuel (DSNF), High-Level Waste (HLW), naval Spent Nuclear Fuel (SNF), and U.S. Department of Energy (DOE) plutonium waste. The intended use of this analysis is in TSPA-SR and TSPA-FEIS. Based on the recommendations made here, models for release, transport, and possibly exposure will be developed for the isotopes that would be the highest

  20. Automating the Extraction of Model-Based Software Product Lines from Model Variants

    OpenAIRE

    Martinez, Jabier; Ziadi, Tewfik; Klein, Jacques; Le Traon, Yves

    2015-01-01

    International audience We address the problem of automating 1) the analysis of existing similar model variants and 2) migrating them into a software product line. Our approach, named MoVa2PL, considers the identification of variability and commonality in model variants, as well as the extraction of a CVL-compliant Model-based Software Product Line (MSPL) from the features identified on these variants. MoVa2PL builds on a generic representation of models making it suitable to any MOF-based ...

  1. Towards an Integrated System Model for Testing and Verification of Automation Machines

    OpenAIRE

    Peter Braun; Benjamin Hummel

    2016-01-01

    The models and documents created during the development of automation machines typically can be categorized into mechanics, electronics, and software/controller. The functionality of an automation machine is, however, usually realized by the interaction of all three of these domains. So no single model covering only one development category will be able to describe the behavior of the machine thoroughly. For early planning of the machine design, virtual prototypes, and especially for the form...

  2. Automated 4D analysis of dendritic spine morphology: applications to stimulus-induced spine remodeling and pharmacological rescue in a disease model

    Directory of Open Access Journals (Sweden)

    Swanger Sharon A

    2011-10-01

    Full Text Available Abstract Uncovering the mechanisms that regulate dendritic spine morphology has been limited, in part, by the lack of efficient and unbiased methods for analyzing spines. Here, we describe an automated 3D spine morphometry method and its application to spine remodeling in live neurons and spine abnormalities in a disease model. We anticipate that this approach will advance studies of synapse structure and function in brain development, plasticity, and disease.

  3. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    Directory of Open Access Journals (Sweden)

    Mohamed Saad

    2015-08-01

    Full Text Available Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid management technique where retail or wholesale customers are requested either electronically or manually to reduce their load. Therefore this paper aims to design and model an automated power system for optimal new load locations using DPL DIgSILENT Programming Language. This study is a diagnostic approach that assists system operator about any voltage violation cases that would happen during adding new load to the grid. The process of identifying the optimal bus bar location involves a complicated calculation of the power consumptions at each load bus As a result the DPL program would consider all the IEEE 30 bus internal networks data then a load flow simulation will be executed. To add the new load to the first bus in the network. Therefore the developed model will simulate the new load at each available bus bar in the network and generate three analytical reports for each case that captures the overunder voltage and the loading elements among the grid.

  4. The Problem of Automation in Dynamic Models Visualization

    OpenAIRE

    Krasnoproshin, V.; Mazouka, D.

    2012-01-01

    This paper describes a methodology for the graphics pipeline extension. The introduced approach is based on specialized formal language called visualization algebra. We argue that this technique can lower visualization software development costs and build a way for further computer graphics automation.

  5. Côte de Resyste : Automated Model Based Testing

    NARCIS (Netherlands)

    Tretmans, Jan; Brinksma, Ed; Schweizer, M.

    2002-01-01

    Systematic testing is very important for assessing and improving the quality of embedded software. Yet, testing turns out to be expensive, laborious, time-consuming and error-prone. The project Cˆote de Resyste has been working since 1998 on methods, techniques and tools for automating specification

  6. TorX: Automated Model-Based Testing

    NARCIS (Netherlands)

    Tretmans, Jan; Brinksma, Ed; Hartman, A.; Dussa-Ziegler, K.

    2003-01-01

    Systematic testing is very important for assessing and improving the quality of software systems. Yet, testing turns out to be expensive, laborious, time-consuming and error-prone. The Dutch research and development project Côte de Resyste worked on methods, techniques and tools for automating speci

  7. New E-Commerce Model Based on Multi-Agent Automated Negotiation

    Institute of Scientific and Technical Information of China (English)

    向传杰; 贾云得

    2003-01-01

    A new multi-agent automated negotiation model is developed and evaluated, in which two competitive agents, such as the buyer and seller, have firm deadlines and incomplete information about each other. The negotiation is multi-dimensional in different cases. The model is discussed in 6 kinds of cases with different price strategies, warrantee strategies and time strategies. The model improves the model of Wooldridge and that of Sycara to a certain extent. In all possible situations, the optimal negotiation strategy is analyzed and presented, and an e-commerce model based on multi-agent automated negotiation model is also illustrated for the e-commerce application in the future.

  8. Context based mixture model for cell phase identification in automated fluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Zhou Xiaobo

    2007-01-01

    Full Text Available Abstract Background Automated identification of cell cycle phases of individual live cells in a large population captured via automated fluorescence microscopy technique is important for cancer drug discovery and cell cycle studies. Time-lapse fluorescence microscopy images provide an important method to study the cell cycle process under different conditions of perturbation. Existing methods are limited in dealing with such time-lapse data sets while manual analysis is not feasible. This paper presents statistical data analysis and statistical pattern recognition to perform this task. Results The data is generated from Hela H2B GFP cells imaged during a 2-day period with images acquired 15 minutes apart using an automated time-lapse fluorescence microscopy. The patterns are described with four kinds of features, including twelve general features, Haralick texture features, Zernike moment features, and wavelet features. To generate a new set of features with more discriminate power, the commonly used feature reduction techniques are used, which include Principle Component Analysis (PCA, Linear Discriminant Analysis (LDA, Maximum Margin Criterion (MMC, Stepwise Discriminate Analysis based Feature Selection (SDAFS, and Genetic Algorithm based Feature Selection (GAFS. Then, we propose a Context Based Mixture Model (CBMM for dealing with the time-series cell sequence information and compare it to other traditional classifiers: Support Vector Machine (SVM, Neural Network (NN, and K-Nearest Neighbor (KNN. Being a standard practice in machine learning, we systematically compare the performance of a number of common feature reduction techniques and classifiers to select an optimal combination of a feature reduction technique and a classifier. A cellular database containing 100 manually labelled subsequence is built for evaluating the performance of the classifiers. The generalization error is estimated using the cross validation technique. The

  9. Bottom-Up Abstract Modelling of Optical Networks-on-Chip: From Physical to Architectural Layer

    Directory of Open Access Journals (Sweden)

    Alberto Parini

    2012-01-01

    Full Text Available This work presents a bottom-up abstraction procedure based on the design-flow FDTD + SystemC suitable for the modelling of optical Networks-on-Chip. In this procedure, a complex network is decomposed into elementary switching elements whose input-output behavior is described by means of scattering parameters models. The parameters of each elementary block are then determined through 2D-FDTD simulation, and the resulting analytical models are exported within functional blocks in SystemC environment. The inherent modularity and scalability of the S-matrix formalism are preserved inside SystemC, thus allowing the incremental composition and successive characterization of complex topologies typically out of reach for full-vectorial electromagnetic simulators. The consistency of the outlined approach is verified, in the first instance, by performing a SystemC analysis of a four-input, four-output ports switch and making a comparison with the results of 2D-FDTD simulations of the same device. Finally, a further complex network encompassing 160 microrings is investigated, the losses over each routing path are calculated, and the minimum amount of power needed to guarantee an assigned BER is determined. This work is a basic step in the direction of an automatic technology-aware network-level simulation framework capable of assembling complex optical switching fabrics, while at the same time assessing the practical feasibility and effectiveness at the physical/technological level.

  10. Feed forward neural networks and genetic algorithms for automated financial time series modelling

    OpenAIRE

    Kingdon, J. C.

    1995-01-01

    This thesis presents an automated system for financial time series modelling. Formal and applied methods are investigated for combining feed-forward Neural Networks and Genetic Algorithms (GAs) into a single adaptive/learning system for automated time series forecasting. Four important research contributions arise from this investigation: i) novel forms of GAs are introduced which are designed to counter the representational bias associated with the conventional Holland GA, ii) an...

  11. Automated Eukaryotic Gene Structure Annotation Using EVidenceModeler and the Program to Assemble Spliced Alignments

    Energy Technology Data Exchange (ETDEWEB)

    Haas, B J; Salzberg, S L; Zhu, W; Pertea, M; Allen, J E; Orvis, J; White, O; Buell, C R; Wortman, J R

    2007-12-10

    EVidenceModeler (EVM) is presented as an automated eukaryotic gene structure annotation tool that reports eukaryotic gene structures as a weighted consensus of all available evidence. EVM, when combined with the Program to Assemble Spliced Alignments (PASA), yields a comprehensive, configurable annotation system that predicts protein-coding genes and alternatively spliced isoforms. Our experiments on both rice and human genome sequences demonstrate that EVM produces automated gene structure annotation approaching the quality of manual curation.

  12. Model-Based Test Automation Strategies for Data Processing Systems

    OpenAIRE

    Di Nardo, Daniel

    2016-01-01

    Data processing software is an essential component of systems that aggregate and analyse real-world data, thereby enabling automated interaction between such systems and the real world. In data processing systems, inputs are often big and complex files that have a well-defined structure, and that often have dependencies between several of their fields. Testing of data processing systems is complex. Software engineers, in charge of testing these systems, have to handcraft complex data files of...

  13. Cellular Automation Model of Traffic Flow Based on the Car-Following Model

    Institute of Scientific and Technical Information of China (English)

    LI Ke-Ping; GAO Zi-You

    2004-01-01

    @@ We propose a new cellular automation (CA) traffic model that is based on the car-following model. A class of driving strategies is used in the car-following model instead of the acceleration in the NaSch traffic model. In our model, some realistic driver behaviour and detailed vehicle characteristics have been taken into account, such as distance-headway and safe distance, etc. The simulation results show that our model can exhibit some traffic flow states that have been observed in the real traffic, and both of the maximum flux and the critical density are very close to the real measurement. Moreover, it is easy to extend our method to multi-lane traffic.

  14. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author)

  15. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  16. Model of informational system for freight insurance automation based on digital signature

    Directory of Open Access Journals (Sweden)

    Maxim E. SLOBODYANYUK

    2009-01-01

    Full Text Available In the article considered a model of informational system for freight insurance automation based on digital signature, showed architecture, macro flowchart of information flow in model, components (modules and their functions. Described calculation method of costs on interactive cargo insurance via proposed system, represented main characteristics and options of existing transport management systems, conceptual cost models.

  17. Finding Feasible Abstract Counter-Examples

    Science.gov (United States)

    Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.

  18. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  19. Model of automated computer aided NC machine tools programming

    Directory of Open Access Journals (Sweden)

    J. Balic

    2006-04-01

    Full Text Available Purpose: Modern companies tend towards the greatest possible automation in all areas. The new control concepts of manufacturing processes required development of adequate tools for the introduction of automated control in a certain area. The paper presents such system for automated programming of CNC machine tools.Design/methodology/approach: The system is based on the previously incorporated know-how and the rules of it implementation in tool – shop. The existing manufacturing knowledge of industry tool production was collected and analysing. On this bases flow chart of all activities were made. Theoretical contribution is made in systemization of technological knowledge, which is now accessible for all workers in NC preparation units.Findings: Utilization of technology knowledge. On the basis of the recognized properties it has worked out the algorithms with which the process of manufacture, the tool and the optimum parameters selected are indirectly determined, whereas the target function was working out of the NC programme. We can first out that with information approaching of the CAM and CAPP the barriers between them, strict so far, disappear.Research limitations/implications: Till now, the system is limited to milling, drilling and similar operation. It could be extended to other machining operations (turning, grinding, wire cutting, etc. with the same procedure. In advanced, some methods of artificial intelligence could be use.Practical implications: It is suitable for industry tools, dies and moulds production, while the system was proved in the real tool shop (production of tools for casting. The system reduces the preparation time of NC programs and could be used with any commercial available CAD/CAM/NC programming systems. Human errors are avoid or at lover level. It is important for engineers in CAD/CAM field and in tool – shops.Originality/value: The developed system is original and was not found in the literature or in the

  20. Modeling the Energy Use of a Connected and Automated Transportation System (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brown, A.

    2014-07-01

    Early research points to large potential impacts of connected and automated vehicles (CAVs) on transportation energy use - dramatic savings, increased use, or anything in between. Due to a lack of suitable data and integrated modeling tools to explore these complex future systems, analyses to date have relied on simple combinations of isolated effects. This poster proposes a framework for modeling the potential energy implications from increasing penetration of CAV technologies and for assessing technology and policy options to steer them toward favorable energy outcomes. Current CAV modeling challenges include estimating behavior change, understanding potential vehicle-to-vehicle interactions, and assessing traffic flow and vehicle use under different automation scenarios. To bridge these gaps and develop a picture of potential future automated systems, NREL is integrating existing modeling capabilities with additional tools and data inputs to create a more fully integrated CAV assessment toolkit.

  1. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    Science.gov (United States)

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  2. A Binary Programming Approach to Automated Test Assembly for Cognitive Diagnosis Models

    Science.gov (United States)

    Finkelman, Matthew D.; Kim, Wonsuk; Roussos, Louis; Verschoor, Angela

    2010-01-01

    Automated test assembly (ATA) has been an area of prolific psychometric research. Although ATA methodology is well developed for unidimensional models, its application alongside cognitive diagnosis models (CDMs) is a burgeoning topic. Two suggested procedures for combining ATA and CDMs are to maximize the cognitive diagnostic index and to use a…

  3. Formalization & Data Abstraction During Use Case Modeling in Object Oriented Analysis & Design

    Directory of Open Access Journals (Sweden)

    Meena Sharma

    2013-05-01

    Full Text Available In object oriented analysis and design, use cases r epresent the things of value that the system performs for its actors in UML and unified process. Use cases are not functions or features. They allow us to get behavioral abstraction of the system to be. The purpose of the behavioral abstraction is to get to the heart of what a system must do, we must first focus on who (or what will use it, or be used by it. After we do this, we look at what the system must do for those users in order to do something useful. That is what exact ly we expect from the use cases as the behavioral abstraction. Apart from this fact use ca ses are the poor candidates for the data abstraction. Rather the do not have data abstractio n. The main reason is it shows or describes the sequence of events or actions performed by the actor or use case, it does not take data in to account. As we know in earlier stages of the develo pment we believe in ‘what’ rather than ‘how’. ‘What’ does not need to include data whereas ‘how’ depicts the data. As use case moves around ‘what’ only we are not able to extract the d ata. So in order to incorporate data in use cases one must feel the need of data at the initial stages of the development. We have developed the technique to integrate data in to the uses case s. This paper is regarding our investigations to take care of data during early stages of the sof tware development. The collected abstraction of data helps in the analysis and then assist in fo rming the attributes of the candidate classes. This makes sure that we will not miss any attribute that is required in the abstracted behavior using use cases. Formalization adds to the accuracy of the data abstraction. We have investigated object constraint language to perform better data abstraction during analysis & design in unified paradigm. In this paper we have p resented our research regarding early stage data abstraction and its formalization.

  4. Integration of Heterogeneous Bibliographic Information through Data Abstractions.

    Science.gov (United States)

    Breazeal, Juliette Ow

    This study examines the integration of heterogeneous bibliographic information resources from geographically distributed locations in an automated, unified, and controlled way using abstract data types called "classes" through the Message-Object Model defined in Smalltalk-80 software. The concept of achieving data consistency by developing classes…

  5. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  6. Model for spatial synthesis of automated control system of the GCR type reactor

    International Nuclear Information System (INIS)

    This paper describes the model which was developed for synthesis of spatial distribution of automated control elements in the reactor. It represents a general reliable mathematical model for analyzing transition states and synthesis of the automated control and regulation systems of GCR type reactors. One-dimensional system was defined under assumption that the time dependence of parameters of the neutron diffusion equation are identical in the total volume of the reactor and that spatial distribution of neutrons is time independent. It is shown that this assumption is satisfactory in case of short term variations which are relevant for safety analysis

  7. Conceptual model of an automated information system of marketing at the enterprise

    Directory of Open Access Journals (Sweden)

    D.V. Raiko

    2014-09-01

    Full Text Available The aim of the article. The purpose of this paper is to create a conceptual model of an automated information system of marketing that has a certain theoretical and practical value. The results of the analysis. The main advantage of this model - a comprehensive disclosure of the relationship of concepts such as automated information technology, marketing information system, automated information system that solve the problem of processing large volumes of data in a short period of time, providing continuous communication with partners and customers and makes it possible to react quickly to market changes, and this in turn contributes to the competitiveness of the domestic and foreign markets. Scientific novelty of this model is, firstly, the assertion that the information system is based on automated information technology presents an automated information the system. Secondly, the marketing information system is an integral part of information system, structural elements are responsible for the transformation of data from internal and external sources of information to information necessary for managers and specialists of marketing services. Thirdly, the most important component of ensuring the functioning of the marketing information system and information system is an automated information technology. Due to the fact that these systems consist of human resources, work within them organized with the help of workstations. Conclusions and directions of further researches. Determined that this conceptual model provides a multi-variant calculations of rational decision-making, including real-time organization of complex accounting and economic analysis, and provides reliability and efficiency obtained and used in the management of information. The results of this model, testing the example of several industries, confirming its practical significance.

  8. Abstract Platform and Transformations for Model-Driven Service-Oriented Development

    NARCIS (Netherlands)

    Andrade Almeida, J.P.; Ferreira Pires, L.; Sinderen, van M.J.

    2006-01-01

    In this paper, we discuss the use of abstract platforms and transformation for designing applications according to the principles of the service-oriented architecture. We illustrate our approach by discussing the use of the service discovery pattern at a platform-independent design level. We show ho

  9. Planning for Evolution in a Production Environment: Migration from a Legacy Geometry Code to an Abstract Geometry Modeling Language in STAR

    International Nuclear Information System (INIS)

    Increasingly detailed descriptions of complex detector geometries are required for the simulation and analysis of today's high-energy and nuclear physics experiments. As new tools for the representation of geometry models become available during the course of an experiment, a fundamental challenge arises: how best to migrate from legacy geometry codes developed over many runs to the new technologies, such as the ROOT/TGeo [1] framework, without losing touch with years of development, tuning and validation. One approach, which has been discussed within the community for a number of years, is to represent the geometry model in a higher-level language independent of the concrete implementation of the geometry. The STAR experiment has used this approach to successfully migrate its legacy GEANT 3-era geometry to an Abstract geometry Modelling Language (AgML), which allows us to create both native GEANT 3 and ROOT/TGeo implementations. The language is supported by parsers and a C++ class library which enables the automated conversion of the original source code to AgML, supports export back to the original AgSTAR[5] representation, and creates the concrete ROOT/TGeo geometry implementation used by our track reconstruction software. In this paper we present our approach, design and experience and will demonstrate physical consistency between the original AgSTAR and new AgML geometry representations.

  10. Automated Generation of Digital Terrain Model using Point Clouds of Digital Surface Model in Forest Area

    Directory of Open Access Journals (Sweden)

    Yoshikazu Kamiya

    2011-04-01

    Full Text Available At present, most of the digital data acquisition methods generate Digital Surface Model (DSM and not a Digital Elevation Model (DEM. Conversion from DSM to DEM still has some drawbacks, especially the removing of off terrain point clouds and subsequently the generation of DEM within these spaces even though the methods are automated. In this paper it was intended to overcome this issue by attempting to project off terrain point clouds to the terrain in forest areas using Artificial Neural Networks (ANN instead of removing them and then filling gaps by interpolation. Five sites were tested and accuracies assessed. They all give almost the same results. In conclusion, the ANN has ability to obtain the DEM by projecting the DSM point clouds and greater accuracies of DEMs were obtained. If the size of the hollow areas resulting from the removal of DSM point clouds are larger the accuracies are reduced.

  11. Tensor contraction engine: Abstraction and automated parallel implementation of configuration-interaction, coupled-cluster, and many-body perturbation theories

    International Nuclear Information System (INIS)

    We develop a symbolic manipulation program and program generator (Tensor Contraction Engine or TCE) that automatically derives the working equations of a well-defined model of second-quantized many-electron theories and synthesizes efficient parallel computer programs on the basis of these equations. Provided an ansatz of a many-electron theory model, TCE performs valid contractions of creation and annihilation operators according to Wick's theorem, consolidates identical terms, and reduces the expressions into the form of multiple tensor contractions acted by permutation operators. Subsequently, it determines the binary contraction order for each multiple tensor contraction with the minimal operation and memory cost, factorizes common binary contractions (defines intermediate tensors), and identifies reusable intermediates. The resulting ordered list of binary tensor contractions, additions, and index permutations is translated into an optimized program that is combined with the NWChem and UTChem computational chemistry software packages. The programs synthesized by TCE take advantage of spin symmetry, Abelian point-group symmetry, and index permutation symmetry at every stage of calculations to minimize the number of arithmetic operations and storage requirement, adjust the peak local memory usage by index range tiling, and support parallel I/O interfaces and dynamic load balancing for parallel executions. We demonstrate the utility of TCE through automatic derivation and implementation of parallel programs for various models of configuration-interaction theory (CISD, CISDT, CISDTQ), many-body perturbation theory[MBPT(2), MBPT(3), MBPT(4)], and coupled-cluster theory (LCCD, CCD, LCCSD, CCSD, QCISD, CCSDT, and CCSDTQ)

  12. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  13. Learning with Technology: Video Modeling with Concrete-Representational-Abstract Sequencing for Students with Autism Spectrum Disorder

    Science.gov (United States)

    Yakubova, Gulnoza; Hughes, Elizabeth M.; Shinaberry, Megan

    2016-01-01

    The purpose of this study was to determine the effectiveness of a video modeling intervention with concrete-representational-abstract instructional sequence in teaching mathematics concepts to students with autism spectrum disorder (ASD). A multiple baseline across skills design of single-case experimental methodology was used to determine the…

  14. Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest

    Science.gov (United States)

    Rohloff, Kurt

    2010-01-01

    The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.

  15. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    Science.gov (United States)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  16. Automated Test Assembly for Cognitive Diagnosis Models Using a Genetic Algorithm

    Science.gov (United States)

    Finkelman, Matthew; Kim, Wonsuk; Roussos, Louis A.

    2009-01-01

    Much recent psychometric literature has focused on cognitive diagnosis models (CDMs), a promising class of instruments used to measure the strengths and weaknesses of examinees. This article introduces a genetic algorithm to perform automated test assembly alongside CDMs. The algorithm is flexible in that it can be applied whether the goal is to…

  17. Evaluation of automated cell disruptor methods for oomycetous and ascomycetous model organisms

    Science.gov (United States)

    Two automated cell disruptor-based methods for RNA extraction; disruption of thawed cells submerged in TRIzol Reagent (method QP), and direct disruption of frozen cells on dry ice (method CP), were optimized for a model oomycete, Phytophthora capsici, and compared with grinding in a mortar and pestl...

  18. AUTOMATED FORMATION OF CALCULATION MODELS OF TURBOGENERATORS FOR SOFTWARE ENVIRONMENT FEMM

    OpenAIRE

    Milykh, V. I.; Polyakova, N. V.

    2015-01-01

    Attention is paid to the popular FEMM (Finite Element Method Magnetics) program which is effective in the numerical calculations of the magnetic fields of electrical machines. The main problem of its using - high costs in time on the formation of a graphical model representing the design and on the formation of the physical model representing the materials properties and the winding currents of machines – is solved. For this purpose, principles of the automated formation of such models are de...

  19. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    OpenAIRE

    Thomas E. Watts III; Robyn A. Snow; Brown, Aaron W.; J. C. York; Greg Fantom; Paul S. Simone Jr.; Gary L. Emmert

    2015-01-01

    An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The...

  20. BALWOIS: Abstracts

    International Nuclear Information System (INIS)

    anthropogenic pressures and international shared water. Here are the 320 abstracts proposed by authors and accepted by the Scientific Committee. More than 200 papers are presented during the Conference on 8 topics related to Hydrology, Climatology and Hydro biology: - Climate and Environment; - Hydrological regimes and water balances; - Droughts and Floods; -Integrated Water Resources Management; -Water bodies Protection and Eco hydrology; -Lakes; -Information Systems for decision support; -Hydrological modelling. Papers relevant to INIS are indexed separately

  1. An automated construction of error models for uncertainty quantification and model calibration

    Science.gov (United States)

    Josset, L.; Lunati, I.

    2015-12-01

    To reduce the computational cost of stochastic predictions, it is common practice to rely on approximate flow solvers (or «proxy»), which provide an inexact, but computationally inexpensive response [1,2]. Error models can be constructed to correct the proxy response: based on a learning set of realizations for which both exact and proxy simulations are performed, a transformation is sought to map proxy into exact responses. Once the error model is constructed a prediction of the exact response is obtained at the cost of a proxy simulation for any new realization. Despite its effectiveness [2,3], the methodology relies on several user-defined parameters, which impact the accuracy of the predictions. To achieve a fully automated construction, we propose a novel methodology based on an iterative scheme: we first initialize the error model with a small training set of realizations; then, at each iteration, we add a new realization both to improve the model and to evaluate its performance. More specifically, at each iteration we use the responses predicted by the updated model to identify the realizations that need to be considered to compute the quantity of interest. Another user-defined parameter is the number of dimensions of the response spaces between which the mapping is sought. To identify the space dimensions that optimally balance mapping accuracy and risk of overfitting, we follow a Leave-One-Out Cross Validation. Also, the definition of a stopping criterion is central to an automated construction. We use a stability measure based on bootstrap techniques to stop the iterative procedure when the iterative model has converged. The methodology is illustrated with two test cases in which an inverse problem has to be solved and assess the performance of the method. We show that an iterative scheme is crucial to increase the applicability of the approach. [1] Josset, L., and I. Lunati, Local and global error models for improving uncertainty quantification, Math

  2. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  3. AUTOMATED FORMATION OF FUNCTIONAL MODELS OF RAILWAY STATIONS

    OpenAIRE

    Д. Н. КОЗАЧЕНКО; ВЕРНИГОРА Р.В.; МАЛАШКИН, В. В.

    2015-01-01

    Effective means of analysis and quantitative assessment of the functioning of the railway stations is a simulation processes. Ergatic model railway stations provide high adequacy of simulation. In such models, the person directly involved in the process of modeling, acting as Station Manager.For the functioning of ergatic model are used data files of a special structure in which is encrypted formalized technical equipment and technological process of the station. Experience of creating ergati...

  4. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    Directory of Open Access Journals (Sweden)

    Thomas E. Watts III

    2015-10-01

    Full Text Available An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The original model predicted trihalomethanes concentrations within ~10 μg·L−1 of the measurement. Calibration improved model prediction by a factor of three to five times better than the literature model.

  5. Automating Measurement for Software Process Models using Attribute Grammar Rules

    Directory of Open Access Journals (Sweden)

    Abdul Azim Abd. Ghani

    2007-08-01

    Full Text Available The modelling concept is well accepted in software engineering discipline. Some software models are built either to control the development stages, to measure program quality or to serve as a medium that gives better understanding of the actual software systems. Software process modelling nowadays has reached a level that allow software designs to be transformed into programming languages, such as architecture design language and unified modelling language. This paper described the adaptation of attribute grammar approach in measuring software process model. A tool, called Software Process Measurement Application was developed to enable the measurement accordingly to specified attribute grammar rules. A context-free grammar to read the process model is depicted from IDEF3 standard, and rules were attached to enable the measurement metrics calculation. The measurement metric values collected were used to aid in determining the decomposing and structuring of processes for the proposed software systems.

  6. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and...... model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility...... of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow....

  7. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language......, a well-established visual language for modelling workflows in a business context. The framework’s modelling language is extended to include the tracking of real-valued quantities associated with the process (such as time, cost, temperature). In addition, this language also allows for an intention...... by means of a case study from the food industry. Through this case study we explore the extent to which the risk of production faults can be reduced and the impact of these can be minimised, primarily through restructuring of the production workflows. This approach is fully automated and only the...

  8. Gaia: automated quality assessment of protein structure models

    OpenAIRE

    Kota, Pradeep; Ding, Feng; Ramachandran, Srinivas; Dokholyan, Nikolay V.

    2011-01-01

    Motivation: Increasing use of structural modeling for understanding structure–function relationships in proteins has led to the need to ensure that the protein models being used are of acceptable quality. Quality of a given protein structure can be assessed by comparing various intrinsic structural properties of the protein to those observed in high-resolution protein structures.

  9. PDB_REDO: automated re-refinement of X-ray structure models in the PDB

    OpenAIRE

    Joosten, R.P.; Salzemann, J.; Bloch, V.; Stockinger, H.; Berglund, A; Blanchet, C.; Bongcam-Rudloff, E.; Combet, C.; Da Costa, A.L.; Deleage, G.; Diarena, M.; Fabbretti, R.; Fettahi, G.; Flegel, V.; Gisel, A.

    2009-01-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimenta...

  10. Automated longitudinal monitoring of in vivo protein aggregation in neurodegenerative disease C. elegans models

    OpenAIRE

    Cornaglia, Matteo; Krishnamani, Gopalan; Mouchiroud, Laurent; Sorrentino, Vincenzo; Lehnert, Thomas; Auwerx, Johan; Gijs, Martin A. M.

    2016-01-01

    Background While many biological studies can be performed on cell-based systems, the investigation of molecular pathways related to complex human dysfunctions – e.g. neurodegenerative diseases – often requires long-term studies in animal models. The nematode Caenorhabditis elegans represents one of the best model organisms for many of these tests and, therefore, versatile and automated systems for accurate time-resolved analyses on C. elegans are becoming highly desirable tools in the field. ...

  11. K model for designing Data Driven Test Automation Frameworks and its Design Architecture Snow Leopard

    OpenAIRE

    Kachewar, Rohan R.

    2013-01-01

    Automated testing improves the efficiency of testing practice at various levels of projects in the organization. Unfortunately, we do not have a common architecture or common standards for designing frameworks across different test levels, projects and test tools which can assist developers, testers and business analysts. To address the above problem, in this paper, I have first proposed a unique reference model and then a design architecture using the proposed model for designing any Data Dr...

  12. MLP based Reusability Assessment Automation Model for Java based Software Systems

    Directory of Open Access Journals (Sweden)

    Surbhi Maggo

    2014-08-01

    Full Text Available Reuse refers to a common principle of using existing resources repeatedly, that is pervasively applicable everywhere. In software engineering reuse refers to the development of software systems using already available artifacts or assets partially or completely, with or without modifications. Software reuse not only promises significant improvements in productivity and quality but also provides for the development of more reliable, cost effective, dependable and less buggy (considering that prior use and testing have removed errors software with reduced time and effort. In this paper we present an efficient and reliable automation model for reusability evaluation of procedure based object oriented software for predicting the reusability levels of the components as low, medium or high. The presented model follows a reusability metric framework that targets the requisite reusability attributes including maintainability (using the Maintainability Index for functional analysis of the components. Further Multilayer perceptron (using back propagation based neural network is applied for the establishment of significant relationships among these attributes for reusability prediction. The proposed approach provides support for reusability evaluation at functional level rather than at structural level. The automation support for this approach is provided in the form of a tool named JRA2M2 (Java based Reusability Assessment Automation Model using Multilayer Perceptron (MLP, implemented in Java. The performance of JRA2M2 is recorded using parameters like accuracy, classification error, precision and recall. The results generated using JRA2M2 indicate that the proposed automation tool can be effectively used as a reliable and efficient solution for automated evaluation of reusability.

  13. Software Test Case Automated Generation Algorithm with Extended EDPN Model

    Directory of Open Access Journals (Sweden)

    Jinlong Tao

    2013-08-01

    Full Text Available To improve the sufficiency for software testing and the performance of testing algorithms, an improved event-driven Petri network model using combination method is proposed, abbreviated as OEDPN model. Then it is applied to OATS method to extend the implementation of OATS. On the basis of OEDPN model, the marked associate recursive method of state combination on category is presented to solve problems of combined conflict. It is also for test case explosion generated by redundant test cases and hard extension of OATS method. Meanwhile, the generation methods on interactive test cases of extended OATS are also presented by research on generation test cases.

  14. An Abstract Model for Proving Safety of Multi-lane Traffic Manoeuvres

    DEFF Research Database (Denmark)

    Hilscher, Martin; Linker, Sven; Olderog, Ernst-Rüdiger;

    2011-01-01

    We present an approach to prove safety (collision freedom) of multi-lane motorway traffic with lane-change manoeuvres. This is ultimately a hybrid verification problem due to the continuous dynamics of the cars. We abstract from the dynamics by introducing a new spatial interval logic based on the...... view of each car. To guarantee safety, we present two variants of a lane-change controller, one with perfect knowledge of the safety envelopes of neighbouring cars and one which takes only the size of the neighbouring cars into account. Based on these controllers we provide a local safety proof for...... unboundedly many cars by showing that at any moment the reserved space of each car is disjoint from the reserved space of any other car....

  15. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  16. Assessing model uncertainty using hexavalent chromium and lung cancer mortality as an example [Abstract 2015

    Science.gov (United States)

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality a...

  17. Automated Planning of context-aware Process Models

    OpenAIRE

    Heinrich, Bernd; Schön, Dominik

    2015-01-01

    Most real world processes are heavily influenced by environmental factors, which are referred to as the context of a process. Thus, the consideration of context is proposed within the research strand of Business Process Modeling. Most existing context-aware modeling approaches con-sider context only in terms of static information like, for instance, the location where a process is performed. However, context information like the weather could change during the conduction of a process, which w...

  18. TRAFFIC FLOW MODEL BASED ON CELLULAR AUTOMATION WITH ADAPTIVE DECELERATION

    OpenAIRE

    Shinkarev, A. A.

    2016-01-01

    This paper describes continuation of the authors’ work in the field of traffic flow mathematical models based on the cellular automata theory. The refactored representation of the multifactorial traffic flow model based on the cellular automata theory is used for a representation of an adaptive deceleration step implementation. The adaptive deceleration step in the case of a leader deceleration allows slowing down smoothly but not instantly. Concepts of the number of time steps without confli...

  19. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Fons

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for th

  20. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  1. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.)

  2. An Abstract Model Showing That the Spatial Structure of Social Networks Affects the Outcomes of Cultural Transmission Processes

    OpenAIRE

    Andrew White

    2013-01-01

    Space plays an important role in the transfer of information in most societies that archaeologists study. Social networks that mediate learning and the transmission of cultural information are situated in spatial environments. This paper uses an abstract agent-based model to represent the transmission of the value of a single "stylistic" variable among groups linked together within a social network, the spatial structure of which is varied using a few simple parameters. The properties of the ...

  3. Automated economic analysis model for hazardous waste minimization

    International Nuclear Information System (INIS)

    The US Army has established a policy of achieving a 50 percent reduction in hazardous waste generation by the end of 1992. To assist the Army in reaching this goal, the Environmental Division of the US Army Construction Engineering Research Laboratory (USACERL) designed the Economic Analysis Model for Hazardous Waste Minimization (EAHWM). The EAHWM was designed to allow the user to evaluate the life cycle costs for various techniques used in hazardous waste minimization and to compare them to the life cycle costs of current operating practices. The program was developed in C language on an IBM compatible PC and is consistent with other pertinent models for performing economic analyses. The potential hierarchical minimization categories used in EAHWM include source reduction, recovery and/or reuse, and treatment. Although treatment is no longer an acceptable minimization option, its use is widespread and has therefore been addressed in the model. The model allows for economic analysis for minimization of the Army's six most important hazardous waste streams. These include, solvents, paint stripping wastes, metal plating wastes, industrial waste-sludges, used oils, and batteries and battery electrolytes. The EAHWM also includes a general application which can be used to calculate and compare the life cycle costs for minimization alternatives of any waste stream, hazardous or non-hazardous. The EAHWM has been fully tested and implemented in more than 60 Army installations in the United States

  4. Automated biowaste sampling system urine subsystem operating model, part 1

    Science.gov (United States)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  5. National Automated Highway System Consortium: Modeling Stakeholder Preferences Project

    OpenAIRE

    Lathrop, John; Chen, Kan

    1997-01-01

    This document is the final report of the Modeling Stakeholder Preferences Project. The results of the project consist of three results: 1) evaluation framework; 2) focus group non-quantitative findings/ recommendations; and, 3) performance/impact measures, their endpoints, rankings and weights, for each stakeholder group.

  6. A Voyage to Arcturus: A model for automated management of a WLCG Tier-2 facility

    Science.gov (United States)

    Roy, Gareth; Crooks, David; Mertens, Lena; Mitchell, Mark; Purdie, Stuart; Cadellin Skipsey, Samuel; Britton, David

    2014-06-01

    With the current trend towards "On Demand Computing" in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on "off the shelf" software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems.

  7. A Voyage to Arcturus: A model for automated management of a WLCG Tier-2 facility

    International Nuclear Information System (INIS)

    With the current trend towards 'On Demand Computing' in big data environments it is crucial that the deployment of services and resources becomes increasingly automated. Deployment based on cloud platforms is available for large scale data centre environments but these solutions can be too complex and heavyweight for smaller, resource constrained WLCG Tier-2 sites. Along with a greater desire for bespoke monitoring and collection of Grid related metrics, a more lightweight and modular approach is desired. In this paper we present a model for a lightweight automated framework which can be use to build WLCG grid sites, based on 'off the shelf' software components. As part of the research into an automation framework the use of both IPMI and SNMP for physical device management will be included, as well as the use of SNMP as a monitoring/data sampling layer such that more comprehensive decision making can take place and potentially be automated. This could lead to reduced down times and better performance as services are recognised to be in a non-functional state by autonomous systems.

  8. Channel and active component abstractions for WSN programming - a language model with operating system support

    OpenAIRE

    P. Harvey; Dearle, A.; Lewis, J.; Sventek, J.

    2012-01-01

    To support the programming of Wireless Sensor Networks, a number of unconventional programming models have evolved, in particular the event-based model. These models are non-intuitive to programmers due to the introduction of unnecessary, non-intrinsic complexity. Component-based languages like Insense can eliminate much of this unnecessary complexity via the use of active components and synchronous channels. However, simply layering an Insense implementation over an existing event-based syst...

  9. A Local Search Modeling for Constrained Optimum Paths Problems (Extended Abstract

    Directory of Open Access Journals (Sweden)

    Quang Dung Pham

    2009-10-01

    Full Text Available Constrained Optimum Path (COP problems appear in many real-life applications, especially on communication networks. Some of these problems have been considered and solved by specific techniques which are usually difficult to extend. In this paper, we introduce a novel local search modeling for solving some COPs by local search. The modeling features the compositionality, modularity, reuse and strengthens the benefits of Constrained-Based Local Search. We also apply the modeling to the edge-disjoint paths problem (EDP. We show that side constraints can easily be added in the model. Computational results show the significance of the approach.

  10. An Improvement in Thermal Modelling of Automated Tape Placement Process

    International Nuclear Information System (INIS)

    The thermoplastic tape placement process offers the possibility of manufacturing large laminated composite parts with all kinds of geometries (double curved i.e.). This process is based on the fusion bonding of a thermoplastic tape on a substrate. It has received a growing interest during last years because of its non autoclave abilities.In order to control and optimize the quality of the manufactured part, we need to predict the temperature field throughout the processing of the laminate. In this work, we focus on a thermal modeling of this process which takes in account the imperfect bonding existing between the different layers of the substrate by introducing thermal contact resistance in the model. This study is leaning on experimental results which inform us that the value of the thermal resistance evolves with temperature and pressure applied on the material.

  11. Combining Various Methods of Automated User Decision and Preferences Modelling

    Czech Academy of Sciences Publication Activity Database

    Eckhardt, Alan; Vojtáš, Peter

    Berlin: Springer, 2009 - (Torra, V.; Narukawa, Y.; Inuiguchi, M.), s. 172-181. (Lecture Notes in Artificial Intelligence. 5861). ISBN 978-3-642-04819-7. [MDAI 2009. Internationa Conference on Modeling Decisions for Artificial Intelligence /6./. Awaji Island (JP), 30.11.2009-02.12.2009] R&D Projects: GA AV ČR 1ET100300517 Institutional research plan: CEZ:AV0Z10300504 Keywords : user preferences learning * recommender systems Subject RIV: IN - Informatics, Computer Science

  12. ADGEN: a system for automated sensitivity analysis of predictive models

    International Nuclear Information System (INIS)

    A system that can automatically enhance computer codes with a sensitivity calculation capability is presented. With this new system, named ADGEN, rapid and cost-effective calculation of sensitivities can be performed in any FORTRAN code for all input data or parameters. The resulting sensitivities can be used in performance assessment studies related to licensing or interactions with the public to systematically and quantitatively prove the relative importance of each of the system parameters in calculating the final performance results. A general procedure calling for the systematic use of sensitivities in assessment studies is presented. The procedure can be used in modelling and model validation studies to avoid ''over modelling,'' in site characterization planning to avoid ''over collection of data,'' and in performance assessment to determine the uncertainties on the final calculated results. The added capability to formally perform the inverse problem, i.e., to determine the input data or parameters on which to focus additional research or analysis effort in order to improve the uncertainty of the final results, is also discussed

  13. Automated main-chain model building by template matching and iterative fragment extension

    International Nuclear Information System (INIS)

    A method for automated macromolecular main-chain model building is described. An algorithm for the automated macromolecular model building of polypeptide backbones is described. The procedure is hierarchical. In the initial stages, many overlapping polypeptide fragments are built. In subsequent stages, the fragments are extended and then connected. Identification of the locations of helical and β-strand regions is carried out by FFT-based template matching. Fragment libraries of helices and β-strands from refined protein structures are then positioned at the potential locations of helices and strands and the longest segments that fit the electron-density map are chosen. The helices and strands are then extended using fragment libraries consisting of sequences three amino acids long derived from refined protein structures. The resulting segments of polypeptide chain are then connected by choosing those which overlap at two or more Cα positions. The fully automated procedure has been implemented in RESOLVE and is capable of model building at resolutions as low as 3.5 Å. The algorithm is useful for building a preliminary main-chain model that can serve as a basis for refinement and side-chain addition

  14. Modeling Knowledge Bases for Automated Decision Making Systems – A Literature Review

    Directory of Open Access Journals (Sweden)

    Franz Felix Füssl

    2015-09-01

    Full Text Available Developing automated decision making systems means dealing with knowledge in every possible manner. One of the most important points of developing artificial intelligent systems is developing a precise knowledge base with integrating self-learning mechanisms. Moreover using knowledge in expert systems or decision support systems it is necessary to document knowledge and make it visible for managing it. Main goal of this work is finding a suitable solution for modeling knowledge bases in automated decision making systems concerning both illustrating specific knowledge and learning mechanisms. There are a lot of different terms describing this kind of research, such as knowledge modeling, knowledge engineering or ontology engineering. For that reason this paper provides a comparison of the technical terms in this domain by illustrating similarities, specifics and how they are used in literature.

  15. Analysis on Traffic Conflicts of Two-lane Highway Based on Improved Cellular Automation Model

    Directory of Open Access Journals (Sweden)

    Xiru Tang

    2013-06-01

    Full Text Available Based on microscopic traffic characteristics of two-lane highway and different driving characteristics for drivers, the characteristics of drivers and vehicle structure are introduced into Cellular Automation model for establishing new Cellular Automation model of two-lane highway. Through computer simulation, the paper analyzes the effect of the promotion of different vehicles, drivers and arrival rates on traffic conflicts of two-lane highway, which gets the relationship between the parameters such as road traffic and velocity variance and collision. The results indicate that the frequency of traffic conflicts has close relationship with the product of traffic flow and velocity variation. When the traffic flow and velocity variation are great, the frequency of the conflict is the greatest, and when the traffic flow and velocity variation are little, the frequency of the conflict is the least.

  16. Composition of Petri nets models in service-oriented industrial automation

    OpenAIRE

    Mendes, João M.; Leitão, Paulo; Restivo, Francisco; Colombo, Armando W.

    2010-01-01

    In service-oriented systems, composition of services is required to build new, distributed and more complex services, based on the logic behavior of individual ones. This paper discusses the formal composition of Petri nets models used for the process description and control in service-oriented automation systems. The proposed approach considers two forms for the composition of services, notably the offline composition, applied during the design phase, and the online composition, related to t...

  17. Automated optimal glycaemic control using a physiology based pharmacokinetic, pharmacodynamic model

    OpenAIRE

    Schaller, Stephan

    2015-01-01

    After decades of research, Automated Glucose Control (AGC) is still out of reach for everyday control of blood glucose. The inter- and intra-individual variability of glucose dynamics largely arising from variability in insulin absorption, distribution, and action, and related physiological lag-times remain a core problem in the development of suitable control algorithms. Over the years, model predictive control (MPC) has established itself as the gold standard in AGC systems in research. Mod...

  18. Management of MAS by Means of Automated Reasoning in the Role Model. Chapter 13

    Czech Academy of Sciences Publication Activity Database

    Kazík, O.; Neruda, Roman

    Amsterdam: IOS Press, 2012 - (Essaaidi, M.; Ganzha, M.; Paprzycki, M.), s. 309-322. (NATO Science for Peace and Security Series - D: Information and Communication Security. 32). ISBN 978-1-60750-817-5 R&D Projects: GA MŠk OC10047 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi-agent systems * description logic * role model * matchmaking * integrity constraints * automated reasoning * computational intelligence Subject RIV: IN - Informatics, Computer Science

  19. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  20. A semi-automated vascular access system for preclinical models

    International Nuclear Information System (INIS)

    Murine models are used extensively in biological and translational research. For many of these studies it is necessary to access the vasculature for the injection of biologically active agents. Among the possible methods for accessing the mouse vasculature, tail vein injections are a routine but critical step for many experimental protocols. To perform successful tail vein injections, a high skill set and experience is required, leaving most scientists ill-suited to perform this task. This can lead to a high variability between injections, which can impact experimental results. To allow more scientists to perform tail vein injections and to decrease the variability between injections, a vascular access system (VAS) that semi-automatically inserts a needle into the tail vein of a mouse was developed. The VAS uses near infrared light, image processing techniques, computer controlled motors, and a pressure feedback system to insert the needle and to validate its proper placement within the vein. The VAS was tested by injecting a commonly used radiolabeled probe (FDG) into the tail veins of five mice. These mice were then imaged using micro-positron emission tomography to measure the percentage of the injected probe remaining in the tail. These studies showed that, on average, the VAS leaves 3.4% of the injected probe in the tail. With these preliminary results, the VAS system demonstrates the potential for improving the accuracy of tail vein injections in mice. (paper)

  1. Automated Finite Element Modeling of Wing Structures for Shape Optimization

    Science.gov (United States)

    Harvey, Michael Stephen

    1993-01-01

    The displacement formulation of the finite element method is the most general and most widely used technique for structural analysis of airplane configurations. Modem structural synthesis techniques based on the finite element method have reached a certain maturity in recent years, and large airplane structures can now be optimized with respect to sizing type design variables for many load cases subject to a rich variety of constraints including stress, buckling, frequency, stiffness and aeroelastic constraints (Refs. 1-3). These structural synthesis capabilities use gradient based nonlinear programming techniques to search for improved designs. For these techniques to be practical a major improvement was required in computational cost of finite element analyses (needed repeatedly in the optimization process). Thus, associated with the progress in structural optimization, a new perspective of structural analysis has emerged, namely, structural analysis specialized for design optimization application, or.what is known as "design oriented structural analysis" (Ref. 4). This discipline includes approximation concepts and methods for obtaining behavior sensitivity information (Ref. 1), all needed to make the optimization of large structural systems (modeled by thousands of degrees of freedom and thousands of design variables) practical and cost effective.

  2. Tool-driven Design and Automated Parameterization for Real-time Generic Drivetrain Models

    Directory of Open Access Journals (Sweden)

    Schwarz Christina

    2015-01-01

    Full Text Available Real-time dynamic drivetrain modeling approaches have a great potential for development cost reduction in the automotive industry. Even though real-time drivetrain models are available, these solutions are specific to single transmission topologies. In this paper an environment for parameterization of a solution is proposed based on a generic method applicable to all types of gear transmission topologies. This enables tool-guided modeling by non- experts in the fields of mechanic engineering and control theory leading to reduced development and testing efforts. The approach is demonstrated for an exemplary automatic transmission using the environment for automated parameterization. Finally, the parameterization is validated via vehicle measurement data.

  3. The Integration and Abstraction of EBS Models in Yucca Mountain Performance Assessment

    International Nuclear Information System (INIS)

    The safety strategy for geological disposal of radioactive waste at Yucca Mountain relies on a multi-barrier system to contain the waste and isolate it from the biosphere. The multi-barrier system consists of the natural barrier provided by the geological setting and the engineered barrier system (EBS). In the case of Yucca Mountain (YM) the geologic setting is the unsaturated-zone host rock, consisting of about 600 meters of layered ash-flow volcanic tuffs above the water table, and the saturated zone beneath the water table. Both the unsaturated and saturated rocks are part of a closed hydrologic basin in a desert surface environment. The waste is to be buried about halfway between the desert surface and the water table. The primary engineered barriers at YM consist of metal components that are highly durable in an oxidizing environment. The two primary components of the engineered barrier system are highly corrosion-resistant metal waste packages, made from a nickel-chromium-molybdenum alloy, Alloy 22, and titanium drip shields that protect the waste packages from corrosive dripping water and falling rocks. Design and performance assessment of the EBS requires models that describe how the EBS and near field behave under anticipated repository-relevant conditions. These models must describe coupled hydrologic, thermal, chemical, and mechanical (THCM) processes that drive radionuclide transport in a highly fractured host rock, consisting of a relatively permeable network of conductive fractures in a setting of highly impermeable tuff rock matrix. An integrated performance assessment of the EBS must include a quantification of the uncertainties that arise from (1) incomplete understanding of processes and (2) from lack of data representative of the large spatial scales and long time scales relevant to radioactive waste disposal (e.g., long-term metal corrosion rates and heterogeneities in rock properties over the large 5 km2 emplacement area of the repository). A

  4. Radionuclide Transport Modelling: Current Status and Future Needs. Synthesis, Work Group Reports and Extended Abstracts

    International Nuclear Information System (INIS)

    The workshop identified a set of critical issues for the Swedish Nuclear Power Inspectorate (SKI) and the Swedish Radiation Protection Authority (SSI) to address in preparing for future reviews of license applications, which have subsequently been considered in preparing this synthesis. Structure for organising expert participation: A structure for organising expert participation in future reviews is proposed based on clearinghouses for (1) regulatory application and context, (2) engineered barrier systems, (3) geosphere, (4) biosphere, and (5) performance assessment integration and calculations. As part of their work, these clearinghouses could identify key issues that need to be resolved prior to future reviews. Performance assessment strategy and review context: Future reviews will be conducted in the context of regulations based on risk criteria; this leads to a need to review the methods used in probabilistic risk assessment, as well as the underlying process models. A plan is needed for accomplishing both aims. Despite the probabilistic framework, a need is anticipated for targeted, deterministic calculations to check particular assumptions. Priorities and ambition level for reviews: SKI's and SSI's resources can be more efficiently utilised by an early review of SKB's safety case, so that if necessary the authorities can make an early start on evaluating topics that are of primary significance to the safety case. As a guide to planning for allocation of effort in future reviews, this workshop produced a preliminary ranking of technical issues, on a scale from 'non-controversial' to 'requiring independent modelling,' Analysis of repository system and scenarios: Systems analysis tools including features/events/processes encyclopaedias, process-influence diagrams, and assessment-model flowcharts should be used as review tools, to check the processes and influences considered in SKB's analyses, and to evaluate the comprehensiveness of the scenarios that are

  5. Modeling and predicting abstract concept or idea introduction and propagation through geopolitical groups

    Science.gov (United States)

    Jaenisch, Holger M.; Handley, James W.; Hicklen, Michael L.

    2007-04-01

    This paper describes a novel capability for modeling known idea propagation transformations and predicting responses to new ideas from geopolitical groups. Ideas are captured using semantic words that are text based and bear cognitive definitions. We demonstrate a unique algorithm for converting these into analytical predictive equations. Using the illustrative idea of "proposing a gasoline price increase of 1 per gallon from 2" and its changing perceived impact throughout 5 demographic groups, we identify 13 cost of living Diplomatic, Information, Military, and Economic (DIME) features common across all 5 demographic groups. This enables the modeling and monitoring of Political, Military, Economic, Social, Information, and Infrastructure (PMESII) effects of each group to this idea and how their "perception" of this proposal changes. Our algorithm and results are summarized in this paper.

  6. Emergence of Consensus in a Multi-Robot Network: from Abstract Models to Empirical Validation

    CERN Document Server

    Trianni, Vito; Reina, Andreagiovanni; Baronchelli, Andrea

    2016-01-01

    Consensus dynamics in decentralised multiagent systems are subject to intense studies, and several different models have been proposed and analysed. Among these, the naming game stands out for its simplicity and applicability to a wide range of phenomena and applications, from semiotics to engineering. Despite the wide range of studies available, the implementation of theoretical models in real distributed systems is not always straightforward, as the physical platform imposes several constraints that may have a bearing on the consensus dynamics. In this paper, we investigate the effects of an implementation of the naming game for the kilobot robotic platform, in which we consider concurrent execution of games and physical interferences. Consensus dynamics are analysed in the light of the continuously evolving communication network created by the robots, highlighting how the different regimes crucially depend on the robot density and on their ability to spread widely in the experimental arena. We find that ph...

  7. A study of sound transmission in an abstract middle ear using physical and finite element models.

    Science.gov (United States)

    Gonzalez-Herrera, Antonio; Olson, Elizabeth S

    2015-11-01

    The classical picture of middle ear (ME) transmission has the tympanic membrane (TM) as a piston and the ME cavity as a vacuum. In reality, the TM moves in a complex multiphasic pattern and substantial pressure is radiated into the ME cavity by the motion of the TM. This study explores ME transmission with a simple model, using a tube terminated with a plastic membrane. Membrane motion was measured with a laser interferometer and pressure on both sides of the membrane with micro-sensors that could be positioned close to the membrane without disturbance. A finite element model of the system explored the experimental results. Both experimental and theoretical results show resonances that are in some cases primarily acoustical or mechanical and sometimes produced by coupled acousto-mechanics. The largest membrane motions were a result of the membrane's mechanical resonances. At these resonant frequencies, sound transmission through the system was larger with the membrane in place than it was when the membrane was absent. PMID:26627771

  8. Modeling Physical Processes at the Nanoscale—Insight into Self-Organization of Small Systems (abstract)

    Science.gov (United States)

    Proykova, Ana

    2009-04-01

    Essential contributions have been made in the field of finite-size systems of ingredients interacting with potentials of various ranges. Theoretical simulations have revealed peculiar size effects on stability, ground state structure, phases, and phase transformation of systems confined in space and time. Models developed in the field of pure physics (atomic and molecular clusters) have been extended and successfully transferred to finite-size systems that seem very different—small-scale financial markets, autoimmune reactions, and social group reactions to advertisements. The models show that small-scale markets diverge unexpectedly fast as a result of small fluctuations; autoimmune reactions are sequences of two discontinuous phase transitions; and social groups possess critical behavior (social percolation) under the influence of an external field (advertisement). Some predicted size-dependent properties have been experimentally observed. These findings lead to the hypothesis that restrictions on an object's size determine the object's total internal (configuration) and external (environmental) interactions. Since phases are emergent phenomena produced by self-organization of a large number of particles, the occurrence of a phase in a system containing a small number of ingredients is remarkable.

  9. An Ontology-based Model to Determine the Automation Level of an Automated Vehicle for Co-Driving

    OpenAIRE

    Pollard, Evangeline; Morignot, Philippe; Nashashibi, Fawzi

    2013-01-01

    International audience Full autonomy of ground vehicles is a major goal of the ITS (Intelligent Transportation Systems) community. However, reaching such highest autonomy level in all situations (weather, traffic, . . . ) may seem difficult in practice, despite recent results regarding driverless cars (e.g., Google Cars). In addition, an automated vehicle should also self-assess its own perception abilities, and not only perceive its environment. In this paper, we propose an intermediate a...

  10. Learning with Technology: Video Modeling with Concrete-Representational-Abstract Sequencing for Students with Autism Spectrum Disorder.

    Science.gov (United States)

    Yakubova, Gulnoza; Hughes, Elizabeth M; Shinaberry, Megan

    2016-07-01

    The purpose of this study was to determine the effectiveness of a video modeling intervention with concrete-representational-abstract instructional sequence in teaching mathematics concepts to students with autism spectrum disorder (ASD). A multiple baseline across skills design of single-case experimental methodology was used to determine the effectiveness of the intervention on the acquisition and maintenance of addition, subtraction, and number comparison skills for four elementary school students with ASD. Findings supported the effectiveness of the intervention in improving skill acquisition and maintenance at a 3-week follow-up. Implications for practice and future research are discussed. PMID:26983919

  11. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  12. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    Science.gov (United States)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  13. IDENTIFICATION OF TYPES AND MODELS OF AIRCRAFT USING ASC-ANALYSIS OF THEIR SILHOUETTES (CONTOURS (GENERALIZATION, ABSTRACTION, CLASSIFICATION AND IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-12-01

    Full Text Available The article discusses the application of automated system-cognitive analysis (ASC-analysis, its mathematical model which is system theory of information and its software tool, which is intellectual system called "Eidos" for solving problems related to identification of types and models of aircraft by their silhouettes on the ground, to be more precise, their external contours: 1 digitization of scanned images of aircraft and creation of their mathematical models; 2 formation of mathematical models of specific aircraft with the use of the information theory; 3 modeling of the generalized images of various aircraft types and models and their graphic visualization; 4 comparing an image of a particular plane with generalized images of various aircraft types and models, and quantifying the degree of similarities and differences between them, i.e., the identification of the type and model of airplane by its silhouette (contour on the ground; 5 quantification of the similarities and differences of the generalized images of the planes with each other, i.e., clusterconstructive analysis of generalized images of various aircraft types and models. The article gives a new approach to digitizing images of aircraft, based on the use of the polar coordinate system, the center of gravity of the image and its external contour. Before digitizing images, we may use their transformation, standardizing the position of the images, their sizes (resolution, distance and the angle of rotation (angle in three dimensions. Therefore, the results of digitization and ASC-analysis of the images can be invariant (independent relative to their position, dimensions and turns. The shape of the contour of a particular aircraft is considered as a noise information on the type and model of aircraft, including information about the true shape of the aircraft type and its model (clean signal and noise, which distort the real shape, due to noise influences, both of the means of

  14. A New Modular Strategy For Action Sequence Automation Using Neural Networks And Hidden Markov Models

    OpenAIRE

    Mohamed Adel Taher; Mostapha Abdeljawad

    2013-01-01

    In this paper, the authors propose a new hybrid strategy (using artificial neural networks and hidden Markov models) for skill automation. The strategy is based on the concept of using an “adaptive desired†that is introduced in the paper. The authors explain how using an adaptive desired can help a system for which an explicit model is not available or is difficult to obtain to smartly cope with environmental disturbances without requiring explicit rules specification (as with fuzzy syste...

  15. Hierarchical Modeling and Verification for High-speed Train Control Center by Time Automation

    OpenAIRE

    Lei Yuan; Shiying Yang; Dewang Chen; Kaicheng Li

    2014-01-01

    Chinese Train Control System level three (CTCS-3) is a major technical system in Chinese high-speed rail and Train Control System (TCC) is indispensable component in the CTCS-3. Current researches on TCC are mainly based on the simulation, which cannot ensure that all conditions in TCC are tested. This paper presents a hierarchical modeling method and uses time automation (TA) to model the TCC software. We take the design of the active balise telegram editing, a major part in the TCC software...

  16. ℮-conome: an automated tissue counting platform of cone photoreceptors for rodent models of retinitis pigmentosa

    Directory of Open Access Journals (Sweden)

    Clérin Emmanuelle

    2011-12-01

    Full Text Available Abstract Background Retinitis pigmentosa is characterized by the sequential loss of rod and cone photoreceptors. The preservation of cones would prevent blindness due to their essential role in human vision. Rod-derived Cone Viability Factor is a thioredoxin-like protein that is secreted by rods and is involved in cone survival. To validate the activity of Rod-derived Cone Viability Factors (RdCVFs as therapeutic agents for treating retinitis Pigmentosa, we have developed e-conome, an automated cell counting platform for retinal flat mounts of rodent models of cone degeneration. This automated quantification method allows for faster data analysis thereby accelerating translational research. Methods An inverted fluorescent microscope, motorized and coupled to a CCD camera records images of cones labeled with fluorescent peanut agglutinin lectin on flat-mounted retinas. In an average of 300 fields per retina, nine Z-planes at magnification X40 are acquired after two-stage autofocus individually for each field. The projection of the stack of 9 images is subject to a threshold, filtered to exclude aberrant images based on preset variables. The cones are identified by treating the resulting image using 13 variables empirically determined. The cone density is calculated over the 300 fields. Results The method was validated by comparison to the conventional stereological counting. The decrease in cone density in rd1 mouse was found to be equivalent to the decrease determined by stereological counting. We also studied the spatiotemporal pattern of the degeneration of cones in the rd1 mouse and show that while the reduction in cone density starts in the central part of the retina, cone degeneration progresses at the same speed over the whole retinal surface. We finally show that for mice with an inactivation of the Nucleoredoxin-like genes Nxnl1 or Nxnl2 encoding RdCVFs, the loss of cones is more pronounced in the ventral retina. Conclusion The automated

  17. Automated modelling of spatially-distributed glacier ice thickness and volume

    Science.gov (United States)

    James, William H. M.; Carrivick, Jonathan L.

    2016-07-01

    Ice thickness distribution and volume are both key parameters for glaciological and hydrological applications. This study presents VOLTA (Volume and Topography Automation), which is a Python script tool for ArcGISTM that requires just a digital elevation model (DEM) and glacier outline(s) to model distributed ice thickness, volume and bed topography. Ice thickness is initially estimated at points along an automatically generated centreline network based on the perfect-plasticity rheology assumption, taking into account a valley side drag component of the force balance equation. Distributed ice thickness is subsequently interpolated using a glaciologically correct algorithm. For five glaciers with independent field-measured bed topography, VOLTA modelled volumes were between 26.5% (underestimate) and 16.6% (overestimate) of that derived from field observations. Greatest differences were where an asymmetric valley cross section shape was present or where significant valley infill had occurred. Compared with other methods of modelling ice thickness and volume, key advantages of VOLTA are: a fully automated approach and a user friendly graphical user interface (GUI), GIS consistent geometry, fully automated centreline generation, inclusion of a side drag component in the force balance equation, estimation of glacier basal shear stress for each individual glacier, fully distributed ice thickness output and the ability to process multiple glaciers rapidly. VOLTA is capable of regional scale ice volume assessment, which is a key parameter for exploring glacier response to climate change. VOLTA also permits subtraction of modelled ice thickness from the input surface elevation to produce an ice-free DEM, which is a key input for reconstruction of former glaciers. VOLTA could assist with prediction of future glacier geometry changes and hence in projection of future meltwater fluxes.

  18. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  19. Automated Generation of Formal Models from ST Control Programs for Verification Purposes

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Blech, J-O; Gonzalez Suarez, V

    2014-01-01

    In large industrial control systems such as the ones installed at CERN, one of the main issues is the ability to verify the correct behaviour of the Programmable Logic Controller (PLC) programs. While manual and automated testing can achieve good results, some obvious problems remain unsolved such as the difficulty to check safety or liveness properties. This paper proposes a general methodology and a tool to verify PLC programs by automatically generating formal models for different model checkers out of ST code. The proposed methodology defines an automata-based formalism used as intermediate model (IM) to transform PLC programs written in ST language into different formal models for verification purposes. A tool based on Xtext has been implemented that automatically generates models for the NuSMV and UPPAAL model checkers and the BIP framework.

  20. Statistical modelling of networked human-automation performance using working memory capacity.

    Science.gov (United States)

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models. PMID:24308716

  1. Illuminance-based slat angle selection model for automated control of split blinds

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Jia; Olbina, Svetlana [Rinker School of Building Construction, University of Florida, Gainesville, FL 32611-5703 (United States)

    2011-03-15

    Venetian blinds play an important role in controlling daylight in buildings. Automated blinds overcome some limitations of manual blinds; however, the existing automated systems mainly control the direct solar radiation and glare and cannot be used for controlling innovative blind systems such as split blinds. This research developed an Illuminance-based Slat Angle Selection (ISAS) model that predicts the optimum slat angles of split blinds to achieve the designed indoor illuminance. The model was constructed based on a series of multi-layer feed-forward artificial neural networks (ANNs). The illuminance values at the sensor points used to develop the ANNs were obtained by the software EnergyPlus trademark. The weather determinants (such as horizontal illuminance and sun angles) were used as the input variables for the ANNs. The illuminance level at a sensor point was the output variable for the ANNs. The ISAS model was validated by evaluating the errors in the calculation of the: 1) illuminance and 2) optimum slat angles. The validation results showed that the power of the ISAS model to predict illuminance was 94.7% while its power to calculate the optimum slat angles was 98.5%. For about 90% of time in the year, the illuminance percentage errors were less than 10%, and the percentage errors in calculating the optimum slat angles were less than 5%. This research offers a new approach for the automated control of split blinds and a guide for future research to utilize the adaptive nature of ANNs to develop a more practical and applicable blind control system. (author)

  2. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  3. Research of Network System Reconfigurable Model Based on the Finite State Automation

    Directory of Open Access Journals (Sweden)

    Shenghan Zhou

    2014-05-01

    Full Text Available Since the network analysis model based on the system state exists the issues of network survivability safety, fault tolerance and dynamic ability are adapted to the environment changes, in this paper, network system model based on the finite state automation has reconfigurable quality. The model first puts forward the concept of reconfigurable network systems and reveals its robustness, evolution and the basic attributes of survivability. By establishing a hierarchical model of system state, the system robust behavior, evolution behavior and survival behavior are described. Secondly, network topology reconfigurable measurement as an example, puts forward the quantitative reconfigurable metrics. At last, the example verification. Experiments show that the proposed reconfigurable quantitative indicators of reconfigurable resistance model for [1.391, 1.140, 1.591] prove that the network is an efficient reconfigurable network topology, which can effectively adapt the dynamic changes in the environment

  4. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show that the...

  5. An automated model-based aim point distribution system for solar towers

    Science.gov (United States)

    Schwarzbözl, Peter; Rong, Amadeus; Macke, Ansgar; Säck, Jan-Peter; Ulmer, Steffen

    2016-05-01

    Distribution of heliostat aim points is a major task during central receiver operation, as the flux distribution produced by the heliostats varies continuously with time. Known methods for aim point distribution are mostly based on simple aim point patterns and focus on control strategies to meet local temperature and flux limits of the receiver. Lowering the peak flux on the receiver to avoid hot spots and maximizing thermal output are obviously competing targets that call for a comprehensive optimization process. This paper presents a model-based method for online aim point optimization that includes the current heliostat field mirror quality derived through an automated deflectometric measurement process.

  6. GoSam 2.0. Automated one loop calculations within and beyond the standard model

    International Nuclear Information System (INIS)

    We present GoSam 2.0, a fully automated framework for the generation and evaluation of one loop amplitudes in multi leg processes. The new version offers numerous improvements both on generational aspects as well as on the reduction side. This leads to a faster and more stable code for calculations within and beyond the Standard Model. Furthermore it contains the extended version of the standardized interface to Monte Carlo programs which allows for an easy combination with other existing tools. We briefly describe the conceptual innovations and present some phenomenological results.

  7. Electronic design automation of analog ICs combining gradient models with multi-objective evolutionary algorithms

    CERN Document Server

    Rocha, Frederico AE; Lourenço, Nuno CC; Horta, Nuno CG

    2013-01-01

    This book applies to the scientific area of electronic design automation (EDA) and addresses the automatic sizing of analog integrated circuits (ICs). Particularly, this book presents an approach to enhance a state-of-the-art layout-aware circuit-level optimizer (GENOM-POF), by embedding statistical knowledge from an automatically generated gradient model into the multi-objective multi-constraint optimization kernel based on the NSGA-II algorithm. The results showed allow the designer to explore the different trade-offs of the solution space, both through the achieved device sizes, or the resp

  8. PDB_REDO: automated re-refinement of X-ray structure models in the PDB.

    Science.gov (United States)

    Joosten, Robbie P; Salzemann, Jean; Bloch, Vincent; Stockinger, Heinz; Berglund, Ann-Charlott; Blanchet, Christophe; Bongcam-Rudloff, Erik; Combet, Christophe; Da Costa, Ana L; Deleage, Gilbert; Diarena, Matteo; Fabbretti, Roberto; Fettahi, Géraldine; Flegel, Volker; Gisel, Andreas; Kasam, Vinod; Kervinen, Timo; Korpelainen, Eija; Mattila, Kimmo; Pagni, Marco; Reichstadt, Matthieu; Breton, Vincent; Tickle, Ian J; Vriend, Gert

    2009-06-01

    Structural biology, homology modelling and rational drug design require accurate three-dimensional macromolecular coordinates. However, the coordinates in the Protein Data Bank (PDB) have not all been obtained using the latest experimental and computational methods. In this study a method is presented for automated re-refinement of existing structure models in the PDB. A large-scale benchmark with 16 807 PDB entries showed that they can be improved in terms of fit to the deposited experimental X-ray data as well as in terms of geometric quality. The re-refinement protocol uses TLS models to describe concerted atom movement. The resulting structure models are made available through the PDB_REDO databank (http://www.cmbi.ru.nl/pdb_redo/). Grid computing techniques were used to overcome the computational requirements of this endeavour. PMID:22477769

  9. Automated Translation and Thermal Zoning of Digital Building Models for Energy Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Nathaniel L. [Cornell University; McCrone, Colin J. [Cornell University; Walter, Bruce J. [Cornell University; Pratt, Kevin B. [Cornell University; Greenberg, Donald P. [Cornell University

    2013-08-26

    Building energy simulation is valuable during the early stages of design, when decisions can have the greatest impact on energy performance. However, preparing digital design models for building energy simulation typically requires tedious manual alteration. This paper describes a series of five automated steps to translate geometric data from an unzoned CAD model into a multi-zone building energy model. First, CAD input is interpreted as geometric surfaces with materials. Second, surface pairs defining walls of various thicknesses are identified. Third, normal directions of unpaired surfaces are determined. Fourth, space boundaries are defined. Fifth, optionally, settings from previous simulations are applied, and spaces are aggregated into a smaller number of thermal zones. Building energy models created quickly using this method can offer guidance throughout the design process.

  10. A Study on Automated Context-aware Access Control Model Using Ontology

    Science.gov (United States)

    Jang, Bokman; Jang, Hyokyung; Choi, Euiin

    Applications in context-aware computing environment will be connected wireless network and various devices. According to, recklessness access of information resource can make trouble of system. So, access authority management is very important issue both information resource and adapt to system through founding security policy of needed system. But, existing security model is easy of approach to resource through simply user ID and password. This model has a problem that is not concerned about user's environment information. In this paper, propose model of automated context-aware access control using ontology that can more efficiently control about resource through inference and judgment of context information that collect user's information and user's environment context information in order to ontology modeling.

  11. Automated Behavioral Phenotyping Reveals Presymptomatic Alterations in a SCA3 Genetrap Mouse Model

    Institute of Scientific and Technical Information of China (English)

    Jeannette Hübener; Nicolas Casadei; Peter Teismann; Mathias W. Seeliger; Maria Bj(o)rkqvist; Stephan von H(o)rsten; Olaf Riess; Huu Phuc Nguyen

    2012-01-01

    Characterization of disease models of neurodegenerative disorders requires a systematic and comprehensive phenotyping in a highly standardized manner,Therefore,automated high-resolution behavior test systems such as the homecage based LabMaster system are of particular interest.We demonstrate the power of the automated LabMaster system by discovering previously unrecognized features of a recently characterized atxn3 mutant mouse model.This model provided neurological symptoms including gait ataxia,tremor,weight loss and premature death at the age of t2 months usually detectable just 2 weeks before the mice died.Moreover,using the LabMaster system we were able to detect hypoactivity in presymptomatic mutant mice in the dark as well as light phase.Additionally,we analyzed inflammation,immunological and hematological parameters,which indicated a reduced immune defense in phenotypic mice.Here we demonstrate thai a detailed characterization even of organ systems that are usually not affected in SCA3 is important for further studies of pathogenesis and required for the preclinical therapeutic studies.

  12. Automated model-based bias field correction of MR images of the brain.

    Science.gov (United States)

    Van Leemput, K; Maes, F; Vandermeulen, D; Suetens, P

    1999-10-01

    We propose a model-based method for fully automated bias field correction of MR brain images. The MR signal is modeled as a realization of a random process with a parametric probability distribution that is corrupted by a smooth polynomial inhomogeneity or bias field. The method we propose applies an iterative expectation-maximization (EM) strategy that interleaves pixel classification with estimation of class distribution and bias field parameters, improving the likelihood of the model parameters at each iteration. The algorithm, which can handle multichannel data and slice-by-slice constant intensity offsets, is initialized with information from a digital brain atlas about the a priori expected location of tissue classes. This allows full automation of the method without need for user interaction, yielding more objective and reproducible results. We have validated the bias correction algorithm on simulated data and we illustrate its performance on various MR images with important field inhomogeneities. We also relate the proposed algorithm to other bias correction algorithms. PMID:10628948

  13. Semi-automated proper orthogonal decomposition reduced order model non-linear analysis for future BWR stability

    International Nuclear Information System (INIS)

    Highlights: • Techniques within the field of ROMing based on POD are reviewed regarding “well-behaved” applications. • A systematic, general, mostly automated, reduction methodology based on POD is derived. • It is applicable for many classes of dynamical problems including the envisioned BWR application. • Robustness of this approach is demonstrated by a “pathological” test example. • The derived ROM accurately predicts dynamics of transients not included in the data set. - Abstract: Thermal–hydraulic coupling between power, flow rate and density, intensified by neutronics feedback are the main drivers determining the stability behavior of a boiling water reactor (BWR). High-power low-flow conditions in connection with unfavorable power distributions can lead the BWR system into unstable regions where power oscillations can be triggered. This important threat to operational safety requires careful analysis for proper understanding. Current design rules assure admissible operation conditions by exclusion regions determined by numerical calculations and analytical methods based on non-linear states for specific transients. Analyzing an exhaustive parameter space of the non-linear BWR system becomes feasible with methodologies based on reduced order models (ROMs), saving computational cost and improving the physical understanding. A new self-contained methodology is developed, based on the general general proper orthogonal decomposition (POD) reduction technique. It is mostly automated, applicable for generic partial differential equation (PDE) systems, and reduces them in a grid-free manner to a small ordinary differential equation (ODE) system able to capture even non-linear dynamics. This allows a much more extensive analysis of the represented physical system. Symbolic mathematical manipulations are performed automatically by Mathematica routines. A novel and general calibration roadmap is proposed which simplifies choices on specific POD

  14. A conceptual model of the automated credibility assessment of the volunteered geographic information

    International Nuclear Information System (INIS)

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd – sourced based applications. There are two main components proposed to be assessed in the conceptual model – metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers

  15. An automated method to build groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    Directory of Open Access Journals (Sweden)

    P. A. Marker

    2015-02-01

    Full Text Available Large-scale integrated hydrological models are important decision support tools in water resources management. The largest source of uncertainty in such models is the hydrostratigraphic model. Geometry and configuration of hydrogeological units are often poorly determined from hydrogeological data alone. Due to sparse sampling in space, lithological borehole logs may overlook structures that are important for groundwater flow at larger scales. Good spatial coverage along with high spatial resolution makes airborne time-domain electromagnetic (AEM data valuable for the structural input to large-scale groundwater models. We present a novel method to automatically integrate large AEM data-sets and lithological information into large-scale hydrological models. Clay-fraction maps are produced by translating geophysical resistivity into clay-fraction values using lithological borehole information. Voxel models of electrical resistivity and clay fraction are classified into hydrostratigraphic zones using k-means clustering. Hydraulic conductivity values of the zones are estimated by hydrological calibration using hydraulic head and stream discharge observations. The method is applied to a Danish case study. Benchmarking hydrological performance by comparison of simulated hydrological state variables, the cluster model performed competitively. Calibrations of 11 hydrostratigraphic cluster models with 1–11 hydraulic conductivity zones showed improved hydrological performance with increasing number of clusters. Beyond the 5-cluster model hydrological performance did not improve. Due to reproducibility and possibility of method standardization and automation, we believe that hydrostratigraphic model generation with the proposed method has important prospects for groundwater models used in water resources management.

  16. Abstractions for Mechanical Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2012-01-01

    mechanical system. The tangential manifolds are generated using constants of motion, which can be derived from Noether's theorem. The transversal manifolds are subsequently generated on a reduced space, given by the Routhian, via action-angle coordinates. The method fully applies for integrable systems. We...... focus on a particular aspect of abstraction - partitioning the state space, as existing methods can be applied on the discretized state space to obtain an automata-based model. The contribution of the paper is to show that well-known reduction methods can be used to generate abstract models, which can...

  17. Policy-Based Automation of Dynamique and Multipoint Virtual Private Network Simulation on OPNET Modeler

    Directory of Open Access Journals (Sweden)

    Ayoub BAHNASSE

    2014-12-01

    Full Text Available The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different architectures of the Dynamic Multipoint Virtual Private Network. Several research studies have been conducted to automate the generation and simulation of complex networks under various simulators, according to our research no work has dealt with the Dynamic Multipoint Virtual Private Network. In this paper we present a simulation model of the Dynamic and Multipoint Virtual Private network in OPNET Modeler, and a WEB-based tool for project management on the same network.

  18. Improved automated diagnosis of misfire in internal combustion engines based on simulation models

    Science.gov (United States)

    Chen, Jian; Bond Randall, Robert

    2015-12-01

    In this paper, a new advance in the application of Artificial Neural Networks (ANNs) to the automated diagnosis of misfires in Internal Combustion engines(IC engines) is detailed. The automated diagnostic system comprises three stages: fault detection, fault localization and fault severity identification. Particularly, in the severity identification stage, separate Multi-Layer Perceptron networks (MLPs) with saturating linear transfer functions were designed for individual speed conditions, so they could achieve finer classification. In order to obtain sufficient data for the network training, numerical simulation was used to simulate different ranges of misfires in the engine. The simulation models need to be updated and evaluated using experimental data, so a series of experiments were first carried out on the engine test rig to capture the vibration signals for both normal condition and with a range of misfires. Two methods were used for the misfire diagnosis: one is based on the torsional vibration signals of the crankshaft and the other on the angular acceleration signals (rotational motion) of the engine block. Following the signal processing of the experimental and simulation signals, the best features were selected as the inputs to ANN networks. The ANN systems were trained using only the simulated data and tested using real experimental cases, indicating that the simulation model can be used for a wider range of faults for which it can still be considered valid. The final results have shown that the diagnostic system based on simulation can efficiently diagnose misfire, including location and severity.

  19. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.; Woodley, John; Lye, G.J.

    2009-01-01

    . These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion...... experimental design.]it comparison with conventional methodology, the modelling approach enabled a nearly 4-fold decrease in the number of experiments while the microwell experimentation enabled a 45-fold decrease in material requirements and a significant increase in experimental throughput. The approach is...

  20. Automated 3D Damaged Cavity Model Builder for Lower Surface Acreage Tile on Orbiter

    Science.gov (United States)

    Belknap, Shannon; Zhang, Michael

    2013-01-01

    The 3D Automated Thermal Tool for Damaged Acreage Tile Math Model builder was developed to perform quickly and accurately 3D thermal analyses on damaged lower surface acreage tiles and structures beneath the damaged locations on a Space Shuttle Orbiter. The 3D model builder created both TRASYS geometric math models (GMMs) and SINDA thermal math models (TMMs) to simulate an idealized damaged cavity in the damaged tile(s). The GMMs are processed in TRASYS to generate radiation conductors between the surfaces in the cavity. The radiation conductors are inserted into the TMMs, which are processed in SINDA to generate temperature histories for all of the nodes on each layer of the TMM. The invention allows a thermal analyst to create quickly and accurately a 3D model of a damaged lower surface tile on the orbiter. The 3D model builder can generate a GMM and the correspond ing TMM in one or two minutes, with the damaged cavity included in the tile material. A separate program creates a configuration file, which would take a couple of minutes to edit. This configuration file is read by the model builder program to determine the location of the damage, the correct tile type, tile thickness, structure thickness, and SIP thickness of the damage, so that the model builder program can build an accurate model at the specified location. Once the models are built, they are processed by the TRASYS and SINDA.

  1. TOBAGO — a semi-automated approach for the generation of 3-D building models

    Science.gov (United States)

    Gruen, Armin

    3-D city models are in increasing demand for a great number of applications. Photogrammetry is a relevant technology that can provide an abundance of geometric, topologic and semantic information concerning these models. The pressure to generate a large amount of data with high degree of accuracy and completeness poses a great challenge to phtogrammetry. The development of automated and semi-automated methods for the generation of those data sets is therefore a key issue in photogrammetric research. We present in this article a strategy and methodology for an efficient generation of even fairly complex building models. Within this concept we request the operator to measure the house roofs from a stereomodel in form of an unstructured point cloud. According to our experience this can be done very quickly. Even a non-experienced operator can measure several hundred roofs or roof units per day. In a second step we fit generic building models fully automatically to these point clouds. The structure information is inherently included in these building models. In such a way geometric, topologic and even semantic data can be handed over to a CAD-system, in our case AutoCad, for further visualization and manipulation. The structuring is achieved in three steps. In a first step a classifier is initiated which recognizes the class of houses a particular roof point cloud belongs to. This recognition step is primarily based on the analysis of the number of ridge points. In the second and third steps the concrete topological relations between roof points are investigated and generic building models are fitted to the point clouds. Based on the technique of constraint-based reasoning two geometrical parsers are solving this problem. We have tested the methodology under a variety of different conditions in several pilot projects. The results will indicate the good performance of our approach. In addition we will demonstrate how the results can be used for visualization (texture

  2. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    The purpose of this work is to develop the Engineered Barrier System (EBS) radionuclide transport abstraction model, as directed by a written development plan (CRWMS M and O 1999a). This abstraction is the conceptual model that will be used to determine the rate of release of radionuclides from the EBS to the unsaturated zone (UZ) in the total system performance assessment-license application (TSPA-LA). In particular, this model will be used to quantify the time-dependent radionuclide releases from a failed waste package (WP) and their subsequent transport through the EBS to the emplacement drift wall/UZ interface. The development of this conceptual model will allow Performance Assessment Operations (PAO) and its Engineered Barrier Performance Department to provide a more detailed and complete EBS flow and transport abstraction. The results from this conceptual model will allow PA0 to address portions of the key technical issues (KTIs) presented in three NRC Issue Resolution Status Reports (IRSRs): (1) the Evolution of the Near-Field Environment (ENFE), Revision 2 (NRC 1999a), (2) the Container Life and Source Term (CLST), Revision 2 (NRC 1999b), and (3) the Thermal Effects on Flow (TEF), Revision 1 (NRC 1998). The conceptual model for flow and transport in the EBS will be referred to as the ''EBS RT Abstraction'' in this analysis/modeling report (AMR). The scope of this abstraction and report is limited to flow and transport processes. More specifically, this AMR does not discuss elements of the TSPA-SR and TSPA-LA that relate to the EBS but are discussed in other AMRs. These elements include corrosion processes, radionuclide solubility limits, waste form dissolution rates and concentrations of colloidal particles that are generally represented as boundary conditions or input parameters for the EBS RT Abstraction. In effect, this AMR provides the algorithms for transporting radionuclides using the flow geometry and radionuclide concentrations determined by other

  3. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  4. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  5. Implementing the WebSocket Protocol Based on Formal Modelling and Automated Code Generation

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2014-01-01

    protocols. Furthermore, we perform formal verification of the CPN model prior to code generation, and test the implementation for interoperability against the Autobahn WebSocket test-suite resulting in 97% and 99% success rate for the client and server implementation, respectively. The tests show that the...... pragmatic annotations for automated code generation of protocol software. The contribution of this paper is an application of the approach as implemented in the PetriCode tool to obtain protocol software implementing the IETF WebSocket protocol. This demonstrates the scalability of our approach to real...... cause of test failures were mostly due to local and trivial errors in newly written code-generation templates, and not related to the overall logical operation of the protocol as specified by the CPN model....

  6. A cellular automation model for the change of public attitude regarding nuclear energy

    International Nuclear Information System (INIS)

    A cellular automation model was constructed to investigate how public opinion on nuclear energy in Japan depends upon the information environment and personal communication between people. From simulation with this model, the following become clear; (i) society is a highly non-linear system with a self-organizing potential: (ii) in a society composed of one type of constituent member with homogeneous characteristics, the trend of public opinion is substantially changed only when the effort to ameliorate public acceptance over a long period of time, by means such as education, persuasion and advertisement, exceeds a certain threshold, and (iii) in the case when the amount of information on nuclear risk released from the newsmedia is reduced continuously from now on, the acceptability of nuclear energy is significantly improved so far as the extent of the reduction exceeds a certain threshold. (author)

  7. Testing abstract behavioral specifications

    NARCIS (Netherlands)

    Wong, P.Y.H.; Bubel, R.; Boer, F.S. de; Gouw, C.P.T. de; Gómez-Zamalloa, M.; Haehnle, R; Meinke, K.; Sindhu, M.A.

    2015-01-01

    We present a range of testing techniques for the Abstract Behavioral Specification (ABS) language and apply them to an industrial case study. ABS is a formal modeling language for highly variable, concurrent, component-based systems. The nature of these systems makes them susceptible to the introduc

  8. Automated Feature Based Tls Data Registration for 3d Building Modeling

    Science.gov (United States)

    Kitamura, K.; Kochi, N.; Kaneko, S.

    2012-07-01

    In this paper we present a novel method for the registration of point cloud data obtained using terrestrial laser scanner (TLS). The final goal of our investigation is the automated reconstruction of CAD drawings and the 3D modeling of objects surveyed by TLS. Because objects are scanned from multiple positions, individual point cloud need to be registered to the same coordinate system. We propose in this paper an automated feature based registration procedure. Our proposed method does not require the definition of initial values or the placement of targets and is robust against noise and background elements. A feature extraction procedure is performed for each point cloud as pre-processing. The registration of the point clouds from different viewpoints is then performed by utilizing the extracted features. The feature extraction method which we had developed previously (Kitamura, 2010) is used: planes and edges are extracted from the point cloud. By utilizing these features, the amount of information to process is reduced and the efficiency of the whole registration procedure is increased. In this paper, we describe the proposed algorithm and, in order to demonstrate its effectiveness, we show the results obtained by using real data.

  9. Modeling and performance optimization of automated antenna alignment for telecommunication transceivers

    Directory of Open Access Journals (Sweden)

    Md. Ahsanul Hoque

    2015-09-01

    Full Text Available Antenna alignment is very cumbersome in telecommunication industry and it especially affects the MW links due to environmental anomalies or physical degradation over a period of time. While in recent years a more conventional approach of redundancy has been employed but to ensure the LOS link stability, novel automation techniques are needed. The basic principle is to capture the desired Received Signal Level (RSL by means of an outdoor unit installed on tower top and analyzing the RSL in indoor unit by means of a GUI interface. We have proposed a new smart antenna system where automation is initiated when the transceivers receive low signal strength and report the finding to processing comparator unit. Series architecture is used that include loop antenna, RCX Robonics, LabVIEW interface coupled with a tunable external controller. Denavit–Hartenberg parameters are used in analytical modeling and numerous control techniques have been investigated to overcome imminent overshoot problems for the transport link. With this novel approach, a solution has been put forward for the communication industry where any antenna could achieve optimal directivity for desired RSL with low overshoot and fast steady state response.

  10. Automated Reconstruction of Walls from Airborne LIDAR Data for Complete 3d Building Modelling

    Science.gov (United States)

    He, Y.; Zhang, C.; Awrangjeb, M.; Fraser, C. S.

    2012-07-01

    Automated 3D building model generation continues to attract research interests in photogrammetry and computer vision. Airborne Light Detection and Ranging (LIDAR) data with increasing point density and accuracy has been recognized as a valuable source for automated 3D building reconstruction. While considerable achievements have been made in roof extraction, limited research has been carried out in modelling and reconstruction of walls, which constitute important components of a full building model. Low point density and irregular point distribution of LIDAR observations on vertical walls render this task complex. This paper develops a novel approach for wall reconstruction from airborne LIDAR data. The developed method commences with point cloud segmentation using a region growing approach. Seed points for planar segments are selected through principle component analysis, and points in the neighbourhood are collected and examined to form planar segments. Afterwards, segment-based classification is performed to identify roofs, walls and planar ground surfaces. For walls with sparse LIDAR observations, a search is conducted in the neighbourhood of each individual roof segment to collect wall points, and the walls are then reconstructed using geometrical and topological constraints. Finally, walls which were not illuminated by the LIDAR sensor are determined via both reconstructed roof data and neighbouring walls. This leads to the generation of topologically consistent and geometrically accurate and complete 3D building models. Experiments have been conducted in two test sites in the Netherlands and Australia to evaluate the performance of the proposed method. Results show that planar segments can be reliably extracted in the two reported test sites, which have different point density, and the building walls can be correctly reconstructed if the walls are illuminated by the LIDAR sensor.

  11. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  12. Toward automated model building from video in computer-assisted diagnoses in colonoscopy

    Science.gov (United States)

    Koppel, Dan; Chen, Chao-I.; Wang, Yuan-Fang; Lee, Hua; Gu, Jia; Poirson, Allen; Wolters, Rolf

    2007-03-01

    A 3D colon model is an essential component of a computer-aided diagnosis (CAD) system in colonoscopy to assist surgeons in visualization, and surgical planning and training. This research is thus aimed at developing the ability to construct a 3D colon model from endoscopic videos (or images). This paper summarizes our ongoing research in automated model building in colonoscopy. We have developed the mathematical formulations and algorithms for modeling static, localized 3D anatomic structures within a colon that can be rendered from multiple novel view points for close scrutiny and precise dimensioning. This ability is useful for the scenario when a surgeon notices some abnormal tissue growth and wants a close inspection and precise dimensioning. Our modeling system uses only video images and follows a well-established computer-vision paradigm for image-based modeling. We extract prominent features from images and establish their correspondences across multiple images by continuous tracking and discrete matching. We then use these feature correspondences to infer the camera's movement. The camera motion parameters allow us to rectify images into a standard stereo configuration and calculate pixel movements (disparity) in these images. The inferred disparity is then used to recover 3D surface depth. The inferred 3D depth, together with texture information recorded in images, allow us to construct a 3D model with both structure and appearance information that can be rendered from multiple novel view points.

  13. Hierarchical Modeling and Verification for High-speed Train Control Center by Time Automation

    Directory of Open Access Journals (Sweden)

    Lei Yuan

    2014-06-01

    Full Text Available Chinese Train Control System level three (CTCS-3 is a major technical system in Chinese high-speed rail and Train Control System (TCC is indispensable component in the CTCS-3. Current researches on TCC are mainly based on the simulation, which cannot ensure that all conditions in TCC are tested. This paper presents a hierarchical modeling method and uses time automation (TA to model the TCC software. We take the design of the active balise telegram editing, a major part in the TCC software, as an example. At first, the process of the active balise telegram editing is analyzed to obtain a hierarchical diagram containing several layers. Then, TA is employed to build one TA model for each layer. Lastly, we use UPPAAL (a model validation tool, developed by Uppsala University and Aalborg University to construct a network of the TA models to verify the active balise telegram editing. The verification results demonstrate that this modeling method is feasible and the model can meet the functional requirements of the TCC software.

  14. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  15. Seismic Consequence Abstraction

    International Nuclear Information System (INIS)

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274])

  16. Examining Uncertainty in Demand Response Baseline Models and Variability in Automated Response to Dynamic Pricing

    Energy Technology Data Exchange (ETDEWEB)

    Mathieu, Johanna L.; Callaway, Duncan S.; Kiliccote, Sila

    2011-08-15

    Controlling electric loads to deliver power system services presents a number of interesting challenges. For example, changes in electricity consumption of Commercial and Industrial (C&I) facilities are usually estimated using counterfactual baseline models, and model uncertainty makes it difficult to precisely quantify control responsiveness. Moreover, C&I facilities exhibit variability in their response. This paper seeks to understand baseline model error and demand-side variability in responses to open-loop control signals (i.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR) parameters, which characterize changes in electricity use on DR days, and then present a method for computing the error associated with DR parameter estimates. In addition to analyzing the magnitude of DR parameter error, we develop a metric to determine how much observed DR parameter variability is attributable to real event-to-event variability versus simply baseline model error. Using data from 38 C&I facilities that participated in an automated DR program in California, we find that DR parameter errors are large. For most facilities, observed DR parameter variability is likely explained by baseline model error, not real DR parameter variability; however, a number of facilities exhibit real DR parameter variability. In some cases, the aggregate population of C&I facilities exhibits real DR parameter variability, resulting in implications for the system operator with respect to both resource planning and system stability.

  17. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  18. Modelling and representation issues in automated feature extraction from aerial and satellite images

    Science.gov (United States)

    Sowmya, Arcot; Trinder, John

    New digital systems for the processing of photogrammetric and remote sensing images have led to new approaches to information extraction for mapping and Geographic Information System (GIS) applications, with the expectation that data can become more readily available at a lower cost and with greater currency. Demands for mapping and GIS data are increasing as well for environmental assessment and monitoring. Hence, researchers from the fields of photogrammetry and remote sensing, as well as computer vision and artificial intelligence, are bringing together their particular skills for automating these tasks of information extraction. The paper will review some of the approaches used in knowledge representation and modelling for machine vision, and give examples of their applications in research for image understanding of aerial and satellite imagery.

  19. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.;

    2015-01-01

    Background: Field calculations for transcranial magnetic stimulation (TMS) are increasingly implemented online in neuronavigation systems and in more realistic offline approaches based on finite-element methods. They are often based on simplified and/or non-validated models of the magnetic vector...... potential of the TMS coils. Objective: To develop an approach to reconstruct the magnetic vector potential based on automated measurements. Methods: We implemented a setup that simultaneously measures the three components of the magnetic field with high spatial resolution. This is complemented by a novel...... approach to determine the magnetic vector potential via volume integration of the measured field. Results: The integration approach reproduces the vector potential with very good accuracy. The vector potential distribution of a standard figure-of-eight shaped coil determined with our setup corresponds well...

  20. Modeling and matching of landmarks for automation of Mars Rover localization

    Science.gov (United States)

    Wang, Jue

    The Mars Exploration Rover (MER) mission, begun in January 2004, has been extremely successful. However, decision-making for many operation tasks of the current MER mission and the 1997 Mars Pathfinder mission is performed on Earth through a predominantly manual, time-consuming process. Unmanned planetary rover navigation is ideally expected to reduce rover idle time, diminish the need for entering safe-mode, and dynamically handle opportunistic science events without required communication to Earth. Successful automation of rover navigation and localization during the extraterrestrial exploration requires that accurate position and attitude information can be received by a rover and that the rover has the support of simultaneous localization and mapping. An integrated approach with Bundle Adjustment (BA) and Visual Odometry (VO) can efficiently refine the rover position. However, during the MER mission, BA is done manually because of the difficulty in the automation of the cross-sitetie points selection. This dissertation proposes an automatic approach to select cross-site tie points from multiple rover sites based on the methods of landmark extraction, landmark modeling, and landmark matching. The first step in this approach is that important landmarks such as craters and rocks are defined. Methods of automatic feature extraction and landmark modeling are then introduced. Complex models with orientation angles and simple models without those angles are compared. The results have shown that simple models can provide reasonably good results. Next, the sensitivity of different modeling parameters is analyzed. Based on this analysis, cross-site rocks are matched through two complementary stages: rock distribution pattern matching and rock model matching. In addition, a preliminary experiment on orbital and ground landmark matching is also briefly introduced. Finally, the reliability of the cross-site tie points selection is validated by fault detection, which

  1. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    Directory of Open Access Journals (Sweden)

    J. F. Wellmann

    2015-11-01

    Full Text Available We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilise the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a~link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential-fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  2. pynoddy 1.0: an experimental platform for automated 3-D kinematic and potential field modelling

    Science.gov (United States)

    Florian Wellmann, J.; Thiele, Sam T.; Lindsay, Mark D.; Jessell, Mark W.

    2016-03-01

    We present a novel methodology for performing experiments with subsurface structural models using a set of flexible and extensible Python modules. We utilize the ability of kinematic modelling techniques to describe major deformational, tectonic, and magmatic events at low computational cost to develop experiments testing the interactions between multiple kinematic events, effect of uncertainty regarding event timing, and kinematic properties. These tests are simple to implement and perform, as they are automated within the Python scripting language, allowing the encapsulation of entire kinematic experiments within high-level class definitions and fully reproducible results. In addition, we provide a link to geophysical potential-field simulations to evaluate the effect of parameter uncertainties on maps of gravity and magnetics. We provide relevant fundamental information on kinematic modelling and our implementation, and showcase the application of our novel methods to investigate the interaction of multiple tectonic events on a pre-defined stratigraphy, the effect of changing kinematic parameters on simulated geophysical potential fields, and the distribution of uncertain areas in a full 3-D kinematic model, based on estimated uncertainties in kinematic input parameters. Additional possibilities for linking kinematic modelling to subsequent process simulations are discussed, as well as additional aspects of future research. Our modules are freely available on github, including documentation and tutorial examples, and we encourage the contribution to this project.

  3. Development and Evaluation of a Model for Modular Automation in Plant Manufacturing

    OpenAIRE

    Uwe Katzke; Katja Fischer; Birgit Vogel-Heuser

    2005-01-01

    The benefit of modular concepts in plant automation is seen ambivalent. On one hand it offers advantages, on the other hand it also sets requirements on the system structure as well as discipline of designer. The main reasons to use modularity in systems design for automation applications in industry are reusability and reduction of complexity, but up to now modular concepts are rare in plant automation. This paper analyses the reasons and proposes measures and solution concepts. An analysis ...

  4. Fast Model Adaptation for Automated Section Classification in Electronic Medical Records.

    Science.gov (United States)

    Ni, Jian; Delaney, Brian; Florian, Radu

    2015-01-01

    Medical information extraction is the automatic extraction of structured information from electronic medical records, where such information can be used for improving healthcare processes and medical decision making. In this paper, we study one important medical information extraction task called section classification. The objective of section classification is to automatically identify sections in a medical document and classify them into one of the pre-defined section types. Training section classification models typically requires large amounts of human labeled training data to achieve high accuracy. Annotating institution-specific data, however, can be both expensive and time-consuming; which poses a big hurdle for adapting a section classification model to new medical institutions. In this paper, we apply two advanced machine learning techniques, active learning and distant supervision, to reduce annotation cost and achieve fast model adaptation for automated section classification in electronic medical records. Our experiment results show that active learning reduces the annotation cost and time by more than 50%, and distant supervision can achieve good model accuracy using weakly labeled training data only. PMID:26262005

  5. Sequential Model-Based Parameter Optimization: an Experimental Investigation of Automated and Interactive Approaches

    Science.gov (United States)

    Hutter, Frank; Bartz-Beielstein, Thomas; Hoos, Holger H.; Leyton-Brown, Kevin; Murphy, Kevin P.

    This work experimentally investigates model-based approaches for optimizing the performance of parameterized randomized algorithms. Such approaches build a response surface model and use this model for finding good parameter settings of the given algorithm. We evaluated two methods from the literature that are based on Gaussian process models: sequential parameter optimization (SPO) (Bartz-Beielstein et al. 2005) and sequential Kriging optimization (SKO) (Huang et al. 2006). SPO performed better "out-of-the-box," whereas SKO was competitive when response values were log transformed. We then investigated key design decisions within the SPO paradigm, characterizing the performance consequences of each. Based on these findings, we propose a new version of SPO, dubbed SPO+, which extends SPO with a novel intensification procedure and a log-transformed objective function. In a domain for which performance results for other (modelfree) parameter optimization approaches are available, we demonstrate that SPO+ achieves state-of-the-art performance. Finally, we compare this automated parameter tuning approach to an interactive, manual process that makes use of classical

  6. Lattice gas cellular automation model for rippling and aggregation in myxobacteria

    Science.gov (United States)

    Alber, Mark S.; Jiang, Yi; Kiskowski, Maria A.

    2004-05-01

    A lattice gas cellular automation (LGCA) model is used to simulate rippling and aggregation in myxobacteria. An efficient way of representing cells of different cell size, shape and orientation is presented that may be easily extended to model later stages of fruiting body formation. This LGCA model is designed to investigate whether a refractory period, a minimum response time, a maximum oscillation period and non-linear dependence of reversals of cells on C-factor are necessary assumptions for rippling. It is shown that a refractory period of 2-3 min, a minimum response time of up to 1 min and no maximum oscillation period best reproduce rippling in the experiments of Myxococcus xanthus. Non-linear dependence of reversals on C-factor is critical at high cell density. Quantitative simulations demonstrate that the increase in wavelength of ripples when a culture is diluted with non-signaling cells can be explained entirely by the decreased density of C-signaling cells. This result further supports the hypothesis that levels of C-signaling quantitatively depend on and modulate cell density. Analysis of the interpenetrating high density waves shows the presence of a phase shift analogous to the phase shift of interpenetrating solitons. Finally, a model for swarming, aggregation and early fruiting body formation is presented.

  7. Semi-Automated Experimental Set-Up for CAD-oriented Low Frequency Noise Modeling of Bipolar Transistors

    OpenAIRE

    Borgarino, M.; Bogoni, A; Fantini, F.; Peroni, M.; Cetronio, A.

    2004-01-01

    The present work addresses the hardware and software development of a semi-automated experimental set-up devoted to the extraction of low frequency noise compact models of bipolar transistors for microwave circuit applications (e.g. oscillators). The obtained experimental setup is applied to GaInP/GaAs Heterojunction Bipolar Transistors.

  8. Fully Automated Design of Super-High-Rise Building Structures by a Hybrid AI Model on a Massively Parallel Machine

    OpenAIRE

    Adeli, Hojjat; Park, H. S.

    1996-01-01

    This article presents an innovative research project (sponsored by the National Science Foundation, the American Iron and Steel Institute, and the American Institute of Steel Construction) where computationally elegant algorithms based on the integration of a novel connectionist computing model, mathematical optimization, and a massively parallel computer architecture are used to automate the complex process of engineering design.

  9. Firedrake-Fluids v0.1: numerical modelling of shallow water flows using a performance-portable automated solution framework

    Directory of Open Access Journals (Sweden)

    C. T. Jacobs

    2014-08-01

    Full Text Available This model description paper introduces a new finite element model for the simulation of non-linear shallow water flows, called Firedrake-Fluids. Unlike traditional models that are written by hand in static, low-level programming languages such as Fortran or C, Firedrake-Fluids uses the Firedrake framework to automatically generate the model's code from a high-level abstract language called UFL. By coupling to the PyOP2 parallel unstructured mesh framework, Firedrake can then target the code in a performance-portable manner towards a desired hardware architecture to enable the efficient parallel execution of the model over an arbitrary computational mesh. The description of the model includes the governing equations, the methods employed to discretise and solve the governing equations, and an outline of the automated solution process. The verification and validation of the model, performed using a set of well-defined test cases, is also presented along with a roadmap for future developments and the solution of more complex fluid dynamical systems.

  10. Semi-automated calibration method for modelling of mountain permafrost evolution in Switzerland

    Science.gov (United States)

    Marmy, A.; Rajczak, J.; Delaloye, R.; Hilbich, C.; Hoelzle, M.; Kotlarski, S.; Lambiel, C.; Noetzli, J.; Phillips, M.; Salzmann, N.; Staub, B.; Hauck, C.

    2015-09-01

    Permafrost is a widespread phenomenon in the European Alps. Many important topics such as the future evolution of permafrost related to climate change and the detection of permafrost related to potential natural hazards sites are of major concern to our society. Numerical permafrost models are the only tools which facilitate the projection of the future evolution of permafrost. Due to the complexity of the processes involved and the heterogeneity of Alpine terrain, models must be carefully calibrated and results should be compared with observations at the site (borehole) scale. However, a large number of local point data are necessary to obtain a broad overview of the thermal evolution of mountain permafrost over a larger area, such as the Swiss Alps, and the site-specific model calibration of each point would be time-consuming. To face this issue, this paper presents a semi-automated calibration method using the Generalized Likelihood Uncertainty Estimation (GLUE) as implemented in a 1-D soil model (CoupModel) and applies it to six permafrost sites in the Swiss Alps prior to long-term permafrost evolution simulations. We show that this automated calibration method is able to accurately reproduce the main thermal condition characteristics with some limitations at sites with unique conditions such as 3-D air or water circulation, which have to be calibrated manually. The calibration obtained was used for RCM-based long-term simulations under the A1B climate scenario specifically downscaled at each borehole site. The projection shows general permafrost degradation with thawing at 10 m, even partially reaching 20 m depths until the end of the century, but with different timing among the sites. The degradation is more rapid at bedrock sites whereas ice-rich sites with a blocky surface cover showed a reduced sensitivity to climate change. The snow cover duration is expected to be reduced drastically (between -20 to -37 %) impacting the ground thermal regime. However

  11. Automated Verification of Code Generated from Models: Comparing Specifications with Observations

    Science.gov (United States)

    Gerlich, R.; Sigg, D.; Gerlich, R.

    2008-08-01

    The interest for automatic code generation from models is increasing. A specification is expressed as model and verification and validation is performed in the application domain. Once the model is formally correct and complete, code can be generated automatically. The general belief is that this code should be correct as well. However, this might be not true: Many parameters impact the generation of code and its correctness: it depends on conditions changing from application to application, the properties of the code depend on the environment where it is executed. From the principles of ISVV (Independent Software Verification and Validation) it even must be doubted that the automatically generated code is correct. Therefore an additional activity is required proving the correctness of the whole chain from modelling level down to execution on the target platform. Certification of a code generator is the state-of-the-art approach dealing with such risks,. Scade [1] was the first code generator certified according to DO178B. The certification costs are a significant disadvantage of this certification approach. All codes needs to be analysed manually, and this procedure has to be repeated for recertification after each maintenance step. But certification does not guarantee at all that the generated code does comply with the model. Certification is based on compliance of the code of the code generator with given standards. Such compliance never can guarantee correctness of the whole chain through transformation down to the environment for execution, though the belief is that certification implies well-formed code at a reduced fault rate. The approach presented here goes a direction different from manual certification.. It is guided by the idea of automated proof: each time code is generated from a model the properties of the code when being executed in its environment are compared with the properties specified in the model. This allows to conclude on the correctness of

  12. Modelling and interpreting biologically crusted dryland soil sub-surface structure using automated micropenetrometry

    Science.gov (United States)

    Hoon, Stephen R.; Felde, Vincent J. M. N. L.; Drahorad, Sylvie L.; Felix-Henningsen, Peter

    2015-04-01

    Soil penetrometers are used routinely to determine the shear strength of soils and deformable sediments both at the surface and throughout a depth profile in disciplines as diverse as soil science, agriculture, geoengineering and alpine avalanche-safety (e.g. Grunwald et al. 2001, Van Herwijnen et al. 2009). Generically, penetrometers comprise two principal components: An advancing probe, and a transducer; the latter to measure the pressure or force required to cause the probe to penetrate or advance through the soil or sediment. The force transducer employed to determine the pressure can range, for example, from a simple mechanical spring gauge to an automatically data-logged electronic transducer. Automated computer control of the penetrometer step size and probe advance rate enables precise measurements to be made down to a resolution of 10's of microns, (e.g. the automated electronic micropenetrometer (EMP) described by Drahorad 2012). Here we discuss the determination, modelling and interpretation of biologically crusted dryland soil sub-surface structures using automated micropenetrometry. We outline a model enabling the interpretation of depth dependent penetration resistance (PR) profiles and their spatial differentials using the model equations, σ {}(z) ={}σ c0{}+Σ 1n[σ n{}(z){}+anz + bnz2] and dσ /dz = Σ 1n[dσ n(z) /dz{} {}+{}Frn(z)] where σ c0 and σ n are the plastic deformation stresses for the surface and nth soil structure (e.g. soil crust, layer, horizon or void) respectively, and Frn(z)dz is the frictional work done per unit volume by sliding the penetrometer rod an incremental distance, dz, through the nth layer. Both σ n(z) and Frn(z) are related to soil structure. They determine the form of σ {}(z){} measured by the EMP transducer. The model enables pores (regions of zero deformation stress) to be distinguished from changes in layer structure or probe friction. We have applied this method to both artificial calibration soils in the

  13. Automating the Mapping Process of Traditional Malay Textile Knowledge Model with the Core Ontology

    Directory of Open Access Journals (Sweden)

    Syerina A.M. Nasir

    2011-01-01

    Full Text Available Problem statement: The wave of ontology has spread drastically in the cultural heritage domain. The impact can be seen from the growing number of cultural heritage web information systems, available textile ontology and harmonization works with the core ontology, CIDOC CRM. The aim of this study is to provide a base for common views in automating the process of mapping between revised TMT Knowledge Model and CIDOC CRM. Approach: Manual mapping was conducted to find similar or overlapping concepts which are aligned to each other in order to achieve ontology similarity. This is achieved after TMT Knowledge Model already undergone transformation process to match with CIDOC CRM structure. Results: Although there are several problems encountered during mapping process, the result shows an instant view of the classes which are found to be easily mapped between both models. Conclusion/Recommendations: Future research will be focused on the construction of Batik Heritage Ontology by using the mapping result obtained in this study. Further testing, evaluation and refinement by using the real collections of cultural artifacts within museums will also be conducted in the near future.

  14. Fully Automated Generation of Accurate Digital Surface Models with Sub-Meter Resolution from Satellite Imagery

    Science.gov (United States)

    Wohlfeil, J.; Hirschmüller, H.; Piltz, B.; Börner, A.; Suppa, M.

    2012-07-01

    Modern pixel-wise image matching algorithms like Semi-Global Matching (SGM) are able to compute high resolution digital surface models from airborne and spaceborne stereo imagery. Although image matching itself can be performed automatically, there are prerequisites, like high geometric accuracy, which are essential for ensuring the high quality of resulting surface models. Especially for line cameras, these prerequisites currently require laborious manual interaction using standard tools, which is a growing problem due to continually increasing demand for such surface models. The tedious work includes partly or fully manual selection of tie- and/or ground control points for ensuring the required accuracy of the relative orientation of images for stereo matching. It also includes masking of large water areas that seriously reduce the quality of the results. Furthermore, a good estimate of the depth range is required, since accurate estimates can seriously reduce the processing time for stereo matching. In this paper an approach is presented that allows performing all these steps fully automated. It includes very robust and precise tie point selection, enabling the accurate calculation of the images' relative orientation via bundle adjustment. It is also shown how water masking and elevation range estimation can be performed automatically on the base of freely available SRTM data. Extensive tests with a large number of different satellite images from QuickBird and WorldView are presented as proof of the robustness and reliability of the proposed method.

  15. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  16. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    and Ben Polly, Joseph Robertson [National Renewable Energy Lab. (NREL), Golden, CO (United States); Polly, Ben [National Renewable Energy Lab. (NREL), Golden, CO (United States); Collis, Jon [Colorado School of Mines, Golden, CO (United States)

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAE 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.

  17. Model of Stochastic Automation Asymptotically Optimal Behavior for Inter-budget Regulation

    Directory of Open Access Journals (Sweden)

    Elena D. Streltsova

    2013-01-01

    Full Text Available This paper is focused on the topical issue of inter-budget control in the structure ↔ by applying econometric models. To create the decision-making model, mathematical tool of the theory of stochastic automation, operating in random environments was used. On the basis of the application of this mathematical tool, the adaptive training economic and mathematical model, able to adapt to the environment, maintained by the income from the payment of federal and regional taxes and fees, payable to the budget of the constituent entity of the RF and paid to the budget of a lower level in the form of budget regulation was developed. The authors have developed the structure of the machine, described its behavior in a random environment and introduced the expression for the final probabilities of machine in each of its states. The behavioral aspect of the machine by means of a mathematically rigorous proof of the theorem on the feasibility of behavior and the asymptotic optimality of the proposed design of the machine were presented.

  18. AUTOMATED FORMATION OF CALCULATION MODELS OF TURBOGENERATORS FOR SOFTWARE ENVIRONMENT FEMM

    Directory of Open Access Journals (Sweden)

    V.I. Milykh

    2015-08-01

    Full Text Available Attention is paid to the popular FEMM (Finite Element Method Magnetics program which is effective in the numerical calculations of the magnetic fields of electrical machines. The main problem of its using - high costs in time on the formation of a graphical model representing the design and on the formation of the physical model representing the materials properties and the winding currents of machines – is solved. For this purpose, principles of the automated formation of such models are developed and presented on the turbogenerator example. The task is performed by a program written in an algorithmic language Lua integrated into the package FEMM. The program is universal in terms of varying the geometry and dimensions of the designed turbogenerators. It uses a minimum of input information in a digital form representing the design of the whole turbogenerator and its fragments. A general structure of the Lua script is provided, significant parts of its text, the graphic results of work's phases, as well as explanations of the program and instructions for its use are given. Performance capabilities of the compiled Lua script are shown on the example of the real 340 MW turbogenerator.

  19. World-wide distribution automation systems

    Energy Technology Data Exchange (ETDEWEB)

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  20. Modeling and simulation of control system for electron beam machine (EBM) using programmable automation controller (PAC)

    International Nuclear Information System (INIS)

    An EBM electronic model is designed to simulate the control system of the Nissin EBM, which is located at Block 43, MINT complex of Jalan Dengkil with maximum output of 3 MeV, 30 mA using a Programmable Automation Controllers (PAC). This model operates likes a real EBM system where all the start-up, interlocking and stopping procedures are fully followed. It also involves formulating the mathematical models to relate certain output with the input parameters using data from actual operation on EB machine. The simulation involves a set of PAC system consisting of the digital and analogue input/output modules. The program code is written using Labview software (real-time version) on a PC and then downloaded into the PAC stand-alone memory. All the 23 interlocking signals required by the EB machine are manually controlled by mechanical switches and represented by LEDs. The EB parameters are manually controlled by potentiometers and displayed on analogue and digital meters. All these signals are then interfaced to the PC via a wifi wireless communication built-in at the PAC controller. The program is developed in accordance to the specifications and requirement of the original real EB system and displays them on the panel of the model and also on the PC monitor. All possible chances from human errors, hardware and software malfunctions, including the worst-case conditions will be tested, evaluated and modified. We hope that the performance of our model complies the requirements of operating the EB machine. It also hopes that this electronic model can replace the original PC interfacing being utilized in the Nissin EBM in the near future. The system can also be used to study the fault tolerance analysis and automatic re-configuration for advanced control of the EB system. (Author)

  1. Using Visual Specifications in Verification of Industrial Automation Controllers

    Directory of Open Access Journals (Sweden)

    Bouzon Gustavo

    2008-01-01

    Full Text Available Abstract This paper deals with further development of a graphical specification language resembling timing-diagrams and allowing specification of partially ordered events in input and output signals. The language specifically aims at application in modular modelling of industrial automation systems and their formal verification via model-checking. The graphical specifications are translated into a model which is connected with the original model under study.

  2. What determines the take-over time? An integrated model approach of driver take-over after automated driving.

    Science.gov (United States)

    Zeeb, Kathrin; Buchner, Axel; Schrauf, Michael

    2015-05-01

    In recent years the automation level of driver assistance systems has increased continuously. One of the major challenges for highly automated driving is to ensure a safe driver take-over of the vehicle guidance. This must be ensured especially when the driver is engaged in non-driving related secondary tasks. For this purpose it is essential to find indicators of the driver's readiness to take over and to gain more knowledge about the take-over process in general. A simulator study was conducted to explore how drivers' allocation of visual attention during highly automated driving influences a take-over action in response to an emergency situation. Therefore we recorded drivers' gaze behavior during automated driving while simultaneously engaging in a visually demanding secondary task, and measured their reaction times in a take-over situation. According to their gaze behavior the drivers were categorized into "high", "medium" and "low-risk". The gaze parameters were found to be suitable for predicting the readiness to take-over the vehicle, in such a way that high-risk drivers reacted late and more often inappropriately in the take-over situation. However, there was no difference among the driver groups in the time required by the drivers to establish motor readiness to intervene after the take-over request. An integrated model approach of driver behavior in emergency take-over situations during automated driving is presented. It is argued that primarily cognitive and not motor processes determine the take-over time. Given this, insights can be derived for further research and the development of automated systems. PMID:25794922

  3. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2005-08-25

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport

  4. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    Directory of Open Access Journals (Sweden)

    Farzad Jalaei

    2014-01-01

    Full Text Available Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA strategies and systems are attained. This paper proposes an automated model that links BIM, LCA, energy analysis, and lighting simulation tools with green building certification systems. The implementation is within developing plug-ins on BIM tool capable of measuring the environmental impacts (EI and embodied energy of building components. Using this method, designers will be provided with a new way to visualize and to identify the potential gain or loss of energy for the building as a whole and for each of its associated components. Furthermore, designers will be able to detect and evaluate the sustainability of the proposed buildings based on Leadership in Energy and Environmental Design (LEED rating system. An actual building project will be used to illustrate the workability of the proposed methodology.

  5. Aircraft wing structural design optimization based on automated finite element modelling and ground structure approach

    Science.gov (United States)

    Yang, Weizhu; Yue, Zhufeng; Li, Lei; Wang, Peiyan

    2016-01-01

    An optimization procedure combining an automated finite element modelling (AFEM) technique with a ground structure approach (GSA) is proposed for structural layout and sizing design of aircraft wings. The AFEM technique, based on CATIA VBA scripting and PCL programming, is used to generate models automatically considering the arrangement of inner systems. GSA is used for local structural topology optimization. The design procedure is applied to a high-aspect-ratio wing. The arrangement of the integral fuel tank, landing gear and control surfaces is considered. For the landing gear region, a non-conventional initial structural layout is adopted. The positions of components, the number of ribs and local topology in the wing box and landing gear region are optimized to obtain a minimum structural weight. Constraints include tank volume, strength, buckling and aeroelastic parameters. The results show that the combined approach leads to a greater weight saving, i.e. 26.5%, compared with three additional optimizations based on individual design approaches.

  6. An automated method for generating analogic signals that embody the Markov kinetics of model ionic channels.

    Science.gov (United States)

    Luchian, Tudor

    2005-08-30

    In this work we present an automated method for generating electrical signals which reflect the kinetics of ionic channels that have custom-tailored intermediate sub-states and intermediate reaction constants. The concept of our virtual single-channel waveform generator makes use of two software platforms, one for the numerical generation of single channel traces stemming from a pre-defined model and another for the digital-to-analog conversion of such numerical generated single channel traces. This technique of continuous generation and recording of the activity of a model ionic channel provides an efficient protocol to teach neophytes in the field of single-channel electrophysiology about its major phenomenological facets. Random analogic signals generated by using our technique can be successfully employed in a number of applications, such us: assisted learning of the single-molecule kinetic investigation via electrical recordings, impedance spectroscopy, the evaluation of linear frequency response of neurons and the study of stochastic resonance of ion channels. PMID:16054511

  7. ABSTRACTION OF DRIFT SEEPAGE

    International Nuclear Information System (INIS)

    Drift seepage refers to flow of liquid water into repository emplacement drifts, where it can potentially contribute to degradation of the engineered systems and release and transport of radionuclides within the drifts. Because of these important effects, seepage into emplacement drifts is listed as a ''principal factor for the postclosure safety case'' in the screening criteria for grading of data in Attachment 1 of AP-3.15Q, Rev. 2, ''Managing Technical Product Inputs''. Abstraction refers to distillation of the essential components of a process model into a form suitable for use in total-system performance assessment (TSPA). Thus, the purpose of this analysis/model is to put the information generated by the seepage process modeling in a form appropriate for use in the TSPA for the Site Recommendation. This report also supports the Unsaturated-Zone Flow and Transport Process Model Report. The scope of the work is discussed below. This analysis/model is governed by the ''Technical Work Plan for Unsaturated Zone Flow and Transport Process Model Report'' (CRWMS MandO 2000a). Details of this activity are in Addendum A of the technical work plan. The original Work Direction and Planning Document is included as Attachment 7 of Addendum A. Note that the Work Direction and Planning Document contains tasks identified for both Performance Assessment Operations (PAO) and Natural Environment Program Operations (NEPO). Only the PAO tasks are documented here. The planning for the NEPO activities is now in Addendum D of the same technical work plan and the work is documented in a separate report (CRWMS MandO 2000b). The Project has been reorganized since the document was written. The responsible organizations in the new structure are the Performance Assessment Department and the Unsaturated Zone Department, respectively. The work plan for the seepage abstraction calls for determining an appropriate abstraction methodology, determining uncertainties in seepage, and providing

  8. Automation of the Jarrell--Ash model 70-314 emission spectrometer

    International Nuclear Information System (INIS)

    Automation of the Jarrell-Ash 3.4-Meter Ebert direct-reading emission spectrometer with digital scaler readout is described. The readout is interfaced to a Data General NOVA 840 minicomputer. The automation code consists of BASIC language programs for interactive routines, data processing, and report generation. Call statements within the BASIC programs invoke assembly language routines for real-time data acquisition and control. In addition, the automation objectives as well as the spectrometer-computer system functions, coding, and operating instructions are presented

  9. Semi-Automated Object-Based Classification of Coral Reef Habitat using Discrete Choice Models

    Directory of Open Access Journals (Sweden)

    Steven Saul

    2015-11-01

    Full Text Available As for terrestrial remote sensing, pixel-based classifiers have traditionally been used to map coral reef habitats. For pixel-based classifiers, habitat assignment is based on the spectral or textural properties of each individual pixel in the scene. More recently, however, object-based classifications, those based on information from a set of contiguous pixels with similar properties, have found favor with the reef mapping community and are starting to be extensively deployed. Object-based classifiers have an advantage over pixel-based in that they are less compromised by the inevitable inhomogeneity in per-pixel spectral response caused, primarily, by variations in water depth. One aspect of the object-based classification workflow is the assignment of each image object to a habitat class on the basis of its spectral, textural, or geometric properties. While a skilled image interpreter can achieve this task accurately through manual editing, full or partial automation is desirable for large-scale reef mapping projects of the magnitude which are useful for marine spatial planning. To this end, this paper trials the use of multinomial logistic discrete choice models to classify coral reef habitats identified through object-based segmentation of satellite imagery. Our results suggest that these models can attain assignment accuracies of about 85%, while also reducing the time needed to produce the map, as compared to manual methods. Limitations of this approach include misclassification of image objects at the interface between some habitat types due to the soft gradation in nature between habitats, the robustness of the segmentation algorithm used, and the selection of a strong training dataset. Finally, due to the probabilistic nature of multinomial logistic models, the analyst can estimate a map of uncertainty associated with the habitat classifications. Quantifying uncertainty is important to the end-user when developing marine spatial

  10. Programme and abstracts

    International Nuclear Information System (INIS)

    Abstracts of 25 papers presented at the congress are given. The abstracts cover various topics including radiotherapy, radiopharmaceuticals, radioimmunoassay, health physics, radiation protection and nuclear medicine

  11. Piaget on Abstraction.

    Science.gov (United States)

    Moessinger, Pierre; Poulin-Dubois, Diane

    1981-01-01

    Reviews and discusses Piaget's recent work on abstract reasoning. Piaget's distinction between empirical and reflective abstraction is presented; his hypotheses are considered to be metaphorical. (Author/DB)

  12. An Analytic Model for Design of a Multivehicle Automated Guided Vehicle System

    OpenAIRE

    Eric Johnson, M; Brandeau, Margaret L.

    1993-01-01

    We consider the problem of designing a multivehicle automated guided vehicle system (AGVS) to supplement an existing nonautomated material handling system. The AGVS consists of a pool of vehicles that deliver raw components from a central storage area to workcenters throughout the factor floor. The objective is to determine which workcenters warrant automated component delivery and the number of vehicles required to service those workcenters, to maximize the benefit of the AGVS, subject to a ...

  13. Acceptance of Automated Road Transport Systems (ARTS): An adaptation of the UTAUT model

    OpenAIRE

    Madigan, Ruth; Louw, Tyron; Dziennus, Marc; Graindorge, Tatiana; Ortega, Erik; Graindorge, Matthieu; Merat, Natasha

    2016-01-01

    As research into innovative forms of automated transportation systems gains momentum, it is important that we develop an understanding of the factors that will impact the adoption of these systems. In an effort to address this issue, the European project CityMobil2 is collecting data around large-scale demonstrations of Automated Road Transport Systems (ARTS) in a number of cities across Europe. For these systems to be successful, user acceptance is vital. The current study used the Unified T...

  14. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  15. Setup time reduction: SMED-balancing integrated model for manufacturing systems with automated transfer

    Directory of Open Access Journals (Sweden)

    Maurizio Faccio

    2013-10-01

    Full Text Available The importance of short setup times is increasing in every type of industry. It has been known how to address this problem for about 20 years. The SMED method, originally developed by the Japanese industrial engineer Shigeo Shingo for reducing the time to exchange dies, gives a really straightforward approach to improve existing setups. On the other hand, in the case of complex manufacturing systems the simple application of the SMED methodology is not enough. Manufacturing systems composed of different working machines with automated transfer facilities are a good example. Technologicalconstraints, task precedence constraints, and synchronization between different setup tasks are just some of the influencing factors that make an improved SMED desirable. The present paper, starting from an industrial case, aims to provide a heuristics methodology that integrates the traditional SMED with the workload balancing problem that is typical of assembly systems, in order to address the setup reduction problem in the case of complex manufacturing systems. Anindustrial case is reported to validate the proposed model and to demonstrate its practical implications.

  16. A new method for automated discontinuity trace mapping on rock mass 3D surface model

    Science.gov (United States)

    Li, Xiaojun; Chen, Jianqin; Zhu, Hehua

    2016-04-01

    This paper presents an automated discontinuity trace mapping method on a 3D surface model of rock mass. Feature points of discontinuity traces are first detected using the Normal Tensor Voting Theory, which is robust to noisy point cloud data. Discontinuity traces are then extracted from feature points in four steps: (1) trace feature point grouping, (2) trace segment growth, (3) trace segment connection, and (4) redundant trace segment removal. A sensitivity analysis is conducted to identify optimal values for the parameters used in the proposed method. The optimal triangular mesh element size is between 5 cm and 6 cm; the angle threshold in the trace segment growth step is between 70° and 90°; the angle threshold in the trace segment connection step is between 50° and 70°, and the distance threshold should be at least 15 times the mean triangular mesh element size. The method is applied to the excavation face trace mapping of a drill-and-blast tunnel. The results show that the proposed discontinuity trace mapping method is fast and effective and could be used as a supplement to traditional direct measurement of discontinuity traces.

  17. A model-free, fully automated baseline-removal method for Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Foist, Rod B; Okuda, Kadek; Ivanov, André; Turner, Robin F B

    2011-01-01

    We present here a fully automated spectral baseline-removal procedure. The method uses a large-window moving average to estimate the baseline; thus, it is a model-free approach with a peak-stripping method to remove spectral peaks. After processing, the baseline-corrected spectrum should yield a flat baseline and this endpoint can be verified with the χ(2)-statistic. The approach provides for multiple passes or iterations, based on a given χ(2)-statistic for convergence. If the baseline is acceptably flat given the χ(2)-statistic after the first pass at correction, the problem is solved. If not, the non-flat baseline (i.e., after the first effort or first pass at correction) should provide an indication of where the first pass caused too much or too little baseline to be subtracted. The second pass thus permits one to compensate for the errors incurred on the first pass. Thus, one can use a very large window so as to avoid affecting spectral peaks--even if the window is so large that the baseline is inaccurately removed--because baseline-correction errors can be assessed and compensated for on subsequent passes. We start with the largest possible window and gradually reduce it until acceptable baseline correction based on the χ(2) statistic is achieved. Results, obtained on both simulated and measured Raman data, are presented and discussed. PMID:21211157

  18. Thermal Edge-Effects Model for Automated Tape Placement of Thermoplastic Composites

    Science.gov (United States)

    Costen, Robert C.

    2000-01-01

    Two-dimensional thermal models for automated tape placement (ATP) of thermoplastic composites neglect the diffusive heat transport that occurs between the newly placed tape and the cool substrate beside it. Such lateral transport can cool the tape edges prematurely and weaken the bond. The three-dimensional, steady state, thermal transport equation is solved by the Green's function method for a tape of finite width being placed on an infinitely wide substrate. The isotherm for the glass transition temperature on the weld interface is used to determine the distance inward from the tape edge that is prematurely cooled, called the cooling incursion Delta a. For the Langley ATP robot, Delta a = 0.4 mm for a unidirectional lay-up of PEEK/carbon fiber composite, and Delta a = 1.2 mm for an isotropic lay-up. A formula for Delta a is developed and applied to a wide range of operating conditions. A surprise finding is that Delta a need not decrease as the Peclet number Pe becomes very large, where Pe is the dimensionless ratio of inertial to diffusive heat transport. Conformable rollers that increase the consolidation length would also increase Delta a, unless other changes are made, such as proportionally increasing the material speed. To compensate for premature edge cooling, the thermal input could be extended past the tape edges by the amount Delta a. This method should help achieve uniform weld strength and crystallinity across the width of the tape.

  19. Automated parameter estimation for biological models using Bayesian statistical model checking

    OpenAIRE

    Hussain, Faraz; Langmead, Christopher J.; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram; Jha, Sumit K.

    2015-01-01

    Background Probabilistic models have gained widespread acceptance in the systems biology community as a useful way to represent complex biological systems. Such models are developed using existing knowledge of the structure and dynamics of the system, experimental observations, and inferences drawn from statistical analysis of empirical data. A key bottleneck in building such models is that some system variables cannot be measured experimentally. These variables are incorporated into the mode...

  20. Forecasting Macroeconomic Variables using Neural Network Models and Three Automated Model Selection Techniques

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this paper we consider the forecasting performance of a well-defined class of flexible models, the so-called single hidden-layer feedforward neural network models. A major aim of our study is to find out whether they, due to their flexibility, are as useful tools in economic forecasting as some...... the neural network model is not appropriate if the data is generated by a linear mechanism. Hence, it might be appropriate to test the null of linearity prior to building a nonlinear model. We investigate whether this kind of pretesting improves the forecast accuracy compared to the case where this is...

  1. Model-Driven Development of Automation and Control Applications: Modeling and Simulation of Control Sequences

    Directory of Open Access Journals (Sweden)

    Timo Vepsäläinen

    2014-01-01

    Full Text Available The scope and responsibilities of control applications are increasing due to, for example, the emergence of industrial internet. To meet the challenge, model-driven development techniques have been in active research in the application domain. Simulations that have been traditionally used in the domain, however, have not yet been sufficiently integrated to model-driven control application development. In this paper, a model-driven development process that includes support for design-time simulations is complemented with support for simulating sequential control functions. The approach is implemented with open source tools and demonstrated by creating and simulating a control system model in closed-loop with a large and complex model of a paper industry process.

  2. Accurate, precise modeling of cell proliferation kinetics from time-lapse imaging and automated image analysis of agar yeast culture arrays

    Directory of Open Access Journals (Sweden)

    Zhao Lue

    2007-01-01

    Full Text Available Abstract Background Genome-wide mutant strain collections have increased demand for high throughput cellular phenotyping (HTCP. For example, investigators use HTCP to investigate interactions between gene deletion mutations and additional chemical or genetic perturbations by assessing differences in cell proliferation among the collection of 5000 S. cerevisiae gene deletion strains. Such studies have thus far been predominantly qualitative, using agar cell arrays to subjectively score growth differences. Quantitative systems level analysis of gene interactions would be enabled by more precise HTCP methods, such as kinetic analysis of cell proliferation in liquid culture by optical density. However, requirements for processing liquid cultures make them relatively cumbersome and low throughput compared to agar. To improve HTCP performance and advance capabilities for quantifying interactions, YeastXtract software was developed for automated analysis of cell array images. Results YeastXtract software was developed for kinetic growth curve analysis of spotted agar cultures. The accuracy and precision for image analysis of agar culture arrays was comparable to OD measurements of liquid cultures. Using YeastXtract, image intensity vs. biomass of spot cultures was linearly correlated over two orders of magnitude. Thus cell proliferation could be measured over about seven generations, including four to five generations of relatively constant exponential phase growth. Spot area normalization reduced the variation in measurements of total growth efficiency. A growth model, based on the logistic function, increased precision and accuracy of maximum specific rate measurements, compared to empirical methods. The logistic function model was also more robust against data sparseness, meaning that less data was required to obtain accurate, precise, quantitative growth phenotypes. Conclusion Microbial cultures spotted onto agar media are widely used for genotype

  3. Evidence evaluation in fingerprint comparison and automated fingerprint identification systems--Modeling between finger variability.

    Science.gov (United States)

    Egli Anthonioz, N M; Champod, C

    2014-02-01

    In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting. PMID:24447455

  4. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  5. An automated system to simulate the River discharge in Kyushu Island using the H08 model

    Science.gov (United States)

    Maji, A.; Jeon, J.; Seto, S.

    2015-12-01

    Kyushu Island is located in southwestern part of Japan, and it is often affected by typhoons and a Baiu front. There have been severe water-related disasters recorded in Kyushu Island. On the other hand, because of high population density and for crop growth, water resource is an important issue of Kyushu Island.The simulation of river discharge is important for water resource management and early warning of water-related disasters. This study attempts to apply H08 model to simulate river discharge in Kyushu Island. Geospatial meteorological and topographical data were obtained from Japanese Ministry of Land, Infrastructure, Transport and Tourism (MLIT) and Automated Meteorological Data Acquisition System (AMeDAS) of Japan Meteorological Agency (JMA). The number of the observation stations of AMeDAS is limited and is not quite satisfactory for the application of water resources models in Kyushu. It is necessary to spatially interpolate the point data to produce grid dataset. Meteorological grid dataset is produced by considering elevation dependence. Solar radiation is estimated from hourly sunshine duration by a conventional formula. We successfully improved the accuracy of interpolated data just by considering elevation dependence and found out that the bias is related to geographical location. The rain/snow classification is done by H08 model and is validated by comparing estimated and observed snow rate. The estimates tend to be larger than the corresponding observed values. A system to automatically produce daily meteorological grid dataset is being constructed.The geospatial river network data were produced by ArcGIS and they were utilized in the H08 model to simulate the river discharge. Firstly, this research is to compare simulated and measured specific discharge, which is the ratio of discharge to watershed area. Significant error between simulated and measured data were seen in some rivers. Secondly, the outputs by the coupled model including crop growth

  6. Automating Supplier Selection Procedures

    OpenAIRE

    Davidrajuh, Reggie

    2001-01-01

    This dissertation describes a methodology, tools, and implementation techniques of automating supplier selection procedures of a small and medium-sized agile virtual enterprise. Firstly, a modeling approach is devised that can be used to model the supplier selection procedures of an enterprise. This modeling approach divides the supplier selection procedures broadly into three stages, the pre-selection, selection, and post-selection stages. Secondly, a methodology is presented for automating ...

  7. Automated quantification of carotid artery stenosis on contrast-enhanced MRA data using a deformable vascular tube model

    OpenAIRE

    Suinesiaputra, Avan; de Koning, Patrick J. H.; Zudilova-Seinstra, Elena; Reiber, Johan H.C.; van der Geest, Rob J.

    2011-01-01

    The purpose of this study was to develop and validate a method for automated segmentation of the carotid artery lumen from volumetric MR Angiographic (MRA) images using a deformable tubular 3D Non-Uniform Rational B-Splines (NURBS) model. A flexible 3D tubular NURBS model was designed to delineate the carotid arterial lumen. User interaction was allowed to guide the model by placement of forbidden areas. Contrast-enhanced MRA (CE-MRA) from 21 patients with carotid atherosclerotic disease were...

  8. Automation based on knowledge modeling theory and its applications in engine diagnostic systems using Space Shuttle Main Engine vibrational data. M.S. Thesis

    Science.gov (United States)

    Kim, Jonnathan H.

    1995-01-01

    Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).

  9. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism.

    Directory of Open Access Journals (Sweden)

    Alicja ePuścian

    2014-04-01

    Full Text Available Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order and cognitive rigidity (higher-order. Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repetitive behaviors during reversal learning in mice in the automated IntelliCage system. During the reward-motivated place preference reversal learning, designed to assess cognitive abilities of mice, visits to the previously rewarded places were recorded to measure cognitive flexibility. Thereafter, emotional flexibility was assessed by measuring conditioned fear extinction. Additionally, to look for neuronal correlates of cognitive impairments, we measured CA3-CA1 hippocampal long term potentiation (LTP. To standardize the designed tests we used C57BL/6 and BALB/c mice, representing two genetic backgrounds, for induction of autism by prenatal exposure to the sodium valproate. We found impairments of place learning related to perseveration and no LTP impairments in C57BL/6 valproate-treated mice. In contrast, BALB/c valproate-treated mice displayed severe deficits of place learning not associated with perseverative behaviors and accompanied by hippocampal LTP impairments. Alterations of cognitive flexibility observed in C57BL/6 valproate-treated mice were related to neither restricted exploration pattern nor to emotional flexibility. Altogether, we showed that the designed tests of cognitive performance and perseverative behaviors are efficient and highly replicable. Moreover, the results suggest that genetic background is crucial for the behavioral effects of prenatal valproate treatment.

  10. Automated modeling of ecosystem CO2 fluxes based on closed chamber measurements: A standardized conceptual and practical approach

    Science.gov (United States)

    Hoffmann, Mathias; Jurisch, Nicole; Albiac Borraz, Elisa; Hagemann, Ulrike; Sommer, Michael; Augustin, Jürgen

    2015-04-01

    Closed chamber measurements are widely used for determining the CO2 exchange of small-scale or heterogeneous ecosystems. Among the chamber design and operational handling, the data processing procedure is a considerable source of uncertainty of obtained results. We developed a standardized automatic data processing algorithm, based on the language and statistical computing environment R© to (i) calculate measured CO2 flux rates, (ii) parameterize ecosystem respiration (Reco) and gross primary production (GPP) models, (iii) optionally compute an adaptive temperature model, (iv) model Reco, GPP and net ecosystem exchange (NEE), and (v) evaluate model uncertainty (calibration, validation and uncertainty prediction). The algorithm was tested for different manual and automatic chamber measurement systems (such as e.g. automated NEE-chambers and the LI-8100A soil CO2 Flux system) and ecosystems. Our study shows that even minor changes within the modelling approach may result in considerable differences of calculated flux rates, derived photosynthetic active radiation and temperature dependencies and subsequently modeled Reco, GPP and NEE balance of up to 25%. Thus, certain modeling implications will be given, since automated and standardized data processing procedures, based on clearly defined criteria, such as statistical parameters and thresholds are a prerequisite and highly desirable to guarantee the reproducibility, traceability of modelling results and encourage a better comparability between closed chamber based CO2 measurements.

  11. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    Directory of Open Access Journals (Sweden)

    Zucker Gerhard

    2011-01-01

    Full Text Available The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour that resolves conflicting demands—a decision-making process. In the last years, research has started to focus on bionic principles for designing new concepts in this area. The information processing principles of the human mind have turned out to be of particular interest as the mind is capable of processing huge amounts of sensory data and taking adequate decisions for (re-actions based on these analysed data. In this paper, we discuss how a bionic approach can solve the upcoming problems of energy optimal systems. A recently developed model for environment recognition and decision-making processes, which is based on research findings from different disciplines of brain research is introduced. This model is the foundation for applications in intelligent building automation that have to deal with information from home and office environments. All of these applications have in common that they consist of a combination of communicating nodes and have many, partly contradicting goals.

  12. Automated service quality and its behavioural consequences in CRM Environment: A structural equation modeling and causal loop diagramming approach

    Directory of Open Access Journals (Sweden)

    Arup Kumar Baksi

    2012-08-01

    Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.

  13. 基于时间抽象状态机的AADL模型验证∗%Verification of AADL Models with Timed Abstract State Machines

    Institute of Scientific and Technical Information of China (English)

    杨志斌; 胡凯; 赵永望; 马殿富; Jean-Paul BODEVEIX

    2015-01-01

    提出了一种基于时间抽象状态机(timed abstract state machine,简称TASM)的AADL(architecture analysis and design language)模型验证方法。分别给出了AADL子集和TASM的抽象语法,并基于语义函数和类ML的元语言形式定义转换规则。在此基础上,基于AADL开源建模环境OSATE(open source AADL tool environment)设计并实现了AADL模型验证与分析工具AADL2TASM,并基于航天器导航、制导与控制系统(guidance, navigation and control)进行了实例性验证。%This paper presents a formal verification method for AADL (architecture analysis and design language) models by TASM (timed abstract state machine) translation. The abstract syntax of the chosen subset of AADL and of TASM are given. The translation rules are defined clearly by the semantic functions expressed in a ML-like language. Furthermore, the translation is implemented in the model transformation tool AADL2TASM, which provides model checking and simulation for AADL models. Finally, a case study of space GNC (guidance, navigation and control) system is provided.

  14. Ecosystem models. Volume 3. November, 1977--October, 1978 (a bibliography with abstracts). Report for Nov 1977--Oct 1978

    International Nuclear Information System (INIS)

    The preparation and use of ecosystem models are covered in this bibliography of Federally-funded research. Models for marine biology, wildlife, plants, water pollution, microorganisms, food chains, radioactive substances, limnology, and diseases as related to ecosystems are included

  15. Ecosystem models. Volume 2. November, 1975--1977 (a bibliography with abstracts). Report for Nov 1975--Nov 1977

    International Nuclear Information System (INIS)

    The preparation and use of ecosystem models are covered in this bibliography of Federally-funded research. Models for marine biology, wildlife, plants, water pollution, microorganisms, food chains, radioactive substances, limnology, and diseases as related to ecosystems are included

  16. Evidence evaluation in fingerprint comparison and automated fingerprint identification systems--modelling within finger variability.

    Science.gov (United States)

    Egli, Nicole M; Champod, Christophe; Margot, Pierre

    2007-04-11

    Recent challenges and errors in fingerprint identification have highlighted the need for assessing the information content of a papillary pattern in a systematic way. In particular, estimation of the statistical uncertainty associated with this type of evidence is more and more called upon. The approach used in the present study is based on the assessment of likelihood ratios (LRs). This evaluative tool weighs the likelihood of evidence given two mutually exclusive hypotheses. The computation of likelihood ratios on a database of marks of known sources (matching the unknown and non-matching the unknown mark) allows an estimation of the evidential contribution of fingerprint evidence. LRs are computed taking advantage of the scores obtained from an automated fingerprint identification system and hence are based exclusively on level II features (minutiae). The AFIS system attributes a score to any comparison (fingerprint to fingerprint, mark to mark and mark to fingerprint), used here as a proximity measure between the respective arrangements of minutiae. The numerator of the LR addresses the within finger variability and is obtained by comparing the same configurations of minutiae coming from the same source. Only comparisons where the same minutiae are visible both on the mark and on the print are therefore taken into account. The denominator of the LR is obtained by cross-comparison with a database of prints originating from non-matching sources. The estimation of the numerator of the LR is much more complex in terms of specific data requirements than the estimation of the denominator of the LR (that requires only a large database of prints from an non-associated population). Hence this paper addresses specific issues associated with the numerator or within finger variability. This study aims at answering the following questions: (1) how a database for modelling within finger variability should be acquired; (2) whether or not the visualisation technique or the

  17. Enhanced Automated Canopy Characterization from Hyperspectral Data by a Novel Two Step Radiative Transfer Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    Wolfgang Wagner

    2009-11-01

    Full Text Available Automated, image based methods for the retrieval of vegetation biophysical and biochemical variables are often hampered by the lack of a priori knowledge about land cover and phenology, which makes the retrieval a highly underdetermined problem. This study addresses this problem by presenting a novel approach, called CRASh, for the concurrent retrieval of leaf area index, leaf chlorophyll content, leaf water content and leaf dry matter content from high resolution solar reflective earth observation data. CRASh, which is based on the inversion of the combined PROSPECT+SAILh radiative transfer model (RTM, explores the benefits of combining semi-empirical and physically based approaches. The approach exploits novel ways to address the underdetermined problem in the context of an automated retrieval from mono-temporal high resolution data. To regularize the inverse problem in the variable domain, RTM inversion is coupled with an automated land cover classification. Model inversion is based on a two step lookup table (LUT approach: First, a range of possible solutions is selected from a previously calculated LUT based on the analogy between measured and simulated reflectance. The final solution is determined from this subset by minimizing the difference between the variables used to simulate the spectra contained in the reduced LUT and a first guess of the solution. This first guess of the variables is derived from predictive semi-empirical relationships between classical vegetation indices and the single variables. Additional spectral regularization is obtained by the use of hyperspectral data. Results show that estimates obtained with CRASh are significantly more accurate than those obtained with a tested conventional RTM inversion and semi-empirical approach. Accuracies obtained in this study are comparable to the results obtained by various authors for better constrained inversions that assume more a priori information. The completely automated

  18. Abstractions on test design techniques

    OpenAIRE

    Wendland, Marc-Florian

    2014-01-01

    Automated test design is an approach to test design in which automata are utilized for generating test artifacts such as test cases and test data from a formal test basis, most often called test model. A test generator operates on such a test model to meet a certain test coverage goal. In the plethora of the approaches, tools and standards for model-based test design, the test design techniques to be applied and test coverage goals to be met are not part of the test model, which may easily le...

  19. Feasibility of rapid and automated importation of 3D echocardiographic left ventricular (LV geometry into a finite element (FEM analysis model

    Directory of Open Access Journals (Sweden)

    Nathan Nadia S

    2004-10-01

    Full Text Available Abstract Background Finite element method (FEM analysis for intraoperative modeling of the left ventricle (LV is presently not possible. Since 3D structural data of the LV is now obtainable using standard transesophageal echocardiography (TEE devices intraoperatively, the present study describes a method to transfer this data into a commercially available FEM analysis system: ABAQUS©. Methods In this prospective study TomTec LV Analysis TEE© Software was used for semi-automatic endocardial border detection, reconstruction, and volume-rendering of the clinical 3D echocardiographic data. A newly developed software program MVCP FemCoGen©, written in Delphi, reformats the TomTec file structures in five patients for use in ABAQUS and allows visualization of regional deformation of the LV. Results This study demonstrates that a fully automated importation of 3D TEE data into FEM modeling is feasible and can be efficiently accomplished in the operating room. Conclusion For complete intraoperative 3D LV finite element analysis, three input elements are necessary: 1. time-gaited, reality-based structural information, 2. continuous LV pressure and 3. instantaneous tissue elastance. The first of these elements is now available using the methods presented herein.

  20. Program and abstracts

    International Nuclear Information System (INIS)

    Abstracts of the papers given at the conference are presented. The abstracts are arranged under sessions entitled:Theoretical Physics; Nuclear Physics; Solid State Physics; Spectroscopy; Physics Education; SANCGASS; Astronomy; Plasma Physics; Physics in Industry; Applied and General Physics

  1. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    Science.gov (United States)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  2. Introduction to abstract algebra

    CERN Document Server

    Nicholson, W Keith

    2012-01-01

    Praise for the Third Edition ". . . an expository masterpiece of the highest didactic value that has gained additional attractivity through the various improvements . . ."-Zentralblatt MATH The Fourth Edition of Introduction to Abstract Algebra continues to provide an accessible approach to the basic structures of abstract algebra: groups, rings, and fields. The book's unique presentation helps readers advance to abstract theory by presenting concrete examples of induction, number theory, integers modulo n, and permutations before the abstract structures are defined. Readers can immediately be

  3. The MineTool Software Suite: A Novel Data Mining Palette of Tools for Automated Modeling of Space Physics Data

    Science.gov (United States)

    Sipes, T.; Karimabadi, H.; Roberts, A.

    2009-12-01

    We present a new data mining software tool called MineTool for analysis and modeling of space physics data. MineTool is a graphical user interface implementation that merges two data mining algorithms into an easy-to-use software tool: an algorithm for analysis and modeling of static data [Karimabadi et al, 2007] and MineTool-TS, an algorithm for data mining of time series data [Karimabadi et al, 2009]. By virtue of automating the modeling process and model evaluations, MineTool makes data mining and predictive modeling more accessible to non-experts. The software is entirely in Java and freeware. By ranking all inputs as predictors of the outcome before constructing a model, MineTool enables inclusion of only relevant variables as well. The technique aggregates the various stages of model building into a four-step process consisting of (i) data segmentation and sampling, (ii) variable pre-selection and transform generation, (iii) predictive model estimation and validation, and (iv) final model selection. Optimal strategies are chosen for each modeling step. A notable feature of the technique is that the final model is always in closed analytical form rather than “black box” form characteristic of some other techniques. Having the analytical model enables deciphering the importance of various variables to affecting the outcome. MineTool suite also provides capabilities for data preparation for data mining as well as visualization of the datasets. MineTool has successfully been used to develop models for automated detection of flux transfer events (FTEs) at Earth’s magnetopause in the Cluster spacecraft time series data and 3D magnetopause modeling. In this presentation, we demonstrate the ease of use of the software through examples including how it was used in the FTE problem.

  4. AUTOMATED GIS WATERSHED ANALYSIS TOOLS FOR RUSLE/SEDMOD SOIL EROSION AND SEDIMENTATION MODELING

    Science.gov (United States)

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed using a suite of automated Arc Macro Language (AML ) scripts and a pair of processing- intensive ANSI C++ executable programs operating on an ESRI ArcGIS 8.x Workstation platform...

  5. Automated Brain Structure Segmentation Based on Atlas Registration and Appearance Models

    DEFF Research Database (Denmark)

    van der Lijn, Fedde; de Bruijne, Marleen; Klein, Stefan;

    2012-01-01

    Accurate automated brain structure segmentation methods facilitate the analysis of large-scale neuroimaging studies. This work describes a novel method for brain structure segmentation in magnetic resonance images that combines information about a structure’s location and appearance. The spatial...

  6. An automated and simple method for brain MR image extraction

    OpenAIRE

    Zhu Zixin; Liu Jiafeng; Zhang Haiyan; Li Haiyun

    2011-01-01

    Abstract Background The extraction of brain tissue from magnetic resonance head images, is an important image processing step for the analyses of neuroimage data. The authors have developed an automated and simple brain extraction method using an improved geometric active contour model. Methods The method uses an improved geometric active contour model which can not only solve the boundary leakage problem but also is less sensitive to intensity inhomogeneity. The method defines the initial fu...

  7. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed

  8. Abstracts and program proceedings of the 1994 meeting of the International Society for Ecological Modelling North American Chapter

    Energy Technology Data Exchange (ETDEWEB)

    Kercher, J.R.

    1994-06-01

    This document contains information about the 1994 meeting of the International Society for Ecological Modelling North American Chapter. The topics discussed include: extinction risk assessment modelling, ecological risk analysis of uranium mining, impacts of pesticides, demography, habitats, atmospheric deposition, and climate change.

  9. (abstract) A Test of the Theoretical Models of Bipolar Outflows: The Bipolar Outflow in Mon R2

    Science.gov (United States)

    Xie, Taoling; Goldsmith, Paul; Patel, Nimesh

    1993-01-01

    We report some results of a study of the massive bipolar outflow in the central region of the relatively nearby giant molecular cloud Monoceros R2. We make a quantative comparison of our results with the Shu et al. outflow model which incorporates a radially directed wind sweeping up the ambient material into a shell. We find that this simple model naturally explains the shape of this thin shell. Although Shu's model in its simplest form predicts with reasonable parameters too much mass at very small polar angles, as previously pointed out by Masson and Chernin, it provides a reasonable good fit to the mass distribution at larger polar angles. It is possible that this discrepancy is due to inhomogeneities of the ambient molecular gas which is not considered by the model. We also discuss the constraints imposed by these results on recent jet-driven outflow models.

  10. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    Science.gov (United States)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an

  11. Automation of measurement of heights waves around a model ship; Mokeisen mawari no hako keisoku no jidoka

    Energy Technology Data Exchange (ETDEWEB)

    Ikehata, M.; Kato, M.; Yanagida, F. [Yokohama National University, Yokohama (Japan). Faculty of Engineering

    1997-10-01

    Trial fabrication and tests were performed on an instrument to automate measurement of heights of waves around a model ship. The currently used electric wave height measuring instrument takes long time for measurement, hence poor in efficiency. The method for processing optical images also has a problem in accuracy. Therefore, a computer controlled system was structured by using AC servo motors in driving the X and Y axes of a traverse equipment. Equipment was fabricated to automate the wave height measurement, in which four servo type wave height meters are installed on a moving rack in the lateral (Y-axial) direction so that wave heights to be measured by four meters can be measured automatically all at once. Wave heights can be measured continuously by moving the moving rack at a constant speed, verifying that wave shapes in longitudinal cross sections can be acquired by only one towing. Time required in the measurements using the instrument was 40 hours as a net time for fixed point measurement and 12 hours for continuous measurement, or 52 hours in total. On the other hand, the time may reach 240 hours for fixed point measurement when the conventional all-point manual traverse equipment is used. Enormous effects were obtained from automating the instrument. Collection of wave height data will continue also on tankers and other types of ships. 2 refs., 8 figs., 1 tab.

  12. Modal abstractions of concurrent behavior

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nanz, Sebastian; Nielson, Hanne Riis

    2011-01-01

    We present an effective algorithm for the automatic construction of finite modal transition systems as abstractions of potentially infinite concurrent processes. Modal transition systems are recognized as valuable abstractions for model checking because they allow for the validation as well as...... supports the definition of a 3-valued modal logic for validating as well as refuting properties of systems. The construction is illustrated on a few examples, including the Ingemarsson-Tang-Wong key agreement protocol. © 2011 ACM....

  13. Designing Negotiating Agent for Automated Negotiations

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Traditional research in automated negotiation is focused on negotiation protocol and strategy.This paper studies automated negotiation from a new point of view, proposes a novel concept, namely negotiating agent, and discusses its significance in construction of automated negotiation system, with an abstract model formally described and the architecture designed, which supports both goal-directed reasoning and reactive response. A communication model was proposed to construct interaction mechanism used by negotiating agents, in which the negotiation language used by agents is defined.The communication model and the language are defined in a way general enough to support a wide variety of market mechanisms, thus being particularly suitable for flexible applications such as electronic business. The design and expression of the negotiation ontology is also discussed. On the base of the theoretical model of negotiating agent, negotiating agent architecture and negotiating agent communication model (NACM) are explicit and formal specifications for the agents negotiating in an E-business environment; especially, NACM defines the negotiation language template shared among all agents formally and explicitly. The novelty of the communication model is twofold.

  14. Automated ground-based remote sensing measurements of greenhouse gases at the Białystok site in comparison with collocated in-situ measurements and model data

    Directory of Open Access Journals (Sweden)

    J. Messerschmidt

    2011-12-01

    Full Text Available The fully automated observatory for total greenhouse gas (GHG column measurements introduced here complements the in-situ facilities at the Białystok site in Poland. With the automated Fourier Transform Spectrometer (FTS, solar absorption measurements have been recorded nearly continuously since March 2009. In this article the automation system, including the hardware components and the automation software will be described in its basics. Furthermore the first comparison of the FTS dataset with the collocated in-situ measurements and the first comparison of the Jena CO2 inversion model are presented. This model identifies monthly variations in the total CO2 column and the seasonal amplitude is in good agreement with the FTS measurements.

  15. A Synthesis of Decision Models for Tool Management in Automated Manufacturing

    OpenAIRE

    Ann E. Gray; Abraham (Avi) Seidmann; Kathryn E. Stecke

    1993-01-01

    The evidence is clear that a lack of attention to structured tool management has resulted in the poor performance of many manufacturing systems. Plant tooling systems affect product design options, machine loading, job batching, capacity scheduling, and real-time part routing decisions. With increasing automation in manufacturing systems, there is a growing need to integrate tool management more thoroughly into system design, planning and control. This paper critically evaluates various tool ...

  16. NeuroGPS: automated localization of neurons for brain circuits using L1 minimization model

    OpenAIRE

    Quan, Tingwei; Zheng, Ting; Yang, Zhongqing; Ding, Wenxiang; Li, Shiwei; Li, Jing; Zhou, Hang; Luo, Qingming; Gong, Hui; Zeng, Shaoqun

    2013-01-01

    Drawing the map of neuronal circuits at microscopic resolution is important to explain how brain works. Recent progresses in fluorescence labeling and imaging techniques have enabled measuring the whole brain of a rodent like a mouse at submicron-resolution. Considering the huge volume of such datasets, automatic tracing and reconstruct the neuronal connections from the image stacks is essential to form the large scale circuits. However, the first step among which, automated location the soma...

  17. Automated data evaluation and modelling of simultaneous (19) F-(1) H medium-resolution NMR spectra for online reaction monitoring.

    Science.gov (United States)

    Zientek, Nicolai; Laurain, Clément; Meyer, Klas; Paul, Andrea; Engel, Dirk; Guthausen, Gisela; Kraume, Matthias; Maiwald, Michael

    2016-06-01

    Medium-resolution nuclear magnetic resonance spectroscopy (MR-NMR) currently develops to an important analytical tool for both quality control and process monitoring. In contrast to high-resolution online NMR (HR-NMR), MR-NMR can be operated under rough environmental conditions. A continuous re-circulating stream of reaction mixture from the reaction vessel to the NMR spectrometer enables a non-invasive, volume integrating online analysis of reactants and products. Here, we investigate the esterification of 2,2,2-trifluoroethanol with acetic acid to 2,2,2-trifluoroethyl acetate both by (1) H HR-NMR (500 MHz) and (1) H and (19) F MR-NMR (43 MHz) as a model system. The parallel online measurement is realised by splitting the flow, which allows the adjustment of quantitative and independent flow rates, both in the HR-NMR probe as well as in the MR-NMR probe, in addition to a fast bypass line back to the reactor. One of the fundamental acceptance criteria for online MR-MNR spectroscopy is a robust data treatment and evaluation strategy with the potential for automation. The MR-NMR spectra are treated by an automated baseline and phase correction using the minimum entropy method. The evaluation strategies comprise (i) direct integration, (ii) automated line fitting, (iii) indirect hard modelling (IHM) and (iv) partial least squares regression (PLS-R). To assess the potential of these evaluation strategies for MR-NMR, prediction results are compared with the line fitting data derived from the quantitative HR-NMR spectroscopy. Although, superior results are obtained from both IHM and PLS-R for (1) H MR-NMR, especially the latter demands for elaborate data pretreatment, whereas IHM models needed no previous alignment. Copyright © 2015 John Wiley & Sons, Ltd. PMID:25854892

  18. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  19. 抽象代数理论下的复杂系统模型%Complex System Model Based on the Theory of Abstract Algebra

    Institute of Scientific and Technical Information of China (English)

    李顺勇; 赵深淼

    2011-01-01

    复杂系统模型已经广泛地应用于生态学\\,中医学\\,物理\\,化学及其他学科.运用抽象代数理论对复杂系统模型进行了研究.首先对张应山所给出的有关复杂系统模型及其 3 种关系(等价关系、跃迁关系和返祖关系)的定义进行了阐述,然后在抽象代数理论的基础上给出几个引理并进行了证明,最后给出了复杂系统模型的一个等价定义.%The complex system model is widely used in ecology, traditional chinese medicine, physics,chemistry and other subjects. The complex system model was studied by using the theory of abstract algebra. Firstly, the definition of the complex system model and three relations (equivalent relation,transition relation and atavism relation) given by Zhang Yingshan was introduced. Secondly, some lemmas were given and proved on the basis of abstract algebra. Finally, an equivalent definition of the complex system model was given.

  20. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  1. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  2. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    Science.gov (United States)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  3. Agenda, extended abstracts, and bibliographies for a workshop on Deposit modeling, mineral resources assessment, and their role in sustainable development

    Science.gov (United States)

    Briskey, Joseph A., (Edited By); Schulz, Klaus J.

    2002-01-01

    Global demand for mineral resources continues to increase because of increasing global population and the desire and efforts to improve living standards worldwide. The ability to meet this growing demand for minerals is affected by the concerns about possible environmental degradation associated with minerals production and by competing land uses. Informed planning and decisions concerning sustainability and resource development require a long-term perspective and an integrated approach to land-use, resource, and environmental management worldwide. This, in turn, requires unbiased information on the global distribution of identified and especially undiscovered resources, the economic and political factors influencing their development, and the potential environmental consequences of their exploitation. The purpose of the IGC workshop is to review the state-of-the-art in mineral-deposit modeling and quantitative resource assessment and to examine their role in the sustainability of mineral use. The workshop will address such questions as: Which of the available mineral-deposit models and assessment methods are best suited for predicting the locations, deposit types, and amounts of undiscovered nonfuel mineral resources remaining in the world? What is the availability of global geologic, mineral deposit, and mineral-exploration information? How can mineral-resource assessments be used to address economic and environmental issues? Presentations will include overviews of assessment methods used in previous national and other small-scale assessments of large regions as well as resulting assessment products and their uses.

  4. Nuclear medicine. Abstracts; Nuklearmedizin 2000. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2000-07-01

    This issue of the journal contains the abstracts of the 183 conference papers as well as 266 posters presented at the conference. Subject fields covered are: Neurology, psychology, oncology, pediatrics, radiopharmacy, endocrinology, EDP, measuring equipment and methods, radiological protection, cardiology, and therapy. (orig./CB) [German] Die vorliegende Zeitschrift enthaelt die Kurzfassungen der 183 auf der Tagung gehaltenen Vortraege sowie der 226 praesentierten Poster, die sich mit den folgenden Themen befassten: Neurologie, Psychiatrie, Onkologie, Paediatrie, Radiopharmazie, Endokrinologie, EDV, Messtechnik, Strahlenschutz, Kardiologie sowie Therapie. (MG)

  5. SWAT and River-2D Modelling of Pinder River for Analysing Snow Trout Habitat under Different Flow Abstraction Scenarios

    Science.gov (United States)

    Nale, J. P.; Gosain, A. K.; Khosa, R.

    2015-12-01

    Pinder River, one of major headstreams of River Ganga, originates in Pindari Glaciers of Kumaon Himalayas and after passing through rugged gorges meets Alaknanda at Karanprayag forming one of the five celestial confluences of Upper Ganga region. While other sub-basins of Upper Ganga are facing severe ecological losses, Pinder basin is still in its virginal state and is well known for its beautiful valleys besides being host to unique and rare biodiversity. A proposed 252 MW run-of-river hydroelectric project at Devsari on this river has been a major concern on account of its perceived potential for egregious environmental and social impacts. In this context, the study presented tries to analyse the expected changes in aquatic habitat conditions after this project is operational (with different operation policies). SWAT hydrological modelling platform has been used to derive stream flow simulations under various scenarios ranging from the present to the likely future conditions. To analyse the habitat conditions, a two dimensional hydraulic-habitat model 'River-2D', a module of iRIC software, is used. Snow trout has been identified as the target keystone species and its habitat preferences, in the form of flow depths, flow velocity and substrate condition, are obtained from diverse sources of related literature and are provided as Habitat Suitability Indices to River-2D. Bed morphology constitutes an important River-2D input and has been obtained, for the designated 1 km long study reach of Pinder upto Karanprayag, from a combination of actual field observations and supplemented by SRTM 1 Arc-Second Global digital elevation data. Monthly Weighted Usable Area for three different life stages (Spawning, Juvenile and Adult) of Snow Trout are obtained corresponding to seven different flow discharges ranging from 10 cumec to 1000 cumec. Comparing the present and proposed future river flow conditions obtained from SWAT modelling, losses in Weighted Usable Area, for the

  6. Journalism Abstracts. Vol. 15.

    Science.gov (United States)

    Popovich, Mark N., Ed.

    This book, the fifteenth volume of an annual publication, contains 373 abstracts of 52 doctoral and 321 master's theses from 50 colleges and universities. The abstracts are arranged alphabetically by author, with the doctoral dissertations appearing first. These cover such topics as advertising, audience analysis, content analysis of news issues…

  7. Completeness of Lyapunov Abstraction

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Sloth, Christoffer

    This paper addresses the generation of complete abstractions of polynomial dynamical systems by timed automata. For the proposed abstraction, the state space is divided into cells by sublevel sets of functions. We identify a relation between these functions and their directional derivatives along...

  8. Program and abstracts

    International Nuclear Information System (INIS)

    Abstracts of the papers given at the conference are presented. The abstracts are arranged under sessions entitled: Theoretical Physics; Nuclear Physics; Solid State Physics; Spectroscopy; Plasma Physics; Solar-Terrestrial Physics; Astrophysics and Astronomy; Radioastronomy; General Physics; Applied Physics; Industrial Physics

  9. Designing for Mathematical Abstraction

    Science.gov (United States)

    Pratt, Dave; Noss, Richard

    2010-01-01

    Our focus is on the design of systems (pedagogical, technical, social) that encourage mathematical abstraction, a process we refer to as "designing for abstraction." In this paper, we draw on detailed design experiments from our research on children's understanding about chance and distribution to re-present this work as a case study in designing…

  10. Abstraction of Drift Seepage

    Energy Technology Data Exchange (ETDEWEB)

    J.T. Birkholzer

    2004-11-01

    This model report documents the abstraction of drift seepage, conducted to provide seepage-relevant parameters and their probability distributions for use in Total System Performance Assessment for License Application (TSPA-LA). Drift seepage refers to the flow of liquid water into waste emplacement drifts. Water that seeps into drifts may contact waste packages and potentially mobilize radionuclides, and may result in advective transport of radionuclides through breached waste packages [''Risk Information to Support Prioritization of Performance Assessment Models'' (BSC 2003 [DIRS 168796], Section 3.3.2)]. The unsaturated rock layers overlying and hosting the repository form a natural barrier that reduces the amount of water entering emplacement drifts by natural subsurface processes. For example, drift seepage is limited by the capillary barrier forming at the drift crown, which decreases or even eliminates water flow from the unsaturated fractured rock into the drift. During the first few hundred years after waste emplacement, when above-boiling rock temperatures will develop as a result of heat generated by the decay of the radioactive waste, vaporization of percolation water is an additional factor limiting seepage. Estimating the effectiveness of these natural barrier capabilities and predicting the amount of seepage into drifts is an important aspect of assessing the performance of the repository. The TSPA-LA therefore includes a seepage component that calculates the amount of seepage into drifts [''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504], Section 6.3.3.1)]. The TSPA-LA calculation is performed with a probabilistic approach that accounts for the spatial and temporal variability and inherent uncertainty of seepage-relevant properties and processes. Results are used for subsequent TSPA-LA components that may handle, for example, waste package

  11. Extending and applying active appearance models for automated, high precision segmentation in different image modalities

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Fisker, Rune; Ersbøll, Bjarne Kjær

    initialization scheme is designed thus making the usage of AAMs fully automated. Using these extensions it is demonstrated that AAMs can segment bone structures in radiographs, pork chops in perspective images and the left ventricle in cardiovascular magnetic resonance images in a robust, fast and accurate...... object class description, which can be employed to rapidly search images for new object instances. The proposed extensions concern enhanced shape representation, handling of homogeneous and heterogeneous textures, refinement optimization using Simulated Annealing and robust statistics. Finally, an...

  12. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    OpenAIRE

    Veronika Brandstetter; Andreas Froese; Bastian Tenbergen; Andreas Vogelsang; Jan Christoph Wehrstedt; Thorsten Weyer

    2015-01-01

    In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must ad...

  13. The Conversion of Cardiovascular Conference Abstracts to Publications

    DEFF Research Database (Denmark)

    Fosbøl, Emil L.; Fosbøl, Philip Loldrup; Harrington, Robert A.;

    2012-01-01

    systematic and automated evaluation of rates, timing, and correlates of publication from scientific abstracts presented at 3 major cardiovascular conferences. Methods and Results—Using an automated computer algorithm, we searched the ISI Web of Science to identify peer-reviewed publications of abstracts....... From 2006 to 2008, 11 365, 5005, and 10 838 abstracts were presented at the AHA, ACC, and ESC meetings, respectively. Overall, 30.6% of presented abstracts were published within 2 years of the conference; ranging from 34.5% for AHA to 29.5% for ACC to 27.0% for ESC (P0.0001). Five years after...... conference presentation in 2005, these rates had risen slightly to 49.7% for AHA, 42.6% for ACC, and 37.6% for ESC (P0.0001). After adjustment for abstract characteristics and contributing countries, abstracts presented at the AHA meeting remained more likely for publication relative to the ESC (adjusted...

  14. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  15. Automated delineation of karst sinkholes from LiDAR-derived digital elevation models

    Science.gov (United States)

    Wu, Qiusheng; Deng, Chengbin; Chen, Zuoqi

    2016-08-01

    Sinkhole mapping is critical for understanding hydrological processes and mitigating geological hazards in karst landscapes. Current methods for identifying sinkholes are primarily based on visual interpretation of low-resolution topographic maps and aerial photographs with subsequent field verification, which is labor-intensive and time-consuming. The increasing availability of high-resolution LiDAR-derived digital elevation data allows for an entirely new level of detailed delineation and analyses of small-scale geomorphologic features and landscape structures at fine scales. In this paper, we present a localized contour tree method for automated extraction of sinkholes in karst landscapes. One significant advantage of our automated approach for sinkhole extraction is that it may reduce inconsistencies and alleviate repeatability concerns associated with visual interpretation methods. In addition, the proposed method has contributed to improving the sinkhole inventory in several ways: (1) detection of non-inventoried sinkholes; (2) identification of previously inventoried sinkholes that have been filled; (3) delineation of sinkhole boundaries; and (4) characterization of sinkhole morphometric properties. We applied the method to Fillmore County in southeastern Minnesota, USA, and identified three times as many sinkholes as the existing database for the same area. The results suggest that previous visual interpretation method might significantly underestimate the number of potential sinkholes in the region. Our method holds great potential for creating and updating sinkhole inventory databases at a regional scale in a timely manner.

  16. Automation of block assignment planning using a diagram-based scenario modeling method

    Directory of Open Access Journals (Sweden)

    Hwang In Hyuck

    2014-03-01

    Full Text Available Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  17. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls and...... capturing concrete values, objects, or actions. As the next step, some of these are lifted to a higher level by computational means. In the object-oriented paradigm the target of such steps is classes. We hypothesise that the proposed approach primarily will be beneficial to novice programmers or during the...

  18. Efficient abstraction selection in reinforcement learning

    NARCIS (Netherlands)

    Seijen, H. van; Whiteson, S.; Kester, L.

    2013-01-01

    This paper introduces a novel approach for abstraction selection in reinforcement learning problems modelled as factored Markov decision processes (MDPs), for which a state is described via a set of state components. In abstraction selection, an agent must choose an abstraction from a set of candida

  19. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Influence of Fermented Product from Beneficial Microorganism on the Cultivation of Larvae Apostichopus japonicus Li Shuang et al(1) Abstract The fermented product from beneficial microorganism was applied in the seed rearing of sea cucumber.The result

  20. 2016 ACPA MEETING ABSTRACTS.

    Science.gov (United States)

    2016-07-01

    The peer-reviewed abstracts presented at the 73rd Annual Meeting of the ACPA are published as submitted by the authors. For financial conflict of interest disclosure, please visit http://meeting.acpa-cpf.org/disclosures.html. PMID:27447885

  1. Abstracts of SIG Sessions.

    Science.gov (United States)

    Proceedings of the ASIS Annual Meeting, 1997

    1997-01-01

    Presents abstracts of SIG Sessions. Highlights include digital collections; information retrieval methods; public interest/fair use; classification and indexing; electronic publication; funding; globalization; information technology projects; interface design; networking in developing countries; metadata; multilingual databases; networked…

  2. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    Science.gov (United States)

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. PMID:25448021

  3. A new automated method for analysis of gated-SPECT images based on a three-dimensional heart shaped model

    DEFF Research Database (Denmark)

    Lomsky, Milan; Richter, Jens; Johansson, Lena; El-Ali, Henrik; Aström, Karl; Ljungberg, Michael; Edenbrandt, Lars; El Ali, Henrik H.

    2005-01-01

    A new automated method for quantification of left ventricular function from gated-single photon emission computed tomography (SPECT) images has been developed. The method for quantification of cardiac function (CAFU) is based on a heart shaped model and the active shape algorithm. The model....... The maximal differences between the CAFU estimations and the true left ventricular volumes of the digital phantoms were 11 ml for the end-diastolic volume (EDV), 3 ml for the end-systolic volume (ESV) and 3% for the ejection fraction (EF). The largest differences were seen in the smallest heart. In...... the patient group the EDV calculated using QGS and CAFU showed good agreement for large hearts and higher CAFU values compared with QGS for the smaller hearts. In the larger hearts, ESV was much larger for QGS than for CAFU both in the phantom and patient studies. In the smallest hearts there was good...

  4. Model-based analysis of an automated changeover switching unit for a busbar. MODSAFE 2009 work report

    Energy Technology Data Exchange (ETDEWEB)

    Bjorkman, K.; Valkonen, J.; Ranta, J.

    2011-06-15

    Verification of digital instrumentation and control (I and C) systems is challenging, because programmable logic controllers enable complicated control functions and the state spaces (number of distinct values of inputs, outputs, and internal memory) of the designs become easily too large for comprehensive manual inspection. Model checking is a promising formal method that can be used for verifying the correctness of system designs. A number of efficient model checking systems are available offering analysis tools that are able to determine automatically whether a given state machine model satisfies the desired safety properties. Model checking can also handle delays and other time-related operations, which are crucial in safety I and C systems and challenging to design and verify. The system analysed in this research project is called 'automated changeover switching unit for a busbar' and its purpose is to switch the power feed to stand-by power supply in the event of voltage breaks. The system is modelled as a finite state machine and some of its key properties are verified with the NuSMV model checking tool. The time-dependent components are modelled to operate in discrete fixed-length time steps and the lengths of the timed functions are scaled to avoid state explosion and enable efficient model checking. (orig.)

  5. 07381 Abstracts Collection -- Cryptography

    OpenAIRE

    Blömer, Johannes; Boneh, Dan; Cramer, Ronald; Maurer, Ueli

    2008-01-01

    From 16.09.2007 to 21.09.2007 the Dagstuhl Seminar 07381 ``Cryptography'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goa...

  6. Painting, abstraction, discourse

    OpenAIRE

    Besson, Christian

    2012-01-01

    Four catalogues and compilations published this year once again raise the issue of the linkage between painting and critical discourse, with abstraction, where applicable, exacerbating the tension between the two. The first essay, La Peinture après l’abstraction, is nothing less than stimulating. Certain observations made by Alain Cueff about the neglected role of poster artists in the renewed formulation of painting, between 1955 and 1965, lie at the root of the comparison--a new departure--...

  7. Abstracts of contributed papers

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This volume contains 571 abstracts of contributed papers to be presented during the Twelfth US National Congress of Applied Mechanics. Abstracts are arranged in the order in which they fall in the program -- the main sessions are listed chronologically in the Table of Contents. The Author Index is in alphabetical order and lists each paper number (matching the schedule in the Final Program) with its corresponding page number in the book.

  8. 08071 Abstracts Collection -- Scheduling

    OpenAIRE

    Jane W. S. Liu; Rolf H. Möhring; Pruhs, Kirk

    2008-01-01

    From 10.02. to 15.02., the Dagstuhl Seminar 08071 ``Scheduling'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in gen...

  9. 10071 Abstracts Collection -- Scheduling

    OpenAIRE

    Albers, Susanne; Baruah, Sanjoy K; Rolf H. Möhring; Pruhs, Kirk

    2010-01-01

    From 14.02. to 19.02.2010, the Dagstuhl Seminar 10071 ``Scheduling '' was held in Schloss Dagstuhl-Leibniz Center for Informatics. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first section describes the seminar topics and goals in general. Links to ext...

  10. Meeting Abstracts - Annual Meeting 2016.

    Science.gov (United States)

    2016-04-01

    The AMCP Abstracts program provides a forum through which authors can share their insights and outcomes of advanced managed care practice through publication in AMCP's Journal of Managed Care & Specialty Pharmacy (JMCP). Most of the reviewed and unreviewed abstracts are presented as posters so that interested AMCP meeting attendees can review findings and query authors. The Student/Resident/ Fellow poster presentation (unreviewed) is Wednesday, April 20, 2016, and the Professional poster presentation (reviewed) is Thursday, April 21. The Professional posters will also be displayed on Friday, April 22. The reviewed abstracts are published in the JMCP Meeting Abstracts supplement. The AMCP Managed Care & Specialty Pharmacy Annual Meeting 2016 in San Francisco, California, is expected to attract more than 3,500 managed care pharmacists and other health care professionals who manage and evaluate drug therapies, develop and manage networks, and work with medical managers and information specialists to improve the care of all individuals enrolled in managed care programs. Abstracts were submitted in the following categories: Research Report: describe completed original research on managed care pharmacy services or health care interventions. Examples include (but are not limited to) observational studies using administrative claims, reports of the impact of unique benefit design strategies, and analyses of the effects of innovative administrative or clinical programs. Economic Model: describe models that predict the effect of various benefit design or clinical decisions on a population. For example, an economic model could be used to predict the budget impact of a new pharmaceutical product on a health care system. Solving Problems in Managed Care: describe the specific steps taken to introduce a needed change, develop and implement a new system or program, plan and organize an administrative function, or solve other types of problems in managed care settings. These

  11. DSL development based on target meta-models. Using AST transformations for automating semantic analysis in a textual DSL framework

    CERN Document Server

    Breslav, Andrey

    2008-01-01

    This paper describes an approach to creating textual syntax for Do- main-Specific Languages (DSL). We consider target meta-model to be the main artifact and hence to be developed first. The key idea is to represent analysis of textual syntax as a sequence of transformations. This is made by explicit opera- tions on abstract syntax trees (ATS), for which a simple language is proposed. Text-to-model transformation is divided into two parts: text-to-AST (developed by openArchitectureWare [1]) and AST-to-model (proposed by this paper). Our approach simplifies semantic analysis and helps to generate as much as possi- ble.

  12. Software Testing and Documenting Automation

    OpenAIRE

    Tsybin, Anton; Lyadova, Lyudmila

    2008-01-01

    This article describes some approaches to problem of testing and documenting automation in information systems with graphical user interface. Combination of data mining methods and theory of finite state machines is used for testing automation. Automated creation of software documentation is based on using metadata in documented system. Metadata is built on graph model. Described approaches improve performance and quality of testing and documenting processes.

  13. Evaluation of automated statistical shape model based knee kinematics from biplane fluoroscopy

    DEFF Research Database (Denmark)

    Baka, Nora; Kaptein, Bart L.; Giphart, J. Erik;

    2014-01-01

    decrease costs and radiation dose (when eliminating CT). SSM based kinematics, however, have not yet been evaluated on clinically relevant joint motion parameters. Therefore, in this work the applicability of SSMs for computing knee kinematics from biplane fluoroscopic sequences was explored. Kinematic...... precision with an edge based automated bone tracking method using SSMs was evaluated on 6 cadaveric and 10 in-vivo fluoroscopic sequences. The SSMs of the femur and the tibia-fibula were created using 61 training datasets. Kinematic precision was determined for medial-lateral tibial shift, anterior......-posterior tibial drawer, joint distraction-contraction, flexion, tibial rotation and adduction. The relationship between kinematic precision and bone shape accuracy was also investigated. The SSM based kinematics resulted in sub-millimeter (0.48-0.81mm) and approximately 1° (0.69-0.99°) median precision on the...

  14. An Automated Feedback System Based on Adaptive Testing: Extending the Model

    Directory of Open Access Journals (Sweden)

    Trevor Barker

    2010-06-01

    Full Text Available Abstract—The results of the recent national students survey (NSS revealed that a major problem in HE today is that of student feedback. Research carried out by members of the project team in the past has led to the development of an automated student feedback system for use with objective formative testing. This software relies on an ‘intelligent’ engine to determine the most appropriate individual feedback, based on test performance, relating not only to answers, but also to Bloom’s cognitive levels. The system also recommends additional materials and challenges, for each individual learner. Detailed evaluation with more than 500 students and 100 university staff have shown that the system is highly valued by learners and seen by staff as an important addition to the methods available. The software has been used on two modules so far over a two year period

  15. Automating the development of Physical Mobile Workflows. A Model Driven Engineering approach

    OpenAIRE

    Giner Blasco, Pau

    2010-01-01

    La visión de la "Internet de las Cosas", hace énfasis en la integración entre elementos del mundo real y los Sistemas de Información. Gracias a tecnologías de Identificación Automática (Auto-ID) cómo RFID, los sistemas pueden percibir objetos del mundo físico. Cuando éstos participan de manera activa en los procesos de negocio, se evita el uso de los seres humanos como transportadores de información. Por tanto, el número de errores se reduce y la eficiencia de los procesos aumenta. Aunque...

  16. Abstracting Runtime Heaps for Program Understanding

    CERN Document Server

    Marron, Mark; Su, Zhendong; Fahndrich, Manuel

    2012-01-01

    Modern programming environments provide extensive support for inspecting, analyzing, and testing programs based on the algorithmic structure of a program. Unfortunately, support for inspecting and understanding runtime data structures during execution is typically much more limited. This paper provides a general purpose technique for abstracting and summarizing entire runtime heaps. We describe the abstract heap model and the associated algorithms for transforming a concrete heap dump into the corresponding abstract model as well as algorithms for merging, comparing, and computing changes between abstract models. The abstract model is designed to emphasize high-level concepts about heap-based data structures, such as shape and size, as well as relationships between heap structures, such as sharing and connectivity. We demonstrate the utility and computational tractability of the abstract heap model by building a memory profiler. We then use this tool to check for, pinpoint, and correct sources of memory bloat...

  17. Automated Camera Calibration

    Science.gov (United States)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  18. Design And Modeling An Automated Digsilent Power System For Optimal New Load Locations

    OpenAIRE

    Mohamed Saad; Mohd Ali Tofigh; Farah Zaheeda; Ahmed N AL-Masri; Nordin Bin Othman; Muhammad Irsyad; Ahmad Abbas; Erhab Youssef

    2015-01-01

    Abstract The electric power utilities seek to take advantage of novel approaches to meet growing energy demand. Utilities are under pressure to evolve their classical topologies to increase the usage of distributed generation. Currently the electrical power engineers in many regions of the world are implementing manual methods to measure power consumption for farther assessment of voltage violation. Such process proved to be time consuming costly and inaccurate. Also demand response is a grid...

  19. Metacognition and abstract reasoning.

    Science.gov (United States)

    Markovits, Henry; Thompson, Valerie A; Brisson, Janie

    2015-05-01

    The nature of people's meta-representations of deductive reasoning is critical to understanding how people control their own reasoning processes. We conducted two studies to examine whether people have a metacognitive representation of abstract validity and whether familiarity alone acts as a separate metacognitive cue. In Study 1, participants were asked to make a series of (1) abstract conditional inferences, (2) concrete conditional inferences with premises having many potential alternative antecedents and thus specifically conducive to the production of responses consistent with conditional logic, or (3) concrete problems with premises having relatively few potential alternative antecedents. Participants gave confidence ratings after each inference. Results show that confidence ratings were positively correlated with logical performance on abstract problems and concrete problems with many potential alternatives, but not with concrete problems with content less conducive to normative responses. Confidence ratings were higher with few alternatives than for abstract content. Study 2 used a generation of contrary-to-fact alternatives task to improve levels of abstract logical performance. The resulting increase in logical performance was mirrored by increases in mean confidence ratings. Results provide evidence for a metacognitive representation based on logical validity, and show that familiarity acts as a separate metacognitive cue. PMID:25416026

  20. Thermography colloquium 2015. Abstracts

    International Nuclear Information System (INIS)

    The USB stick contains 17 lectures which where held on the Thermography colloquium 2015 in Leinfelden-Echterdingen (Germany). Here a selection of the topics: Thermal Chladni sound figures in nondestructive testing (M. Rahammer); Flash thermography with several flashes (R. Krankenhagen); Frequency optimization of ultrasound-induced thermography during the measurement (C. Srajbr); Worldwide introduction of a thermographic inspection system for gas turbine components (M. Goldammer); Practical aspects of automation of thermographic weld inspection (G.Mahler); Investigations to determine the crack depth with inductive thermography (B. Oswald-Tranta); Testing of spot welds with laser thermography (M. Ziegler).

  1. GoSam-2.0. A tool for automated one-loop calculations within the Standard Model and beyond

    International Nuclear Information System (INIS)

    We present the version 2.0 of the program package GoSam for the automated calculation of one-loop amplitudes. GoSam is devised to compute one-loop QCD and/or electroweak corrections to multi-particle processes within and beyond the Standard Model. The new code contains improvements in the generation and in the reduction of the amplitudes, performs better in computing time and numerical accuracy, and has an extended range of applicability. The extended version of the ''Binoth-Les-Houches-Accord'' interface to Monte Carlo programs is also implemented. We give a detailed description of installation and usage of the code, and illustrate the new features in dedicated examples.

  2. Toward Automated FAÇADE Texture Generation for 3d Photorealistic City Modelling with Smartphones or Tablet Pcs

    Science.gov (United States)

    Wang, S.

    2012-07-01

    An automated model-image fitting algorithm is proposed in this paper for generating façade texture image from pictures taken by smartphones or tablet PCs. The façade texture generation requires tremendous labour work and thus, has been the bottleneck of 3D photo-realistic city modelling. With advanced developments of the micro electro mechanical system (MEMS), camera, global positioning system (GPS), and gyroscope (G-sensors) can all be integrated into a smartphone or a table PC. These sensors bring the possibility of direct-georeferencing for the pictures taken by smartphones or tablet PCs. Since the accuracy of these sensors cannot compared to the surveying instruments, the image position and orientation derived from these sensors are not capable of photogrammetric measurements. This paper adopted the least-squares model-image fitting (LSMIF) algorithm to iteratively improve the image's exterior orientation. The image position from GPS and the image orientation from gyroscope are treated as the initial values. By fitting the projection of the wireframe model to the extracted edge pixels on image, the image exterior orientation elements are solved when the optimal fitting achieved. With the exact exterior orientation elements, the wireframe model of the building can be correctly projected on the image and, therefore, the façade texture image can be extracted from the picture.

  3. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa

    2008-03-01

    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  4. Building Safe Concurrency Abstractions

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    2014-01-01

    as well as programming, and we describe how this has had an impact on the design of the language. Although Beta supports the definition of high-level concurrency abstractions, the use of these rely on the discipline of the programmer as is the case for Java and other mainstream OO languages. We...

  5. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Study of Feeding Effects of EM Fermented Feed on the Growth and Survival of Juvenile Sea Cucumber Apostichopus japonicus Gong Hai-ning et al(1) Abstract In this study, comparisons of feeding effects on the sea cucumber Apostichopus japonicus between th

  6. Abstraction through Game Play

    Science.gov (United States)

    Avraamidou, Antri; Monaghan, John; Walker, Aisha

    2012-01-01

    This paper examines the computer game play of an 11-year-old boy. In the course of building a virtual house he developed and used, without assistance, an artefact and an accompanying strategy to ensure that his house was symmetric. We argue that the creation and use of this artefact-strategy is a mathematical abstraction. The discussion…

  7. Learning Abstracts, 2001.

    Science.gov (United States)

    Wilson, Cynthia, Ed.

    2001-01-01

    Volume 4 of the League for Innovation in the Community College's Learning Abstracts include the following: (1) "Touching Students in the Digital Age: The Move Toward Learner Relationship Management (LRM)," by Mark David Milliron, which offers an overview of an organizing concept to help community colleges navigate the intersection between digital…

  8. 2002 NASPSA Conference Abstracts.

    Science.gov (United States)

    Journal of Sport & Exercise Psychology, 2002

    2002-01-01

    Contains abstracts from the 2002 conference of the North American Society for the Psychology of Sport and Physical Activity. The publication is divided into three sections: the preconference workshop, "Effective Teaching Methods in the Classroom;" symposia (motor development, motor learning and control, and sport psychology); and free…

  9. Annual Conference Abstracts

    Science.gov (United States)

    Engineering Education, 1976

    1976-01-01

    Presents the abstracts of 158 papers presented at the American Society for Engineering Education's annual conference at Knoxville, Tennessee, June 14-17, 1976. Included are engineering topics covering education, aerospace, agriculture, biomedicine, chemistry, computers, electricity, acoustics, environment, mechanics, and women. (SL)

  10. Monadic abstract interpreters

    DEFF Research Database (Denmark)

    Sergey, Ilya; Devriese, Dominique; Might, Matthew;

    2013-01-01

    Recent developments in the systematic construction of abstract interpreters hinted at the possibility of a broad unification of concepts in static analysis. We deliver that unification by showing context-sensitivity, polyvariance, flow-sensitivity, reachabilitypruning, heap-cloning and cardinalit...

  11. Abstracts of submitted papers

    International Nuclear Information System (INIS)

    The conference proceedings contain 152 abstracts of presented papers relating to various aspects of personnel dosimetry, the dosimetry of the working and living environment, various types of dosemeters and spectrometers, the use of radionuclides in various industrial fields, the migration of radionuclides on Czechoslovak territory after the Chernobyl accident, theoretical studies of some parameters of ionizing radiation detectors, and their calibration. (M.D.)

  12. Reasoning abstractly about resources

    Science.gov (United States)

    Clement, B.; Barrett, A.

    2001-01-01

    r describes a way to schedule high level activities before distributing them across multiple rovers in order to coordinate the resultant use of shared resources regardless of how each rover decides how to perform its activities. We present an algorithm for summarizing the metric resource requirements of an abstract activity based n the resource usages of its potential refinements.

  13. SPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-04-01

    The volume contains the abstracts of the SPR (society for pediatric radiology) 2015 meeting covering the following issues: fetal imaging, muscoskeletal imaging, cardiac imaging, chest imaging, oncologic imaging, tools for process improvement, child abuse, contrast enhanced ultrasound, image gently - update of radiation dose recording/reporting/monitoring - meaningful or useless meaning?, pediatric thoracic imaging, ALARA.

  14. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Morphological Variations and Discriminant Analysis of Three Populations of Mytilus coruscus Ye Ya-qiu et al. (4) Abstract The multivariate morphometrics analysis method was used for studying four morphological characters of three geographical populations of Mytilus coruscus from Sheng-si, Zhou-shan, Tai-zhou along the coast of Zhe-jiang province of China.

  15. ESPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-06-15

    The Proceedings on ESPR 2014 include abstracts concerning the following topics: pediatric imaging: thorax, cardiovascular system, CT-technique, head and neck, perinatal imaging, molecular imaging; interventional imaging; specific focus: muscoskeletal imaging in juvenile idiopathic arthritis; radiation protection; oncology; molecular imaging - nuclear medicine; uroradiology and abdominal imaging.

  16. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Study on the Enrichment Regularity of Semicarbazide in Algae Tian Xiu-hui eta1. (1) Abstract Semicarbazide (SEM) in three kinds of representative algae (Nitzschia closterium, Tetraselmis chui and Dicrateria sp) and seawater was determined using ultra performance liquid chromatogram tandem mass spectrometry in this work. Accumulation of semicarbazide (SEM) in algae under laboratory conditions was studied.

  17. Abstract Film and Beyond.

    Science.gov (United States)

    Le Grice, Malcolm

    A theoretical and historical account of the main preoccupations of makers of abstract films is presented in this book. The book's scope includes discussion of nonrepresentational forms as well as examination of experiments in the manipulation of time in films. The ten chapters discuss the following topics: art and cinematography, the first…

  18. Concept of the abstract program

    OpenAIRE

    Gregorics, T.

    2012-01-01

    The aim of this paper is to alter the abstract definition of the program of the theoretical programming model which has been developed at Eotvos Lorand University for many years in order to investigate methods that support designing correct programs. The motivation of this modi?cation was that the dynamic properties of programs appear in the model. This new definition of the program gives a hand to extend the model with the concept of subprograms while the earlier results of the original prog...

  19. Semi-automated curation of metabolic models via flux balance analysis: a case study with Mycoplasma gallisepticum.

    Directory of Open Access Journals (Sweden)

    Eddy J Bautista

    Full Text Available Primarily used for metabolic engineering and synthetic biology, genome-scale metabolic modeling shows tremendous potential as a tool for fundamental research and curation of metabolism. Through a novel integration of flux balance analysis and genetic algorithms, a strategy to curate metabolic networks and facilitate identification of metabolic pathways that may not be directly inferable solely from genome annotation was developed. Specifically, metabolites involved in unknown reactions can be determined, and potentially erroneous pathways can be identified. The procedure developed allows for new fundamental insight into metabolism, as well as acting as a semi-automated curation methodology for genome-scale metabolic modeling. To validate the methodology, a genome-scale metabolic model for the bacterium Mycoplasma gallisepticum was created. Several reactions not predicted by the genome annotation were postulated and validated via the literature. The model predicted an average growth rate of 0.358±0.12[Formula: see text], closely matching the experimentally determined growth rate of M. gallisepticum of 0.244±0.03[Formula: see text]. This work presents a powerful algorithm for facilitating the identification and curation of previously known and new metabolic pathways, as well as presenting the first genome-scale reconstruction of M. gallisepticum.

  20. A Knowledge Based Approach for Automated Modelling of Extended Wing Structures in Preliminary Aircraft Design

    OpenAIRE

    Dorbath, Felix; Nagel, Björn; Gollnick, Volker

    2011-01-01

    This paper introduces the concept of the ELWIS model generator for Finite Element models of aircraft wing structures. The physical modelling of the structure is extended beyond the wing primary structures, to increase the level of accuracy for aircraft which diverge from existing configurations. Also the impact of novel high lift technologies on structural masses can be captured already in the early stages of design by using the ELWIS models. The ELWIS model generator is able to c...

  1. Automata Learning through Counterexample Guided Abstraction Refinement

    DEFF Research Database (Denmark)

    Aarts, Fides; Heidarian, Faranak; Kuppens, Harco;

    2012-01-01

    Abstraction is the key when learning behavioral models of realistic systems. Hence, in most practical applications where automata learning is used to construct models of software components, researchers manually define abstractions which, depending on the history, map a large set of concrete even...

  2. Bounded Rationality of Generalized Abstract Fuzzy Economies

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2014-01-01

    Full Text Available By using a nonlinear scalarization technique, the bounded rationality model M for generalized abstract fuzzy economies in finite continuous spaces is established. Furthermore, by using the model M, some new theorems for structural stability and robustness to (λ,ϵ-equilibria of generalized abstract fuzzy economies are proved.

  3. Abstractions of Awareness: Aware of What?

    Science.gov (United States)

    Metaxas, Georgios; Markopoulos, Panos

    This chapter presents FN-AAR, an abstract model of awareness systems. The purpose of the model is to capture in a concise and abstract form essential aspects of awareness systems, many of which have been discussed in design essays or in the context of evaluating specific design solutions.

  4. Automated evolutionary optimization of ion channel conductances and kinetics in models of young and aged rhesus monkey pyramidal neurons.

    Science.gov (United States)

    Rumbell, Timothy H; Draguljić, Danel; Yadav, Aniruddha; Hof, Patrick R; Luebke, Jennifer I; Weaver, Christina M

    2016-08-01

    Conductance-based compartment modeling requires tuning of many parameters to fit the neuron model to target electrophysiological data. Automated parameter optimization via evolutionary algorithms (EAs) is a common approach to accomplish this task, using error functions to quantify differences between model and target. We present a three-stage EA optimization protocol for tuning ion channel conductances and kinetics in a generic neuron model with minimal manual intervention. We use the technique of Latin hypercube sampling in a new way, to choose weights for error functions automatically so that each function influences the parameter search to a similar degree. This protocol requires no specialized physiological data collection and is applicable to commonly-collected current clamp data and either single- or multi-objective optimization. We applied the protocol to two representative pyramidal neurons from layer 3 of the prefrontal cortex of rhesus monkeys, in which action potential firing rates are significantly higher in aged compared to young animals. Using an idealized dendritic topology and models with either 4 or 8 ion channels (10 or 23 free parameters respectively), we produced populations of parameter combinations fitting the target datasets in less than 80 hours of optimization each. Passive parameter differences between young and aged models were consistent with our prior results using simpler models and hand tuning. We analyzed parameter values among fits to a single neuron to facilitate refinement of the underlying model, and across fits to multiple neurons to show how our protocol will lead to predictions of parameter differences with aging in these neurons. PMID:27106692

  5. Abstracts of Main Essays

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The Position of Capitalist Study in Marx's Social Formation Theory Yang Xue-gong Xi Da-min The orientation and achievements of Marx's study of Capitalism or bourgeois society is the foundation of his social formation theory. On the base of his scientific study of capitalism, Marx evolves his concept of eco- nomic social formation, the scientific methodology of researching other social formations or social forms, the clues of the development of social formations, the abstraction of the general laws as well as his reflection on this abstraction. A full evaluation and acknowledgement of the position of capitalist study in Marx's social formation theory is crucial for revising Marx's social formation theory in the new era and for solving some controversial issues in the research of social formation theory.

  6. Ghana Science Abstracts

    International Nuclear Information System (INIS)

    This issue of the Ghana Science Abstracts combines in one publication all the country's bibliographic output in science and technology. The objective is to provide a quick reference source to facilitate the work of information professionals, research scientists, lecturers and policy makers. It is meant to give users an idea of the depth and scope and results of the studies and projects carried out. The scope and coverage comprise research outputs, conference proceedings and periodical articles published in Ghana. It does not capture those that were published outside Ghana. Abstracts reported have been grouped under the following subject areas: Agriculture, Biochemistry, Biodiversity conservation, biological sciences, biotechnology, chemistry, dentistry, engineering, environmental management, forestry, information management, mathematics, medicine, physics, nuclear science, pharmacy, renewable energy and science education

  7. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...... the dynamic behaviour of the interlocking system, i.e. the dynamic behaviour of the circuits depicted in the diagrams. The resulting state transition system (model) is expressed in the SAL language such that the SAL model checker can be used to model check required properties of this model of the...... interlocking system. The tool has been applied to the circuit diagrams of Stenstrup station in Denmark and the resulting formal model has then been model checked to satisfy a number of required safety properties....

  8. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  9. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    in emphasis from the three syntheses to mappings and rhizomatic diagrams that cut across semiotics or “blow apart regimes of signs”. The aim here is the absolute deterritorialization. Deleuze has shown how abstract machines operate in the philosophy of Foucault, the literature of Proust and Kafka......, and the painting of Bacon. We will finish our presentation by showing how these machines apply to architecture....

  10. SPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-05-15

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  11. NPP life management (abstracts)

    International Nuclear Information System (INIS)

    Abstracts of the papers presented at the International conference of the Ukrainian Nuclear Society 'NPP Life Management'. The following problems are considered: modernization of the NPP; NPP life management; waste and spent nuclear fuel management; decommissioning issues; control systems (including radiation and ecological control systems); information and control systems; legal and regulatory framework. State nuclear regulatory control; PR in nuclear power; training of personnel; economics of nuclear power engineering

  12. Medical physics 2013. Abstracts

    International Nuclear Information System (INIS)

    The proceedings of the medical physics conference 2013 include abstract of lectures and poster sessions concerning the following issues: Tele-therapy - application systems, nuclear medicine and molecular imaging, neuromodulation, hearing and technical support, basic dosimetry, NMR imaging -CEST (chemical exchange saturation transfer), medical robotics, magnetic particle imaging, audiology, radiation protection, phase contrast - innovative concepts, particle therapy, brachytherapy, computerized tomography, quantity assurance, hybrid imaging techniques, diffusion and lung NMR imaging, image processing - visualization, cardiac and abdominal NMR imaging.

  13. SPR 2014. Abstracts

    International Nuclear Information System (INIS)

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  14. Abstracts of the communications

    OpenAIRE

    2014-01-01

    (P) paper, (A) abstract only Dietary patterns and habitat of the Grimm’s duiker, Sylvicapra grimmia in Benin, (P)Abdoul Razack Adjibi Oualiou, Jean Claude Codjia, Guy Apollinaire Mensah The distribution of protected areas and conservation of flora in the republic of Benin, (P)Aristide Adomou, Hounnankpon Yedomonhan, Brice Sinsin, Laurentius Josephus and Gerardus Van Der Maesen The problem of invasive plants in protected areas. Chromolaena odorata in the regeneration process of the dense, semi...

  15. Historical development of abstracting.

    Science.gov (United States)

    Skolnik, H

    1979-11-01

    The abstract, under a multitude of names, such as hypothesis, marginalia, abridgement, extract, digest, précis, resumé, and summary, has a long history, one which is concomitant with advancing scholarship. The progression of this history from the Sumerian civilization ca. 3600 B.C., through the Egyptian and Greek civilizations, the Hellenistic period, the Dark Ages, Middle Ages, Renaissance, and into the modern period is reviewed. PMID:399482

  16. WWNPQFT-2011 - Abstracts

    International Nuclear Information System (INIS)

    The object of this workshop is to consolidate and publicize new efforts in non-perturbative field theories. This year the presentations deal with quantum gravity, non-commutative geometry, fat-tailed wave-functions, strongly coupled field theories, space-times two time-like dimensions, and multiplicative renormalization. A presentation is dedicated to the construction of a nucleon-nucleon potential from an analytical, non-perturbative gauge invariant QCD. This document gathers the abstracts of the presentations

  17. Using Personal Health Records for Automated Clinical Trials Recruitment: the ePaIRing Model

    OpenAIRE

    Wilcox, Adam; Natarajan, Karthik; Weng, Chunhua

    2009-01-01

    We describe the development of a model describing the use of patient information to improve patient recruitment in clinical trials. This model, named ePaIRing (electronic Participant Identification and Recruitment Model) describes variations in how information flows between stakeholders, and how personal health records can specifically facilitate patient recruitment.

  18. Generic modeling and mapping languages for model management

    OpenAIRE

    Kensche, David Sebastian

    2010-01-01

    Activities in management of schemas and schema mappings are usually solved by special-purpose solutions such as coding wrapper components or manually updating view definitions. The goal of model management is to raise the level of abstraction for metadata-intensive activities by providing a set of high-level operators that automate or semi-automate such tasks. The problems of model management are aggravated by the fact that usually heterogeneous modeling languages, such as the relational data...

  19. Automated Design Space Exploration with Aspen

    Directory of Open Access Journals (Sweden)

    Kyle L. Spafford

    2015-01-01

    Full Text Available Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation language with three new language constructs: user-defined resources, parameter ranges, and a collection of costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.

  20. On the Application of Macros to the Automation of different Dating Models Using ''210 Pb

    International Nuclear Information System (INIS)

    Different Dating models based on 210 Pb measurements, used for verifying recent events are shown in this report as well as, models that describe different processes affecting the vertical distribution of radionuclides in lacustrine and marine sediments. Macro-Commands are programmes included in calculation work sheets that allow automatised operations to run. In this report macros are used to: a) obtain 210 Pb results from a data base created from different sampling campaigns b) apply different dating models automatically c) optimise the diffusion coefficient employed by models through standards deviation calculations among experimental values and those obtained by the model. (Author) 21 refs

  1. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers

    OpenAIRE

    Shu, Jie; Dolman, G. E.; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-01-01

    Background Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Methods Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To ...

  2. Abstract Cauchy problems three approaches

    CERN Document Server

    Melnikova, Irina V

    2001-01-01

    Although the theory of well-posed Cauchy problems is reasonably understood, ill-posed problems-involved in a numerous mathematical models in physics, engineering, and finance- can be approached in a variety of ways. Historically, there have been three major strategies for dealing with such problems: semigroup, abstract distribution, and regularization methods. Semigroup and distribution methods restore well-posedness, in a modern weak sense. Regularization methods provide approximate solutions to ill-posed problems. Although these approaches were extensively developed over the last decades by many researchers, nowhere could one find a comprehensive treatment of all three approaches.Abstract Cauchy Problems: Three Approaches provides an innovative, self-contained account of these methods and, furthermore, demonstrates and studies some of the profound connections between them. The authors discuss the application of different methods not only to the Cauchy problem that is not well-posed in the classical sense, b...

  3. Application of Holdridge life-zone model based on the terrain factor in Xinjiang Automous Region

    Institute of Scientific and Technical Information of China (English)

    NI Yong-ming; OUYANG Zhi-yun; WANG Xiao-ke

    2005-01-01

    This study improved the application of the Holdridge life-zone model to simulate the distribution of desert vegetation in China which gives statistics to support eco-recovery and ecosystem reconstruction in desert area. This study classified the desert vegetation into four types: (1) LAD: little arbor desert; (2) SD: shrub desert; (3) HLHSD: half-shrub, little half-shrub desert; (4) LHSCD: little halfshrub cushion desert. Based on the classification of Xinjiang desert vegetation, the classical Holdridge life-zone model was used to simulate Xinjiang desert vegetation's distribution and compare the Kappa coefficient result of the model with table of accuracy represented by Kappa values. The Kappa value of the model was only 0.19, it means the simulation result was poor. To improve the life-zone model application to Xinjiang desert vegetation type, a set of plot standards for terrain factors was developed by using the plot standard as the reclassification criterion to climate sub-regime. Then the desert vegetation in Xinjiang was simulated. The average Kappa value of the second simulation to the respective climate regime was 0.45. The Kappa value of final modeling result was 0.64, which is the better value.The modification of the model made it in more application region. In the end, the model' s ecological relevance to the Xinjiang desert vegetation types was studied.

  4. Forecasting performance of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this work we consider forecasting macroeconomic variables during an economic crisis. The focus is on a speci…c class of models, the so-called single hidden-layer feedforward autoregressive neural network models. What makes these models interesting in the present context is that they form a cla...... forecasting during the economic crisis 2007–2009. Forecast accuracy is measured by the root mean square forecast error. Hypothesis testing is also used to compare the performance of the different techniques with each other.......In this work we consider forecasting macroeconomic variables during an economic crisis. The focus is on a speci…c class of models, the so-called single hidden-layer feedforward autoregressive neural network models. What makes these models interesting in the present context is that they form a class...... of universal approximators and may be expected to work well during exceptional periods such as major economic crises. These models are often difficult to estimate, and we follow the idea of White (2006) to transform the speci…cation and nonlinear estimation problem into a linear model selection and...

  5. AN OPTIMIZATION-BASED HEURISTIC FOR A CAPACITATED LOT-SIZING MODEL IN AN AUTOMATED TELLER MACHINES NETWORK

    Directory of Open Access Journals (Sweden)

    Supatchaya Chotayakul

    2013-01-01

    Full Text Available This research studies a cash inventory problem in an ATM Network to satisfy customer’s cash needs over multiple periods with deterministic demand. The objective is to determine the amount of money to place in Automated Teller Machines (ATMs and cash centers for each period over a given time horizon. The algorithms are designed as a multi-echelon inventory problem with single-item capacitated lot-sizing to minimize total costs of running ATM network. In this study, we formulate the problem as a Mixed Integer Program (MIP and develop an approach based on reformulating the model as a shortest path formulation for finding a near-optimal solution of the problem. This reformulation is the same as the traditional model, except the capacity constraints, inventory balance constraints and setup constraints related to the management of the money in ATMs are relaxed. This new formulation gives more variables and constraints, but has a much tighter linear relaxation than the original and is faster to solve for short term planning. Computational results show its effectiveness, especially for large sized problems.

  6. Automated Learning of Subcellular Variation among Punctate Protein Patterns and a Generative Model of Their Relation to Microtubules.

    Directory of Open Access Journals (Sweden)

    Gregory R Johnson

    2015-12-01

    Full Text Available Characterizing the spatial distribution of proteins directly from microscopy images is a difficult problem with numerous applications in cell biology (e.g. identifying motor-related proteins and clinical research (e.g. identification of cancer biomarkers. Here we describe the design of a system that provides automated analysis of punctate protein patterns in microscope images, including quantification of their relationships to microtubules. We constructed the system using confocal immunofluorescence microscopy images from the Human Protein Atlas project for 11 punctate proteins in three cultured cell lines. These proteins have previously been characterized as being primarily located in punctate structures, but their images had all been annotated by visual examination as being simply "vesicular". We were able to show that these patterns could be distinguished from each other with high accuracy, and we were able to assign to one of these subclasses hundreds of proteins whose subcellular localization had not previously been well defined. In addition to providing these novel annotations, we built a generative approach to modeling of punctate distributions that captures the essential characteristics of the distinct patterns. Such models are expected to be valuable for representing and summarizing each pattern and for constructing systems biology simulations of cell behaviors.

  7. Automated estimation of the truncation of room impulse response by applying a nonlinear decay model.

    Science.gov (United States)

    Janković, Marko; Ćirić, Dejan G; Pantić, Aleksandar

    2016-03-01

    Noise represents one of the most significant disturbances in measured room impulse responses (RIRs), and it has a potentially large impact on evaluation of the decay parameters. In order to reduce noise effects, various methods have been applied, including truncation of an RIR. In this paper, a procedure for the response truncation based on a model of RIR (nonlinear decay model) is presented. The model is represented by an exponential decay plus stationary noise. Unknown parameters of the model are calculated by an optimization that minimizes the difference between the curve generated by the model and the target one of the response to be truncated. Different curves can be applied in the optimization-absolute value of the RIR, logarithmic decay curve, and Schroeder curve obtained by the backward integration of the RIR. The proposed procedure is tested on various synthesized and measured impulse responses. It is compared with the procedure taken from the literature, often applied in practice. PMID:27036242

  8. Beyond the abstractions?

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2006-01-01

      The anniversary of the International Journal of Lifelong Education takes place in the middle of a conceptual landslide from lifelong education to lifelong learning. Contemporary discourses of lifelong learning etc are however abstractions behind which new functions and agendas for adult education...... are set. The ideological discourse of recent policies seems to neglect the fact that history and resources for lifelong learning are different across Europe, and also neglects the multiplicity of adult learners. Instead of refusing the new agendas, however, adult education research should try to dissolve...... learning. Adult education research must fulfil it's potential conversion from normative philosophy to critical and empirical social science....

  9. Program and abstracts

    International Nuclear Information System (INIS)

    This volume contains the program and abstracts of the conference. The following topics are included: metal vapor molecular lasers, magnetohydrodynamics, rare gas halide and nuclear pumped lasers, transfer mechanisms in arcs, kinetic processes in rare gas halide lasers, arcs and flows, XeF kinetics and lasers, fundamental processes in excimer lasers, electrode effects and vacuum arcs, electron and ion transport, ion interactions and mobilities, glow discharges, diagnostics and afterglows, dissociative recombination, electron ionization and excitation, rare gas excimers and group VI lasers, breakdown, novel laser pumping techniques, electrode-related discharge phenomena, photon interactions, attachment, plasma chemistry and infrared lasers, electron scattering, and reactions of excited species

  10. Circularity and Lambda Abstraction

    DEFF Research Database (Denmark)

    Danvy, Olivier; Thiemann, Peter; Zerny, Ian

    2013-01-01

    In this tribute to Doaitse Swierstra, we present the rst transformation between lazy circular programs a la Bird and strict cir- cular programs a la Pettorossi. Circular programs a la Bird rely on lazy recursive binding: they involve circular unknowns and make sense equa- tionally. Circular...... unknowns from what is done to them, which we lambda-abstract with functions. The circular unknowns then become dead variables, which we eliminate. The result is a strict circu- lar program a la Pettorossi. This transformation is reversible: given a strict circular program a la Pettorossi, we introduce...

  11. IPR 2016. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-05-15

    The volume on the meeting of pediatric radiology includes abstract on the following issues: chest, cardiovascular system, neuroradiology, CT radiation DRs (diagnostic reference levels) and dose reporting guidelines, genitourinary imaging, gastrointestinal radiology, oncology an nuclear medicine, whole body imaging, fetal/neonates imaging, child abuse, oncology and hybrid imaging, value added imaging, muscoskeletal imaging, dose and radiation safety, imaging children - immobilization and distraction techniques, information - education - QI and healthcare policy, ALARA, the knowledge skills and competences for a technologist/radiographer in pediatric radiology, full exploitation of new technological features in pediatric CT, image quality issues in pediatrics, abdominal imaging, interventional radiology, MR contrast agents, tumor - mass imaging, cardiothoracic imaging, ultrasonography.

  12. ESPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-05-10

    The volume includes the abstracts of the ESPR 2015 covering the following topics: PCG (post graduate courses): Radiography; fluoroscopy and general issue; nuclear medicine, interventional radiology and hybrid imaging, pediatric CT, pediatric ultrasound; MRI in childhood. Scientific sessions and task force sessions: International aspects; neuroradiology, neonatal imaging, engineering techniques to simulate injury in child abuse, CT - dose and quality, challenges in the chest, cardiovascular and chest, muscoskeletal, oncology, pediatric uroradiology and abdominal imaging, fetal and postmortem imaging, education and global challenges, neuroradiology - head and neck, gastrointestinal and genitourinary.

  13. IPR 2016. Abstracts

    International Nuclear Information System (INIS)

    The volume on the meeting of pediatric radiology includes abstract on the following issues: chest, cardiovascular system, neuroradiology, CT radiation DRs (diagnostic reference levels) and dose reporting guidelines, genitourinary imaging, gastrointestinal radiology, oncology an nuclear medicine, whole body imaging, fetal/neonates imaging, child abuse, oncology and hybrid imaging, value added imaging, muscoskeletal imaging, dose and radiation safety, imaging children - immobilization and distraction techniques, information - education - QI and healthcare policy, ALARA, the knowledge skills and competences for a technologist/radiographer in pediatric radiology, full exploitation of new technological features in pediatric CT, image quality issues in pediatrics, abdominal imaging, interventional radiology, MR contrast agents, tumor - mass imaging, cardiothoracic imaging, ultrasonography.

  14. Abstracts of Major Articles

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On Problems in Fujian's Present Health Insurance Professionals and Related Suggestions LIN Deng-hui,WU Xiao-nan (School of Public Health, Fujian Medical University, Fuzhou 350108, China) Abstract:Based on a statistical analysis of questionnaire survey data collected from practitioners in Fu- jian's medical insurance management system, the paper discusses the problems relevant to the staff's qua lity structure in this industry as well as mechanisms for continuing education and motivation. Finally, the authors advance such suggestions as increasing the levels of practitioner's expertise and working capacity by developing disciplinary education and continuing motivated with a well-established motivation system. education, and encouraging employees to get highly

  15. Automated Test Case Generation

    CERN Document Server

    CERN. Geneva

    2015-01-01

    I would like to present the concept of automated test case generation. I work on it as part of my PhD and I think it would be interesting also for other people. It is also the topic of a workshop paper that I am introducing in Paris. (abstract below) Please note that the talk itself would be more general and not about the specifics of my PhD, but about the broad field of Automated Test Case Generation. I would introduce the main approaches (combinatorial testing, symbolic execution, adaptive random testing) and their advantages and problems. (oracle problem, combinatorial explosion, ...) Abstract of the paper: Over the last decade code-based test case generation techniques such as combinatorial testing or dynamic symbolic execution have seen growing research popularity. Most algorithms and tool implementations are based on finding assignments for input parameter values in order to maximise the execution branch coverage. Only few of them consider dependencies from outside the Code Under Test’s scope such...

  16. Forecasting performances of three automated modelling techniques during the economic crisis 2007-2009

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    2014-01-01

    In this work we consider the forecasting of macroeconomic variables during an economic crisis. The focus is on a specific class of models, the so-called single hidden-layer feed-forward autoregressive neural network models. What makes these models interesting in the present context is the fact th...... Scandinavian ones, and focus on forecasting during the economic crisis 2007–2009. The forecast accuracy is measured using the root mean square forecast error. Hypothesis testing is also used to compare the performances of the different techniques.......In this work we consider the forecasting of macroeconomic variables during an economic crisis. The focus is on a specific class of models, the so-called single hidden-layer feed-forward autoregressive neural network models. What makes these models interesting in the present context is the fact that...... they form a class of universal approximators and may be expected to work well during exceptional periods such as major economic crises. Neural network models are often difficult to estimate, and we follow the idea of White (2006) of transforming the specification and nonlinear estimation problem into a...

  17. Automated Modeling and Simulation Using the Bond Graph Method for the Aerospace Industry

    Science.gov (United States)

    Granda, Jose J.; Montgomery, Raymond C.

    2003-01-01

    Bond graph modeling was originally developed in the late 1950s by the late Prof. Henry M. Paynter of M.I.T. Prof. Paynter acted well before his time as the main advantage of his creation, other than the modeling insight that it provides and the ability of effectively dealing with Mechatronics, came into fruition only with the recent advent of modern computer technology and the tools derived as a result of it, including symbolic manipulation, MATLAB, and SIMULINK and the Computer Aided Modeling Program (CAMPG). Thus, only recently have these tools been available allowing one to fully utilize the advantages that the bond graph method has to offer. The purpose of this paper is to help fill the knowledge void concerning its use of bond graphs in the aerospace industry. The paper first presents simple examples to serve as a tutorial on bond graphs for those not familiar with the technique. The reader is given the basic understanding needed to appreciate the applications that follow. After that, several aerospace applications are developed such as modeling of an arresting system for aircraft carrier landings, suspension models used for landing gears and multibody dynamics. The paper presents also an update on NASA's progress in modeling the International Space Station (ISS) using bond graph techniques, and an advanced actuation system utilizing shape memory alloys. The later covers the Mechatronics advantages of the bond graph method, applications that simultaneously involves mechanical, hydraulic, thermal, and electrical subsystem modeling.

  18. Automated test generation for production systems with a model-based testing approach

    OpenAIRE

    Durand, William

    2016-01-01

    This thesis tackles the problem of testing (legacy) production systems such as those of our industrial partner Michelin, one of the three largest tire manufacturers in the world, by means of Model-based Testing. A production system is defined as a set of production machines controlled by a software, in a factory. Despite the large body of work within the field of Model-based Testing, a common issue remains the writing of models describing either the system under test or its specification. It ...

  19. Bringing Automated Model Checking to PLC Program Development - A CERN Case Study

    CERN Document Server

    Fernandez Adiego, B; Tournier, J-C; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. Model checking appears to be an appropriate approach for this purpose. However, this technique is not widely used in industry yet, due to some obstacles. The main obstacles encountered when trying to apply formal verification techniques at industrial installations are the difficulty of creating models out of PLC programs and defining formally the specification requirements. In addition, models produced out of real-life programs have a huge state space, thus preventing the verification due to performance issues. Our work at CERN (European Organization for Nuclear Research) focuses on developing efficient automatic verification methods for industrial critical installations based on PLC (Programmable Logic Controller) control systems. In this paper, we present a tool generating automatically formal models out of PLC code. The tool implements a general methodology which can support several input languages, ...

  20. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  1. Automated prostate cancer detection via comprehensive multi-parametric magnetic resonance imaging texture feature models

    International Nuclear Information System (INIS)

    Prostate cancer is the most common form of cancer and the second leading cause of cancer death in North America. Auto-detection of prostate cancer can play a major role in early detection of prostate cancer, which has a significant impact on patient survival rates. While multi-parametric magnetic resonance imaging (MP-MRI) has shown promise in diagnosis of prostate cancer, the existing auto-detection algorithms do not take advantage of abundance of data available in MP-MRI to improve detection accuracy. The goal of this research was to design a radiomics-based auto-detection method for prostate cancer via utilizing MP-MRI data. In this work, we present new MP-MRI texture feature models for radiomics-driven detection of prostate cancer. In addition to commonly used non-invasive imaging sequences in conventional MP-MRI, namely T2-weighted MRI (T2w) and diffusion-weighted imaging (DWI), our proposed MP-MRI texture feature models incorporate computed high-b DWI (CHB-DWI) and a new diffusion imaging modality called correlated diffusion imaging (CDI). Moreover, the proposed texture feature models incorporate features from individual b-value images. A comprehensive set of texture features was calculated for both the conventional MP-MRI and new MP-MRI texture feature models. We performed feature selection analysis for each individual modality and then combined best features from each modality to construct the optimized texture feature models. The performance of the proposed MP-MRI texture feature models was evaluated via leave-one-patient-out cross-validation using a support vector machine (SVM) classifier trained on 40,975 cancerous and healthy tissue samples obtained from real clinical MP-MRI datasets. The proposed MP-MRI texture feature models outperformed the conventional model (i.e., T2w+DWI) with regard to cancer detection accuracy. Comprehensive texture feature models were developed for improved radiomics-driven detection of prostate cancer using MP-MRI. Using a

  2. Automated Generation of 3D Volcanic Gas Plume Models for Geobrowsers

    Science.gov (United States)

    Wright, T. E.; Burton, M.; Pyle, D. M.

    2007-12-01

    A network of five UV spectrometers on Etna automatically gathers column amounts of SO2 during daylight hours. Near-simultaneous scans from adjacent spectrometers, comprising 210 column amounts in total, are then converted to 2D slices showing the spatial distribution of the gas by tomographic reconstruction. The trajectory of the plume is computed using an automatically-submitted query to NOAA's HYSPLIT Trajectory Model. This also provides local estimates of air temperature, which are used to determine the atmospheric stability and therefore the degree to which the plume is dispersed by turbulence. This information is sufficient to construct an animated sequence of models which show how the plume is advected and diffused over time. These models are automatically generated in the Collada Digital Asset Exchange format and combined into a single file which displays the evolution of the plume in Google Earth. These models are useful for visualising and predicting the shape and distribution of the plume for civil defence, to assist field campaigns and as a means of communicating some of the work of volcano observatories to the public. The Simultaneous Algebraic Reconstruction Technique is used to create the 2D slices. This is a well-known method, based on iteratively updating a forward model (from 2D distribution to column amounts). Because it is based on a forward model, it also provides a simple way to quantify errors.

  3. Reference Model of Desired Yaw Angle for Automated Lane Changing Behavior of Vehicle

    Institute of Scientific and Technical Information of China (English)

    Dianbo Ren; Guanzhe Zhang; Hangzhe Wu

    2016-01-01

    In this paper, it studies the problem of trajectory planning and tracking for lane changing behavior of vehicle in automatic highway systems. Based on the model of yaw angle acceleration with positive and negative trapezoid constraint, by analyzing the variation laws of yaw motion of vehicle during a lane changing maneuver, the reference model of desired yaw angle and yaw rate for lane changing is generated. According to the yaw angle model, the vertical and horizontal coordinates of trajectory for vehicle lane change are calculated. Assuming that the road curvature is a constant, the difference and associations between two scenarios are analyzed, the lane changing maneuvers occurred on curve road and straight road, respectively. On this basis, it deduces the calculation method of desired yaw angle for lane changing on circular road. Simulation result shows that, it is different from traditional lateral acceleration planning method with the trapezoid constraint, by applying the trapezoidal yaw acceleration reference model proposed in this paper, the resulting expected yaw angular acceleration is continuous, and the step tracking for steering angle is not needed to implement. Due to the desired yaw model is direct designed based on the variation laws of raw movement of vehicle during a lane changing maneuver, rather than indirectly calculated from the trajectory model for lane changing, the calculation steps are simplified.

  4. Automation of reverse engineering process in aircraft modeling and related optimization problems

    Science.gov (United States)

    Li, W.; Swetits, J.

    1994-01-01

    During the year of 1994, the engineering problems in aircraft modeling were studied. The initial concern was to obtain a surface model with desirable geometric characteristics. Much of the effort during the first half of the year was to find an efficient way of solving a computationally difficult optimization model. Since the smoothing technique in the proposal 'Surface Modeling and Optimization Studies of Aerodynamic Configurations' requires solutions of a sequence of large-scale quadratic programming problems, it is important to design algorithms that can solve each quadratic program in a few interactions. This research led to three papers by Dr. W. Li, which were submitted to SIAM Journal on Optimization and Mathematical Programming. Two of these papers have been accepted for publication. Even though significant progress has been made during this phase of research and computation times was reduced from 30 min. to 2 min. for a sample problem, it was not good enough for on-line processing of digitized data points. After discussion with Dr. Robert E. Smith Jr., it was decided not to enforce shape constraints in order in order to simplify the model. As a consequence, P. Dierckx's nonparametric spline fitting approach was adopted, where one has only one control parameter for the fitting process - the error tolerance. At the same time the surface modeling software developed by Imageware was tested. Research indicated a substantially improved fitting of digitalized data points can be achieved if a proper parameterization of the spline surface is chosen. A winning strategy is to incorporate Dierckx's surface fitting with a natural parameterization for aircraft parts. The report consists of 4 chapters. Chapter 1 provides an overview of reverse engineering related to aircraft modeling and some preliminary findings of the effort in the second half of the year. Chapters 2-4 are the research results by Dr. W. Li on penalty functions and conjugate gradient methods for

  5. Study on dynamic model of tractor system for automated navigation applications

    Institute of Scientific and Technical Information of China (English)

    FENG Lei; HE Yong

    2005-01-01

    This research aims at using a dynamic model of tractor system to support navigation system design for an automatically guided agricultural tractor. This model, consisting of a bicycle model of the tractor system, has been implemented in the MATLAB environment and was developed based on a John Deere tractor. The simulation results from this MATLAB model was validated through field navigation tests. The accuracy of the trajectory estimation is strongly affected by the determination of the cornering stiffness of the tractor. In this simulation, the tractor cornering stiffness analysis was identified during simulation analysis using the MATLAB model based on the recorded trajectory data. The obtained data was used in simulation analyses for various navigation operations in the field of interest. The analysis on field validation test results indicated that the developed tractor system could accurately estimate wheel trajectories of a tractor system while operating in agricultural fields at various speeds. The results also indicated that the developed system could accurately determine tractor velocity and steering angle while the tractor operates in curved fields.

  6. Interactional Metadiscourse in Research Article Abstracts

    Science.gov (United States)

    Gillaerts, Paul; Van de Velde, Freek

    2010-01-01

    This paper deals with interpersonality in research article abstracts analysed in terms of interactional metadiscourse. The evolution in the distribution of three prominent interactional markers comprised in Hyland's (2005a) model, viz. hedges, boosters and attitude markers, is investigated in three decades of abstract writing in the field of…

  7. Transcranial Magnetic Stimulation: An Automated Procedure to Obtain Coil-specific Models for Field Calculations

    DEFF Research Database (Denmark)

    Madsen, Kristoffer Hougaard; Ewald, Lars; Siebner, Hartwig R.; Thielscher, Axel

    2015-01-01

    Background: Field calculations for transcranial magnetic stimulation (TMS) are increasingly implemented online in neuronavigation systems and in more realistic offline approaches based on finite-element methods. They are often based on simplified and/or non-validated models of the magnetic vector...... approach to determine the magnetic vector potential via volume integration of the measured field. Results: The integration approach reproduces the vector potential with very good accuracy. The vector potential distribution of a standard figure-of-eight shaped coil determined with our setup corresponds well...... with that calculated using a model reconstructed from x-ray images. Conclusion: The setup can supply validated models for existing and newly appearing TMS coils....

  8. Automated measurement of Drosophila wings

    Directory of Open Access Journals (Sweden)

    Mezey Jason

    2003-12-01

    Full Text Available Abstract Background Many studies in evolutionary biology and genetics are limited by the rate at which phenotypic information can be acquired. The wings of Drosophila species are a favorable target for automated analysis because of the many interesting questions in evolution and development that can be addressed with them, and because of their simple structure. Results We have developed an automated image analysis system (WINGMACHINE that measures the positions of all the veins and the edges of the wing blade of Drosophilid flies. A video image is obtained with the aid of a simple suction device that immobilizes the wing of a live fly. Low-level processing is used to find the major intersections of the veins. High-level processing then optimizes the fit of an a priori B-spline model of wing shape. WINGMACHINE allows the measurement of 1 wing per minute, including handling, imaging, analysis, and data editing. The repeatabilities of 12 vein intersections averaged 86% in a sample of flies of the same species and sex. Comparison of 2400 wings of 25 Drosophilid species shows that wing shape is quite conservative within the group, but that almost all taxa are diagnosably different from one another. Wing shape retains some phylogenetic structure, although some species have shapes very different from closely related species. The WINGMACHINE system facilitates artificial selection experiments on complex aspects of wing shape. We selected on an index which is a function of 14 separate measurements of each wing. After 14 generations, we achieved a 15 S.D. difference between up and down-selected treatments. Conclusion WINGMACHINE enables rapid, highly repeatable measurements of wings in the family Drosophilidae. Our approach to image analysis may be applicable to a variety of biological objects that can be represented as a framework of connected lines.

  9. A LARI Experience (Abstract)

    Science.gov (United States)

    Cook, M.

    2015-12-01

    (Abstract only) In 2012, Lowell Observatory launched The Lowell Amateur Research Initiative (LARI) to formally involve amateur astronomers in scientific research by bringing them to the attention of and helping professional astronomers with their astronomical research. One of the LARI projects is the BVRI photometric monitoring of Young Stellar Objects (YSOs), wherein amateurs obtain observations to search for new outburst events and characterize the colour evolution of previously identified outbursters. A summary of the scientific and organizational aspects of this LARI project, including its goals and science motivation, the process for getting involved with the project, a description of the team members, their equipment and methods of collaboration, and an overview of the programme stars, preliminary findings, and lessons learned is presented.

  10. Abstracts of Selected Papers

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    On the Social Solidarity of Organization An Empirical Analysis Li Hanlin Abstract: Based on the 2002 survey data, this paper tries to measure solidarity in organization. The operationalization for this measurement goes from two points of view. One is from the degree of cohesion and another one is from the degree of vulnerability. To observe and measure the degree of cohesion three subscales like social support, vertical integration and organizational identity have been used. To observe and measure the degree of vulnerability other three subscales like dissatisfaction, relative deprivation and anomie have been used. The paper tries to explore finally under which condition the organization behavior and behavior orientation could go to the similarity or make some difference. Key words: Organization Cohesion Vulnerability Organization Behavior

  11. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Determination of the Estrogen Alkylphenols and Bisphenol A in Marine Sediments by Gas Chromatography-Mass Spectrometry Deng Xu-xiu et al. (1) Abstract Octylphenol, nonylphenol and bisphenol A are recognized environmental endocrine disruptors. A quantitative method was established for the simultaneous determination of octylphenol, nonylphenol and bisphenol A in marine sediments by gas chromatography-mass spectrometry. The test sample was extracted by methanol with ultrasonic technique, purified with copper powder and carbon solid phase extraction column, and derived with heptafluorobutyric anhydride. Then the analytes were separated on HP-5ms column and determined by gas chromatography-mass. The recovery of the method was between 84.3% and 94.5%, and the LOQ of 4-N- octylphenol, nonylphenol and bisphenol A was 0.25 g/kg, 0.15 g/kg and 0.15 g/kg. Key words octylphenol; nonylphenol; bisphenol A; gas chromatography-mass spectrometry

  12. Contents and Abstracts

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    [Ancient Mediterranean Civilizations] Title: On Poseidon's Image in Homeric Epics Author: Zhu Yizhang, Lecturer, School of History and Culture, Shandong University, Jinan, Shandong, 250100, China. Abstract: Poseidon was an important role in religion, myth and literature of ancient Greece. His religious functions, and status in mythical image in literature were mainly established by Homeric Epics. Poseidon doesn't only appear frequently in the Homeric Epics but also influences the development of the plots directly; therefore, he could be seen as one of the most important gods in the Epics. But Homeric Epics do not introduce his basic image clearly. In Homeric Epics, Poseidon carries the deity and humanity aspect of the figure, and the latter was emphasized, which implied his archetype was a mortal wanax.

  13. ICENES 2007 Abstracts

    International Nuclear Information System (INIS)

    In this book Conference Program and Abstracts were included 13th International Conference on Emerging Nuclear Energy Systems which held between 03-08 June 2007 in Istanbul, Turkey. The main objective of International Conference series on Emerging Nuclear Energy Systems (ICENES) is to provide an international scientific and technical forum for scientists, engineers, industry leaders, policy makers, decision makers and young professionals who will shape future energy supply and technology , for a broad review and discussion of various advanced, innovative and non-conventional nuclear energy production systems. The main topics of 159 accepted papers from 35 countries are fusion science and technology, fission reactors, accelerator driven systems, transmutation, laser in nuclear technology, radiation shielding, nuclear reactions, hydrogen energy, solar energy, low energy physics and societal issues

  14. ABSTRACTS AND KEY WORDS

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Comparative Study on Adhesion Effect Among Different Materials of Sepia esculenta Wang Xue-mei et al. (1) Abstract PE Harness, mesh, sea cucumber seedling box attached, sorghum bar, tamarix (fresh, and old), artemisia annua (fresh, and old) and artificial egg-based subsidiary were used as spawning substrates of Sepia esculenta for comparative study on adhesion effect during artificial breeding. The results showed that the best was artificial egg-based subsidiary produced by the process of invention in this study. The second was old artemisia annua and tamarix. PE Harness, mesh, sea cucumber seedling box attached, sorghum bar were unsatisfactory for using as spawning substrates of Sepia esculenta. Key words Sepia esculenta; adhesion effect; different materials

  15. Towards Automation 2.0: A Neurocognitive Model for Environment Recognition, Decision-Making, and Action Execution

    OpenAIRE

    Zucker Gerhard; Dietrich Dietmar; Velik Rosemarie

    2011-01-01

    The ongoing penetration of building automation by information technology is by far not saturated. Today's systems need not only be reliable and fault tolerant, they also have to regard energy efficiency and flexibility in the overall consumption. Meeting the quality and comfort goals in building automation while at the same time optimizing towards energy, carbon footprint and cost-efficiency requires systems that are able to handle large amounts of information and negotiate system behaviour ...

  16. Economic Perspectives on Automated Demand Responsive Transportation and Shared Taxi Services - Analytical models and simulations for policy analysis

    OpenAIRE

    Jokinen, Jani-Pekka

    2016-01-01

    The automated demand responsive transportation (DRT) and modern shared taxi services provide shared trips for passengers, adapting dynamically to trip requests by routing a fleet of vehicles operating without any fixed routes or schedules. Compared with traditional public transportation, these new services provide trips without transfers and free passengers from the necessity of using timetables and maps of route networks. Furthermore, automated DRT applies real-time traffic information in ve...

  17. DATA FOR ENVIRONMENTAL MODELING (D4EM): BACKGROUND AND EXAMPLE APPLICATIONS OF DATA AUTOMATION

    Science.gov (United States)

    Data is a basic requirement for most modeling applications. Collecting data is expensive and time consuming. High speed internet connections and growing databases of online environmental data go a long way to overcoming issues of data scarcity. Among the obstacles still remaining...

  18. Natural Environment Modeling and Fault-Diagnosis for Automated Agricultural Vehicle

    DEFF Research Database (Denmark)

    Blas, Morten Rufus; Blanke, Mogens

    2008-01-01

    This paper presents results for an automatic navigation system for agricultural vehicles. The system uses stereo-vision, inertial sensors and GPS. Special emphasis has been placed on modeling the natural environment in conjunction with a fault-tolerant navigation system. The results are exemplified...

  19. Meaning-Based Scoring: A Systemic Functional Linguistics Model for Automated Test Tasks

    Science.gov (United States)

    Gleason, Jesse

    2014-01-01

    Communicative approaches to language teaching that emphasize the importance of speaking (e.g., task-based language teaching) require innovative and evidence-based means of assessing oral language. Nonetheless, research has yet to produce an adequate assessment model for oral language (Chun 2006; Downey et al. 2008). Limited by automatic speech…

  20. Automated Voxel Model from Point Clouds for Structural Analysis of Cultural Heritage

    Science.gov (United States)

    Bitelli, G.; Castellazzi, G.; D'Altri, A. M.; De Miranda, S.; Lambertini, A.; Selvaggi, I.

    2016-06-01

    In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM) of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy) that was hit by an earthquake in 2012.

  1. Model Behavior Analysis of Stock Market Indicators and Listed Companies: Evidence from the Ghana Stock Exchange: Automated versus Floor Trading

    Directory of Open Access Journals (Sweden)

    Gladys A. A. Nabieu

    2014-10-01

    Full Text Available This article studies the model behavior analysis of stock market indicators and listed companies on the GhanaStock Exchange (GSE over a five year period. The Ghana Stock Exchange transferred from floor trading toautomation in June 2009. The GSE operates an up to date market for recurrently traded securities and an auctioncall for rarely traded securities. Data for this study was extracted from GSE’s profile of listed companies (factbook for the period under investigation. The study results show a significant progress on stock marketindicators within 2007 to 2011 in dividend yield, volume traded, share price, market capitalization and marketreturns following the automation in mid 2009, equity premium however, decreased and no significant effects onliquidity were detected. The study further revealed that, the introduction of the electronic trading system hassignificantly increased the tempo of trading activities on the Ghanaian Stock Market. The study recommends theGSE to accommodate all quoted securities as this will promote and improve fund raising for investors.

  2. Simulation of Regionally Ecological Land Based on a Cellular Automation Model: A Case Study of Beijing, China

    Directory of Open Access Journals (Sweden)

    Xiubin Li

    2012-08-01

    Full Text Available Ecological land is like the “liver” of a city and is very useful to public health. Ecological land change is a spatially dynamic non-linear process under the interaction between natural and anthropogenic factors at different scales. In this study, by setting up natural development scenario, object orientation scenario and ecosystem priority scenario, a Cellular Automation (CA model has been established to simulate the evolution pattern of ecological land in Beijing in the year 2020. Under the natural development scenario, most of ecological land will be replaced by construction land and crop land. But under the scenarios of object orientation and ecosystem priority, the ecological land area will increase, especially under the scenario of ecosystem priority. When considering the factors such as total area of ecological land, loss of key ecological land and spatial patterns of land use, the scenarios from priority to inferiority are ecosystem priority, object orientation and natural development, so future land management policies in Beijing should be focused on conversion of cropland to forest, wetland protection and prohibition of exploitation of natural protection zones, water source areas and forest parks to maintain the safety of the regional ecosystem.

  3. An automated nowcasting model of significant instability events in the flight terminal area of Rio de Janeiro, Brazil

    Science.gov (United States)

    Borges França, Gutemberg; Valdonel de Almeida, Manoel; Rosette, Alessana C.

    2016-05-01

    This paper presents a novel model, based on neural network techniques, to produce short-term and local-specific forecasts of significant instability for flights in the terminal area of Galeão Airport, Rio de Janeiro, Brazil. Twelve years of data were used for neural network training/validation and test. Data are originally from four sources: (1) hourly meteorological observations from surface meteorological stations at five airports distributed around the study area; (2) atmospheric profiles collected twice a day at the meteorological station at Galeão Airport; (3) rain rate data collected from a network of 29 rain gauges in the study area; and (4) lightning data regularly collected by national detection networks. An investigation was undertaken regarding the capability of a neural network to produce early warning signs - or as a nowcasting tool - for significant instability events in the study area. The automated nowcasting model was tested using results from five categorical statistics, indicated in parentheses in forecasts of the first, second, and third hours, respectively, namely proportion correct (0.99, 0.97, and 0.94), BIAS (1.10, 1.42, and 2.31), the probability of detection (0.79, 0.78, and 0.67), false-alarm ratio (0.28, 0.45, and 0.73), and threat score (0.61, 0.47, and 0.25). Possible sources of error related to the test procedure are presented and discussed. The test showed that the proposed model (or neural network) can grab the physical content inside the data set, and its performance is quite encouraging for the first and second hours to nowcast significant instability events in the study area.

  4. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. (Chalmers Univ. of Technology. Division Design and Human factors. Dept. of Product and Production Development, Goeteborg (Sweden))

    2011-01-15

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operator's mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  5. Automation strategies in five domains - A comparison of levels of automation, function allocation and visualisation of automatic functions

    International Nuclear Information System (INIS)

    This study was conducted as a field study where control room operators and engineers from the refinery, heat and power, aviation, shipping and nuclear domain were interviewed regarding use of automation and the visualisation of automatic functions. The purpose of the study was to collect experiences and best practices from the five studied domains on levels of automation, function allocation and visualisation of automatic functions. In total, nine different control room settings were visited. The studied settings were compared using a systemic approach based on a human-machine systems model. The results show that the 'left over principle' is still the most common applied approach for function allocation but in high risk settings the decision whether to automate or not is more carefully considered. Regarding the visualisation of automatic functions, it was found that as long as each display type (process based, functional oriented, situation oriented and task based) are applied so that they correspond to the same level of abstraction as the technical system the operators mental model will be supported. No single display type can however readily match all levels of abstraction at the same time - all display types are still needed and serve different purposes. (Author)

  6. cmpXLatt: Westinghouse automated testing tool for nodal cross section models

    International Nuclear Information System (INIS)

    The procedure for evaluating the merits of different nodal cross section representation models is normally both cumbersome and time consuming, and includes many manual steps when preparing appropriate benchmark problems. Therefore, a computer tool called cmpXLatt has been developed at Westinghouse in order to facilitate the process of performing comparisons between nodal diffusion theory results and corresponding transport theory results on a single node basis. Due to the large number of state points that can be evaluated by cmpXLatt, a systematic and comprehensive way of performing verification and validation of nodal cross section models is provided. This paper presents the main features of cmpXLatt and demonstrates the benefits of using cmpXLatt in a real life application. (author)

  7. Multi-Instance Learning Models for Automated Support of Analysts in Simulated Surveillance Environments

    Science.gov (United States)

    Birisan, Mihnea; Beling, Peter

    2011-01-01

    New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.

  8. Automated Object-Oriented Simulation Framework for Modelling of Superconducting Magnets at CERN

    CERN Document Server

    Maciejewski, Michał; Bartoszewicz, Andrzej

    The thesis aims at designing a flexible, extensible, user-friendly interface to model electro thermal transients occurring in superconducting magnets. Simulations are a fundamental tool for assessing the performance of a magnet and its protection system against the effects of a quench. The application is created using scalable and modular architecture based on object-oriented programming paradigm which opens an easy way for future extensions. What is more, each model composed of thousands of blocks is automatically created in MATLAB/Simulink. Additionally, the user is able to automatically run sets of simulations with varying parameters. Due to its scalability and modularity the framework can be easily used to simulate wide range of materials and magnet configurations.

  9. An Automated BIM Model to Conceptually Design, Analyze, Simulate, and Assess Sustainable Building Projects

    OpenAIRE

    Farzad Jalaei; Ahmad Jrade

    2014-01-01

    Quantifying the environmental impacts and simulating the energy consumption of building’s components at the conceptual design stage are very helpful for designers needing to make decisions related to the selection of the best design alternative that would lead to a more energy efficient building. Building Information Modeling (BIM) offers designers the ability to assess different design alternatives at the conceptual stage of the project so that energy and life cycle assessment (LCA) strategi...

  10. Policy-Based Automation of Dynamique and Multipoint Virtual Private Network Simulation on OPNET Modeler

    OpenAIRE

    Ayoub BAHNASSE; Najib EL KAMOUN

    2014-01-01

    The simulation of large-scale networks is a challenging task especially if the network to simulate is the Dynamic Multipoint Virtual Private Network, it requires expert knowledge to properly configure its component technologies. The study of these network architectures in a real environment is almost impossible because it requires a very large number of equipment, however, this task is feasible in a simulation environment like OPNET Modeler, provided to master both the tool and the different ...

  11. A novel automated behavioral test battery assessing cognitive rigidity in two genetic mouse models of autism.

    OpenAIRE

    Alicja ePuścian; Szymon eŁęski; Tomasz eGórkiewicz; Ksenia eMeyza; Hans-Peter eLipp; Ewelina Anna Knapska

    2014-01-01

    Repetitive behaviors are a key feature of many pervasive developmental disorders, such as autism. As a heterogeneous group of symptoms, repetitive behaviors are conceptualized into two main subgroups: sensory/motor (lower-order) and cognitive rigidity (higher-order). Although lower-order repetitive behaviors are measured in mouse models in several paradigms, so far there have been no high-throughput tests directly measuring cognitive rigidity. We describe a novel approach for monitoring repet...

  12. VoICE: A semi-automated pipeline for standardizing vocal analysis across models

    OpenAIRE

    Burkett, ZD; Day, NF; Peñagarikano, O; Geschwind, DH; White, SA

    2015-01-01

    The study of vocal communication in animal models provides key insight to the neurogenetic basis for speech and communication disorders. Current methods for vocal analysis suffer from a lack of standardization, creating ambiguity in cross-laboratory and cross-species comparisons. Here, we present VoICE (Vocal Inventory Clustering Engine), an approach to grouping vocal elements by creating a high dimensionality dataset through scoring spectral similarity between all vocalizations within a reco...

  13. Micro-simulation Modeling of Coordination of Automated Guided Vehicles at Intersection

    OpenAIRE

    Makarem, Laleh; Pham, Minh Hai; Dumont, André-Gilles; Gillet, Denis

    2012-01-01

    One of the challenging problems with autonomous vehicles is their performance at intersections. This paper shows an alternative control method for the coordination of autonomous vehicles at intersections. The proposed approach is grounded in multi-robot coordination and it also takes into account vehicle dynamics as well as realistic communication constraints. The existing concept of decentralized navigation functions is combined with a sensing model and a crossing strategy is developed. It i...

  14. STATISTICAL MODELS DECISION SUPPORT FOR INFORMATION SECURITY MANAGEMENT IN AN AUTOMATED SYSTEM

    OpenAIRE

    Stepanov V. V.; Kucher V. A.

    2015-01-01

    The article deals with mathematical models of management decision-making to select the option to protect the AU, based on sufficient statistical information about attacks on the AU. The amount of a priori uncertainty about the choice of protection option in GIS was described with Boltzmann's entropy. Introduction of the value within Shannon’s definition of mutual information is called the context random variables, it allows removing the uncertainty regarding the actions of the enemy, and it e...

  15. Protection and Automation Devices Testing Using the Modeling Features of EUROSTAG

    OpenAIRE

    Sauhats, A; Utāns, A; Kucajevs, J; Pašņins, G; Antonovs, D; Bieļa-Dailidoviča, E

    2011-01-01

    Setting calculation of asynchronous regime liquidation devices can be very difficult as a result of variable power system parameters. Therefore, efficiency of action operation and correctness of setting must be tested by experience in various regimes of power system work. For asynchronous regimes such signals are typical which form it is difficult to recreate using traditional relay testing technique, so, these signals of testing are obtained by modelling of various regimes of power system w...

  16. Automated diagnosis of coronary artery disease based on data mining and fuzzy modeling.

    Science.gov (United States)

    Tsipouras, Markos G; Exarchos, Themis P; Fotiadis, Dimitrios I; Kotsia, Anna P; Vakalis, Konstantinos V; Naka, Katerina K; Michalis, Lampros K

    2008-07-01

    A fuzzy rule-based decision support system (DSS) is presented for the diagnosis of coronary artery disease (CAD). The system is automatically generated from an initial annotated dataset, using a four stage methodology: 1) induction of a decision tree from the data; 2) extraction of a set of rules from the decision tree, in disjunctive normal form and formulation of a crisp model; 3) transformation of the crisp set of rules into a fuzzy model; and 4) optimization of the parameters of the fuzzy model. The dataset used for the DSS generation and evaluation consists of 199 subjects, each one characterized by 19 features, including demographic and history data, as well as laboratory examinations. Tenfold cross validation is employed, and the average sensitivity and specificity obtained is 62% and 54%, respectively, using the set of rules extracted from the decision tree (first and second stages), while the average sensitivity and specificity increase to 80% and 65%, respectively, when the fuzzification and optimization stages are used. The system offers several advantages since it is automatically generated, it provides CAD diagnosis based on easily and noninvasively acquired features, and is able to provide interpretation for the decisions made. PMID:18632325

  17. An automated model for rooftop PV systems assessment in ArcGIS using LIDAR

    Directory of Open Access Journals (Sweden)

    Mesude Bayrakci Boz

    2015-08-01

    Full Text Available As photovoltaic (PV systems have become less expensive, building rooftops have come to be attractive for local power production. Identifying rooftops suitable for solar energy systems over large geographic areas is needed for cities to obtain more accurate assessments of production potential and likely patterns of development. This paper presents a new method for extracting roof segments and locating suitable areas for PV systems using Light Detection and Ranging (LIDAR data and building footprints. Rooftop segments are created using seven slope (tilt, ve aspect (azimuth classes and 6 different building types. Moreover, direct beam shading caused by nearby objects and the surrounding terrain is taken into account on a monthly basis. Finally, the method is implemented as an ArcGIS model in ModelBuilder and a tool is created. In order to show its validity, the method is applied to city of Philadelphia, PA, USA with the criteria of slope, aspect, shading and area used to locate suitable areas for PV system installation. The results show that 33.7% of the buildings footprints areas and 48.6% of the rooftop segments identi ed is suitable for PV systems. Overall, this study provides a replicable model using commercial software that is capable of extracting individual roof segments with more detailed criteria across an urban area.

  18. Modeling kinematic hardening in a dispersion strengthened aluminum alloy using a stochastic cellular automation

    International Nuclear Information System (INIS)

    Full text: The Bauschinger effect refers to an observed asymmetry in the forward and reverse loading curves of a metal or an alloy. Typically, the absolute value of the yield stress in reverse loading is lower than the maximum stress imposed on the initial, forward loading. This difference arises from either the presence of a back stress or from the greater strength of obstacles opposing dislocation motion in the forward than in the reverse direction. Thus, the Bauschinger effect contributes to the phenomena referred to as kinematic hardening. In particular dispersion hardened systems, containing strong, non-shearable particles that offer obstacles to dislocation motion, will often exhibit a large kinematic hardening component. When a material is described as a group of parallel elements (a composite) having variable yield stresses and or Young's moduli, kinematic hardening of type KI, is observed when the first element to yield on forward loading is the first element to yield on reverse loading. Kinematic hardening types KII and KIII result when the order of relaxation of the elements is different from the order of their initial yielding. The reverse loading curves for types KII and KIII hardening generally exhibit inflection points at the initiation of yielding on reverse loading. In previous work, a micromechanics model, with a detailed description of the microstructure, was employed to model the effects of plastic inhomogeneity and duplicate the loading and unloading trends observed experimentally. Trends predicted by the model corresponded well to some of the expectations derived from observation, e.g. the effects related to the inclusions of different phases, however in some cases the correlation depended on the assignment of unrealistic properties to microstructural constituents. In the current work, the material is modeled as an array of coupled elements with varying stiffnesses and strengths. A stochastic cellular automaton is then used to simulate the

  19. An automated approach to design of solid rockets utilizing a special internal ballistics model

    Science.gov (United States)

    Sforzini, R. H.

    1980-01-01

    A pattern search technique is presented, which is utilized in a computer program that minimizes the sum of the squares of the differences, at various times, between a desired thrust-time trace and that calculated with a special mathematical internal ballistics model of a solid propellant rocket motor. The program is demonstrated by matching the thrust-time trace obtained from static tests of the first Space Shuttle SRM starting with input values of 10 variables which are, in general, 10% different from the as-built SRM. It is concluded that an excellent match is obtained.

  20. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    Science.gov (United States)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework